Robotics 10

download Robotics 10

of 36

Transcript of Robotics 10

  • 8/8/2019 Robotics 10

    1/36

    CS 491/691(X) - Lecture 5 1

    EXPERT SYSTEMS AND SOLUTIONS

    Email: [email protected]@yahoo.com

    Cell: 9952749533www.r esea rchproj e ct s .infoPAIYANOOR, OMR, CHENNAI

    Call For Research Projects Final year students of B.E in EEE, ECE, EI,

    M.E (Power Systems), M.E (AppliedElectronics), M.E (Power Electronics)

    Ph.D Electrical and Electronics.Students can assemble their hardware in our

    Research labs. Experts will be guiding the projects .

  • 8/8/2019 Robotics 10

    2/36

    Topics: Introduction toRobotics

    CS 491/691(X)Lecture 5

    Instructor: Monica Nicolescu

  • 8/8/2019 Robotics 10

    3/36

  • 8/8/2019 Robotics 10

    4/36

    CS 491/691(X) - Lecture 5 4

    Reflective OptosensorsInclude a source of light emitter (lightemitting diodes LED) and a lightdetector (photodiode or phototransistor)

    Two arrangements, depending on thepositions of the emitter and detector Reflectance sensors: Emitter and detector

    are side by side; Light reflects from the objectback into the detector

    Break-beam sensors: The emitter anddetector face each other; Object is detected if light between them is interrupted

  • 8/8/2019 Robotics 10

    5/36

    CS 491/691(X) - Lecture 5 5

    CalibrationAmbient / background light can interfere with the sensor measurement

    The ambient light level should be subtracted to get only theemitter light level

    Calibration : the process of adjusting a mechanism so as tomaximize its performance

    Ambient light can change sensors need to be calibratedrepeatedly

    Detecting ambient light is difficult if the emitter has the samewavelength

    Adjust the wavelength of the emitter

  • 8/8/2019 Robotics 10

    6/36

    CS 491/691(X) - Lecture 5 6

    Infra Red (IR) LightIR light works at a frequency different than ambientlight

    IR sensors are used in the same ways as the visible

    light sensors, but more robustly Reflectance sensors, break beams

    Sensor reports the amount of overall illumination, ambient lighting and the light from light source

    More powerful way to use infrared sensing Modulation/demodulation : rapidly turn on and off the source

    of light

  • 8/8/2019 Robotics 10

    7/36

    CS 491/691(X) - Lecture 5 7

    Modulation/DemodulationModulated IR is commonly

    used for communication

    Modulationis done by flashing the light source at aparticular frequency

    This signal is detected by a demodulator tuned to

    that particular frequencyOffers great insensitivity to ambient light Flashes of light can be detected even if weak

  • 8/8/2019 Robotics 10

    8/36

    CS 491/691(X) - Lecture 5 8

    Infrared CommunicationBit frames All bits take the same amount of

    time to transmit

    Sample the signal in the middle of the bit frame

    Used for standard computer/modem communication Useful when the waveform can be reliably transmitted

    Bit intervals

    Sampled at the falling edge

    Duration of interval between sampling determines whether it is a0 or 1

    Common in commercial use

    Useful when it is difficult to control the exact shape of the

    waveform

  • 8/8/2019 Robotics 10

    9/36

    CS 491/691(X) - Lecture 5 9

    Proximity SensingIdeal application for modulated/demodulatedIR light sensing

    Light from the emitter is reflected back intodetector by a nearby object, indicating

    whether an object is present LED emitter and detector are pointed in the

    same direction

    Modulated light is far less susceptible to

    environmental variables amount of ambient light and the reflectivity of

    different objects

  • 8/8/2019 Robotics 10

    10/36

    CS 491/691(X) - Lecture 5 10

    Break Beam SensorsAny pair of compatible emitter-detector devicescan be used to make a break-beam sensor

    Examples:

    Incadescent flashlight bulb and photocell

    Red LEDs and visible-light-sensitive photo-transistors

    IR emitters and detectors

    Where have you seen these?

    Break beams and clever burglars in movies

    In robotics they are mostly used for keeping trackof shaft rotation

  • 8/8/2019 Robotics 10

    11/36

    CS 491/691(X) - Lecture 5 11

    Shaft Encoding

    Shaft encoders Measure the angular rotation of a shaft or an axle

    Provide position and velocity information about the

    shaftSpeedometers: measure how fast the wheels areturning

    Odometers: measure the number of rotations of thewheels

  • 8/8/2019 Robotics 10

    12/36

    CS 491/691(X) - Lecture 5 12

    Measuring RotationA perforated disk is mounted on the shaft

    An emitterdetector pair is placed on both

    sides of the disk

    As the shaft rotates, the holes in the diskinterrupt the light beam

    These light pulses are counted thus monitoring the rotation of the

    shaft

    The more notches, the higher the resolution of the encoder

    One notch, only complete rotations can be counted

  • 8/8/2019 Robotics 10

    13/36

    CS 491/691(X) - Lecture 5 13

    General Encoder Properties

    Encoders are active sensors

    Produce and measure a wave

    function of light intensityThe wave peaks are counted to compute the speed

    of the shaft

    Encoders measure rotational velocity and position

  • 8/8/2019 Robotics 10

    14/36

    CS 491/691(X) - Lecture 5 14

    Color-Based Encoders

    Use a reflectance sensors to count the rotationsPaint the disk wedges in alternating contrastingcolors

    Black wedges absorb light, white reflect it and onlyreflections are counted

  • 8/8/2019 Robotics 10

    15/36

    CS 491/691(X) - Lecture 5 15

    Uses of Encoders

    Velocity can be measured at a driven (active) wheel

    at a passive wheel (e.g., dragged behind a legged robot)

    By combining position and velocity information, onecan: move in a straight line

    rotate by a fixed angle

    Can be difficult due to wheel and gear slippage andto backlash in geartrains

  • 8/8/2019 Robotics 10

    16/36

    CS 491/691(X) - Lecture 5 16

    Q uadrature Shaft EncodingHow can we measure direction of rotation?

    Idea: Use two encoders instead of one

    Align sensors to be 90 degrees out of phase

    Compare the outputs of both sensors at eachtime step with the previous time step

    Only one sensor changes state (on/off) at eachtime step, based on the direction of the shaftrotation this determines the direction of

    rotation A counter is incremented in the encoder that

    was on

  • 8/8/2019 Robotics 10

    17/36

    CS 491/691(X) - Lecture 5 17

    Which Direction is the Shaft Moving?

    Encoder A = 1 and Encoder B = 0

    If moving to position AB=00,the position count is

    incremented If moving to the position

    AB=11, the position count isdecremented

    State transition table:

    Previous state = current stateno change in position

    Single-bit changeincrementing / decrementing thecount

    Double-bit change illegaltransition

  • 8/8/2019 Robotics 10

    18/36

    CS 491/691(X) - Lecture 5 18

    Uses of Q SE in RoboticsRobot arms with complex joints e.g., rotary/ball joints like knees or

    shoulders

    Cartesian robots, overhead cranes

    The rotation of a long worm screwmoves an arm/rack back and fortalong an axis

    Copy machines, printers

    ElevatorsMotion of robot wheels

    Dead-reckoning positioning

  • 8/8/2019 Robotics 10

    19/36

    CS 491/691(X) - Lecture 5 19

    Ultrasonic Distance SensingS onars: so(und) na(vigation) r (anging)Based on the time-of-flight principleThe emitter sends a chirp of sound

    If the sound encounters a barrier it reflects back tothe sensor

    The reflection is detected by a receiver circuit,tuned to the frequency of the emitter D istance to objects can be computed by measuringthe elapsed time between the chirp and the echoSound travels about 0.89 milliseconds per foot

  • 8/8/2019 Robotics 10

    20/36

    CS 491/691(X) - Lecture 5 20

    Sonar SensorsE mitter is a membrane that transforms mechanicalenergy into a ping (inaudible sound wave)

    The receiver is a microphone tuned to thefrequency of the emitted sound

    Polaroid Ultrasound Sensor Used in a camera to measure the

    distance from the camera to the subject

    for auto-focus system Emits in a 30 degree sound cone

    Has a range of 32 feet

    Operates at 50 KHz

  • 8/8/2019 Robotics 10

    21/36

    CS 491/691(X) - Lecture 5 21

    EcholocationE cholocation = finding location based on sonar Numerous animals use echolocation

    Bats use sound for:

    finding pray, avoid obstacles, find mates,communication with other bats

    Dolphins/Whales:find small fish, swim through mazes

    Natural sensors are much more complex thanartificial ones

  • 8/8/2019 Robotics 10

    22/36

    CS 491/691(X) - Lecture 5 22

    Specular Reflection

    Sound does not reflect directly and come right backSpecular reflection The sound wave bounces off multiple sources before

    returning to the detector

    Smoothness The smoother the surface the more likely is that the sound

    would bounce off

    Incident angle The smaller the incident angle of the sound wave the

    higher the probability that the sound will bounce off

  • 8/8/2019 Robotics 10

    23/36

    CS 491/691(X) - Lecture 5 23

    Improving Accuracy

    Use rough surfaces in lab environments

    Multiple sensors covering the same area

    Multiple readings over time to detect discontinuities

    Active sensing

    In spite of these problems sonars are used

    successfully in robotics applications Navigation

    Mapping

  • 8/8/2019 Robotics 10

    24/36

    CS 491/691(X) - Lecture 5 24

    Laser Sensing

    High accuracy sensor Lasers use light time-of-flightLight is emitted in a beam (3mm) rather than a cone

    Provide higher resolution

    For small distances light travels faster than it can bemeasured use phase-shift measurement

    SICK LMS200

    360 readings over an 180-degrees, 10HzDisadvantages: cost, weight, power, price

    mostly 2D

  • 8/8/2019 Robotics 10

    25/36

    CS 491/691(X) - Lecture 5 25

    Visual Sensing

    Cameras try to model biological eyesMachine visionis a highly difficult research area Reconstruction

    What is that? Who is that? Where is that?Robotics requires answers related to achievinggoals Not usually necessary to reconstruct the entire world

    Applications Security, robotics (mapping, navigation)

  • 8/8/2019 Robotics 10

    26/36

    CS 491/691(X) - Lecture 5 26

    Principles of Cameras

    Camerashave many similarities with the human eye The light goes through an opening ( iris - lens) and hits the

    image plane (retina ) The retina is attached to light-sensitive elements ( rods,

    cones silicon circuits ) Only objects at a particular range are

    in focus ( fovea ) depth of field 512x512 pixels ( cameras ),120x10 6 rods and 6x10 6 cones ( eye )

    The brightness is proportional to the

    amount of light reflected from the objects

  • 8/8/2019 Robotics 10

    27/36

    CS 491/691(X) - Lecture 5 27

    Image Brightness

    Brightness depends on reflectance of the surface patch

    position and distribution of the light sourcesin the environment

    amount of light reflected from other objectsin the scene onto the surface patch

    Two types of reflection Specular (smooth surfaces)

    Diffuse (rough sourfaces)

    Necessary to account for theseproperties for correct objectreconstruction complex computation

  • 8/8/2019 Robotics 10

    28/36

    CS 491/691(X) - Lecture 5 28

    Early VisionThe retina is attached to numerous rods and cones which, inturn, are attached to nerve cells (neurons )The nerves process the information; they perform "earlyvision", and pass information on throughout the brain to do

    "higher-level" vision processingThe typical first step ("early vision") is edge detection, i.e., findall the edges in the image

    Suppose we have a b&w camera with a 512 x 512 pixel image

    Each pixel has an intensity level between white and black

    How do we find an object in the image? Do we know ifthere is one?

  • 8/8/2019 Robotics 10

    29/36

    CS 491/691(X) - Lecture 5 29

    Edge Detection

    Edge= a curve in the image across whichthere is a change in brightnessFinding edges Differentiate the image and look for areas

    where the magnitude of the derivative is largeDifficulties Not only edges produce changes in brightness:

    shadows, noise

    Smoothing Filter the image using convolution Use filters of various orientations

    Segmentation: get objects out of the lines

  • 8/8/2019 Robotics 10

    30/36

    CS 491/691(X) - Lecture 5 30

    Model-Based Vision

    Compare the current image with images of similar objects(models ) stored in memory

    Models provide prior information about the objects

    Storing models

    Line drawings

    Several views of the same object

    Repeatable features (two eyes, a nose, a mouth)

    Difficulties

    Translation, orientation and scale

    Not known what is the object in the image

    Occlusion

  • 8/8/2019 Robotics 10

    31/36

    CS 491/691(X) - Lecture 5 31

    Vision from Motion

    Take advantage of motion to facilitate visionStatic system can detect moving objects Subtract two consecutive images from each other the

    movement between frames

    Moving system can detect static objects At consecutive time steps continuous objects move as one

    Exact movement of the camera should be known

    Robots are typically moving themselves Need to consider the movement of the robot

  • 8/8/2019 Robotics 10

    32/36

    CS 491/691(X) - Lecture 5 32

    Stereo Vision

    3D information can becomputed from two

    images

    Compute relativepositions of cameras

    Compute disparity

    displacement of a point in3D between the two images

    Disparity is inverse proportional with actual distancein 3D

  • 8/8/2019 Robotics 10

    33/36

    CS 491/691(X) - Lecture 5 33

    Biological Vision

    Similar visual strategies are used in natureModel-based vision is essential for object/people

    recognition

    Vestibular occular reflex Eyes stay fixed while the head/body is moving to stabilize

    the image

    Stereo vision Typical in carnivores

    Human vision is particularly good at recognizingshadows, textures, contours, other shapes

  • 8/8/2019 Robotics 10

    34/36

    CS 491/691(X) - Lecture 5 34

    Vision for Robots

    If complete scene reconstruction is not needed wecan simplify the problem based on the taskrequirements

    Use colorUse a combination of color and movementUse small images

    Combine other sensors with visionUse knowledge about the environment

  • 8/8/2019 Robotics 10

    35/36

    CS 491/691(X) - Lecture 5 35

    Examples of Vision-Based Navigation

    Running QRIO Sony Aibo obstacle avoidance

  • 8/8/2019 Robotics 10

    36/36

    CS 491/691(X) - Lecture 5 36

    Readings

    F. Martin: Chapter 6

    M. Matari : 9