DEGREE PROJECT IN TECHNOLOGY, FIRST CYCLE, 15 CREDITS STOCKHOLM, SWEDEN 2016 Light Tracking Robot Navigation using light and colour sensors MIKAELA KARLÉRUS BEATA TÖRNEMAN KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF INDUSTRIAL ENGINEERING AND MANAGEMENT Bachelor Thesis MMKB 2016:41 MDAB102 Light Tracking Robot Mikaela Karlérus Beata Törneman Approved Examiner Supervisor 2016-06-07 Martin Edin Grimheden Baha Alhaj Hasan ABSTRACT The increasing demand of making the roads safer has trigged a lot of companies to developcompleteself-drivingcars.Aself-drivingcarrequiresagreatnumberofdifferent sensorsasgyros,radars,GPS,tachymetersetc.andadvancedsoftware.Thisthesiswill focus on the possibilities of using only light sensing devices for a tracking robot and examinetheadvantagesanddisadvantagesofthis. Thepurposeistoinvestigatewhichtypeoflightsensorismoresuitableforatracking robotandwhatthelimitationsofatrackingrobotusingthistechnologyare. Ademonstratorusingtwolightsensorsforcontrollingspeedanddirectionandacolour sensortoavoidobstacleswillbebuilt.Apartfromchoosingthemostsuitablesensorfora light-trackingrobotthesensingdistanceandrangeofthechosenonewillbetested. To investigate the different light-tracking possibilities and the accuracy of the demonstrator, the vehicle will be put in an open indoor space with arranged coloured luminous obstacles. The robot will be tested in both a completely dark room and a lit room.Theintentionwiththeoutcomeistoseethedifferencesoftherobotsbehaviour whendisturbancesfromsurroundinglightareaddedasanadditionalaspect. Theresultsfromthetestarepresentedandtheuseofdifferentsensorsarediscussed. Thefinalconclusiononusinglightsensingforatrackingrobotisthatitisaneasyand inexpensivemethod,butitshouldbeusedasacomplementtoothersensingdevicesnot asastand-alonemethod. I II Kandidatarbete MMKB 2016:41 MDAB102 Light tracking robot Mikaela Karlérus Beata Törneman Godkänt Examinator Handledare 2016-06-07 Martin Edin Grimheden Baha Alhaj Hasan SAMMANFATTNING Den ökande efterfrågan på säkra bilvägar har lett till en utveckling av kompletta självkörande bilar. En självkörande bil kräver ett stort antal olika sensorer som gyros, GPS,tachymeterosv.samtavanceradprogramvara.Dennaavhandlingkommerfokusera påmöjligheternaattkonstrueraenspårningsrobotmedbaraljuskännandeenhetersamt undersökafördelarochnackdelarmeddetta. Syftetärattavgöravilkentypavljussensorsomlämparsigbästförenspårningsrobotoch vilkabegränsningarenspårningsrobotsomanvänderdennateknologikommerattha. Enprototypsomanvändertvåljussensorerförkontrollerandeavhastighetochriktning samtenfärgsensorförattundvikahinderkommerattkonstrueras.Bortsettfrånattvälja den mest lämpade sensorn för en spårningsrobot, kommer avståndskänning och räckningsvidd att testat för den valda sensorn. Roboten kommer att testas i både ett fullständigtmörktrumochiettupplystrum. Förattundersökadeolikaljusavkänningsalternativenochnoggrannhetenhosroboten, kommer fordonet att placeras inomhus på ett öppet område med färgade ljuskällor arrangerade som hinder. Avsikten är att se skillnaderna i robotens beteende när störningarsåsomreflektionerocholikaljuskälloriomgivningentillkommer. Resultatet från testerna kommer att presenteras och användningen av olika sensorer kommerattdiskuteras. Slutsatsen är att det är en enkel och billig metod att använda ljusavkänning för en spårningsrobot men att det framförallt bör användas som ett komplement till andra avkänningsanordningarochintesomenfriståendemetod. III IV PREFACE First we would like to thank Baha Alhaj Hasan for the supervising and feedback throughouttheproject. We would also like to thank Staffan Qvarntröm for helping with the electrical and mechanicaldesignandtothestudentassistantswhohelpedusduringtheway. Finally, a huge thanks to Martin Åkerblad and Todd Barker for very useful and appreciatedfeedbackonthereportandproject. MikaelaKarlérus BeataTörneman Stockholm,May2016 V VI NOMENCLATURE Thischapterdescribesabbreviationsusedinthisproject. Abbreviations LDR LightDependentResistor PWM PulseWidthModulation RGB Red,Green,Blue LED LightEmittingDiode CAD ComputerAidedDesign IR Infra-Red SDA SerialDataLine SCL SerialClockLine VIN InputVoltage GND Ground IPS IndoorPositionSystem GPS GlobalPositionSystem AIS AutomaticIdentificationSystem VII VIII CONTENTS ABSTRACT.........................................................................................................................................................I SAMMANFATTNING.....................................................................................................................................III PREFACE...........................................................................................................................................................V NOMENCLATURE.........................................................................................................................................VII CONTENTS.......................................................................................................................................................IX 1 INTRODUCTION....................................................................................................................................1 1.1 BACKGROUND......................................................................................................................................................1 1.2 PURPOSE..............................................................................................................................................................1 1.3 SCOPE...................................................................................................................................................................2 1.4 METHOD...............................................................................................................................................................2 1.4.1 Testone..............................................................................................................................................................2 1.4.2 Testtwo.............................................................................................................................................................2 1.4.3 Testthree..........................................................................................................................................................3 2 THEORY...................................................................................................................................................5 2.1 LIGHTINTENSITYSENSING...............................................................................................................................5 2.1.1 Lightdependentresistor............................................................................................................................5 2.1.2 Photodiode.......................................................................................................................................................6 2.2 COLOURSENSING................................................................................................................................................6 2.3 MOTORCONTROL...............................................................................................................................................7 3 DEMONSTRATOR................................................................................................................................11 3.1 PROBLEMFORMULATION................................................................................................................................11 3.2 SOFTWARE.........................................................................................................................................................11 3.3 ELECTRONICS....................................................................................................................................................12 3.3.1 Microcontroller............................................................................................................................................13 3.3.2 Motordriver..................................................................................................................................................14 3.3.3 Lightsensor....................................................................................................................................................14 3.3.4 Coloursensor.................................................................................................................................................15 3.4 HARDWARE.......................................................................................................................................................16 3.4.1 Chassis..............................................................................................................................................................16 3.5 RESULTS.............................................................................................................................................................16 3.5.1 Lightsensors..................................................................................................................................................16 3.5.2 Coloursensor.................................................................................................................................................18 4 DISCUSSIONANDCONCLUSIONS...................................................................................................21 4.1 DISCUSSION.......................................................................................................................................................21 4.1.1 Lightsensor....................................................................................................................................................21 4.1.2 Colourssensor...............................................................................................................................................21 4.2 CONCLUSIONS....................................................................................................................................................22 5 RECOMMENDATIONSANDFUTUREWORK...............................................................................23 5.1 RECOMMENDATIONS.......................................................................................................................................23 5.2 FUTUREWORK..................................................................................................................................................23 REFERENCES..................................................................................................................................................25 APPENDIXA:TESTTHREE...........................................................................................................................I APPENDIXB:THEFINISHEDROBOT.....................................................................................................III IX X 1 INTRODUCTION This chapter describes in detail the thesis to be investigated, assumptions made and limitations in the modelling and construction of the demonstrator. 1.1 Background Self-driving cars are no longer just a surrealistic theory. In media you can follow the developmentofself-drivingcarsthatsoonwillbeputontheroadinSwedenfortesting purposesbyVolvo(VolvoCars,2016).CompanieslikeBMW,MercedesandTeslahave developedself-drivingfeaturesthataresoontobereleasedonthemarketwithambition to make fully autonomous vehicles (Business Insider, 2015). Google has since 2009 workedwiththeirself-drivingcarprojectandarerightnowtestingprototypevehicleson theroad(Google,2016).Aself-drivingcarcanbedefinedasavehiclewithfeaturesthat canmakeitaccelerate,brakeorsteerwithnohumaninput.Itrequiresagreatnumberof differentsensorsasgyros,radars,GPS,tachymetersetc.andadvancedsoftwaretomake itself-driven.Oneofthemainpurposesofaself-drivingcaristomaketheroadsaferand facilitate daily life for commuting people. Every year approximately 1.2 million people diesintrafficaccidents,which94%arecausedbyhumanerrors,afigurewhichcouldbe decreasedgreatlywithuseofself-drivingtechnology(Google,2016). Forthisthesisafullyautonomouscarwouldbetoocomplicatedtobuild,eveninasmaller scale.Ontheotherhand,asmalleramountofsensorscouldbeusedforothertypesof autonomousvehicles,withamissionthatissimplertopredict.Inworkingenvironments not optimal for humans, like in mines, it could be possible to develop a much simpler autonomousvehiclewhichcouldforexamplefollowlightinadarktunnel.Inanairport,a simpler self-driving vehicle could be used to tow airplane when taxing on the airport followinglightandcoloursonthegroundtosteeritswayonthefield. 1.2 Purpose This bachelor thesis will investigate the possibility to control a robot with light only. However,acompleteinvestigationonthesubjectwilltakelongertimeandcostmorethan the framework and limits given so the investigation will be limited to answering the followingmainquestion. Whichaspectsoflightarerelevanttoatrackingrobotcontrolledbylight? Thisquestionhasbeendividedfurtherintotwomorespecificquestions: • Which type of light sensor is more suitable for a tracking robot and what are the limitationsofatrackingrobotusingthistechnology? • Isitpossibletomaketherobotobstacleavoidingbyusingcolouredlightandifso,is thereanyspecificcolourthat’seasiertodetect? Differentaspectsoflightcouldbeintensity,brightness,colour,wavelengthandasuitable sensor would then be a sensor that reacts a lot to changes in one or several of these 1 aspects. Answers to above questions could hopefully be used as guide for further developmentoflightcontrolledvehiclesorrobotstoachievedifferentbasicmanoeuvres. 1.3 Scope TheOpen-SourceArduino-platformhascontributedinmakingtheuseofsensorseasier andmoreapproachable.Thecodeusedforreadingthefilteredsensorvalues,red,green, blueandcalculatingluxandcolourtemperatureiscollectedfromanopensourcelibrary. Therefore,thetheorybehindiscoveredonlybriefly.Focushasinsteadbeenputontesting andverifyingtheoutputfromthesensortodeterminethesensingrangeandthesampling frequencythatmakesausefulcompromisebetweenaccuracyandbreakdistance.Also thecommunicationbusI2CbetweenthecoloursensorandtheArduinoisfromanopen sourcelibraryandisnotexplainedfurther. Thisreportcoversthetheoryandconceptsbehindalightseekingandobstacleavoiding robot.Thecalculationswillbemadebasedonindoorusagesandtherobotwillberunand testedinanindoorenvironment.Thethesiswillonlycoverdrivingintwoenvironments, acompletelydarkroom(0lux)andalightedroom(10lux).Theobstacleavoidancewill be based on a colour sensor and therefore obstacles that are supposed to be avoided needstobecolourcoded.Moreover,thereportfollowsthescopeofaBachelorthesisat KTHandcorrespondsto10weeksofwork. 1.4 Method To begin with a study of the equipment and the theory behind their functionality was made,tobeabletoevaluatetherightusageofthedifferentcomponents.Theinformation retrievalinvolvedmainlysciencearticlesanddatasheets. Forthefollowingresearch,a complete prototype was built based on Arduino software and a robot kit. Sensors and electricmotorcontrolwasthenaddedtotherobotvehicletogetherwithanextrapower sourceinshapeofabatterypackage.Threedifferenttestswereconductedtoevaluatethe sensorsandwillbedescribedbelow. 1.4.1 Testone Acomparisonbetweentwolightsensorswasperformedbyplacingbothsensorsnextto eachotheronabreadboardandmeasuringtheirresponsetodifferentlightenvironments. Intheinitialstatebothsensorswerecovered,thesensorswerethenmovedfromadarker to a brighter part of the hallway. The most suitable sensor of those two i.e. the most sensitivesensorwaschosenforthedemonstratorandusedforthefollowingtests. 1.4.2 Testtwo Toinvestigatethetrackingabilitiesofthechosenlightsensorthesensingdistanceand sensingrangewereevaluated.Thetestwasdoneintwoenvironments,acompletelydark room (0 lux) and a lit room (10 lux). The sensing distance were determined by first measuringhowfarawaythechosenlightsensorcouldsensethelightsourceinthetwo different environments. After that the sensors response from different distances were conductedaccordingtofollowingsteps: 1. Measuringtheambientvalueintheroom. 2. Placingalightsource3minfrontofthelightsensor. 2 3. 4. 5. 6. 7. Calculatingthesensorvalue. Stepping0.5mforwardstothesensor. Calculatingthesensorvalue. Repeatingstep4-5untilthedistance0.5m. Addingthevaluesinatable. Thesensingrangetestwereconductedaccordingtothefollowingsteps: 1. 2. 3. 4. 5. 6. 7. Measuringtheambientvalueintheroom. Placingthelightsource1minfrontofthelightsensor. Calculatingthesensorvalue. Stepping0,1mtotheside. Calculatingthesensorvalue. Repeatingstep4-5untilthesensorvalueisbacktotheambientvalue. Determinatethesensorsrange. 1.4.3 Testthree A test to determine the colour sensors sensitivity to different colours were done by placingthedemonstratoratsetdistanceseverytenthcentimetrebetween1-1.5metres from a colour lighted obstacle. The illuminated obstacles are of own design with RGB LED:splacedonabreadboard.Thetestwasdoneinacompletelydarkroomandinalit roomandobstaclesinthethreedifferentcolours,red,greenandblueweretested.The testwasconductedaccordingtothefollowingsteps: 1. 2. 3. 4. 5. 6. 7. Placingthelightedobjectinfrontofthedemonstratoron1.5mdistance. Calculatingthesensorvalue. Movingtheobject0.1mforward. Calculatingthesensorvalue. Repeatingstep3-5untilthedistance0.1m. Repeatingstep1-5forthethreecolours. Plottingthevaluesinexcel. TheresultfromthetestwillbepresentedinSection3.5. 3 4 2 THEORY This chapter presents the theoretical framework that the performed research is based on. The different parts will be described separately below. 2.1 Light intensity sensing Lightsensorisadevicethatdetectstheambientlightlevelandsendsanoutputsignal whichvarieswiththelightintensity.Lightsensorsabsorblightenergyandreactwitha physicalalterationinaspectrumfrominfra-redtoultravioletlightandcreateelectricity in form of electrons (Eriksson, 2003). Light sensors are commonly known as photo sensorsandcanbedividedintotwogroups,theonesthatconvertthelightenergydirectly intocurrent,forexamplesolarcellsandtheonesthatchangestheirelectricalproperties depending on the intensity of the incident light and converts the measured light to a numerical value, for example light dependent resistor, photodiode etc. The numerical valuethencontrolsthesuppliedcurrenttothemotors. 2.1.1 Lightdependentresistor Alightdependentresistor,alsocalledLDR,changesitsresistivevaluedependingonthe incidentlightintensity.Theresistancewilldecreasewithincreasinglightintensityand viceversa.Tobeabletocalculatetheresistanceofthesensors,anotherresistorneedto beadded.Itcanbedonebyeitherapull-downorapull-upresistor.Usingapull-down resistor,theresistorisconnectedtothegroundasillustratedin.Apull-upresistorworks in the same way but the resistor is instead connected to its voltage source (Mims III, 2016).Thisphenomenon,toconnectaLDRinserieswitharesistorgivesacircuitcalled voltagedivider,Figure1. Figure1.Voltagedividerwithapull-downresistor WhentheresistanceofthelightisdecreasingthetotalresistanceoftheLDRandthepulldown resistor will decrease, in turn the current flow through both the resistances increases which leads to an increase of the voltage across the fixed resistor. Once the outputofthevoltagedividerisknown,theresistanceofthesensorcanbecalculatedusing Equation1, 5 !"#$ = !&' () (* + () (1) where!"#$ istheoutputvoltage,!&' istheinputvoltage,() theresistorand(* thephoto resistorsresistance(Adafruit,2015). 2.1.2 Photodiode A photodiode convert light into voltage in direct proportion to the light intensity and consistsofalensthatmakessurethatthelightisfocusedonacertainmaterialinthediode thatdetectslight(MimsIII,2016).Aphotodiodehastwolevelsofoutput,eitherit’soff,in aconditioncalledreversebias,whichmeansthatnocurrentflowthroughthediodeoron, when the light intensity is adequate and the diode will then allow a current to flow (ElectronicTurtorials,2016).Therefore,auseofaphotodiodeispositivewhencontrolof the light intensity is needed, while an LDR is preferred when varying light intensities needstobemeasured.Oppositetothelightdependentresistorwhichreactsinvisible light,aphotodiodeismoresensitivetolightinlongerwavelengthssuchasinfra-redlight. 2.2 Colour sensing Toenhancetheunderstandingofhowacoloursensordeviceworks,theknowledgeof how colours are created and how humans perceive colour needs to be shared. Electromagneticradiationinthewavelengthfromca380-780nmaredefinedasvisible light, though the radiation need to be reflected on a surface before colour can be distinguished.(JohJoh,KheeBoon,&Leong,2006).Thecombinationofanobject,alight source and an observer is what creates colour nuances, where each colour represent differentwavelengthsasillustratedinFigure2. Colour Wavelength Red ~625-740nm Orange ~590-625nm Yellow ~565-590nm Green ~520-565nm Cyan ~500-520nm Blue ~450-500nm Indigo ~430-450nm Violet ~380-430nm Figure2.Colourdiagramforthedifferentwavelengths Coloursarerepresentedbythecombinationofthethreeprimarycolours,red,blueand green (Color Matters, 2016). These three colours constitute the additive colour model calledRGB.Thismodelshowshowthesethreecolourscanproduceavarietyofdifferent coloursdependingonhowtheycombineasshowninFigure3. 6 Figure3:RGBcolourmodel Acoloursensoriscreatedtomimicthehumaneyeasmuchaspossibleandcanadjust differingbrightnessanddetectcolourswhichgivesahighresolutionofcolourimages.An RGBsensorisconstructedofseveralphotodiodesbehindcolourfiltersandacurrentto voltageconversioncircuit(JohJoh,KheeBoon,&Leong,2006).Whenlightfallsonthe photodiodeitwillbeconvertedtoaphotocurrentandthesensorproducesanoutput.The colourcanbedeterminedbyinterpretingthesethreevoltagesandshowsinFigure4. RGB Colour filter Photodiode Current-to-voltage-converter IR Red Reflective or transmitted color light IG Green IB Blue Figure4.Coloursensor VROUT VGOUT VBOUT 2.3 Motor control Motor control is about regulating and targeting the mechanisms that are essential for movements.Amotordriverisacontrollingdeviceusedtocontrolthespeedanddirection 7 ofthemotorsi.e.tomaketherobotturncorrectly.Anessentialcomponentinafunctional motor driver is an H-bridge which can be seen in Figure 5, this is the component that makesitpossibleforthemotorstoturnindifferentdirections. S1 S3 Vin M S2 S4 Figure5:SchematicofanH-bridge Thetopofthebridgeisconnectedtothepowersupplyandbottomisgrounded.S1-S4are theswitchingelements,usuallytransistors.Thedirectionofthemotorsiscontrolledby thecurrentflowsthroughthemotorsinthedirectiondeterminedbytheswitches.IfS1 andS4areconnectedtheleftleadofthemotorwillbeconnectedtothepowersupplyand therightleadtotheground.Thecurrentflowsthroughthemotoranditstartsspinning andthemotorwillgoforward.IfS2andS3areconnected,thecurrentwillflowinthe reversedirectionandthemotorwillgobackwards.TheadvantageofanH-bridgecircuit is the possibility to drive backwards and forwards at any speed, optionally using an independentpowersource(McManis,2006). Inthesamewayasthedirectionofthemotorscanbecontrolledisitpossibletocontrol the speed of the motors. The speed can be controlled with a Pulse-Width-Modulation, PWM, which is a method to create a continuous variable power supply. Instead of a continuouslyvaryinganaloguesignal,aPulse-Width-Modulationsignaldeliversenergy throughasectionofpulses.Digitalcontrolisusedtocreateasignalbetweenonandoff, somethingcalledsquarewave(Arduino,2016).ThecontrolofthePWMsignalisdone withtwoparameters,theclockcycleandthedutycycle.Theclockcycleisthefrequency ofthesignalandthedutycycleislinkedtotheswitchingtime.Thedurationofon-timeis called pulse width and by increasing and decreasing the pulse width, the controller regulatestheaverageDCvoltageappliedtothemotorthatchangesthespeedofthemotor. Dutycycleistheproportionofon-timetotheperiodoftime.Figure6illustratesaPWM signalanditsmeanvalue,inthiscaseisa50%dutycycle.Highmeansthatitsuppliesthe motorwiththefullamountofavailablevoltage,whichalsocanbedescribedasa100% dutycycle.Lowdoesn’tsupplythemotorwithvoltageatallandmeanistheaverageof highandlow. 8 Voltage High Mean Pulse Width Period Low Time Figure6:PWMsignalwitha50%dutycycle 9 10 3 DEMONSTRATOR Thischapterpresentthedevelopingprocessofthedemonstratoraswellasthefinalresult. 3.1 Problem formulation As stated in the purpose, the scientific question to be evaluated in this project is the possibility to control a robot using only light. This was reduced to an investigation of whichaspectsoflightsthatareneededtocontrolthetrackingrobot. Thedemonstratorshouldbeabletoaccuratelysteeraccordingtotheinputfromthelight sensors and the RGB sensor i.e. to steer towards the higher light intensity and avoid obstaclesusingtheinputfromtheRGBsensor.Theobstacleavoidancewillbebasedona colourcodingsysteminspiredbythenavigationsystemusedatseaandtheprinciplefor thesystemcanbevisualizedinFigure7.Whentherobotisapproachinganobstacleit shouldperformeitheraleftorarightturnbasedontheobstaclescolour. Figure7.Principlefortheobstacleavoidance 3.2 Software Thesoftwarecanbedividedintwosubsystems,oneforcomparinglightintensityandone for detecting obstacles. The demonstrator are constructed with a left and a right light sensor,asdescribedinFigure9.Thefirstsystemisbasedontheinputfromtheleftand therightlightsensorandthesecondsystemisbasedontheinputfromthecoloursensor. Thelightsensorsmeasurethelightintensitylevelsandtheseslevelsarethencompared todetermineinwhichdirectiontherobotshoulddrivei.e.thePWMsignaltoeachpairof motors. To keep the robot from constantly turning, an additional condition has been setup. For the robot to turn the sensor difference between the left and the right light sensorneedstobehigherthanathresholdvaluethathasbeenexperimentallysettothe numericalvalue45.TheRGBvaluesthatwereusedforavoidingobstaclesarecompared toathresholdvaluethatwassetasapercentageofthemeasuredclearlightvalue.This way the robot can drive in different light environments and still be able to sense an obstacle.AflowchartofthesteeringsystemisillustratedinFigure8. 11 Figure8.Flowchartofthesoftwaresystem. 3.3 Electronics Inthefollowingsectiontheelectricalcomponentsandhowtheyareconnectedwillbe described.InFigure9anoverallconstructionofthevehicleandtherelationbetweenthe componentsispresented. 12 Left sensor Right sensor Colour sensor Right motor Left motor Arduino UNO Motor driver Right motor Left motor Figure9.Schematicofthedemonstratorcontrol 3.3.1 Microcontroller ArduinoUNOisamicrocontrollerbaseddevelopmentboardwithopensourcehardware andsoftware.Theboardisusedtotransmitinformationfromthesensorsandcontrolthe fourDCmotorsthroughthemotordriver.Ithas14digitalI/Opinsofwhich6canbeused asPWMoutputsand6analogueinputs(Arduino,2016).TheArduinocallstheLDRand when it gets a new reading from the sensors it adjusts the motor speed and direction accordingtotheprogrammedinstructions.Thespeedadjustmentsaredonebysendinga PWMsignaltothemotordriver,whichprocessthesignalandforwardsittothemotors. Sincethemotorsineachsetareparallel,therightandleftsidearecontrolledseparately, whichenablesthesteering. 13 3.3.2 Motordriver ThemotordriverchosentopowerandcontrolthemotorsisaL298NDualH-BridgeMotor DriverShield(ArtofCircuits,2015).EachchannelontheL298Niscapableofdelivering anoutputcurrentupto2A,thusitcaneasilydrivetwosetsofbrushedDCmotorsas illustratedinFigure10. M M L298N Driver Arduino UNO Power supply + - ENA IN1 IN2 IN3 IN4 ENB AOUT1 AOUT2 VIN GND BOUT1 BOUT2 PWM AN PWM AN PWM PWM M M Figure10.DC-motorcontrol Two enable inputs are provided to enable or disable the motors independently of the inputsignals.Moreover,themotordriverhasfourpinsusedforcontrollingdirectionof themotors,andtwopinswhereitreadsthePWMsignalthatissenttothemotors.The motorsinthesameseti.e.thetwomotorsonthesamesideareinparallel,whichmeans thatthePWMoutputisthesameforthetwomotorsonthesameside.Themotordriver isconnectedtoanexternalpowersource,inthiscasea7,4Vbatterypackwiththebenefit topowerthemotorsdirectlywithouthavingtogothroughtheArduino. 3.3.3 Lightsensor Thelightsensorsusedinthisthesiswherethetypeofsensorsthatconvertsthemeasured lightintensitytoanumericalvalue.Twodifferentlightsensorsofthattypeweretested andevaluated.ThelightsensorsthatweretestedweretheLDRandthephotodiodeand theirfunctionalityaredescribedinSection2.1. The photodiode was of the type TSL 252R (Elfa, 2016) and shown in Figure 11. The photodiodewasonlyusedinthetestdescribedinSection1.4.1thusnotmountedonthe demonstrator. 14 Figure11.Photodiode Thelightsensorsusedforthedemonstratorandtocontroltherobotsrightorleftturns attheforwardrunaretwoLDR:softypeB906032(Elfa,2016),showninFigure12.As mentionedinSection2.1.1thelightsensorswereconnectedtoapull-downresistorand theinputfromthesensorsgoesthroughtheanalogueinputsA0andA1totheArduinoas illustratedinFigure14.Thepull-downresistorwascalculatedwithEquation1to270Ω. Togetsomedistancebetweenthesensorstheyaremountedontopofthevehiclewithan arrangement similar to the feelers of an insect. The mounting is described further in AppendixB:Thefinishedrobot. Figure12.LightDependentResistor 3.3.4 Coloursensor ThecoloursensorusedinthisprojectwasaTCS34725,asshowninFigure13(Adafruit, 2016),whichhasbothRGBandclearlightsensingelements.ThesensorfeaturesanIR blockingfilterthatminimizestheIRspectralcomponentoftheincominglightandallows colourmeasurementstobemadeaccurately.Sincethehumaneyecan’tseeinfraredlight, the IR blocking filter will make the sensing more realistic. The RGB-sensor has 7 pins labelled,LED,INT,SDA,SCL,3V3,GNDandVIN.Fortheapplication,SDA,SCL,GNDand VIN pins are used. The serial interface SCL is used to synchronize all data transfers between the Arduino and the rest of the components, while SDA is the data line. The connectionsbetweentheArduinoandsensorareillustratedinFigure14. Figure13.RGBcoloursensor 15 LDR 5V SDA SCL RGB Colour sensor Arduino UNO LDR A0 A1 LED INT SDA SCL GND VIN GND Figure14.DescriptionovertheconnectionbetweenthesensorsandtheArduino 3.4 Hardware 3.4.1 Chassis ThedemonstratorisbasedonafourwheeledrobotchassissuppliedbytheMechatronic institution at the Royal Institute of Technology. The chassis features four DC motors whichareoperatingbetween3-6V.Thecurrentpowerconsumptionofeachmotorvaries from100mAto300mAdependingonrotationalspeedandload(Curriculum,2015).The chassiswasbuiltwiththreelevels.Thefirstlevelispreparedforthetwobatterypackages, a ZIPPY Flightmax 5000mAh battery pack to supply the motors and a 2200mAh USB power bank for the Arduino. On the second level the motor driver and sensors were mountedandtheArduinowereplacedonthetoplayer.Thelightsensorandthecolour sensorweremountedona3Dprintedconstruction,modelledintheCADprogramSolid EdgeTM. 3.5 Results Thispartcoverstheresultfromthedifferentteststhatweredoneduringtheproject. 3.5.1 Lightsensors Thischapterrevealstheresultsyieldedfromtheperformedtests,moreorlessdivided intoparagraphsbasedontheperformedtestsdescribedinSection1.4. The registered sensor output during the test described in Section 1.4.1 are sorted accordingtothemeasuredluxvaluesandplottedbelowinfigure15.Thefigureshowsthe LDRandphotodiodesresponsestodifferentlightintensitylevels. 16 Numericalsensoroutput 1200 1000 800 600 400 200 0 0 114 175 274 373 462 789 1136 1433 21791 34196 Lux LDR Photodiode Figure15.Graphoverthelightsensorresponsetodifferentlightintensitylevels FromthetestdescribedinSection1.4.2,thesensingdistanceoftheLDRwasmeasured to7minadarkroomandtheresultsofthelightsensorsresponseispresentedinTable 1below.Themeasuredvaluesarethenumericalsensorresponsetotheflashlightputat differentdistancesfromthesensor. Table1.TheLDR:sresponsetoaflashlight Flashlightdistance(m) Noflashlight 3,0 2,5 2,0 1,5 1,0 0,5 Litroom 345 351 356 360 372 401 446 Darkroom 0 22 25 41 84 163 342 Whendoingthetestonthelightsensorsrange,theambientvaluewas348inthelit roomand0inthedarkroombeforetheflashlightwasturnedon.Thesensoroutput registeredatdifferentanglesarepresentedbelowinTable2. Table2.Thenumericalvaluesonthesensingrangetest Deviation(°) 0 11 21 30 38 45 50 54 Litroom 394 378 375 367 365 352 350 347 17 Darkroom 188 162 143 113 57 49 23 22 BasedontheresultsdescribedinTable2thesensingrangeoftheLDRwasdetermined andareillustratedinFigure16. 7m 2m 50° Figure16.SensingrangeoftheLDR 3.5.2 Coloursensor TheresultfromthetestdescribedinSection1.4.3areratherextensive,whereforethe bulkofthedatacanbefoundinappendixA.PlottedbelowinFigure17-Figure19arethe resultsfromthetestswithred,greenandblueobstaclesconductedinalightroom.The y-axisshowsthenumericalsensoroutput,andthex-axisshowsthedistancefromthe sensormeasuredinmeters. Redobstacle 250 200 150 100 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 R: G: B: Figure17.Thecoloursensorsresponsetoaredobstacle. 18 Greenobstacle 400 350 300 250 200 150 100 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 R: G: B: Figure18.Thecoloursensorsresponsetoagreenobstacle. Blueobstacle 400 350 300 250 200 150 100 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 R: G: B: Figure19.Thecoloursensorsresponsetoablueobstacle. 19 20 4 DISCUSSIONANDCONCLUSIONS This chapter summarize and analyse the previous given results with a discussion and a conclusion. 4.1 Discussion 4.1.1 Lightsensor FromthetheresultinSection3.5.1andthetheorybehindlightsensorsthatwascollected duringinformationretrievalthemostsuitablelightsensorforatrackingrobotcouldbe determinedtobetheLDR.ThegraphinSection3.5.1showsclearlythattheLDRhasa quickerresponsetochangesinlightintensity,whichisanecessarypropertyforatracking robot. The major difference between the two sensors is probably because the LDR respondbettertothevisiblelightspectrumthanthephotodiodewhichrespondbetterin theinfraredspectra. The idea of driving the prototype in indoor light was a lot more problematic than expected.Somethingthatnotfirsthadbeentakentoconsiderationwasthelightsensors sensibilitytodisturbancesfromvariouslightintheroom,whichbecameaproblemwhen tryingtosteertherobot.Inadarkroomthelightsensorsworkedasexpected,thenthe robot could sense the light with no disturbances from other light or reflections in the room.ThiswassolvedbyaddingaLEDstriponthefloor,withamuchbrighterlightthan thesurroundedlightanddisturbancesfromvariouslightwerereduced. 4.1.2 Colourssensor The results from Section 3.5.2 shows only a slightly difference between the sensors responseinacompletelydarkroomandalightedroom.FromFigure17-Figure19itcan bedeterminedthatthesensorsensestheredobstacleapproximatelyatthedistance0.7m, thegreenobstacleat0.5mandtheblueobstacleat0.3m.Probablyduetothatthered colourhasahigherwavelength,asdescribedinSection2.2,whichcontributestoalower spreadofthecolourintheroom.Theredcolourismoreconcentratedandwillbeseenby thecoloursensorearlier.Thebluecolourhastheshortestwavelengthofthisthreecolour andthereforethebluecolourisdetectedfromtheshortestdistance.Thisresultedinthe useofredandthegreenlightedobjectsasobstacles.Figure17showsthatwhendetecting redlight,theblueandgreenvalueswereverylow,whichisgoodwhentryingtodetecting aspecificcolour.However,thiswillgiveaquitenarrowuseofthiskindoftrackingrobot. Tobeabletobuildacompleteconstructionthatonlyuseslightforselfdriving,agreat varietyoflightandcolourintensityfromtheobstaclesmustbejudgedotherwiseitwill notwork. Themeasurementsweredonewithanondrivingvehicle.Accordingtotheresultsfrom Section3.5.2thecoloursensorarepossibletosensethecolouredilluminatedobstacles withinadistancelongenoughtomaketherobotabletoturnandavoidtheobstaclein time. In practice it didn’t work as expected. With a driving vehicle a problem with the coloursensorandthevehiclesresponsivenesstothesensorsinputoccurred.Thecolour sensorsmeasuredthecolourdifferencestooslowanditresultedinthecrashingintothe obstacle,sinceitwasn’treactingfastenough.Thisproblemwassolvedbychangingthe 21 colour sensors measuring speed to measure faster and the reaction time could be reduced. 4.2 Conclusions Thepossibilitytocontrolarobotbyusingonlylightmaynotbethemostaccurateway.It is not impossible to follow a track but it is surrounded by certain limitations. It is achievableiftherobotissupposedtofollowacertainlightasaflashlightoralighttrail, but the ability to let the robot freely drive in a normally lit room and seek after the brightestlightsourceisunattainablewithoutaddingothersensorstoavoidobstacles. Althoughitisnotthebestpossiblewaytoconstructatrackingrobot,couldaconclusion stillbedrawnbasedontheinformationdeterminedduringthisthesis.Forthiskindof trackingrobot,theLDRwasselectedasthemostsuitablelightsensor.Theabilitytoget therobottoavoidobstaclesbyusingonlycolouredlightwerepossiblewiththeuseofa colour sensor and the colour easiest to detect was the red colour, due to its long wavelength. To sum this up, using light sensing for a tracking robot are an easy and inexpensive method,butshouldbeusedasacomplementtoothersensingdevicesnotasastandalone method. 22 5 RECOMMENDATIONSANDFUTUREWORK Thischaptergivesrecommendationsformoredetailedsolutionsandfuturework. 5.1 Recommendations Asdescribedinpreviouschapterthecoloursensorismountedstraightaheadandonthe lowerpartofthevehiclewithascreenontopofthesensor.Thismeansthatthelighted obstacles need to be placed at a certain height above the ground which might not be practicalinreallife.Tosolvethis,moresensorswithawiderrangecouldbeadded. Furtherthecoloursensorwassetinananglethatmightnothavebeenthemostoptimal angleordirection.Calculationandconsiderationofthebestdirectionandangletoputthe lightsensorsincouldbeinvestigatedfurther. Asthecoloursensorisalsoalightsensingdevicethesurroundinglightinaroomcould affect the sensors signal, as the light contains different colours depending on the environment,forexamplelightinfluorescentlampshasabluerlightthanalight-bulb. Thecoloursensorshouldthereforebecalibratedtobeabletoworkindifferentlighted environments.Itcanbedonebyputtingawhiteandablackpaperinfrontofthesensor andsetthesevaluesasreferencevaluestotheinputfromthesensor.Thesensorreadings could then be compared to these values, this way environmental differences and disturbancesfromsurroundinglightcouldbeavoided. 5.2 Future work The demonstrator and the control system can be improved in a lot of aspects. A construction of a tracking robot using only light is possible, but as described earlier certain limits are required. To construct a complete tracking robot, the demonstrator needtobeequippedwithmoresensorsandfurtherdevelopedsoftwaretoworkcorrectly. Onemainproblemwiththedemonstratorwhendrivingindoorwithindoorlightwasits abilityofavoidingwallsandopendoors.Implementinganultrasonicsensororinfra-red sensorcouldbeasolutiontotheproblem.Thedemonstratorshouldalsotobeequipped withastartandstopsystem.Inaddition,itcouldbeinterestingtofurtherdevelopthe abilitytoregulatethespeedofthevehicle. ThedemonstratorcouldalsobedevelopedwithanIPSsystemforindooruseorGPSfor outdooruse,dependingonitspurpose.Withhelpofthatsystemtherobotcouldscanthe light intensity of a defined area and position its way back to the brightest spot of the scannedarea.It’sanideathatcouldbeusedforaself-drivingmower,chargedwithfor examplesolarcells.Insteadofneedingahumaninputtocarrythemowertoitscharging station,themowercouldonitsownfindthesunandchargeitselfusingsolarcells. Tousethiskindoftrackingrobotinenvironments,notoptionalforhumans,suchasmines orairportswouldalsorequiremoresensors.Fortheuseinamineamappingsystemand alocalnavigationsystemwillbeneeded.Touseitonanairportabettersafetysystemis needed, AIS system, which is a system to see others with AIS systems, and a sensing systemforafixedreferenceintheground(RL,2016).Areferencesystemintheground 23 willgivetherobotpositioninghelpandalightedlinewillgiveexactprecision.Theconcept withredandgreencoloursensingcanalsobeusedinshippinglanesfornavigationof autonomousshipsandboats. 24 REFERENCES Adafruit,2015.Usingaphotocell.Availableat: https://learn.adafruit.com/photocells/using-a-photocell.[Accessed:2016-03-09] Adafruit,2016.Datasheet-TCS3472.Availableat: https://cdn-shop.adafruit.com/datasheets/TCS34725.pdf.[Accessed:2016-04-19] ArduinoUno,2016.Arduino–ArduinoBoardUno.Availableat: https://www.arduino.cc/en/main/arduinoBoardUno.[Accessed:2016-04-10] Arduino,2016.Arduino–PWM.Availableat: https://www.arduino.cc/en/Tutorial/PWM.[Accessed:2016-03-09] ArtofCircuits,2015.Datasheet-L298N.Availableat: http://www2.st.com/content/ccc/resource/technical/document/datasheet/82/cc/3f/ 39/0a/29/4d/f0/CD00000240.pdf/files/CD00000240.pdf/jcr:content/translations/en. CD00000240.pdf.[Accessed:2016-03-20] BusinessInsider,2015.10millionself-drivingcarswillbeontheroadby2020.Available at:http://www.businessinsider.com/report-10-million-self-driving-cars-will-be-on-theroad-by-2020-2015-5-6?IR=T.[Accessed:2016-04-19] ColorMatters,2016.ColorSystems–RGB&CMYK.Availableat: http://www.colormatters.com/color-and-design/color-systems-rgb-and-cmyk [Accessed:2016–04-17] Curriculum,2015.DC-motors.Availableat: http://curriculum.vexrobotics.com/curriculum/speed-power-torque-and-dcmotors/dc-motors.[Accessed:2016-05-08] ElectronicTutorials,2016.Lightsensors.Availableat: http://www.electronics-tutorials.ws/io/io_4.html.[Accessed:2016-03-21] Elfa,2016.Datasheet-B906032.Availableat: https://www.elfa.se/Web/Downloads/_t/ds/photocells_eng_tds.pdf?mime=application %2Fpdf.[Accessed:2016-03-20] Elfa,2016.Datasheet-TSL252R.Availableat: https://www.elfa.se/Web/Downloads/_t/ds/tsl250r2r_eng_tds.pdf?mime=application%2Fpdf.[Accessed:2016-02-09] Eriksson,Patrik,2003.Ljus-ochbildsensorer.Availableat: http://www8.tfe.umu.se/courses/elektro/FSE/Kompendier/ljusochbildsensorer.pdf. [Accessed:2016-03-09] Google,2016.GoogleSelf-DrivingCarProject.Availableat: https://www.google.com/selfdrivingcar/.[Accessed:2016-04-19] 25 JohJoh,KheeBoonandLeong,2006.Usecolorsensorsforprecisemeasurement. OptoelectronicProductsDivisionAvagoTechnologies MimsIII,ForrestM.AmateurScientist:ExperimentingwithLightandDarkSensors. Availableat:http://makezine.com/projects/make-38-cameras-and-av/light-and-darksensors/.[Accessed:2016-03-20] McManis,Chuck,2006.H-Bridge:TheoryandPractice.Availableat: http://www.mcmanis.com/chuck/robotics/tutorial/h-bridge/.[Accessedat:2016-04- 17] RL,2016.AIS(AutomaticIdentificationSystem)Decoding.Availableat: http://rl.se/ais_eng. [Accessed at: 2016-05-07] VolvoCars,2016.Autonomousdrivingexplained.Availableat: http://www.volvocars.com/intl/about/our-innovation-brands/intellisafe/intellisafeautopilot/this-is-autopilot/autonomous-drive-in-detail.[Accessedat:2016-04-17] 26 APPENDIXA:TESTTHREE TheresultfromthetestinthedarkroomdescribedinSection1.4.3areplottedbelowin FigureA1-FigureA3. Redobstacle 200 180 160 140 120 100 80 60 40 20 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 R: G: B: FigureA1.Thecoloursensorsresponsetoaredobstacleinadarkroom. Greenobstacle 350 300 250 200 150 100 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0 1.1 1.2 1.3 1.4 1.5 R: G: B: FigureA2..Thecoloursensorsresponsetoagreenobstacleinadarkroom. I Blueobstacle 350 300 250 200 150 100 50 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 R: G: 0.9 1.0 1.1 1.2 1.3 1.4 1.5 B: FigureA3..Thecoloursensorsresponsetoablueobstacleinadarkroom. II APPENDIXB:THEFINISHEDROBOT FigureB1.Frontalviewofthefinishedrobot III FigureB2.Topviewwithboxesshowingthedifferentcomponents. FigureB3.Frontwithboxesshowingthedifferentcomponents IV FigureB4.Sideviewwithboxshowingthemotordriver ThecomponentsinFigureB2–FigureB4areasfollows: 1.ArduinoUNO 2.LightDependentResistor 3.RGBColourSensor 4.3Dprintedconstructionforthesensors 5.L298NDualH-bridgeMotorDriverShield V www.kth.se