Measuring Cooperative Robotic Systems Using Simulation-Based Virtual Environment Xiaolin Hu Computer Science Department Georgia State University, Atlanta GA, USA 30303 Bernard P. Zeigler Arizona Center for Integrative Modeling and Simulation University of Arizona, Tucson AZ, USA 85721 Agenda • Background on DEVS and Model Continuity • The Virtual Measuring Environment • The Robotic Convoy Example • Preliminary Simulation Results • Future Work DEVS (Discrete Event System Specification) • Derived from mathematical dynamical system theory • Based on a formal M&S framework • Supports hierarchical model construction Atomic model Coupled model • Explicit time modeling Experimental Frame real Source System Simulator behavior database Modeling Relation Simulation Relation Model DEVS Formalism A Discrete Event System Specification (DEVS) is a structure M = <X,S,Y,int,ext,con,, ta> where X is the set of input values. S is a set of states. Y is the set of output values. int: S S is the internal transition function. ext: Q Xb S is the external transition function, where Q {(s,e) | s ?S, 0 e ta(s)} is the total state set, e is the time elapsed since last transition, Xb denotes the collection of bags over X. con: S Xb S is the confluent transition function. : S Yb is the output function. + ta: S R 0, is the time advance function A Background on Model Continuity for Robotic Software Development The control logic to be designed Control model Control model Sensor/actuator Virtual Sensor/actuator Sensor/actuator interface Real environment Environment model Real environment Robotic system to be Modeling Designed Simulation & Test Mapping System in Execution A Virtual Measuring Environment Simulation-based measuring using robot models and the environment model A virtual measuring environment – an intermediate step that allows models and real robots to work together within a “virtual environment” Real system measuring with all real robots within a real physical environment This virtual measuring environment will bring simulationbased study one step closer to the reality. Realization of the Virtual Measuring Environment • Model Continuity allows the same control models used in simulation to be deployed to real robots for execution. The couplings among these models are maintained from simulation to execution. • The clear separation between the control model and sensor/actuator interface make it possible for the control model to interact with different types of sensors/actuators, as long as the interface functions between them are maintained. • The control model of a real robot can use real sensors/actuators to interact with the real environment and virtual sensor/actuators to interact with a virtual environment. • A real robot can communicate with robot models simulated on computers, resulting in a system with combined real and virtual robots. Architecture of the Virtual Measuring Environment virtual environment virtual sensors HIL actuators virtual obstacle virtual counterpart of the real robot Control Model virtual robots wireless communication computer mobile robot An Incremental Measuring Process Robot Model Robot Model Robot Model Real Robot Real Robot Real Robot Virtual Virtual Sensor Actuator Virtual Virtual Sensor Actuator Virtual Virtual Sensor Actuator Virtual HIL Sensor Actuator Real Real Sensor Actuator Real Real Sensor Actuator Virtual Environment Virtual Environment Real Environment (a) (b) (c) Conventional simulation Robot-in-the-loop simulation Real system measurement Experimental Frames • input stimuli: specification of the class of admissible input time-dependent stimuli. This is the class from which individual samples will be drawn and injected into the model or system under test for particular experiments. • control: specification of the conditions under which the model or system will be initialized, continued under examination, and terminated. • metrics: specification of the data summarization functions and the measures to be employed to provide quantitative or qualitative measures of the input/output behavior of the model. Examples of such metrics are performance indices, goodness-of-fit criteria, and error accuracy bound. • analysis: specification of means by which the results of data collection in the frame will be analyzed to arrive at final conclusions. The data collected in a frame consists of pairs of input/output time functions. An Architecture with Experimental Frames Control model of robots Define input stimuli, control, metrics, analysis from the measuring point of view Robot Models Models of sensors and actuators Experimental Frames Environment Model Model how the environment reacts/interacts with robots A Robot Convoy Example FReadyIn Robotn FReadyOut … … BReadyIn FReadyOut Robot3 FReadyIn BReadyOut BReadyIn FReadyOut Robot2 FReadyIn BReadyIn Robot1 BReadyOut BReadyOut • Robots are in a line formation with each robot has a front neighbor and a back neighbor. • The system try to maintain the coherence of the line formation • A robot’s movements are synchronized with its neighbors • A robot (except the leader robot) goes through a basic “turn – move – adjust – inform” routine. • No global communication/coordination exists. Robot Model FReadyIn FReadyIn BReadyIn BReadyIn Convoy moveComplete move FReadyOut FReadyOut BReadyOut BReadyOut move Avoid moveComplete HWInterface activity • Based on Brooks’ Subsumption Architecture • The Avoid model controls a robot to move away if the robot collides with anything. The Convoy model is fully responsible to control a robot to convoy in the team. • HWInterface activity is responsible for the sensor/actuator hardware interfaces. Formula of Movement D Ri-1 di Ri i i-1 Ri Ri-1 i-1 i di-1 a tg i d i 1 * sin( i 1 i ) a d i 1 * cos( i 1 i ) d i 1 * sin( i 1 i ) di D sin i i i i 1 i 1 i i i Environment Model move1 … … moveN startMove TimeManager1 Robot1 t1 … … startMove TimeManagerN RobotN tN moveComplete1 SpaceManager … … Robot1 (x, y) … … sensorData1 … … RobotN (x, y) moveCompleteN Obstacles (x, y) sensorDataN • This TimeManager determines how long it takes for a robot to finish a movement. • The SpaceManager is a model of the experimental floor space, including the size, shape and position of the work area, the static objects and robots. • In this example we have ignored the detailed dynamic processes of a movement. Two Noise Factors a Random numbers are used to simulate the noises and variances in robots’ movement. D d Distance noise factor (DNF): the ratio of the maximum distance variance as compared to the robot’s moving distance. Angle noise factor (ANF): the ratio of the maximum angle variance as compared to the robot’s moving distance. Formation Coherence The formation coherence is affected by the noise factors. 45 40 35 30 25 20 15 10 5 0 11033 10245 9457 8669 7881 7093 6305 5517 4729 3941 3153 2365 1577 1 DNF=0.1, ANF=0.08 789 position error average position error with 30 robots simulation steps Ei (t ) ( xi (t ) xi d esired (t )) 2 ( y i (t ) yi d esired (t )) 2 (4) xi desired (t ) xi 1 (t ) D * cos i 1 (t ) (5) yi desired (t ) yi 1 (t ) D * sin i 1 (t ) (6) E (t ) E (t ) i N (7) We use the average position error of the robot team as the indicator for the convoy system’s formation coherence: the smaller the error is, the more coherent the convoy system is. The position error does not accumulate over time – coherent. Noise Sensitivity Position errors vs. noise factors position errors 45 40 35 Series1 30 Series2 Series3 25 9572 9009 8446 7883 7320 6757 6194 5631 5068 4505 3942 3379 2816 2253 1690 1127 564 1 20 simulation steps Series 1: DNF = 0.04, ANF = 0.04, average = 35.08 Series 2: DNF = 0.1, ANF = 0.08, average = 35.69 Series 3: DNF = 0.2, ANF = 0.1, average = 36.61 The system is insensitive to the noise factors as long as these factors are within a safe boundary. Scalability Position error vs. number of robots 40 position error 35 30 DNF=0.1, ANF=0.08 25 20 15 0 10 20 30 40 50 number of robots • It shows that the average position error increases as the number of robot increases. • If this trend holds true with more robots, the system is not scalable in the sense that it will eventually break as more robots are added into the system. A Future Experimental Setup Environment Model Robot_1 … … … … Robot_k Robot_n Real environment abstract sensors – abstract motor abstract sensors – HIL motor real sensors – HIL motor For example, we can check the back robot’s position errors based on the position and direction of its front robot in the physical environment. Thank you