Plume tracking hardware

67 %
33 %
Information about Plume tracking hardware
Education

Published on January 7, 2008

Author: Barbara

Source: authorstream.com

Slide1:  Actuated sensors (mote-based robots) take “plume” samples Wireless communication system broadcasts commands to actuated sensors Base station makes plume prediction and computes sensor locations Vision system for locating sensors Air outlet Fog “Contaminant” (orange) introduced into air stream Fan blows air (green) through system 2-D System Testbed Concept Outline:  Outline System architecture Hardware configuration Robot chassis MICA board & circuit system Camera system Software configuration System diagram pGPS Mote Software Simulation Results Preliminary Testbed Results System Architecture:  System Architecture Hardware system 10 mote-based robots 11 MICA2 boards 1 video camera as a pseudo-GPS 1 PC 1 programming board that connect MICA2 with PC via serial port. Software system TinyOS on each MICA2 board Window XP on PC Cygwin on PC for compilation environment Motes:  Motes What is a mote? Motes are tiny, self-contained, battery powered computers with radio links, which enable them to communicate, exchange data with one another and to self organize to ad hoc networks. A mote is processor/radio boards (MPR) combination Motes can act as a wireless sensor/data acquisition system A mote is a self-contained, millimeter-scale sensing and communication platform. Berkeley Motes:  Berkeley Motes J.Hill at al. “System Architecture Directions for Networked Sensors” [online]. Available: http://www.cs.virginia.edu/~qc9b/cs851/SADofNS_2.ppt Hardware Configuration of the Mobility Platform:  Hardware Configuration of the Mobility Platform MICA2 (Berkeley) Control Board (USU) AVR Atmega 128 (CPU) CC1000 (Comm.) 2 Encoders Paper Detector (Optional) (QRB1134) 3 IR (Sharp GP2D12) Photo Resistor 2 Servos Sensors 3V Power 6V Power 2 PW 2 PWM 2 ADC 3 ADC 2 ADC 3-D Model of the Robot:  3-D Model of the Robot Prototype:  Prototype Prototype (cont.):  Prototype (cont.) MICA2 PCB and Motors On-chip programming and low-level control:  On-chip programming and low-level control Mica2 and Programming Board Pseudo-GPS (pGPS):  Pseudo-GPS (pGPS) Robots have their own evaluation of position based on encoder – less precise pGPS informs robots of their real position Robots can adjust their movement with updated and more precise information Implemented with a camera Camera Hardware:  Camera Hardware Available cameras Camera - Final Selection:  Camera - Final Selection Camera LU-205C Color 2.0 Megapixel camera with standard C-mount 2.0 Megapixel resolution (1600 x 1200), 10 frames/s SVGA (800 x 600) sub-sampling provides 40 fps USB 2.0 compatibility Lens Manual iris and focus 3.6mm C-mount 4m USB 2.0 Cable Outline:  Outline System architecture Hardware configuration Robot chassis MICA board & circuit system Camera system Software configuration System diagram pGPS Mote Software Simulation Results Preliminary Testbed Results Software Architecture:  Software Architecture PC Host Robot Commander observer, higher-level controller Message Forwarder TCP Server Pseudo-GPS TCP Server Robot Robots 10101… Serial Port TCP/IP Port Wireless comm.with limited bandwidth Logger DB TinyOS TinyDB User App. Lower-level controller Video Camera pGPS Updating Scheme:  pGPS Updating Scheme Broadcast periodically by pGPS Save bandwidth Update information for all 10 robots can be packed into 3 packets Least work on PC host, more updated information Other scheme Request by Matlab: more work on PC Request by Robots: waste bandwidth (need 20 packets) pGPS:  pGPS Functions Camera Calibration Marker Detection Calculate Position Trigger to detection by Timer by TCP request by GUI, for debugging Display Image Display frame and detected object Optional,can be closed ARToolKit for pGPS:  ARToolKit for pGPS Augmented Reality Adding 3D Virtual Object to reality Can do: Calibrate Camera Grab a frame from camera Detect Marker ARToolKit for pGPS (cont.):  ARToolKit for pGPS (cont.) 10 pGPS Positioning:  pGPS Positioning Transform Pixels to length Field of View V = d * (s/f) cm or inch per pixel for X and Y V/p = (d * s) / f * p ARToolkit Changes Left-Right Flipped Increase buffer size to fit 1280X1024 d V f s Mote Software:  Mote Software Mote Components Components and interface Low-level motion control Communication and its interface Java support and Matlab programming Slide22:  Mote Application Clock ADC Server PWM Active Message RFM UART Motor ADC Sensor Mote Input-Output Structure Software Hardware Slide23:  Mote Components What is a component ? Similar to a object in c++ Provide programming interface (command / event ) Implementation encapsulated Main components in mote ADC server (ADCServer) Led (Led) Timer (Timer) Motor (myMotor) Communication (myComm) Xnp (for wireless download) Robot component Main component Component event command COMPONENT COMPONENT task signal [1] Bill Naurer “Introduction to TinyOS and nesC programming” Crossbow Inc. Croxbow CD, Day1amSessionB1_nescIntro.ppt July 2003. From [1] Component Architecture in Mote :  Component Architecture in Mote Note: Low-level feedback control is implemented in robot component. Command interpreter/executer are implemented in the main application Architecture demonstrated by functionality Application=component+scheduler:  Architecture demonstrated by functionality Application=component+scheduler [2] Radu Stolern “TinyOS Programming”, [online]. Available http://www.cs.virginia.edu/~qc9b/cs851/TinyOS.ppt From [2] Low-Level Feedback Control in Robot:  Low-Level Feedback Control in Robot Position feedback PI/PID control, path planning Speed feedback PI/PID control These two feedback control are being implemented by robot component. The algorithms are running in the background. The running periods are defined by Timers. Slide27:  Motors Odometry (△X, △Y, △Ψ) Controller Software-Adjustable Controller Architecture pGPS (X, Y, Ψ) Base Station Centralized Mote Robot Distributed Slide28:  Motors Odometry Controller pGPS (X, Y, Ψ) Base Station Mote Robot Software-Adjustable Controller Architecture Wireless Communication Protocol:  Wireless Communication Protocol TinyOS ActiveMessage module provides transmit and receive data movement using CSMA/CA based contention avoidance schemes. In the receive mode, the module accepts bytes of data from the radio and performs the necessary preamble detection, synchronization and CRC calculation/checks. To transmit, the stack checks to see if the channel is clear by searching for a preamble AND monitoring received signal strength (via the ADC). When the channel is clear, it toggles the radio and pushes out the preamble, sync and payload bytes. Message Type and Command Interpreter/Executer :  Message Type and Command Interpreter/Executer There are two kinds of output message types Type 1: mote ->station, report sensor reading (sending at constant interval ) Type 2: mote -> mote, report position and sensor reading (possibly send at request) There are three input messages Type 1: station->mote, give position and next new position Type 2: station -> mote, set speed Type 3 station -> mote, update (x, y, angle) information. State Variable Structure:  State Variable Structure struct robotStructure { int16_t x; int16_t y; double angle; int8_t leftMotorSpeed; int8_t rightMotorSpeed; uint8_t leftEncoderReading; uint8_t rightEncoderReading; uint8_t frontIReading; uint8_t frontPhotoResisterReading; uint8_t rearPhotoResisterReading; //add if necessary uint16_t timeStamp //optional structure otherMoteinfo; //optional }robotInfo; Distributed Motes Network Database:  Distributed Motes Network Database On-mote database helps signal processing on sensor data inside mote. The way to build a distributed sensor database is to utilize its extended flash memory (TinyDB for example). It may not be necessary on current stage. Outline:  Outline System architecture Hardware configuration Robot chassis MICA board & circuit system Camera system Software configuration System diagram pGPS Mote Software Simulation Results Preliminary Testbed Results Geometric Configuration:  Geometric Configuration (-1,-0.6) (-1,0.8) (-1,0.6) (-1,0.8) (1,-0.6) (-1,-0.4) (-0.7,0.2) (-0.7,0.6) (-0.3,0.2) (-0.3,0.6) Central=(0.5,0) r=0.2 Flow in Flow out Finite Element Model (FEM) for Air Flow Simulation:  Finite Element Model (FEM) for Air Flow Simulation Flow only model: Incompressible Navier-Stokes equation : velocity of the air (time-invariant at the inlet, initially zero everywhere else). : density of the media. : viscous of the media. : pressure (calculated everywhere except at the boundary). : the force applied to the diffusion chemical (fog; calculated everywhere except at the boundary). FEM Simulation (cont.):  FEM Simulation (cont.) 2. Assumptions of the Air Flow Model Air comes in the container from an inlet and goes out from outlet. The container is thin (consider 2D only). There are 2 obstacles in the container. 3. Simulate in FEMLAB FEM Simulation (cont.):  FEM Simulation (cont.) 4. Configurations No Diffusion Flow speed at inlet low Flow speed low Low viscosity Flow speed at inlet low High viscosity Flow speed at inlet high Low viscosity Flow speed at inlet high High viscosity With Diffusion Flow speed low Low viscosity Small diffusion rate Flow speed low High viscosity High diffusion rate Sample Mesh Configuration:  Sample Mesh Configuration Low Viscosity, Low Speed Vector Field:  Low Viscosity, Low Speed Vector Field Low Viscosity, Low Speed Contour:  Low Viscosity, Low Speed Contour Low Viscosity, Low Speed Flow Plot:  Low Viscosity, Low Speed Flow Plot With Diffusion:  With Diffusion Convection and diffusion equation. c: Concentration. D: Diffusion constant. V: Media (air) speed. Diffusion Surface Contours:  Diffusion Surface Contours Outline:  Outline System architecture Hardware configuration Robot chassis MICA board & circuit system Camera system Software configuration System diagram pGPS Mote Software Simulation Results Preliminary Testbed Results The Testbed:  The Testbed Fog Diffusion Video:  Fog Diffusion Video Hardware :  Hardware Hardware :  Hardware Testbed Summary:  Testbed Summary We have presented preliminary results aimed at developing an experimental testbed, motivated by the problem of plume tracking. The system architecture uses Mote-based robots Pseudo-GPS system Software using TinyOS Simulation results demonstrated basic concepts of diffusion and a physical testbed has been completed. Future efforts will include: Simulation validation via measurements from the physical testbed Experiments tracking the plume with the mote-based robots

Add a comment

Related presentations

Related pages

HERMES Sac Plume Elan 28 Grey Beige Leather Silver ...

HERMES Sac Plume Elan 28 Grey Beige Leather Silver Hardware Mini Tote Hand Bag in Clothing, ... (With online tracking and full insurance)
Read more

A reconfigurable computing platform for plume tracking ...

A reconfigurable computing platform for plume tracking with mobile sensor networks - art. no. 62301I on ResearchGate, the professional network for scientists.
Read more

CiteSeerX — Citation Query robot-based chemical plume ...

CiteSeerX - Scientific documents that cite the following paper: robot-based chemical plume tracing using fluid dynamics
Read more

Comparison of Braitenberg Vehicles with Bio-Inspired ...

Comparison of Braitenberg Vehicles with Bio-Inspired Algorithms for Odor Tracking in ... the plume tracking task in ... adds hardware complexities such ...
Read more

Invertebrate-Inspired Sensory-Motor Systems and Autonomous ...

These performance discrepancies are not due to sloppy hardware or software implementations ... of plume tracking by lobsters, we ...
Read more

ENVIRONMENTAL CONSULTING ENGINEERS & SCIENTISTS SOFTWARE ...

hardware engineers and developers. Backed by RWDI’s team of engineers, scientists, meteorologists, ... •Plume-RT (Real-time Dispersion Modelling)
Read more

Autonomous and Adaptive Underwater Plume Detection and ...

PLUME TRACKING METHODS With a knowledge of the ... and most gliders cannot power the acoustic communication hardware necessary for multiAUV missions ...
Read more

Development of CPT M3D for Multiple Chemical Plume Tracing ...

Development of CPT_M3D for Multiple Chemical Plume Tracing and Source Identification ... The tracking of chemical ... to graphic hardware, ...
Read more

Monitoring chemical plumes in an environmental sensing ...

This paper describes the development of a wireless chemical sensor network (WCSN) and an environmental sensing chamber (ESC) within which this WCSN was tested.
Read more

Vehicle Tracking | GPS Tracking | LiveViewGPS.com

Vehicle tracking through use of our GPS tracking technology allows business owners to pinpoint the strengths and weaknesses of their vehicle fleets.
Read more