程序代写案例-CS26020

欢迎使用51辅导,51作业君孵化低价透明的学长辅导平台,服务保持优质,平均费用压低50%以上! 51fudao.top
1 / 30
CS26020 Introduction to Robotics and Embedded
Systems
Introduction to Robots,
Practical Sessions
and
Assessment
Dr Laurence Tyler
[email protected]
2 / 30
Practicals organisation this year
● Normally:
– 9 × 2-hour practicals in the ISL
– Use CS workstations with specific IDE installed
– Access to shared environment for testing
– Can also access ISL outside practical slots
● This year (for now, anyway):
– Access to ISL and shared environment not possible
– 8 × 2-hour practical sessions online (Discord)
– Use own computer, or CS workstations remotely
– DIY environment using what's available
– Worksheets and assignment will be adjusted to suit
3 / 30
Practicals organisation
● Rough outline schedule:
– Using (and installing) MPLAB-X IDE (1 week)
– Using the API for movement and sensor reading (1 week)
– Sensors and reactive control experiments (2 weeks)
– Combining reactive & deliberative (3 weeks)
● Demonstrators available to help with problems
– Robots can be tricky
– Don't be afraid to ask for help!
4 / 30
Assessment
● 50% of module marks are practical assessment, 50%
exam.
● Usually two parts to practical assessment:
– Live demo of developed robot controller: 10%
– Report on robot controller: 40% (with code submission)
● Assessment details still being finalised
– Live demo activity will probably be changed
5 / 30
Assessment Timetable
● Two assessment deadlines, just after Easter Vac:
– "Live demo": Week 33 (begins Mon 12 April)
● in your regular practical session
– Controller report : Week 34 (due Fri 23 April)
● just before start of "Integration & Testing" week (CS22120)
● submitted to Blackboard
6 / 30
Robot Deposits
● £40 deposit for loan of robot
● Pay through Aber shop:
– shop.aber.ac.uk
● You will need to set up an account
– Use your Aber email account, not any other!
● Deposit refunded on (intact) robot return towards
end of semester
– You will be told how to do this...
7 / 30
New Robots! ☺
● Formula AllCode robot
buggy
● Developed by
Matrix TSL
● Better* than our
previous robots:
– smaller
– more sensors
– wheel encoders
– more powerful CPU
©Matrix TSL
* for certain values of "better"...
8 / 30
● Inputs
– 2 Push to make switch
– 8 IR distance sensors
– Light sensor
– Microphone
– I2C accelerometer /
compass
– 2 Line following sensors
– Microphone volume
control
● Outputs
– 8 LEDs (one port)
– Speaker
– 4 Servo outputs
Sensors, Actuators and System Features
● Motors
– Left and Right
– Integrated gear-box
– Integrated encoders
● System
– Reset switch
– 16 bit dsPIC microcontroller
– USB rechargeable lithium
battery
– 128x32 pixel backlit LCD
– Micro SD card
– Integrated Bluetooth
– Micro USB Socket
– E-blocks expansion port
– Expansion port (8bit)
9 / 30
Formula AllCode buggy anatomy
©Matrix TSL
Micro USB
socketReset switch
10 / 30
Formula AllCode buggy block diagram
©Matrix TSL
Also see the online anatomy diagram at:
https://www.matrixtsl.com/allcode/formula/
11 / 30
Formula AllCode remote API
● "Normal" method of control is using remote API
– API: Application Programming Interface, i.e. a set of
functions provided to use the system
● Write control program on laptop or workstation
– More or less any language
● Run slave control firmware on robot
● Control program sends commands to and reads
sensor data from robot over serial Bluetooth
connection between computer and robot
– Teleoperation
12 / 30
Remote API
> Control
Program
> run
Slave
firmware
runningcommands
sensor values
Bluetooth
connection
Formula AllCode remote API mode
"SetMotors 20 20" →
"ReadIR 2" →
← "450"
13 / 30
Why don't we use the remote API?
● Every command has to be sent over a serial link, as
a text string
● Every sensor reading has to come back the same
way
● This process takes time
– precious milliseconds both ways!
● Can't continue program until result of command is
known
● Recipe for lag – program can't keep up with real-
time events in a complex environment
● Robot control dependent on having a Bluetooth link
in order to work
14 / 30
Instead: Autonomous operation
● Just like using Arduinos (well, partly...)
● Compile program on computer
– use an "in robot" library of functions, which may be
different to remote API
● Download program to robot via USB cable
● Disconnect robot and reset
● Robot runs program internally
– full speed sensor/actuator operations (microseconds
instead of milliseconds)
– full access to hardware (potentially)
15 / 30
Remote API vs. Autonomous control
> Control
Program
> run
Slave
firmware
runningcommands
sensor values
Bluetooth
connection
/* Source code */
> compile
> download
USB
download
compiled
code
Autonomous control
program running
no
co
nn
ec
tio
n
Formula AllCode remote API mode
Fully autonomous operation mode
16 / 30
Interactive Development Environment
● Cannot use the Arduino IDE
– different hardware, not supported directly
● Instead use MPLAB-X IDE and XC16 compiler from
Microchip
– web site: microchip.com
– DIY: Install IDE first, then compiler
● Latest version of IDE is MPLAB X
– Java app, cross-platform (based on NetBeans)
17 / 30
MPLAB-X IDE
18 / 30
Downloader application: mLoader (Windows only)
● Separate application "mLoader" needed to
download compiled code to robot
– available for download from Matrix TSL at:
https://www.matrixtsl.com/resources/files/software/programs/mLoader.zip
Sadly, there is at present
no cross-platform version
of this software... ☹
19 / 30
Downloader application: faload
● Command line tool for Linux & Mac users
● Locally written
● Easily installable
– Ubuntu/Mint compatible package
– Source code available
20 / 30
Accessing the software: choices
● Install it on your own machine:
– Best solution if you can manage it
– Windows, Linux & MacOS installs available
– Needs quite a lot of space
● about 3.1 GB for the IDE, 1.8 GB for the compiler
● OR use CS Workstations remotely
– Dependent on your Internet connection and VPN
– Edit and compile remotely
– BUT must transfer compiled code to your own machine
for download into robot
● Mount M: drive, or use sftp/scp or (e.g.) FileZilla to copy
21 / 30
In-robot API
● The in-robot API allows access to the sensors and
actuators of the robot from a program running on
the robot itself
● The API is developed by us
– Currently version 1.2 is available for use
● Initially, the most important functions will be
implemented
● Eventually, we expect to support all devices and
operating modes available on the robot
22 / 30
Sensors: Infra Red obstacle sensors
● 8 sensors arranged around the robot
● Active sensing: IR LED paired with photodiode
– but not same as sensor on old robots
● Reading IR sensor returns a value 0 – 4095
– higher number = more reflected light = closer object
– use as rough distance sensor (needs calibrating)
– In-built function subtracts ambient IR level automatically
● Good for obstacle avoidance, mapping,
communicating with other robots (possibly...)
23 / 30
Sensors: Line sensors and light sensor
● 2 line sensors underneath robot
– work the same as the IR obstacle detectors
– reads as 0 – 4095, where low = dark, high = light
– again, difference between transmitted and ambient
– good for line following, stripe counting, barcodes etc
● 1 visible light sensor on front of robot
– reads as 0 – 4095, where higher reading = more light
24 / 30
Push-buttons, LEDs, Microphone, Speaker
● 2 push-button switches reading as 0 = not pushed, 1
= pushed
● 8 LEDs in a row that can be switched on or off
individually, or set all at once (8-bit pattern)
● Sound input to microphone can be read as value 0 –
4095 (12-bit sample)
– just about possible to activate something on sharp
sound like clap or whistle
● Loudspeaker can play tones and sound effects
– good for debugging while robot is running
– Don't over-use it – can be very annoying!
25 / 30
Motors and wheel encoders
● Left and right motors have gearboxes
● Wheel encoders count up (only) as shaft rotates
● Two control modes supported:
– Set raw motor power (PWM value)
● as with old robots servo setting
– Drive a specific distance or turn through an angle
● inbuilt routines use encoders to correct drift
● Can use encoder readings to implement PID control
of speed, turning etc. in user code
26 / 30
LCD panel
● Liquid Crystal Display panel for showing text &
images
● Black & White, resolution = 128 x 32 pixels
● Backlight can be turned on and off
● Basic routines for displaying text, numbers, simple
graphics
● Use C functions like sprintf for more complex
formatting
27 / 30
Other bits
● Battery voltage reading
● 3-axis accelerometer & magnetic compass
– compass is not much use! ☹
● Bluetooth wireless serial interface
– great for status information / debugging
– pair with phone/tablet
– use downloaded serial Bluetooth app
28 / 30
● Most of the robot's sensors are "noisy"
– They don't give the same reading even if nothing
changes
– The numbers they give are not accurate or repeatable
– You need to allow for this in your programming
● Don't depend on exact sensor values
– Instead use ranges of acceptable value
– Eg: "far", "middle" and "near" distance zones
● often sufficient for reactive behaviours
– Or use averaging/smoothing, and threshold values
Sensor Noise
29 / 30
Wheels
● They may slip!
– depends greatly on surface
● Left & right motors may behave differently for the
same speed setting
– differences in motor efficiency
– differences in gearbox friction
– differences in exact wheel position on axle
● This is normal and expected!
– try slightly higher speeds
– can use wheel encoders to compensate
30 / 30
Resources (web links)
● Formula AllCode web site
– AllCode downloadable resources - including mLoader
software
– Robot programming course manual - mainly about the
remote API, but much useful reference material as well
● Microchip.com
– MPLAB-X IDE - see "Downloads" tab
– XC16 Compiler - see "Downloads" tab
● Look on module Blackboard page under "
Practical Materials" for in-robot API files,
worksheets and other documentation

欢迎咨询51作业君
51作业君

Email:51zuoyejun

@gmail.com

添加客服微信: abby12468