question archive In this part, the tricycle-like mobile robot (shown in Figure 1) is assumed to use its front wheel to steer (a) and front wheel to drive (v)
Subject:Computer SciencePrice:32.99 Bought3
In this part, the tricycle-like mobile robot (shown in Figure 1) is assumed to use its front wheel to steer (a) and front wheel to drive (v). The axial distance d is still 0.5+0.01G (m) (to the nearest millimetre), where G is your group number. It is also assumed that the control inputs v and a are corrupted by Gaussian noises with zeros means and variances of o1 = 10+(m/sec) and o2=9x10+ (rad), respectively. In order to estimate the robot position accurately, external land marks are used to provide measurements that can infer the pose of the robot. Assume that a laser range finder is attached to the centre of the rear wheel axis (as shown in Figure 1). The laser sensor measures the distance r and the angle between the robot and the landmark. It is assumed that the laser sensor can measure all landmarks in the work space. The variances of the sensor measurement are 0% =10*(mo) and o* = 7.62x10(rad). The laser readings will be given as a vector that contains two (2) columns, the first column being the range reading, and the second column being the angle (bearing) reading. The number of rows returned from the sensor depend on how many landmarks the sensor can 'see'. In addition, the order of the vector is given as the smallest (most negative) angle being the first row and the largest (most positive) angle being the last row. It is assumed that the robot could be anywhere in the space of x EO, 5] (m), y E TO, 5] (m), and a E TO, (rad). The robot is driven by nominal input signals of velocity v=0.1 m/s and the steering angle a=0.2 rad. It is also assumed that the sampling time of the controller and sensor is 0.05 seconds. There are four landmarks in this space. They are located at [2, 2.5] m (3,2.5) m, (3, 2] m and [3, 1] m. You are required in this part of the assignment to: 1. Develop a Particle Filter (PF) algorithm to calculate the pose of the vehicle using the measured range and bearing values. Write the MATLAB program to implement this algorithm. 2. Complete and demonstrate the working of PF in MATLAB to estimate the positions and orientations of the vehicle. (items 1 and 2 of Part II should be demonstrated by tutorial sessions in Week 9, 20%) Students will be provided with two files: The actual vehicle position during the first 10 seconds of the robot motion will be provided on the vUWS site - the car poses are group based (file name 'carpos# mat', where # is the group number). The car poses are generated using the nominal control velocity and steering angle plus zero mean Gaussian noises with different variances, as stated earlier. The 'carpos.mat' file contains 3 rows that represent the car's x, y and value at each time step - total of 201 columns that represent 0 - 10 seconds at 0.05 second interval A MATLAB function called "sensor m" that provides the noisy sensor reading as in a real robot situation. The function will return the sensor reading in a row vector form - each row corresponding to one landmark. The order of the row is sorted according the bearing reading. Read the comments in sensor m for more details 3. Present at least the following results in your report: Plot and compare the pose results obtained from the particle filter with the actual data given in the 'carpos. • Plot and compare the estimation errors in xy and from the particle filter and the actual pose. • Comment on the effectiveness (or ineffectiveness) of the filter.
Purchased 3 times