tirsdag den 8. april 2014

Date: 03/04 - 2014
Duration of activity: 5 Hrs
Group members participating: Alexander Rasmussen & Søren Ditlev

Lesson 7

Single behavior System

Goal:

To build a system that avoids objects using a US sensor.

Plan:

Build a system according to the instructions[1] and implement the AvoidFigure9_3.java class. Observe how the system behaves to find potential implications.

The observations revealed that system gets stuck in corners. To combat this problem we plan to Update the system so that it spins 180 degrees if it gets distance readings from left, right and font less than the stop threshold.  

Results:

The initial test run of the system shows that the system drives slowly forwards. If the US sensor detects an obstacle in front of it the system the system will turn left to check if there is any obstacles to the left, If not, it will drive left. However if there is an obstacle to the left the system will turn right and repeat the procedure. This behavior will assure that the system always find a free route unless it encounters a corner. In this instant the system will edge close until it completely stuck in the corner.

Implementation of 180 backup
To make the system turn 180 degrees we modified the privateCar class by using the NXTRegulatedMotor class instead of the MotorPort class. This enabled us to use the rotate function which rotates the wheels a given amount of degrees, eg 360 degrees means turning the wheel 1 rotation.   

A video af the system in action [1].

Conclusion:

With the implementation that detected corners and makes the system backup before it turns around, the system was able drive around without getting stuck. It was should be noted that the systems US sensor cannot detect object that are low and therefore the system  might get stuck in low objects like the feet of an office chair.  
.    

     

Concurrent Behaviour System


Goal:

Construct a system that uses multiple behaviours to avoid getting stuck´.

Plan:

Add a Light sensor to the system single behaviour and install the RobotFigure9_9.java and related classes. Observe how the system behaves.   

Add an escape behaviour to the system. We plan to do this by listening on the two touch sensor in the bumper. If the left touch sensors is activated the system will turn right so that the system turn away from the wall and vice versa on the right touch sensor. If both Sensors are pressed the system will drive backwards and turn 180 degrees.       

Results:

The initial test of the multi behaviour system revealed that the system follows the light sensor and subsequently drives towards where there is the  most light. If the system detects an obstacle with the US it will behave exactly the same as single behaviour system from lesson one.

To implement the escape behaviour we first copied the pseudo [2] code and translated it into java, the java code was put into the while loop of the escape class. Because the escape class is a top priority we had to make it send a command telling the less prioritized behaviour to run, if nothing top prioritized occurred eg. in this case, the robot bumping into walls. This was done by using the car.NoCommand() function telling the system to send no more commands from the behaviour.

while (true)
       {
        leftBumper = tLeft.isPressed();
        rightBumper = tRight.isPressed();
       
        if(leftBumper && rightBumper){
        car.backward(power, power);
        Delay.msDelay(ms);
        car.forward(0, power);
        Delay.msDelay(ms);
        car.noCommand();
        }else if(leftBumper){
        car.forward(0, power);
        Delay.msDelay(ms);
        car.noCommand();
        }else if(rightBumper){
        car.forward(power, 0);
        Delay.msDelay(ms);
        car.noCommand();
        }else{
        car.noCommand();
        }
       }

Conclusion:

The touch sensors is rarely activated because the system uses US sensor to avoid obstacles. In corners the system can be stuck for a while as the system have to edge forward until a touch sensor is activated and the system can escape. For escaping corners the method checking font, left and right(which was used in the signal behavior system worked better). However if the system hit an obstacle from a crooked angle, that the UV sensor did not registre, the bumper works well to correct the systems course.      

Add a Third motor to manage the lightsensor

Goal:Add a motor to control the light sensor on the concurrent behaviour model.

Plan:

We will add a motor to control the light sensor, we plan to do this by using the NXTRegulatedMotor class which enables us to use a tachocounter to move the motor a precise amount of degrees.

Results:

To implement the motor driven light sensor, a new class called rotator was made, the class looks like the carDriver class but uses NXTRegulatedMotor class instead of Motorport to regulate the motors. It only has one function called rotate, which rotates the motor a certain amount of degrees.

public class Rotator {                         
   private NXTRegulatedMotor rot = Motor.A;
   
   public void rotate(int degrees){
    rot.rotate(degrees);
  }
}

For the follow behaviour the only change was that the steering behaviour of the wheels was switched with the rotation of the motor driving the lightsensor.



  // Get the light to the left
  rotator.rotate(-45);
  leftLight = light.getLightValue();
   
   // Get the light to the right
   rotator.rotate(90);
   rightLight = light.getLightValue();
   
   // Turn back to start position
   rotator.rotate(-45);

When testing the new implementation we found that because the system drives forward while collecting light sensor data, it sometimes did not have time to turn before driving away from the light again. To make up for this, some of the delays on the follow class was removed to get faster samples, and the delta variable was multiplied with 1.3 to get sharper turns, this improved the behaviour.

Video of the system with moving light sensor [3]

Conclusion:The implementation of a motor driving the light sensor, made it possible  for the system to drive while collecting light data. However this made a slight problem occur, because the system drives forward the follow behaviour system didn't have time to steer the system towards light before encountering obstacles triggering higher prioritized behaviour. The problem was solved by getting faster reading and sharper turns.

References:

[2] Jones, Flynn, and Seiger, "Mobile Robots, Inspiration to Implementation", Second Edition, 1999.