Home > PathBot, Uncategorized > Follow a Line – Line Following Basics

Follow a Line – Line Following Basics

Summary

There are many ways (and sensors) available that can allow a robot to follow a line. I have experimented a few that are of interest.

  • Single Light Sensor
  • Two Light Sensors
  • Mindsensors LineLeader Sensor (Intelligent array of 8 light sensors in one I2C package)

In this article I review a couple of standard ways of staying on track, and then review the new Mindsensors – Line Leader sensor.  I was part of a beta-testing group for this great sensor.

Single Light Sensor

Using on NXT light sensor, there

are a couple of possibilities for following a line that can be effective. The programmer can either code to follow the line itself, or the edge of the line.

clip_image001

Following inside the line offers the advantage of knowing the robot is on the line, but what direct do we turn of we fall off? If the robot falls off of the line, which direction should it turn to return to it? As the robot drifts back and forth on the line, the programmer must try to correct its course to stay on the line. This method is mostly impractical.

clip_image001[4]

Edge following offers the advantage of knowing the direction to turn in order to stay on the line. It is generally a faster, more accurate method for following a line. You select the right or left side of the line to follow, and code the robot to turn left if the light sensor reports BLACK and right if WHITE. This method allows a robot to follow the edge of the line through straight sections and curves. Is there a better method to follow a line?

Two Light Sensors

Using two NXT light sensors, there are also a couple of useful configurations to review. Sensors are generally set side by side on the front of the robot with either both inside, or both outside the line. With two sensors (left and right), more accurate line following can be achieved because as a line is lost on one side or the other, recovery is possible because we know which side lost the line. clip_image001[6]

Placement of the light sensors is very important so that only one sensor at a time could be on the line. As can be seen in the illustration, as the robot drifts from one side to the other, one sensor contacts the line and the robot can react accordingly.

clip_image001[8]

Coding is similar to using one light sensor following the edge of a line, but you can let the robot run straight if both report WHITE or both report BLACK (depending on the configuration). This allows the robot to gain speed instead of oscillating back and forth (as much) as it must with one light sensor.

Mindsensors Line Leader Sensor

A more advanced method for following a line is to utilize three or more light sensors to follow a line. Many people have custom built line following sensor arrays, but Mindsensors has just released the Line Leader sensor for the rest of us. Not only does it contain eight (8) sensors in its array, the intelligent sensor also implements some very nice internal coding and a robust I2C interface to communicate with the robot.

clip_image001[10] clip_image001[12]

Because there are 8 light detectors under the sensor, it is very easy to determine how to follow a line. The goal is to keep the line centered under the sensor. As the line moves left or right under the robot, it can compensate to center the line under the sensor. How is this done? Mindsensors has a good explanation in the Line Leader user guide.

Mindsensors took all of the hard work and math out of determining how to follow the line. There is a register for retrieving the STEERING value from the sensor. I have used the STEERING value successfully by supplying speed values to two motors. The following example code shows how to read the STEERING value from the sensor, and apply it to two drive imagemotors.

int clip(int x, int min, int max) {
  if (x<min)
    return min;
  else if (x>max)
    return max;
  else
    return x;
}

While(true) {
//TWO MOTORS – DIFFERENTIAL DRIVE STEERING FROM LINE LEADER 
//GET STEERING FEEDBACK FROM THE SENSOR TO KEEP ROBOT ON THE LINE

  steering = (int) LL_Read(SensorPort, LineLeaderAddr, LL_READ_STEERING);
  motor[RIGHTMOTOR] = clip(70 + steering, -100, 100); //RIGHT MOTOR
  motor[LEFTMOTOR] = clip(70 - steering, -100, 100); //LEFT MOTOR
}

The clip function is a simple utility function that should be in everyone’s toolbox. It takes value returns it if it is between min and max. If outside the range, it returns min or max.

Mindsensors supplies a simple driver file that makes reading the STEERING value easier. We call LL_Read() with the LL_READ_STEERING constant to return the STEERING value suggested by the sensor based on where the line is. The last step is to determine speed value for the left and right motors. We ideally (in this case) want to travel straight at speed = 70. We add the steering value to the right motor, and subtract it from the left motor (within the range motors accept – [-100 to 100]). That’s it.

As the line drifts to the right (or robot to the left), steering becomes a negative number. The further it moves to the right, the bigger the number becomes to allow for correction of the robot.

If the STEERING value returned is -20, we set the motors to use the value to correct the robot’s steering.

RIGHTMOTOR power = 70 + (-20) = 50

LEFTMOTOR power = 70 – (-20) = 90

This causes the right motor to spin faster than the left turning the robot left to re-center the sensor over the line.

Your program would continue to loop, finding the next steering value and correcting the motors accordingly. The robot uses this feedback control to follow a line.

What if the robot takes advantage of the NXTs motor implementation and links the RIGHTMOTOR and LEFTMOTOR together? NXT firmware allows the motors to take advantage of the internal encoding for each motor to keep them running at the same speed (It can move in a straight line!). I often implement my drive motors with this configuration so the robot can easily move straight. Here is a quick code sample of implementing sync’d motors.

void _DriveRobot(int power, int steering) {
  if (steering >= 0) { //RIGHT TURN or STRAIGHT
    if (lastPower!=power || lastSteering != steering){
      nSyncedMotors = synchBA;
      nSyncedTurnRatio = (100 - (2*abs(steering)));
      motor[LEFTMOTOR]=power;
      lastPower=power;
      lastSteering=steering;
    }
  }
  else { //LEFT TURN
    if (lastPower!=power || lastSteering != steering){
      nSyncedMotors = synchAB;
      nSyncedTurnRatio = (100 - (2*abs(steering)));
      motor[RIGHTMOTOR]=power;
      lastPower=power;
      lastSteering=steering;
    }
  }
}

In this case, I have directly passed the STEERING value returned from the Line Leader to this routine with excellent success. Robot speed is determined by the POWER parameter (-100 to 100) to implement reverse, stop, or forward. The STEERING parameter (-100 to 100) determines left, straight, or right turns.

About these ads
  1. Thom
    November 7, 2010 at 7:25 pm

    The code posted in this article will help with basic line following. I put some behaviors into my framework for doing some advanced stuff, but have not organized it for distribution.
    I plan to revisit that stuff before the end of the year… but am very busy with work lately. When I get the behaviors working in the latest framework, I will send a copy.

  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

Follow

Get every new post delivered to your Inbox.

%d bloggers like this: