The touch sensor block is found under the "Sensors" dropdown.
Let's start by breaking down how a touch sensor works at its core!
Remember what sensors and motors are available in your program are determined by your configuration! Double check the correct configuration is active if you do not see a device list.
The information collected by a touch sensor comes in two states, also known as binary states. This information is perfect to use with a conditional statement like an if/else statement.
The block collects the binary TRUE/FALSE state from the touch sensor and acts as the condition for the if/else statement.
Take a moment to think what this code is asking the robot to do. We could read this line of code as "If the touch sensor is pressed do ____, else if the touch sensor is not pressed do _____."
It's always helpful for us to be able to see what the robot thinks its doing on our Driver Hub's screen. To do this, let's request the robot shares some telemetry data while our program is active.
We can access the "Telemetry" blocks under our "Utilities" dropdown on the menu. Look for the block to be added in each section of the if/else statement.
When on the default "Telemetry" block the information provided is not helpful for the robot to communicate with us. Therefore we need to change "key" and "text" to match the desired information.
The "key" should be something related to which sensor, motor, or other device we are receiving information from. Meanwhile "text" will tell us what is happening based on the state of our touch sensor and our if/else statement.
We could change it so our robot says "Hello World" when the button is pressed.
Take a moment to think about how else telemetry data could be used with your robot before moving on to the next section!
At the moment, our robot does not have any senses to help navigate the world around it like you might. However, that's the key advantage to adding sensors to our design.
For the touch sensor, one of the most common uses is for it to act as a limit switch. This will help the robot know when it needs to halt the movement of a mechanism, like an arm or lift, that's at its limit similar to how your nerves help to tell your brain to do the same.
We can test this idea by adding on to our existing if/else statement. This time we are going to ask our motor to move until our sensor is pressed.
In the above example the if/else is checking first for if the touch sensor is pressed. The full statement could be read as "If the touch sensor is pressed set the motor's power to 0 else, if it is not pressed, set the power to 0.3."
There may be situations where we want our program to read if the touch is NOT pressed first. Let's take a quick look at how that would function using the "not" block from the "Logic" menu.
The color and light sensor menus are found under the "Sensors" dropdown. Additional blocks to set or call colors are within the "Color" menu under Utilities.
While a touch sensor features a physical switch to gather information, a color sensor makes use of reflected light. By doing so it collects different data to determine how much light it is seeing, the distance to a surface, and of course what color is in front of it.
For our robot we're going to focus on a few key components: hue, saturation, and value. With these we can use something known as the HSV color model to have the robot translate what its seeing into a recognizable color.
HSV is a form of a cylindrical RGB color model used to do things like create color pickers for digital painting programs, to edit photos, and for programming vision code.
Hue, saturation, and value all will play a part in helping our robot tell us what color it detects and allow us to make adjustments for something like a uniquely colored game piece!
Before we tackle colors, let's start with having our robot use the color sensor to tell us how much light is being reflected.
To start, let's grab a telemetry block to add to our loop. The "key" should be set to "Light detected". To the "number" place we will pull a block from the color sensor menu.
Time to test your program to see what your color sensor detects! While testing think about the following questions:
What happened?
Likely, the numbers and differences you saw while testing are different than those we'd see ourselves. There are many factors that might change the color sensor's readings including the lighting in the room and surface material.
However, one thing that is the same is that 1 should be the least amount of light, such as when your hand is covering the sensor, and 0 is the most amount of light being seen.
Let's start by establishing a few variables in our program.
We'll be going over what a variable is in more detail during Part 2: Robot Control, but for this example we are using them to help our robot translate the data it records more clearly. Our variables will be called "color", "hue", "saturation", "value", and "normalizedColors".
We've discussed how most of these are related to the HSV color model, but what about normalizedColors?
Color Normalization is another technique within vision programming intended to help compensate for differences caused by lighting and shadows when looking at colors. This also affects shades of a color. For example, there are a ton of different shades of blue, such as cyan, navy, and aquamarine, but to our robot these will all be referenced as blue.
Now that we've named our variables, we need to set them to different values.
From our variable menu we need a set variable block. From the dropdown menu, we can change it to "normalizedColors". Next we will snap it in place with a block from the Color Sensor menu below our light detecting telemetry.
Next, let's go ahead and add set blocks for all our variables. To each we can connect their corresponding block from the Color menu under Utilities.
Next we need to change our variable name from the default of "myColor".
Notice that "color" is matched with NormalizedColors using the matching variable while the rest have the variable set to "color".
From here we can add our telemetry blocks to see what values the color sensor detects!
A touch sensor operates in binary states -- either activated or deactivated. The isPressed() method retrieves this TRUE/FALSE state for use in conditional logic:
if (test_touch.isPressed()){
//Touch Sensor is pressed
} else {
//Touch Sensor is not pressed
}
This structure can be understood as: "If the touch sensor is pressed execute this block, otherwise execute that block."
Key states:
Displaying sensor feedback on the Driver Hub helps verify robot behavior:
if (test_touch.isPressed()){
//Touch Sensor is pressed.
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
//Touch Sensor is not pressed
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
The program requires telemetry.update(); after the conditional to refresh the display each loop cycle:
while (opModeIsActive()) {
if (test_touch.isPressed()){
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
telemetry.update();
}
A common application uses touch sensors as limit switches to halt mechanisms when reaching physical boundaries:
if (test_touch.isPressed()){
//Touch Sensor is pressed
test_motor.setPower(0);
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
//Touch Sensor is not pressed
test_motor.setPower(0.3);
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
This logic reads: "If pressed, stop the motor; otherwise, run it at 0.3 power."
import com.qualcomm.robotcore.eventloop.opmode.Disabled;
import com.qualcomm.robotcore.hardware.DcMotor;
import com.qualcomm.robotcore.hardware.Servo;
import com.qualcomm.robotcore.hardware.TouchSensor;
import com.qualcomm.robotcore.eventloop.opmode.LinearOpMode;
import com.qualcomm.robotcore.eventloop.opmode.TeleOp;
@TeleOp
public class HelloRobot_TouchSensor extends LinearOpMode {
TouchSensor test_touch; // Touch sensor Object
private DcMotor test_motor = null;
private Servo test_servo = null;
@Override
public void runOpMode() {
test_motor = hardwareMap.get(DcMotor.class, "test_motor");
test_servo = hardwareMap.get(Servo.class, "test_servo");
test_touch = hardwareMap.get(TouchSensor.class, "test_touch");
// Wait for the game to start (driver presses PLAY)
waitForStart();
// run until the end of the match (driver presses STOP)
while (opModeIsActive()) {
if (test_touch.isPressed()){
//Touch Sensor is pressed.
test_motor.setPower(0);
telemetry.addData("Touch Sensor", "Is Pressed");
} else {
//Touch Sensor is not pressed
test_motor.setPower(0.3);
telemetry.addData("Touch Sensor", "Is Not Pressed");
}
telemetry.update();
}
}
}
The conditional can be inverted by adding the ! operator before the sensor method:
if (!test_touch.isPressed()){
//Touch Sensor is not pressed
test_motor.setPower(0.3);
telemetry.addData("Touch Sensor", "Is Not Pressed");
} else {
//Touch Sensor is pressed
test_motor.setPower(0);
telemetry.addData("Touch Sensor", "Is Pressed");
}
In OnBot Java the operator ! tells the code to look for the opposite or to "not" be what is being called.