Now that you have watched the video and co-created learning targets, you are ready to begin exploring AI Vision with the VEX AIM Coding Robot!
You will complete four explorations to develop your understanding of how to use the AI Vision sensor more effectively. Through these explorations, you will start to answer the following questions:
- How far and wide can the robot's AI Vision sensor see?
- How does light affect the robot's AI Vision?
- What is detected by the robot's AI Vision?
- Does the surface the robot is on affect the AI Vision?
In each of these explorations you will start by forming a hypothesis about AI Vision. Then you will go through the process of testing its capabilities and limitations. You will reflect on your hypothesis once you have gathered data.
Now that you have watched the video and co-created learning targets, you are ready to begin exploring AI Vision with the VEX AIM Coding Robot!
You will complete four explorations to develop your understanding of how to use the AI Vision sensor more effectively. Through these explorations, you will start to answer the following questions:
- How far and wide can the robot's AI Vision sensor see?
- How does light affect the robot's AI Vision?
- What is detected by the robot's AI Vision?
- Does the surface the robot is on affect the AI Vision?
In each of these explorations you will start by forming a hypothesis about AI Vision. Then you will go through the process of testing its capabilities and limitations. You will reflect on your hypothesis once you have gathered data.
In this lesson, students will complete four short explorations designed to help them develop fundamental understandings of how AI Vision works. Each investigation should take approximately 15 minutes, and they can be completed in any order.
Introduce students to the format of this lesson.
- Explain that groups will complete four explorations about AI Vision using the scientific method. Each exploration has students watch a video, make a hypothesis, complete an activity, and reflect on their collected data. Task cards will guide each part of the exploration.
- After making a hypothesis, groups will design and run tests. Example variables or factors are available in the video or on the task card if students need help.
- Once all groups have completed the four explorations, they will come together to converge their thinking about the AI Vision, and document their shared understandings. They should keep their task cards and journal notes for this discussion.
Before beginning, ensure students understand the expectations for completing the explorations. They can be implemented concurrently, as stations the students rotate through, or you can have the whole class can complete one exploration at a time. Expectations you may wish to set include:
- Student roles for participation and collaboration.
- How to clean up or reset the explorations.
- If you are implementing the explorations as stations to rotate through, be sure students understand:
- How they will know it is time to move to the next station.
- What order they should complete the explorations in.
How far and wide can the robot's AI Vision sensor see?
In this exploration, you'll determine the sensor’s field of view—first for barrels, then for AprilTags. Field of view (FOV), or angle of view, refers to how much of the environment the AI Vision sensor can detect at a given moment. It's measured as an angle in degrees. This is measured as an angle in degrees.
Watch the video below to learn how to complete this exploration.
Use these task cards to record your prediction, complete the activity, discuss, and reflect.
In this exploration, you'll determine the sensor’s field of view—first for barrels, then for AprilTags. Field of view (FOV), or angle of view, refers to how much of the environment the AI Vision sensor can detect at a given moment. It's measured as an angle in degrees. This is measured as an angle in degrees.
Watch the video below to learn how to complete this exploration.
Use these task cards to record your prediction, complete the activity, discuss, and reflect.
In this exploration, students will determine the field of view of the AI Vision Sensor by testing how far and how wide the AI Vision can detect barrels and AprilTags.
Ensure students have the materials needed in order to mark the field of view. Distribute the Field of View – Barrels task card (Google / .docx / .pdf) first. You can use paper and pens, a field and masking tape or wet-erase markers, chart paper and markers as shown in the video, or whatever combination works best for you and your students. As they complete this exploration, they will need to measure the angle of their field of view and will need a protractor.
As students share their angular measurements, expect numbers at 73° with a range of 2 degrees in either direction. This value was recorded in a controlled environment with a white surface and bright lighting—your classroom may produce slightly different results. If groups fall outside of the expected range, encourage them to revisit their testing process to ensure they are following the procedures correctly.
Once students have completed their field of view with a barrel, distribute the Field of View – AprilTags task card (Google / .docx / .pdf). Groups should clearly distinguish between the two fields of view. For more ideas on how to implement this activity with the materials available to you, reach out in the PD+ Community.
Circulate around the room as students are completing this activity. Ask students questions such as:
- What do you notice about the detection of the object as you move it away from the robot?
- Does your data match your hypothesis? What are you discovering about the AI Vision's field of view?
- Is there a difference between the field of view when detecting a barrel versus an AprilTag? Why do you think that is?
How does light affect the robot's AI Vision?
In this exploration, you will determine how the different lighting conditions affect AI Vision.
Watch the video below to learn how to complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, you will determine how the different lighting conditions affect AI Vision.
Watch the video below to learn how to complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, students will determine how different lighting conditions affect AI Vision.
Ensure students have the materials needed in order to modify lighting around the robot. Distribute the task card (Google / .docx / .pdf) first. Lighting can be modified by changing the light source (classroom lights, window, flashlights), the color of the lighting (warm, bright, colored), the brightness (covering the robot to make it darker, adding more light), or other ideas. If students need help determining what factors to test, provide them with a smaller list of potential factors.
As students test each variable, they should record: the lighting factor being changed, the barrel’s position, whether it is detected, and whether AI Vision correctly identifies its color.
Circulate around the room as students are completing this exploration. Ask students questions such as:
- How does the lighting affect the detection of the barrel when it is close to the robot? When it's further away?
- What lighting variable impacted the detection the most? Why do you think it had such a large impact?
- How does light affect vision? Do the factors that impact the AI Vision match or differ from our vision as humans?
What is detected by the robot's AI Vision?
In this exploration, you will investigate what everyday objects are detected by the AI Vision sensor as classified objects like barrels and sports balls.
Watch the video below to learn more about how you will complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, you will investigate what everyday objects are detected by the AI Vision sensor as classified objects like barrels and sports balls.
Watch the video below to learn more about how you will complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, students will investigate how different objects can be used to trick the AI Vision sensor. They will make hypotheses about what the AI Vision will report as being cargo, AprilTags, or another robot, then test those hypotheses and record their data.
Ensure students have the materials needed in order to trick the AI Vision. Distribute the task card (Google / .docx / .pdf) first. This exploration can use a wide variety of materials like the One Stick Controller, construction paper reconstructions of objects, printouts of elements like barrels, or other objects from around the classroom. Encourage students to first try for objects around the classroom that have the most likely chance of being detected before creating brand new objects.
Encourage students to use the AI Vision Utility for this exploration. As students test, they will be able to see the trick object while also seeing what is being reported by the AI Vision Sensor.
Circulate around the room as students are completing this exploration. Ask students questions such as:
- What patterns do you see about the objects being detected as sports balls and barrels? Why do you think that is?
- Is there one object that is more often detected than others? (i.e. sports ball, orange barrel, blue barrel)
- Is there a difference in your data between the trick objects when they are closer or further away from the robot? Why do you think that is?
Does the surface the robot is on affect the AI Vision?
In this exploration, you will test if the surface the robot is on affects the AI Vision. Watch the video below to learn more about how you will complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, you will test if the surface the robot is on affects the AI Vision. Watch the video below to learn more about how you will complete this exploration.
Use this task card (Google / .docx / .pdf) to guide you through the exploration including the hypotheses, data collection, discussion, and reflection steps.
In this exploration, students will investigate how the surface the robot is on affects the data from the AI Vision. They will form hypotheses about the effect (or lack of effect) of the surface on object detection, then test their ideas and record the results.
Ensure students have the materials needed in order to test different surfaces. Distribute the task card (Google / .docx / .pdf) first. This exploration can use a wide variety of options including: carpet (of different colors and thicknesses), linoleum tiles, stone flooring, wood flooring, table tops, mirrors, etc. Make sure students know where in the classroom they can go to find and test those surfaces.
Encourage students to use the AI Vision Utility during this exploration. As students test, they will be able to see the differing surfaces while also seeing what is being reported by the robot's AI Vision.
Circulate around the room as students are completing this exploration. Ask students questions such as:
- How do the surfaces you are testing differ? (color, size, reflectivity, etc).
- What patterns do you see in your data as you test the different surfaces? Why do you think that is?
Wrap Up
Now that you have completed the four explorations, it is time to draw conclusions as a class!
Before beginning this whole-class discussion, ensure that you can clearly state your group's conclusions for each of the four questions posed above during the explorations.
Feeling stuck? Write your conclusions in your journal using this sentence stem: ______________________ causes ______________________ because ______________________.
Discuss your group’s conclusions so that you can come to a whole-class consensus.
Now that you have completed the four explorations, it is time to draw conclusions as a class!
Before beginning this whole-class discussion, ensure that you can clearly state your group's conclusions for each of the four questions posed above during the explorations.
Feeling stuck? Write your conclusions in your journal using this sentence stem: ______________________ causes ______________________ because ______________________.
Discuss your group’s conclusions so that you can come to a whole-class consensus.
Guide students to share their discoveries in a whole-class discussion. Students should use their journals and task cards as a reference. The goal of the discussion is to develop a shared understanding of the factors that influence the robot’s AI Vision, based on group conclusions from the explorations. Use those understandings to create a shared artifact students can refer to as they code with AI Vision data in future lessons. Possible artifacts include:
- An anchor chart
- A bulletin board
- Concept map
- A shared document students can access digitally or add to their journals
Taking one exploration at a time, encourage groups to share the conclusions and support their assertions with evidence from the explorations. Then ask follow up questions to help converge student understandings. These could include things like:
- What data did your group collect to support that conclusion?
- How does that conclusion compare to your hypothesis?
For groups that have a hard time articulating their conclusions, have them use the provided sentence stem to guide their claims.
Select Next > to move to the final activity in this unit.