- Limelight was designed to make robotic perception as easy and reliable as possible without sacrificing raw performance.
- Limelight is easy enough for complete beginners, and powerful enough for professionals.
- Configure zero-code computer vision pipelines for color blobs, AprilTags, neural networks, and more using the built-in web interface.
- Write custom Python SnapScript pipelines with tensorflow, opencv, and more using the built-in web interface or Visual Studio Code.
- The Limelight hardware integrates a high-bandwidth, ultra-low-latency MIPI-CSI imaging sensor, arm64 computer, power conditioning, and LimelightOS.
- Limelight OS supports REST/HTTP, Websocket, Modbus, and NetworkTables protocols and JSON, Protobuf, and raw output formats.
Accessing API
- Limelight OS provides a NetworkTables API for accessing Limelight data and controlling Limelight settings.
- NetworkTables is a protocol for distributed data storage and retrieval.
- NetworkTables is used by the FIRST Robotics Competition (FRC) and other robotics competitions.
- To get the Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").getDouble(0);- To set the Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").setDouble(0);- To get the 3D Data, you need to use this code
NetworkTableInstance.getDefault().getTable("limelight").getEntry("<variablename/>").getDoubleArray(new double[6]);Web Interface
- Configure Limelight settings and view data.
- Manage pipelines, SnapScript pipelines, neural networks, color blobs, and AprilTags.
- View Limelight data in real-time.
SnapScript
- SnapScript is a Python-like language that is designed to be easy to learn and use.
- SnapScript is designed to be easy to learn and use.
Compatible with Google Coral
- Limelight is compatible with the Google Coral USB Accelerator.
- The Google Coral USB Accelerator is a USB device that provides hardware acceleration for neural networks.
Robot Localization
- Limelight provides robot localization using and AprilTags.
- Robot localization is the process of determining the position and orientation of a robot in a known environment.
LimelightOS
- LimelightOS is a custom operating system that is designed to run on the Limelight hardware.
- LimelightOS is based on the Linux kernel.
- LimelightOS is designed to be fast, reliable, and easy to use.
Basic Targeting Data API
| Key | Type | Description |
|---|---|---|
| tv | int | 1 if valid target exists. 0 if no valid targets exist |
| tx | double | Horizontal Offset From Crosshair To Target (degrees) |
| ty | double | Vertical Offset From Crosshair To Target (degrees) |
| txnc | double | Horizontal Offset From Principal Pixel To Target |
| tync | double | Vertical Offset From Principal Pixel To Target |
| ta | double | Target Area (0% of image to 100% of image) |
| tl | double | The pipeline's latency contribution (ms). Add to "cl" to get total latency. |
| cl | double | Capture pipeline latency (ms). Time between the end of the exposure of the middle row of the sensor to the beginning of the tracking pipeline. |
| tshort | double | Sidelength of shortest side of the fitted bounding box (pixels) |
| tlong | double | Sidelength of longest side of the fitted bounding box (pixels) |
| thor | double | Horizontal sidelength of the rough bounding box (0 - 320 pixels) |
| tvert | double | Vertical sidelength of the rough bounding box (0 - 320 pixels) |
| getpipe | int | True active pipeline index of the camera (0 .. 9) |
| json | string | Full JSON dump of targeting results |
| tclass | string | Class name of primary neural detector result or neural classifier result |
| tc | doubleArray | Get the average HSV color underneath the crosshair region (3x3 pixel region) as a NumberArray |
| hb | double | heartbeat value. Increases once per frame, resets at 2 billion |
| hw | doubleArray | HW metrics [fps, cpu temp, ram usage, temp] |
Advanced Targeting Data API
| Key | Type | Description |
|---|---|---|
| botpose | doubleArray | Robot transform in field-space. Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| botpose_wpiblue | doubleArray | Robot transform in field-space (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| botpose_wpired | doubleArray | Robot transform in field-space (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| botpose_orb | doubleArray | Robot transform in field-space (Megatag2). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| botpose_orb_wpiblue | doubleArray | Robot transform in field-space (Megatag2) (blue driverstation WPILIB origin). Translation (X,Y,Z) in meters Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| botpose_orb_wpired | doubleArray | Robot transform in field-space (Megatag2) (red driverstation WPILIB origin). Translation (X,Y,Z) in meters, Rotation(Roll,Pitch,Yaw) in degrees, total latency (cl+tl), tag count, tag span, average tag distance from camera, average tag area (percentage of image) |
| camerapose_targetspace | doubleArray | 3D transform of the camera in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees) |
| targetpose_cameraspace | doubleArray | 3D transform of the primary in-view AprilTag in the coordinate system of the Camera (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees) |
| targetpose_robotspace | doubleArray | 3D transform of the primary in-view AprilTag in the coordinate system of the Robot (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees) |
| botpose_targetspace | doubleArray | 3D transform of the robot in the coordinate system of the primary in-view AprilTag (array (6)) [tx, ty, tz, pitch, yaw, roll] (meters, degrees) |
| camerapose_robotspace | doubleArray | 3D transform of the camera in the coordinate system of the robot (array (6)) |
| tid | int | ID of the primary in-view AprilTag |
| priorityid | int (setter) | SET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization |
Other Utils API
| Key | Type | Description |
|---|---|---|
| ledMode | int | Sets limelight's LED state: [0] use the LED Mode set in the current pipeline,[1] force off, [2] force blink, [3] force on |
| pipeline | int | Sets limelight's current pipeline: 0 .. 9 Select pipeline 0..9 |
| stream | int | Sets limelight's streaming mode: [0] Standard - Side-by-side streams if a webcam is attached to Limelight, [1] PiP Main - The secondary camera stream is placed in the lower-right corner of the primary camera stream, [2] PiP Secondary - The primary camera stream is placed in the lower-right corner of the secondary camera stream |
| crop | doubleArray | Sets the crop rectangle. The pipeline must utilize the default crop rectangle in the web interface. The array must have exactly 4 entries: [X0, X1, Y0, Y1] |
| camerapose_robotspace_set | doubleArray | Set the camera's pose in the coordinate system of the robot |
| priorityid | int (setter) | SET the required ID for tx/ty targeting. Ignore other targets. Does not affect localization |
| robot_orientation_set | doubleArray | SET Robot Orientation and angular velocities in degrees and degrees per second[yaw,yawrate,pitch,pitchrate,roll,rollrate] |
| fiducial_id_filters_set | intArray | Override valid fiducial ids for localization (array) |
Libraries
Limelight Library