Zebravison 6.0 is a crucial development step towards the completion of a robust and dynamic codebase. Integration of ROS allows for automation and communication between systems, giving way to advanced developments in all features of sensors on the robot. The primary objective was to collect data and extract critical information about the position of the robot compared to other objects on the field. In other words, complete localization and environmental visualization in all aspects of the robot. This task, though expansive, has been finalized and polished off, leaving almost no robot-relative values unknown. Computer vision on The Zebracorns is closing the gap between the current standard of robotics and the goal of a fully functional, independent, and autonomous robot.
Most swerve profiling algorithms effectively assume that curvature and rotation have a negligible effect. For some paths this assumption can be very problematic, so we have developed an approach for using at wheel acceleration limited (trapezoidal) profiles.
In 2016, Team 900 wrote a neural network for detecting boulders. Last year, we implemented the Robot Operating System, ROS, into our vision code to facilitate communication between multiple processors. But this year, we’ve gone above and beyond what anyone thought we would be crazy enough to attempt. We transitioned our entire robot code -- including hardware control -- into ROS.
A very factual whitepaper about the shouting of ROBOT! in FRC pit areas. The incessant shouting of ROBOT in FRC pits plagues the FIRST community. The Zebracorns have a solution.
During offseason we worked on a whole bunch of projects ranging from multiple different types of swerve drives and some west coast drives in preparation for the game. We’ve put them into a public GrabCAD space so feel free to peruse our files. Note: Most of the designs are either unfinished or won’t necessarily work in their current state.
Team 900 has lots of programmers. A positive cornucopia of programmers. Unfortunately, this often leaves us short on the mechanical side of things. Even after we get our robot built, the lack of a convenient practice field makes it difficult to test all of our robot’s functionality or train drivers. In 2017 we decided to program a world where reality isn't an issue. Our first year exploring VR ended relatively successfully. Our driver quickly adapted to a variety of field conditions and his clever maneuvering improved our score in several cases. We went through three iterations of the simulation, primarily rebuilding the robot drivetrains, and finally reached a stable model.
During preseason we decided to upgrade out battery cables. This is a paper detailing our process and reasons behind the change.
Zebravision 5.0 is a radical departure from previous paradigms of robot software architecture and completes the computer vision team’s take over of team 900 on the whole. The work centered around the implementation of ROS or Robot Operating System into the team’s overall software framework. The main goal was to improve the facility of Jetson to RoboRio communication but ROS represents much more than that. This leap in software interfacing not only allows the two systems to communicate in a dynamic manner, but also lays the foundation for sophisticated control paradigms built upon the open source ROS framework. This distributed computation model will allow advanced work on robot sensor processing, motion planning, environment perception, localization, and mapping.
Presentation given at the 2017 GTC about ZebraVision history.
This guide is not complete. It is merely the beginning on your journey with the TX1. It will not answer all of your questions nor is it meant to (It’s “incomplete” for a reason). Before you begin you should start by looking at the available resources you have and read the documentation available for the TX1. It’s entirely possible that this guide will be out of date by the time you read it. It is not definitive and should not be treated as such.