In 2018, we wrote a comprehensive whitepaper explaining our groundbreaking work to introduce ROS to FRC. This year, we learned from last year's mistakes and challenges to write better code: code that was effectively organized for automation and took advantage of more of what ROS has to offer. We also made the huge step of transfering some CAN reads and writes to our NVIDIA Jetson TX2, requiring the setup of a second hardware interface. This whitepaper covers the biggest improvements that we made this year.
During offseason we focused on improving the swerve drive we used last year and the tank drive we designed last year in the case of obstacles. We’ve put them into a public GrabCAD space so feel free to peruse our files. Note: Most of the designs are either unfinished or won’t necessarily work in their current state.
Zebravison 6.0 is a crucial development step towards the completion of a robust and dynamic codebase. Integration of ROS allows for automation and communication between systems, giving way to advanced developments in all features of sensors on the robot. The primary objective was to collect data and extract critical information about the position of the robot compared to other objects on the field. In other words, complete localization and environmental visualization in all aspects of the robot. This task, though expansive, has been finalized and polished off, leaving almost no robot-relative values unknown. Computer vision on The Zebracorns is closing the gap between the current standard of robotics and the goal of a fully functional, independent, and autonomous robot.
Most swerve profiling algorithms effectively assume that curvature and rotation have a negligible effect. For some paths this assumption can be very problematic, so we have developed an approach for using at wheel acceleration limited (trapezoidal) profiles.
In 2016, Team 900 wrote a neural network for detecting boulders. Last year, we implemented the Robot Operating System, ROS, into our vision code to facilitate communication between multiple processors. But this year, we’ve gone above and beyond what anyone thought we would be crazy enough to attempt. We transitioned our entire robot code -- including hardware control -- into ROS.
A very factual whitepaper about the shouting of ROBOT! in FRC pit areas. The incessant shouting of ROBOT in FRC pits plagues the FIRST community. The Zebracorns have a solution.
During offseason we worked on a whole bunch of projects ranging from multiple different types of swerve drives and some west coast drives in preparation for the game. We’ve put them into a public GrabCAD space so feel free to peruse our files. Note: Most of the designs are either unfinished or won’t necessarily work in their current state.
Team 900 has lots of programmers. A positive cornucopia of programmers. Unfortunately, this often leaves us short on the mechanical side of things. Even after we get our robot built, the lack of a convenient practice field makes it difficult to test all of our robot’s functionality or train drivers. In 2017 we decided to program a world where reality isn't an issue. Our first year exploring VR ended relatively successfully. Our driver quickly adapted to a variety of field conditions and his clever maneuvering improved our score in several cases. We went through three iterations of the simulation, primarily rebuilding the robot drivetrains, and finally reached a stable model.
During preseason we decided to upgrade out battery cables. This is a paper detailing our process and reasons behind the change.
Zebravision 5.0 is a radical departure from previous paradigms of robot software architecture and completes the computer vision team’s take over of team 900 on the whole. The work centered around the implementation of ROS or Robot Operating System into the team’s overall software framework. The main goal was to improve the facility of Jetson to RoboRio communication but ROS represents much more than that. This leap in software interfacing not only allows the two systems to communicate in a dynamic manner, but also lays the foundation for sophisticated control paradigms built upon the open source ROS framework. This distributed computation model will allow advanced work on robot sensor processing, motion planning, environment perception, localization, and mapping.