Hello everyone!
So me and my Capstone team at UC Santa Barbara were given the amazing opportunity to work with Bristlemouth and Sofar on their Pioneer Program this past year, in collaboration with our school’s Marine Science Institute. The original goal was to use this system to establish a subsea livestream to learn more about the marine ecosystems off Arroyo Quemada Beach, but we quickly realized that we would have to scale down this goal a bit to just involve snapping some pictures and propagating it through the Bristlemouth ecosystem to deliver real-time images to the Sofar Api.
So now we know what our system is supposed to do, so ill get a little bit into the details of our system’s implementation! The components we added in addition to the existing parts given to us in the Pioneer program include:
- Raspberry Pi 4B 8G: great for it’s computing capabilities, not so great on the power consumption… with more time we wouldve definitely planned to move all the operations onto a Pi zero W
- PiSugar 2 Plus + battery pack: This helped us sustain our Pi externally from the dev kit and also let us control power cycles via bash scripts
- BlueRobotics Low-Light HD USB camera: This camera was easily controllable via our raspberry pi and we were even able to implement frame by frame motion detection on it!
Heres a lil look into how it was wired to the dev kit, we just got the UART TX/RX pins on there and hypothetically a common ground that looks like its popped off our raspberry pi in this picture :o
And the camera was held in place with a custom 3-d printed mount. We purchased this domed lid separately in hopes of optimizing the available space within the underwater enclosures, as well as providing a clearer, non distorted image.
Working with the Bristlemouth Constraints
This part was probably where our team had to get the most creative, seeing as how the entire Bristlemouth platform is primarily a text environment in it’s current state. Our solution was to convert our image to base64, chop it up into smaller packets, and send em one by one to the dev kit/spotter to be transmitted cellularly. Base64 is a safe way to carry image data over text without any loss, in case you were wondering why we used that. Because the data is sent from the spotter in a very specific format, we performed some quick maths to set the size of our buffers:
The tags mentioned just help us figure out when images start and end, and also help us confirm if anything went missing during the cellular executions. This means we get about 340 bytes per transmission in accordance with the iridium short burst protocol (cellular only transmissions coming soon?!?!)
So anyways, after Bristlemouth is done doing it’s magic and all our data has made it to the API, we get something that looks a little like this (courtesy of Bristlemouth UI, our team also used curl to get the data for a lot of our testing)
This means the last step for us was creating a script that could locate the data columns on a CSV if you download it, and piece together our image again! very exciting stuff
Unfortunately, due to some time constraints of our project, this system never made it out onto the ocean, but we have the next best thing: our schools touch tanks!!
There are definitely soo many improvements that our team hoped to make but just didn’t have the time for, but I really hope my team’s work here can help get people inspired! If you’re interested, ill leave the poster that my team made below, it has a more concise version of this post as well as our GitHub and some pretty amazing flow diagrams
Thanks for sticking with us throughout the year to our troubleshooting team at Bristlemouth+Sofar, we really learned so much from you guys,
-Remora