Subsea Imaging Solution!

Hello everyone!

So me and my Capstone team at UC Santa Barbara were given the amazing opportunity to work with Bristlemouth and Sofar on their Pioneer Program this past year, in collaboration with our school’s Marine Science Institute. The original goal was to use this system to establish a subsea livestream to learn more about the marine ecosystems off Arroyo Quemada Beach, but we quickly realized that we would have to scale down this goal a bit to just involve snapping some pictures and propagating it through the Bristlemouth ecosystem to deliver real-time images to the Sofar Api.

So now we know what our system is supposed to do, so ill get a little bit into the details of our system’s implementation! The components we added in addition to the existing parts given to us in the Pioneer program include:

  1. Raspberry Pi 4B 8G: great for it’s computing capabilities, not so great on the power consumption… with more time we wouldve definitely planned to move all the operations onto a Pi zero W
  2. PiSugar 2 Plus + battery pack: This helped us sustain our Pi externally from the dev kit and also let us control power cycles via bash scripts
  3. BlueRobotics Low-Light HD USB camera: This camera was easily controllable via our raspberry pi and we were even able to implement frame by frame motion detection on it!

Heres a lil look into how it was wired to the dev kit, we just got the UART TX/RX pins on there and hypothetically a common ground that looks like its popped off our raspberry pi in this picture :o

And the camera was held in place with a custom 3-d printed mount. We purchased this domed lid separately in hopes of optimizing the available space within the underwater enclosures, as well as providing a clearer, non distorted image.

Working with the Bristlemouth Constraints
This part was probably where our team had to get the most creative, seeing as how the entire Bristlemouth platform is primarily a text environment in it’s current state. Our solution was to convert our image to base64, chop it up into smaller packets, and send em one by one to the dev kit/spotter to be transmitted cellularly. Base64 is a safe way to carry image data over text without any loss, in case you were wondering why we used that. Because the data is sent from the spotter in a very specific format, we performed some quick maths to set the size of our buffers:


The tags mentioned just help us figure out when images start and end, and also help us confirm if anything went missing during the cellular executions. This means we get about 340 bytes per transmission in accordance with the iridium short burst protocol (cellular only transmissions coming soon?!?!)

So anyways, after Bristlemouth is done doing it’s magic and all our data has made it to the API, we get something that looks a little like this (courtesy of Bristlemouth UI, our team also used curl to get the data for a lot of our testing)


This means the last step for us was creating a script that could locate the data columns on a CSV if you download it, and piece together our image again! very exciting stuff :hugs:

Unfortunately, due to some time constraints of our project, this system never made it out onto the ocean, but we have the next best thing: our schools touch tanks!!
Image capture


There are definitely soo many improvements that our team hoped to make but just didn’t have the time for, but I really hope my team’s work here can help get people inspired! If you’re interested, ill leave the poster that my team made below, it has a more concise version of this post as well as our GitHub and some pretty amazing flow diagrams :wink:

Thanks for sticking with us throughout the year to our troubleshooting team at Bristlemouth+Sofar, we really learned so much from you guys,
-Remora

6 Likes

This is awesome and congrats! We have a proposal pending that will be sending image data through the BM node to Spotter so it was inspiring to see how you tackled things. Thanks for a well-written post with some great insights about power consumption and packaging image data!

Oh man, this is wonderful! Excellent work, @Kaitlynyau and team!!!

Getting images through Bristlemouth is a milestone I know tons of people have been interested in for a long time. This is the first time I’ve ever seen it done. Thank you so much for posting about awesome work this and for sharing the work on GitHub. Getting something complex like this to work is a great feat, but documenting what you did so well on top of that is extraordinary, and an extremely valuable contribution to the community. Thank you so much for the amazing post. I hope other can play off of your amazing work!

Excellent job on the awesome project! I am proud of what you guys have done and glad I could help along the way! :tada: Your feedback and participation on the forums has been extremely valuable!

Hi @Kaitlynyau this is so cool! Thanks for posting the code and documenting the project. Congratulations to you and the team for all you were able to get done.

I’m trying to recreate your project with a Raspberry Pi 4B + Pi Camera module v3 but I’m running into a few issues.

What app did you load onto the dev kit, and did you have to modify the size of the serial buffer?

I’m getting the following error on the dev kit

WARN payload uart user byte stream buffer full!
cobs decode error: 8

I updated the Pi to have a serial baud rate of 115200. This seems to be more stable. But even with a 1 second delay between messages it’s filling up the buffer.

Thanks for the help.

Hey! I don’t think I changed the buffer size at all, it stays at 2048. We also kept the baudrate at 9600 the whole time…
I think lines 124-149 of the new code are what handle that though I’m not sure precisely. Either way, here’s the script: user_code.cpp
And I’m not sure how that didn’t make it in in the first place, but I also just added it into github REMORA_Local

Good luck!