More info on "growing set of abstract parsing tools"

Hi! I’m working with @Taylor.gill on BM DevKit integration with a Sunburst Sensors pH sensor. I’ve made my way through the DevKit User Guides, #1 through #5, and the RS232 Serial Sensor example.

One thing that leapt out at me in the latter webpage was this comment:

“In the Bristlemouth Dev Kit application framework we’ve got a growing set of abstract parsing tools to make it as simple as possible to instantiate one of these abstract protocol patterns and set up a bespoke parser for a new sensor.”

Is there some documention of what other parsers exist in this “growing set”? Obviously the RS232 example webpage is a good guide to using the OrderedSeparatorLineParser but that’s not relevant to our sensor output.

(The only other thing that looks similar in that directory is OrderedKVPLineParser, but it’s not obvious from the source/comments what KVP refers to in this context.)

Thanks!
Mike J+

Hey there @mike_j this is a really good question.

Let me tag in my colleague @evan who may know more about how to answer you.

~Z

Hi @mike_j - there isn’t documentation of those libraries outside of the source code, and you’re looking in the right place. “KVP” refers to “Key Value Pair”. There’s a usage example of it in the bristleback_apps/aanderaa app, though it won’t run on your DevKit hardware.

We don’t have any specific roadmap for these - just building them as we go. Do you have a brief description or example of what the output format from this sensor is? We could advise on whether and how the existing libs could be used or augmented to support it.

Hi @evan and @zack_j, thanks for the followup.

The short answer is that our sensor reports in a hexadecimal string, roughly 116 fixed-width numbers of different lengths, no delimiters. Since it will only report once (or maybe twice) per hour, my instinct is to work on transmitting its unparsed output for interpretation by lab-based scripts and processes. No averaging or stdevving will be needed within the BM ecosystem.

A slightly longer answer is that, while I’ve worked with these sensors for decades, this is the first application I’ve worked on that needs to listen to their serial output in real time. So I’m still learning how they talk. Yes the main report is 466 hex characters and parseable by known field widths, but it also seems to report shorter strings for different reasons seemingly on a whim. Which is another reason I think we should just hoover up its output and send it all to shore to be dealt with there.

I started this discussion because I was wondering if any of these “abstract parsing tools” already dealt with fixed-width hexadecimal strings but at this (early) stage I’m not sure I even need them. Mostly I’m trying to figure out what else is in my toolbox before I pull out this hammer and start hitting things that look like nails.

I do have a ton of other questions (if I want temp and humidity from the BM devkit, do I need to write explicit code for that like in the early DevKit Guides? are there other status parameters to look for – like battery voltage? is there a reason that DevKit Guide #5 insists on using v0.5.0 or can I develop new mote firmware with the most recent versions available on github? can I have the BM DevKit listening for my sensor’s output continuously – vs, in your examples, turning on for 10 mins per hour or something – or is this going to exhaust somebody’s power budget? etc etc). I’m still finding my feet on figuring out what my resources are but I was pointed to this discourse page as a good starting place.

[Related to the power budgeting question, note that our sensor will run autonomously on its own battery power. It will not need power from the DevKit and it won’t need prompting (or any commands) by the DevKit. We think we could make the whole thing work with just two wires, comms ground and tx, although we might engineer cabling with more in case future-us has different notions. I’d feel especially weird about leaving out rx even if we’re dead certain we’ll never need it.]

Mike J+

Got it @mike_j - that is indeed an unusual line schema. None of the existing parsers will handle this schema, but if you’re comfortable in C++ I might encourage you to implement a subclass of LineParser like FixedWidthLineParser and open a Pull Request! But, I’m not sure how much use this would see as it’s a very uncommon pattern for a sensor interface.
Your approach of just collecting the bytes and sending them all home sounds totally reasonable.
Note – there’s currently a 300 byte payload limit on Spotter messages (This is not enforced in the Motes and is not well documented anywhere currently, so you would just see an error in your Spotter if you tried to send something larger than this - cc @zachary). So you’ll either need to shed some of the data you’re not interested in, or send the 466 bytes in multiple parts. This is a hard limit for Iridium satellite messages, but we have it on the list to remove this restriction for cellular messages.


if I want temp and humidity from the BM devkit, do I need to write explicit code for that like in the early DevKit Guides?

All of the demo apps in apps/bm_devkit use the main file from bmdk_common. This includes the polling and logging of on-board sensors (hum/temp, pressure, and power montiors). You can adjust the polling rates using the system configs from the guide examples. I’d suggest star


is there a reason that DevKit Guide #5 insists on using v0.5.0 or can I develop new mote firmware with the most recent versions available on github?

You can use the most recent release, but you’ll need to update your Bridge and main Spotter firmware for Spotter compatibility as well. This is the next guide we’ll be releasing - eta 2 weeks (cc @zack_j @timjoh). If you want to test comms with the Spotter before then, I’d suggest sticking to 0.5.0 then updating to the latest when that guide + binaries are available. If you’ll be focusing on Mote integration and don’t need to test Spotter comms yet, I’d suggest developing on the latest bm_protocol released version (0.9.1).


can I have the BM DevKit listening for my sensor’s output continuously – vs, in your examples, turning on for 10 mins per hour or something – or is this going to exhaust somebody’s power budget?

Exactly - the main consideration is energy budget. Predicting whether a solar powered system will be power sustainable in a particular environment is fairly complicated. We’re working on tools to support this, but those are a ways out this year.
The energy consumption of the Spotter in default configuraitons + a single Dev Kit that is constantly on will be ~300mW (150mW for the Dev Kit, and 150mW for the Spotter), and will operate sustainably off of Spotter’s solar panels in most places between 50º N and S year round. If you double that to 600mW, we’re now likely only year round sustainable up to 25º. You can duty cycle your sensor directly from the Dev Kit, use the Spotter sampling configurations to duty cycle the entire Dev Kit, dig into the under-the-hood low power modes in the STM32u5 platform, disable the Spotter visibility LED, or reduce GPS sampling if long dwell power sustainability is a concern for your application.


hope this helps!

Thanks @evan for the detailed response.

This part…

…is simultaneously (1) really! good! to know (thanks) and (2) kind of a ‘whoa!’ moment for me. It makes me think what other constraints I might run into without knowing about them and without an error message that tells me what the actual problem is.

Two comments, and a followup question:

Comment one is just that I’ve been finding firmware compilation and upload to be fussy in ways I haven’t been able to explain. I’ve been cloning and customizing the hello_world and serial_payload_example apps in the bm_devkit folder. Sometimes when I copy the resulting *elf.dfu.bin file over to the SD card and use the “bridge dfu” command to try to send it to the devkit mote, I get a reply of “Invalid param” to the “bridge dfu” command, for no reason I’ve been able to identify. It just hates some of my firmware files. Or their names. Or something about the way I copy them (with the SD card mounted as a volume on the macbook, and dragging+dropping within the Finder).

Comment two is that the way that I connect the bm devkit to my spotter buoy seems to affect how easily I can connect my macbook to the devkit by usb-c cable and use a pyserial terminal to connect to the devkit. When the buoy’s smart mooring cable is plugged into the pair of bm connectors on the mote side of the devkit, it always works. When it’s plugged into the pair of bm connectors on the side with the leds and all the wire terminals (where I attach my tx, rx and ground wires from my serial sensor), it almost always fails to list the two bm /dev/cu.usbmodem* devices for the pyserial terminal to connect to. Except on some rare occasions they are listed there and I connect just fine (to the one ending in 1). I had been proceeding on the assumption that the two pairs of bm connection terminals on the devkit are completely interchangeable, but my experiences are challenging that assumption.

Aaaand my followup question is, would this 300-byte limit be relevant to the PLUART::readLine() code as well? I’ve only just begun to successfully get serial output from my pH sensor, but in these early days I’m seeing that the short (4-byte) sensor messages come through just fine, are written out on the consoles, and captured in the SD data files. Whereas the long (466-byte) messages do correctly trigger the 2nd bmdk LED light and do trigger the firmware to write something to the spotter’s SD data file, etc., but the actual lines it gives me as output are empty. The code I cloned uses a 2048-byte buffer but perhaps I need to take in my sensor’s message in smaller chunks? [Weirdly, my sensor seems to use CR as its line separator, not LF and not CRLF, but I believe I’ve accounted for that in my code.]

Just to be clear my test code is being developed in the v0.5.0 environment. I’m not yet attempting any cell/iridium comms with sensor payload but I hope to start playing with that soon and I don’t plan to wait 2 weeks for the upcoming Dev Guide.

Mike J+

Actually NM about my followup question. My sensor oh-so-kindly spits out an additional CR after it writes out its long measurement string, which means this warning comes into play:

/// Warning: PLUART only stores a single line at a time. If your attached payload sends lines
/// faster than the app reads them, they will be overwritten and data will be lost.

Seems like I’ll need to forego line-oriented input in favor of something that reads character by character.

Sorry, I’ll just be over here talking to myself in the corner.

Mike J+

Ah! :open_mouth: Thanks for the heads-up @evan! I just looked into this, and the reason is that we add a 29-byte header to the message encoding things like device health and location, which means the precise limit is 311 bytes. I just verified this. If I call spotter_tx_data with 311 bytes, it sends just fine, but if I call it with 312 bytes, then I get these messages in my Spotter serial console.

2024-01-26T19:25:56.109Z [MS] [ERROR] Message size is too large!
2024-01-26T19:25:56.109Z [BM_TX] [ERROR] Unable to submit message to sat/cell queue
1 Like

I just submitted a PR documenting these limits in doxygen comments.

oh. Nifty! Thanks for that.

I did manage to re-code my sensor integration using PLUART::readByte() instead of PLUART::readLine(), so progress is being made. All output from my serial sensor now makes it into the buoy’s SD files without any loss of characters.

Next up, transmitting. Following this example I was able to send an array of uint8_t (length < 295 just to be conservative) using spotter_tx_data(), and it rewarded me with the “Successfully sent Spotter transmit data request” message I cloned from that example.

However, when I look at past transmissions with something like this…

curl “https://api.sofarocean.com/api/raw-messages?token=*redacted*&spotterId=*redacted*”

…it’s not obvious to me that my sensor’s data is contained within the messages it gives me, nor do I see any timestamps that look close to when the sensor’s data was received by the bmdk.

I did previously succeed in tracking a transmission like this…

spotter txdata i Hello, Spotter!

…while following DevKit Guide #3, and I found those bytes in the curl output, so I am confident that our buoy is able to phone home.

What resources/documentation exists when it comes to pulling data from the sofar API and turning it into something recognizable? I realize I might be asking this in the wrong place (that’s more of a sofar/spotter concern than a bm function) but if you have pointers to helpful sites or hints based on experience I’d be appreciative.

Mike J+

1 Like

On the Sofar Ocean support page, there are a bunch of helpful links, including Spotter Data Access documentation.

Messages sent with spotter_tx_data, when viewed using the raw-messages API, can be recognized apart from the other Spotter messages because they will always have 0xDE as the first byte. The first 29 bytes (including 0xDE) are a header, and your payload starts at the 30th byte.

1 Like

Full disclosure - there are several :grimacing:. The good news is we’ve got a team that is highly motivated to write a lot of stuff down, and some ambitious plans to make great strides in the overall realm of “Bristlemouth Developer Accessibility & Support” this quarter (and beyond).

Comment one is just that I’ve been finding firmware compilation and upload to be fussy in ways I haven’t been able to explain.

There have been several stability improvements in firmware updates in recent versions since 0.5.0, specifically around prevent some edge cases where updates would annoyingly hang and timeout. But, the “invalid param” issue doesn’t ring a bell. If you come across a particular image that reliable results in this error, could you send it our way?

I had been proceeding on the assumption that the two pairs of bm connection terminals on the devkit are completely interchangeable, but my experiences are challenging that assumption.

The ports should be interchangeable, and this is another issue we haven’t come across ourselves. There is an ordering dependency - for the Mote to create a USB serial connection, the USB cable must be plugged into the Mote and host before power is applied to the Mote. Do you think that could be at play here?
Another question - are you able to reproduce this issue when powering the Mote from the Wall-wart, or only from the Ebox?
We’ll try out a few systems here and see if we can reproduce this on our end.

Seems like I’ll need to forego line-oriented input in favor of something that reads character by character.

Yep - there’s a feature in the payload_uart library to support this. You need to setUseByteStreamBuffer(true), then you can do something like:

  // Read a cluster of bytes if available
  // -- A timer is used to try to keep clusters of bytes (say from lines) in the same output.
  static int64_t readingBytesTimer = -1;
  // Note - PLUART::setUseByteStreamBuffer must be set true in setup to enable bytes.
  if (readingBytesTimer == -1 && PLUART::byteAvailable()) {
    // Get the RTC if available
    RTCTimeAndDate_t time_and_date = {};
    rtcGet(&time_and_date);
    char rtcTimeBuffer[32];
    rtcPrint(rtcTimeBuffer, &time_and_date);
    printf("[payload-bytes] | tick: %" PRIu64 ", rtc: %s, bytes:", uptimeGetMs(),
           rtcTimeBuffer);
    // not very readable, but it's a compact trick to overload our timer variable with a -1 flag
    readingBytesTimer = (int64_t)((u_int32_t)uptimeGetMs());
  }
  while (PLUART::byteAvailable()) {
    readingBytesTimer = (int64_t)((u_int32_t)uptimeGetMs());
    uint8_t byte_read = PLUART::readByte();
    printf("%02X ", byte_read);
  }
  if (readingBytesTimer > -1 &&
      (u_int32_t)uptimeGetMs() - (u_int32_t)readingBytesTimer >= BYTES_CLUSTER_MS) {
    printf("\n");
    readingBytesTimer = -1;
  }

Note: For reference, there’s an example of this in the HEAD of develop branch of the RBR Coda example app [here].

I really think it should be high-priority to open up more direct access to the satellite/cell queue. It would be super valuable to:

  1. Be able to send data directly to my own backend without going through Sofar’s server, both for low-latency monitoring, reliability, and data ownership.
  2. Have a cellular only mode for large data uploads.
  3. Have control over message send failures and retries - sometimes I notice a large queue building up if the modem is disconnected, but there’s no way to detect this from bristlemouth.

Largely I think it would be useful to build up a more complex API for triggering sends from bristlemouth instead of the current “write bytes to serial” system. Curious what the bristlemouth devs think though!

1 Like

Hi Chris,

Much of this is in progress / on-roadmap, and will see a lot of forward progress throughout the year!

I’ll start with the easy ones:

  1. Have a cellular only mode for large data uploads.

The Spotter firmware already has the scaffolding in place for a ‘cellular only’ flag, but the backend implementation for accessibility isn’t complete yet. We’ll post an update in the forums when it is.

  1. Have control over message send failures and retries - sometimes I notice a large queue building up if the modem is disconnected, but there’s no way to detect this from bristlemouth.

YES. We’re imagining a request/reply interface on Spotter that will allow direct interrogation and control of the message queue, including confirmations and failure notifications.

One more feature request I’ll add to this list

  1. Clean handling of the 1500 byte MTU in the Bristlemouth layer, so I can send data packets larger than 1500 bytes over cellular (or satellite if I’m willing to pay for it), without having to do my on packetization in my app.

2 & 3 are pure Spotter features (Sofar needs to implement these, as Spotter is proprietary), but 4 also involves some core improvements to Bristlemouth protocol to implement mechanisms for multi-message payloads (eg File Transfer Protocol, stream encapsulation, etc).

The tricky one:

  1. Be able to send data directly to my own backend without going through Sofar’s server, both for low-latency monitoring, reliability, and data ownership.

Sofar doesn’t currently offer this option to commercial customers, but all of these needs make sense, and we’d like to learn more about the specifics of your application and what you need from a telemetry data provider. @chris If you have some specific application needs and are up for a chat, please lmk!

My current theory is this may have had something to do with testing the bmdk’s creation of files on the buoy’s SD card. For some tests I was cd-ing to a subfolder of the bm directory and cat-ing the contents of the file I’d created.

Perhaps the “bridge dfu” command expected me to be in the same working directory as the new firmware file, and that was triggering an opaque “invalid param” error rather than a more descriptive “file not found” type of thing. Still, I don’t know that this fully explains it since the buoy was restarting every time I pulled/replaced the card to try a new firmware, which required reconnecting the terminal session, and surely that would put my CWD back at the root level.

I did get to a point where it stopped happening, so it’s no longer causing me grief.

Hm. Maybe?? Again, I found my workaround (always power the bmdk with that specific pair of terminals) so I stopped troubleshooting.

On a loosely related note, my reading of the DevKit’s Guide 2 very strongly suggests that connecting a single usb-c cable to the buoy should make both the Spotter and the two Bristlemouth ports show up in the python terminal list of ports, and with our hardware this was only possible to achieve with two usb-c cables, one each connecting my laptop to the buoy and the devkit.

It’s not a huge inconvenience but either my hardware doesn’t work the same way as the stuff used when writing that Guide, or the Guide could use a little tweaking to make it clear that two usb-c cables were at play in the example.

I’m drawing a blank on the term “Ebox”… but for the most part, my setup was this: buoy connected to laptop by usb-c and connected to bmdk by smartmooring. Bmdk connected to laptop by usb-c. No additional/external power supplied to buoy or bmdk during testing, although I was leaving the buoy plugged in to a wall wart (via usb-c) when not actively testing. Every time the buoy restarted, the bmdk appeared to restart as well. Not sure if that method of “repowering” the bmdk would be enough to allow it to recognize newly-connected laptops on the usb-c. All I can say is the situation appeared to respond differently depending on whether the buoy was connected to the front pair of bm terminals or the back pair.

I’ll skip the rest of the advice (but thanks!) because I’d managed to rewrite my code as character-oriented rather than line-oriented a few days after writing my last message.

Mike J+

Speaking of the Spotter duty cycle configuration - How does one turn that on? Can’t seem to find it in the docs.

@chris There’s a few ways to do that. One easy way could be to configure your mote firmware to contain a dwell period in the main loop either before or after your data parsing code. I am stronger in C so I’ve made ample use of delay() but I bet you C++ has something like a sleep() or something. The example code in Guide 5, here, has the general structure of the loop can be seen.

Another way which could be less computational overhead (since you might want to do other things during the dwell periods) may be to tie the data collection to the GPS time…I don’t know if that’s in the docs yet but maybe I can tag in @evan on the new post (Setting Duty Cycle) to help us understand if there’s an easy way to do that or if we missed it somewhere already in the guides.

@chris I’m also going to make this a new topic because I think this will generally be interesting outside the original post question.

FYI - this post has a link to the guide on managing firmware versions across an entire system: Bristlemouth Technical Docs Are Live! - #6 by evan

Hi @evan and @zachary !

I am trying to add a SST probe to the surface node of our buoy while still using the bottom node with the devkit and SAMI pH logger.

I am stuck on one section of this process. I went through: Notion – The all-in-one workspace for your notes, tasks, wikis, and databases. and found all firmware versions. It looks like they will need updates. The boot loader is V0.40, the bridge is 0.40, and the spotter FW is 2.9.0.

However, we made a custom firmware that allows the buoy to receive information from our SAMI properly. I am worried that changing firmwares on the other parts of the buoy will cause a problem with the SAMI recordings.

When I tried to flash our custom firmware to the temp probe in the surface node it gave back error ffffff.

Any thoughts on how I should go about integrating this?

I also have been following the guide to start updating the dev kit with v0.11.1 and ran into some issues after running bootloader.