Sensor/CNS scalability story, does one exist?


Hey all, this is just an open question for anyone that may have relevant information. Is there a strategy brewing on how sensor modules can, or will be, scaled out design-wise? Currently, PFC’s co-locate the central nervous system, and expansion or integration is limited to devices connected directly to the BBB signal board (or host platform). The first issues I see relate to protocol and hardware dependencies:

  1. i2c has limited range and would not be suitable for scaling out a large number of sensors or sensors separated by longer distances. While using the PCA9615 and TCA9548A as implemented in the PFC-EDU schematics will mitigate some issues, it will not provide full speed sensor communication beyond 10ft (per datasheet).

  2. Sensor networks appear to be dependent on the brain with no way to monitor or manage them in a distributed fashion. This creates a costly single point of failure baked into the design.

I imagine container farms would have some other method of distributing sensors and sensor integration. Is this something that has been discussed before?


Hello @Drew, good questions. We have TWO boards in use now. The all-in-one PFC_EDU board and a second one we use in our larger machines (container, tree computer) that we call the CNS (central nervous system).

The CNS board uses a differential pair circuit with the PCA chips to extend the range of I2C communications to 50’. The CNS board has 8 RJ45 connectors, a multiplexer, voltage regulators and pins to attach a beagle bone black (we also have adapters to allow use of a raspberry pi).

The sensor board uses the same temp/humidity and CO2 sensor chips as the PFC_EDU, but adds the differential pair PCA I2C circuit so we can use an off the shelf ethernet patch cable to connect it to the CNS board. We have other boards we can connect too: individual LED panels, relays, etc.

So a typical setup on a single rack in our shipping container (holds 12 racks) is:

  • CNS board + beagle bone
  • I2C air sensor board
  • Four I2C LED panels (water cooled)
  • Four USB cameras

I’ll have to ask our electrical design team which is the latest version of the CNS board (I think it is v3.0). Here is the link to the eagle files in our repo:

That github repo has all the current designs for all our boards.



Thanks @rbaynes! I appreciate the detailed response! Any chance you could briefly explain why i2c won out against RS485 for this application? I bet there’s a reason and I’d like to know it before I reinvent the wheel using some flavor or RS485 for my applications.


I’m the software guy, @Drew, so I wasn’t involved in those electrical design decisions. But if I had to guess I’d say it’s because I2C is built into many of the micros we use (BBB, RPi), and many of the sensor we like to use.

@jake or @Poitrast would know all the details.


@Drew I2C ‘won’ mainly due to sensor availability and ease of use. I don’t think anybody is averse to RS485 and there are actually plans to use it on a future system (tree computer) that will need to interface with PLCs for safety monitoring (e.g. over temp shutoffs). I second your concern about throttling the data rate at long distances but in practice this hasn’t been an issue. The max range we typically see is ~10ft which happens on a per-rack basis (on each rack in the shipping container [about 12 racks per 40’] there is a beaglebone + cns + sensors that use the PCA chip…the wiring on each rack doesn’t seem to become >10ft).

For cases where long distances are frequently encountered I anticipate the strategy to either provision a new device instance closer to the peripheral or go the bluetooth mesh route with a device driven by a coin cell (no wires, yay!). I know particle has some great new modules out that the team has been eager to play with but hasn’t had a chance to dig in too deep yet.

The big debate about using a fleet of low cost / low computation distributed sensor nodes vs a centralized high cost / higher computation processor connected to multiple nodes typically ends with the conversation around cameras and the processing requirements to perform localized analysis (e.g. leaf detection, etc.) and advanced imaging (e.g. depth / thermal). This decision is mainly a function of building containers for a research context where data prospects typically win over cost. If you are looking at this from a production standpoint this assumption likely won’t hold (at least until the value of the data being collected can be better defined).

Hope this helps clarify… If I missed anything @Poitrast has thought about all this too.

It sounds like you have an interesting application, would love to hear more about it!