Considering alternatives to the Raspberry Pi and Arduino


#1

Greetings fellow nerd farmers,

As many of you are aware, the food computers have been using raspberry pi and arduino to interface with sensors / actuators, run recipes, and store data. They have served us well so far, however we have seen a few downsides with them in terms of reliability and complexity.

The raspberry pi file system is structured in such a way that it is easy for the operating system to corrupt after a few months of usage. The communication link between the arduino and raspberry pi creates additional complexity by requiring code to be generated to change the device configuration once deployed.

Moving all the sensors to the raspberry pi would be a good way to reduce complexity but the operating system corruption issue is still a concern that does not seem to have a straightforward solution.

We have been toying with the idea of moving everything to a beaglebone or another singleboard computer but it would be great get more ideas and feedback on this before committing to a new development path.

So, what do you all think?


Combining farmbot and PFC electronics
MVP - Product Design
#2

I just switched all my stuff to Particle Photon in the past several days (https://github.com/wsnook/farm) after a couple weeks of contemplating the potential for issues like you’re describing. I’m still using a Rasberry Pi to control it, but I’m not trying to do timing sensitive stuff on the Pi.

Based on what I read, and on my own evaluation of the Pi’s design, my sense was that using 1-Wire or I2C kernel drivers on Raspberry Pi would lead to a less stable system compared to using a Photon. Raspberry Pi GPIO support seems optimized for the time scale of human response times–button presses, LED blinking, relays on and off, etc.–rather than on the scale of microseconds or nanoseconds like you need for software implementations of inter-circuit serial protocols. The Raspberry Pi’s I2C pins have built in 1.8kΩ pullups that seem designed for 400kHz short distance I2C. For longer cable runs inside a food computer, regular slow 100kHz I2C is more suitable. A lot of I2C sensor breakout boards–like Sparkfun or Adafruit’s stuff–have their own built in pullups that would lead to out of spec current levels on the I2C bus if wired in parallel with the built in pullups. Using those sensor breakout boards with Raspberry Pi I2C would require delicate soldering to disable all the extra pullups.

The old Raspberry Pi power supply issues seem to have been mostly resolved, and my sense is the platform can be very stable if you use robust techniques–lots of people are using Pis successfully for embedded and industrial stuff. The catch is that you have to approach it as an embedded system design.

I’ve been running 24/7 data logging on different combinations of a Pi Zero W and Pi 3 for about the last five weeks. So far, I’ve had no problems, but I’m using several techniques to keep things stable and be kind to the SD card:

  1. Use high quality 2.5A usb power supplies. From what I read, lots of of Raspberry Pi problems trace back to inadequate power.
  2. Save my initial camera images to a ramdisk so only my final cropped and scaled images get written to disk.
  3. Use big, high quality SD cards so the wear leveling algorithms have lots of spare flash to use beyond the capacity I need for images and sensor logs.
  4. Change a few things with Raspbian’s default configuration to reduce chatty logging (see https://github.com/wsnook/farm/blob/master/README.md#raspberry-pi-setup)
  5. Use a modest sensor logging interval, and cluster disk writes to happen at the same time. I started with a two minute interval, but that gave me way more data than I needed. Now I’ve switched to 20 minute intervals.
  6. Use a lightweight software stack that’s conservative about logging and other disk activity–ROS and CouchDB are the opposite of this (e.g. frequent api polling that generates HTTP access logs).

My sense is that OpenAg’s experiences with Arduino and Raspberry Pi are probably not representative of what’s possible with a lighter-weight software stack designed to conserve disk use and offload timing-sensitive I/O tasks to dedicated hardware. In particular, using automatic code generation for firmware seems outside the norm for embedded designs. Arduino sensor libraries aren’t uniformly high quality, and even the good ones can’t just be arbitrarily combined or instantiated multiple times. If you want it to work, you have to read the docs, check the all the resource requirements, and balance a lot of constraints from the different components. Also, you can’t trust library and example code for sensors to follow the datasheets or implement robust error handling. If you have serial buses operating at or beyond their rated electrical specs–and the PFC designs very well may–poor error detection and recovery can lead to major headaches.

TL;DR:

  1. Particle Photon is great. It works a lot like Arduino, but it’s faster. It’s easier to flash. And, their ecosystem isn’t burdened with lots of cheap, unreliable clone boards.
  2. Rasberry Pi is great, but it’s questionable to rely on linux kernel modules to do a good job of high speed GPIO. Offloading the timing-critical stuff to dedicated processors can be more reliable as long as the firmware is written carefully.
  3. Check out https://github.com/wsnook/farm/blob/master/firmware/photon.ino for an example of what I consider a potentially better approach to sensor and actuator firmware. It’s what I’ve been working on partly for me, and partly for the MVP thing Caleb wants to do.

MVP - Product Design
#3

How about the Edison Arduino?


#4

I’ve used Particle Photon before and it is quite a versatile platform. My only problem is the cost - 19 USD + Particle Cloud fee if you exceed 25 devies and 250K events per month.

If I were to re-do what I’ve done in the past few months, I would use ESP32 (https://www.seeedstudio.com/ESP-32S-Wifi-Bluetooth-Combo-Module-p-2706.html) since it is cheap and quite extensible.

I would also place the growing recipe in a text file that is loaded onto the ESP32 using an SD card.

The ESP 32 also need to have a caching ability so when the internet connection conks out, it will still log the data, and send the data when the connection comes back. (I am from the Philippines and there are too many variables to assume the internet always works 24/7.)

I’d stream all the data onto couch db, hosted somewhere, and if I get enough food computers running locally, will just spin up a RPi which will collect all the data.


#5

Yeah. What I’m doing so far is just using the Photon’s wifi for programming firmware. For data logging and control, I’ve been using a USB serial connection to a Raspberry Pi. I like the idea of not relying on always having a working Internet connection.

One thing that’s nice about using either the Pi or Photon with an Internet connection is that they set their clocks from ntp servers or the Particle Cloud. The Photon has an RTC with a pin for adding a backup battery, and the SDK lets you set the time with code if you don’t have an Internet connection. But, the Pi doesn’t have an RTC at all, and adding a good one costs about as much as a Particle Photon.

How would you handle timekeeping on an ESP32?


#6

The ESP32 has an RTC clock in it as well, so similar to the Photon, just need a battery. This is in theory, but I have not tried practically. Could check it out next weekend. If I remember the documentation, the ESP32 also has an NTC client as well to get the latest UNIX time.

For the Particle Electron, we need to remove a 0ohm resistor on the back side to enable the usage of VBAT or else the power from the VBAT is used to power the entire MCU. I am not sure if this is the same for Photon, but basically I assume that if power gets disconnected for the Photon, the entire food computer will not function anyway, and when power comes back on, then a time sync is needed to get back to the point where the recipe was running.


#7

When I first started building my own food computer over two years ago I was convinced that the Arduino YUN was the ideal micro controller. Having an arduino uno and a linux machince running OpenWRT on a single board was amazing because it was so easy to create a REST API for all the connected sensors (res depth, res temp, PH, EC).

However as the system became more complicated I began having issues with the size of the arduino sketches (it’s no MEGA board). Also, the embedded linux Open WRT is also limited and the processor on the YUN is weak…very stable…but weak. I had to use an Arduino board because I had picked the DF robot sensors which only had code examples for arduino. The DF robot EC sensor was throwing of the PH readings so I ended up putting the EC sensor on an opto issolated SSR… it worked… but ultimated those DF robot sensors aren’t reliable enough and they degrade quickly. However I’m not confident with the YUNs future (they even stopped making them for a while!) I will switch to the Pi and mega combo for my next version of my grow machine.

Now for the Pi…
I use a Pi for envornment monitoring (CO2, temp, humidity) and I connected a pi camera that takes a picture every hour and creates a time lapse of the total grow from beginning to the end of the current day (haven’t figure out how to auto post to Youtube). I don’t mind using the Pi for crunching data because the YUN is the stable minimal machine that does the crucial misting/light control of the aeroponics system.

Now for my actual point…
Instead of worrying about either of these micro controllers I have a dedicated cloud virtual machine that the YUN and the Pi talk to. If the cloud machine doesn’t hear from either of them then I know there’s an issue. Another major point to this setup is that it’s much easier to gain access to the grow system from outside the house without dealing with router port forwarding nonsense.


MVP - Product Design
#8

I have chosen to use an ESP8266. It’s less than 3USD, so it would be perfect for an MVP. It’s programmed through the Arduino IDE, so easily reprogrammed by beginners. I’m sending data via MQTT, a well supported and popular protocol, so you have a wide choice of online free servers for your data, as well as the choice to host your own server.


Adafruit IO, allows up to 10 feeds, and I’m redirecting the rest of my data to a text feed.

If one wanted a camera, I’d suggest going with the Pi Zero and leaving it as an optional and standalone feature.


MVP - Product Design
MVP - Product Design
#9

I like the idea of making a Pi Zero camera tool that could optionally be used alongside of other systems for data collection and sensor/actuator control loops. Potentially, someone who didn’t want to buy a Pi Zero and camera could just take pictures with their phone and upload them to wherever the Pi camera pictures would go.


#10

MVP Software v 0.001
The problem I keep staring at is scheduling. There are a lot of cheap options for collecting sensor data, but the problem I keep seeing is scheduling the periodic collection of data, especially when I want different times for different sensors (camera picture every 4 hours, temperature recording every 2 hours, temperature reading to adjust thermometer every 2 minutes). Coding in waits/delays (which most example code use) becomes a nightmare after the second sensor, and many of these packages lack a scheduler (like Linux’s Cron).
The direction I am currently looking at is a hack, but it is simple and gets me started.

  1. Cron is my scheduler, it comes with the Raspberry Pi
  2. Pictures are a one line Python shell script that write to a directory. Cron scheduled.

raspistill -vf -hf -o /home/pi/picam/selfie.jpg

  1. Temperature/Humidity will be via Particle, using a HTTP client called by a Curl shell script (via Cron). There will be two parameters: log temperature, update thermostat. As an extra, I will probably set two digital pins to the temperature and humidity for remote reading. Logging will be to a text file.
    This is by no means ready for prime time, but I think it will minimally work and I want to get something started. This also lets me get my feet wet with HTTP which has potential future uses for Webhook and pushing data to Amazon’s AWS or the existing CouchDB.

#11

Are you asking for advice on what to do? If so, a classic way to handle this sort of thing is to organize a program around an event loop in your main thread or process with a concurrent thread or process to generate events at the scheduled times. That’s what my farm controller does to take pictures and sensor readings at 10, 30, and 50 minutes past the hour. If you’re familiar with setTimeout() in javascript, it works on the same principle.

On an Arduino-like microcontroller, you can get pseudo-concurrency for the timer and event loop dispatcher functions by calling both of them from loop().


#12

@webbhm Will’s suggestion is one of many ways to ensure tight scheduling. @jyauxi also mentioned using an NTC client for obtaining UNIX time, so any decent Wifi enabled system (ESP8266, ESP32, etc) will have accurate time with the ability to schedule events for certain times.

If you don’t want to implement event driven or procedural C programs, the ESP gives you the option to run NodeMCU, a Lua based OS similar to XTOS. That OS has built in task scheduling and background functions for more complicated projects.

Also @wsnook, I stumbled across ArduCam today. Although I think a Pi Zero and camera add-on is still worth developing, an SPI camera with a cheap micro might make for a cheaper MVP


#13

I just looked at some of Arducam’s stuff on Amazon, and I like how they mostly seem to use changeable lenses. One of my Raspberry Pi camera problems was adjusting the manual focus–I could do it, but it wasn’t easy. Also, it had a lot of resolution but a wide field of view, so I ended up cropping and scaling well over half of the pixels away. A lower resolution camera matched with the right lens might have been a better choice.

How do you feel about the number of pins, flash, RAM, etc. that would need to be dedicated to the camera? How would you handle moving the image data around, especially if wifi or cloud connectivity wasn’t always immediately available?


MVP - Community Development
#14

Thank you all for the great feedback, many great ideas have been floated. My first sense it that we need to clarify the distinction between the PFC2.0+ and the PFC-MVP. While they are pursuing a similar end objective, designing a few hundred dollar device oriented toward consumer and a few thousand dollar device oriented toward research have different constraints.

I don’t want to dig in too much to the PFC-MVP right now but it makes sense for cost constrained device to use a lower-cost processing unit. Something like the Particle Photon could work quite well. I have used it in past projects and it is a really nice device to develop on. Flashing over the air is magical and having a backend solution within minutes of development time is a huge win for a lot of cases. I do see concerns with image acquisition however there are solutions previously mentioned. Counter to that, the ESP32 could achieve a lot of the same features with a bit more development work. But the selection of the processor for the MVP should move to a different thread.

As far as the PFC2.0+, having a processor that runs linux is a huge win. Being able to connect to usb devices, run multiple event loops, and have an operating system raises the ceiling of the device quite a bit. I can easily imagine a future where we connect a low cost usb spectrometer to the bot that feeds a lot of data to the system that would overwhelm a microcontroller. For this reason, a processor that runs linux for the PFC2.0+ is a requirement.

The next major concern is for the timing critical processes. @wsnook makes a great point that “it’s questionable to rely on linux kernel modules to do a good job of high speed GPIO. Offloading the timing-critical stuff to dedicated processors can be more reliable as long as the firmware is written carefully” While most of the processes in our system right now are not timing critical, reliably actuating the doser pumps need to acurate to a few milleseconds (real-ish time). I would be curious to see what timing accuracies we could get from a raspi alone.

However, looking forward, as we pursue the research line of the device, it is likely more timing sensitive modules will be added. This leads to searching for a system with both linux and real time processing abilities. This can be solved either with a single board computer (like a raspi) plus a microcontroller (like arduino or photon) or by a single board computer with a programmable real time unit (like beaglebone). Using one board instead of two seems simpler to me. Also using the beaglebone removes the file system corruption concerns.

Other factors that are also important consider are the communities surrounding the device. Raspi has the largest however Beaglebone also has a thriving one. Another point for the beaglebone is that the board files are open source so moving to a single integrated board (instead of board + shield) is possible.

For these reasons, the beaglebone black wireless seems like an optimal solution for the reasearch line of the PFC.


Greetings Community - Lighting PCB
#15

Are you sure you are not grounding the Raspberry Pi before you have really given it a chance to fly?
@jake
Emlid makes Navio2, which is a autonomous flight controller built on top of a Raspberry. They have a compile of Raspbian with a real-time option (free download). Not only that, but they are using ROS as a part of their architecture. Flight controllers place a lot of demands on real-time processing, where the consequences are more than software crashes.

Before giving up on Raspberry and switching to the Beagle, I think you ought to review this option. It would appear that all you would have to do is change the kernel software, and use your existing PFC code on top of that.


#16

I noticed the Beaglebone black wireless appears to be wireless only. I personally would always want the option to make my grow system’s network topology based upon wired Ethernet. I think one can rattle off many reasons for this having to do with security, reliability, and user support overhead. One other reason comes to mind. I foresee a future where all sensors and actuators are Internet connected devices. In this world I would want to wire the food computer’s local LAN and would want the top level controller (i.e the Beaglebone) to be wired into that LAN. One however can also imagine a future where everything is wireless so maybe I just need to chill out.


#17

I agree the option for a wired connection has strong advantages. The beaglebone black wireless has the same pinout as the beaglebone black (w/ethernet) so the same cape / shield should work for both cases. Also if needed, we could use a usb ethernet adaptor. I too share your dream to have all sensors and actuators as internet connected devices. :sunglasses: