@webbhm I’ve got Photon firmware that can read a DS18B20 ©, Si7021 (C, %RH), and BME280 (C, %RH, hPa) over at https://github.com/wsnook/farm/blob/dev/firmware/photon.ino. It can also turn a couple LEDs on and off–I intended those for use either as notification lights or as pins that could be repurposed for controlling relays. The whole thing is controllable with a human-readable interactive shell protocol (i.e. REPL) over the serial port, and it can easily be adapted for automation in Python, go, etc.
I have a Photon coming. Good to hear your evaluation.
I’m not sure if you meant that as a question. Assuming you did, I’ve been very happy with Photon–more so than Arduino. One example of the difference is that if I restart my mac with the Arduino plugged in, something about its interaction with the USB port prevents my mac from booting–not good. I can restart my mac with the Photon plugged in, and everything works fine. The web IDE and wifi flashing are also very convenient, and the 120MHz ARM Cortex M3 processor, large flash, and large RAM mean that I don’t need to be too concerned about optimizing everything or being stingy about including descriptive error messages.
In this context, the key things I want from a microcontroller platform are reliability (well designed hardware) and the potential for repeatable success by people who don’t have a background in programming microcontrollers. Basically, I want a high probability that the microcontroller will just quietly do its job and not be an ongoing source of problems. Photon fits the bill.
[edit: Another big bonus is that Photon doesn’t require downloading dubious serial port drivers. The USB serial port just works. In fairness, this is also true for Arduino Uno and Mega. I mention this to point out why I’m not a fan of super-cheap alternative platforms that save cost by using low-end USB serial chips. I’m not okay with bypassing macOS security to install kernel drivers from mysterious websites. The potential for opening the door to malware–or just a less stable kernel–is too high.]
Thanks for the Particle code example.One of my big worries was that I say digital and analog pins, but nothing for I2C (ie Si7021 sensor); your example shows that is covered.
I like their pub/sub architecture, which has similarities to ROS. I need to dig deeper to see if that can be done on one device, or requires separate devices (one pub, the other sub). I am getting the Raspberry beta and will see what I can do with that platform. There are a couple of examples of fan controllers, and with the connection to Google there is data storage. Looks like (in theory) my MVP architecture needs are covered. Now just need to get a prototype working.
The one down side with Particle is being proprietary and associated fees, but I think that is just the cost of getting into IoT early on. I suspect that in a year there will be open source solutions that are doing the same thing.
I have been following this MVP initiative with interest. As @wsnook mentioned a while ago as a motivation to create farm, the FCs are pretty complex both hardware-wise and software-wise. Yet an MVP approach usually implies radical simplification. Over the three original posts and threads, it seems to me that this could go way farther into stripping the FC down to an MVP, and then (perhaps) increment with the community.
I have worked on some kind of MVP for months, before eventually finding OpenAg last December—Happy New Year it was. This device costs about $80 all included, hand-crafted, and pretty limited. Yet $80 can help get started and have more people involved. Here is a quick presentation, just as an example of a “more radical” first step. I write now, as it seems to contribute to the direction of this thread.
- Goal: Super simple and cheap stripped-down FC (it may not qualify as an FC…).
- How to cut cost “radically”? Really select readily available materials. The goal is simplicity and cheapness, so there will be compromise on quality and physical properties. Fine for toys to get hands dirty. Also, instead of targetting 6 plants at once, 1 is enough—costs is mechanically divided almost as much.
- Hardware (current):
- A 2L PET bottle
- Raspberry PI 3
- Temperature and humidity sensor (DHT11)
- As many LEDs you want and can cable to the RasPI (I just use two in the simplest settings).
The count adds up to a bit less than $80 (including cabling, resistors, etc.).
Assembly becomes quite simple, and somewhat modular actually. My trials use just the PET plastic, rubber bands, and a bit of glue to support a mini bread board. I can assemble a new one in about 10 minutes.
The software consists of a CRON-based data logger and “decision maker” (super simple, based on the locale night time), and a Unix daemon to control LEDs. Logger and daemon communicate on a Unix socket. Data is collected at a fixed interval, and saved locally. I transfer the data periodically for review (rsync, etc.). I can share this code if there is interest—I would just need a few days to put it on a presentable Github repository.
Such a “radical” approach may go to far with respect to the noble goal of the MVP. I think it should be an incremental endeavour, though. The price tag and relative ease of assemble/hackability may make this appealing. How does it sound?
@_ic I like your radical simplicity description. It sounds like the sort of thing Caleb mentioned doing early on. If nothing else, I think it would be great for everybody who wants to build a food computer to first go through the process of actually growing something using truly simple methods. It’s important to understand that fancy gear can help you achieve certain goals, but it’s not necessary for growing plants. Based on comments I read here on the forums, there’s a non-negligible risk of people cargo culting the food computer concept if they don’t know anything about plants to begin with.
@webbhm I didn’t follow what you’re getting at with the pub/sub stuff. The way I’m using my Photon is strictly as a USB serial device for local communication with a Raspberry Pi or my mac. I just use the wifi cloud stuff for flashing new firmware–that’s free, and Particle provides offline alternatives. I’m not clear if you meant you intend to try running my code on Particle’s Raspberry Pi beta–if so, that’s not likely to work for I2C or 1-Wire because of the hardware differences compared to Photon.
I will keep you posted. I am checking out the limits of the Particle
Raspberry beta. It would be nice if everything could be on one board,
either now or in the future (camera, sensors, code, data storage). I hope
to get into some programming next week (beyond controlling a LED from my
phone!). This is beta, and the pin mapping
https://docs.particle.io/datasheets/raspberrypi-datasheet/ looks a bit
rough; I2C is the first thing I want to check out, it had problems late
last year, but is suppose to work now
After that I want to look into the camera integration
Blynk has a nicer looking interface, but this is functional. At the moment
Particle is checking off the items on my requirements list, but I still
need to see it in action.
Looking a little deeper, the pub/sub appears to be an integration with
Webhooks https://docs.particle.io/reference/webhooks/. This is fairly
fancy, as it assumes the ability for multiple independent devices to
communicate, which may be overkill for a single Raspberry; but it does give
logic independence. Again, this is something I need to investigate before
making a decision.
@webbhm I’d offer some words of caution about the way it sounds like you’re approaching this. It’s worth thinking about the difference between “is capable of communicating with I2C sensors” compared to “does rock solid I2C communication”.
If you try to just check the boxes for individual capabilities without weighing the quality and compatibility of their implementations, you run a high risk of mysterious, hard to debug problems down the road. IMO, that’s how the current challenges with debugging the PFC2 software stack came about. Beyond that, specifications of components have to be matched. For instance, the Raspberry Pi has built in 1.8kΩ I2C pullup resistors on its I2C pins–good for 400kpbs fast I2C, but less good for 100kbps standard I2C breakout boards. Pullups that small have consequences and implications with sensors meant for standard I2C–to use them, you might need to disable the built in pullups on your breakout boards (the parallel resistance needs to stay under I2C’s 3mA limit for pullup current).
Sacrificing reliability of components isn’t worth it, and skimping on board and cable level electrical design considerations isn’t worth it. It’s easy to fall into traps of seeking superficial conceptual elegance by over-emphasizing component count or the importance of software at the top levels of the stack. If you can’t trust that foundational components of the system are highly likely to do their jobs, integrating them into a more complicated system becomes a nightmare. That’s not to say low component count isn’t achievable–you just have to be aware of the tradeoffs you’re making and the level of unreliability you can tolerate.
[edit: As an example of cabling considerations, if you use a Pi camera, your Pi will need to be close to the camera, and the camera will need to be positioned to get a a good angle on your plants–probably a foot or two away from them. Also, the Pi and camera give off meaningful amounts of heat. If you use Pi I2C, you’ll need to get the sensors away from the heat and near the plants. That means potentially a few feet of I2C cable–long enough that you need to pay attention to capacitance, pullups, and bus clock speed. If you use USB to a Photon and inches of I2C cable from the Photon to the sensors, you can easily get plenty of camera to environmental sensor separation without the need to worry much about potential I2C electrical signaling problems. This is why I decided to use Photon instead of staying on the path of Pi I2C.]
[edit 2: What do you think about maybe doing everything on the Photon and relying on their cloud API or Blynk for communication? I haven’t evaluated this carefully, but it seems potentially more reliable than trying to do everything on a Raspberry Pi. The Photon has a built in RTC (real time clock), so with a battery backup, it could potentially do better than a Raspberry Pi for keeping accurate time with unstable power and network connectivity. Image logging would be out, but that’s an easy $50 saved on the BOM ($30 Pi + $30 camera + $10 sd card - $20 Photon) if you’re willing to give it up.]
[edit 3: This also speaks to the old question of, how important is image logging, and should it be a requirement for the MVP? I don’t know the answer.]
This is all wait and see for me. The nice thing about Particle is that I can switch boards and not loose the logic (if the board has the capabilities). At the moment I have a Pi to play with, and I don’t have a Photon, so I am starting development with what I have; but I am open to changes if/when the issues you bring up present themselves.
I want to get something built and see how it looks before making any decisions. There will be trade offs (camera, sensors, boards), but I want to get something up and running (and check stability) before trying to fine tune the details. It is the things I don’t know yet that concern me more than the ones I do.
If you need help with software stuff, particularly at or near the firmware level, feel free to ask me. No promises, but maybe I’ll be able to do something useful. If you want something not involving go (like my current stuff)–perhaps more python-ish–I’m open in principle to helping explore other approaches besides the one I’ve been pursuing.
@_ic Thanks for contributing! I really like your focus on a “radical” approach to the MVP. One thing we all need to keep in mind is that even the PFC V2 isn’t perfect yet. Without actuation of Co2, or ways to control the root zone temperature there are still major improvements to be made. I only bring this up because I want to make the point that no one still really knows what a “Food Computer” will end up being yet. That being said, OpenAg has already made some decisions about what they think a “Personal Food Computer” should be (V2 PFC), the question now becomes, do we agree?
I brought this point up to someone here locally this week. They intend to use the MVP for education in elementary/middle school classrooms this summer. A huge part of that hands-on learning experience is harvesting plants, and to an 8 year old one lettuce plant isn’t nearly as exciting as a dozen. @webbhm reminded me that in open field research the plants on the edges of the field are excluded from datasets because they are typically outliers. If we end up using a type of light that is more of a bulb form, and less distributed like the V2 we may find that this makes a real difference where the plants sit in relation to the light source. All of that being said, when the time comes that we develop a Food Computer for Peppers, or Tomatoes, we will definitely not be building it for a dozen. In all sincerity I’m really not sure about this one, I think you make a great point, I would love to see your MVP if you developed it for just one lettuce plant.
Could you please post a picture of your build? Also, do you have a BOM? If not, would you mind at least telling me what LED’s you are using, and have you had success thusfar? [quote=“wsnook, post:26, topic:1893”]
Based on comments I read here on the forums, there’s a non-negligible risk of people cargo culting the food computer concept if they don’t know anything about plants to begin with.
I couldn’t agree with you more. We want to make sure we don’t go the same route as Tower Gardens, or Freight Farms in promising people success without any effort. I believe half of the educational value of a PFC comes from the process of building it. Farmers have always fixed their own equipment, and I don’t think the future is going to be any different. I have learned an incredible amount just by killing plants. When I first got started I would use a simple $5 set-up with lights from home depot.
Having individual reservoirs allowed me to experiment with different nutrient concentrations. I started to learn what it meant to “stunt” a plants growth. I was using an air pump at the time, and sometimes when the water level dropped faster in one reservoir than another it would cause the variation in pressure to deprive another reservoir of oxygen. This taught me what “root rot” was, something almost anyone who has grown hydroponically is all too familiar with. I experimented with different types of nutrients, teaching me what nitrogen defeciences look like. I also learned that if I didn’t keep a fan on the plants the stems would become weak, and fall over causing them to rot and die (big problem with lettuce seedlings). It was this curiosity and hands-on learning that caused me to dive deeper and deeper into what was required to make one of these systems work. Due to my inexperience I was sort of forced into using simple materials and didn’t have access to a real workshop. I then developed “Oasis” which was my best effort at finding the cheapest possible way I could find to grow a pepper plant indoors. I wanted to try peppers because I knew that they required much more intense lighting, as well as a transition from vegetative to flowering and I was very curious about that process. It was also a way for me to practice pruning something which I think is critical in indoor growing (especially hobbyist level). Through pruning I pushed my peppers to have as many as 12 primary stems and one plant would fill up my entire grow box. I could have just read about the value of pruning peppers in a paper and learn that 4 stem peppers double the yield of single stem peppers. But, for me there was something really exciting about coming home every day and seeing how the plants had changed.
@wsnook pointed out the value of truly understanding the science behind plants. I agree it’s all too easy to get caught up in thinking that there is one answer to how to grow a plant indoors and it’s a Food Computer, which couldn’t be further than the truth. Everything I did above I did without any sensors, that being said I have a background in databases so I knew there was value in data. I would take daily measurements of EC, PH, Temperature, as well as record my findings to an Excel sheet.
It’s undeniable that I learned quite a lot from my experimentation, perhaps most importantly I started to understand the value of a “recipe” and creating the ideal conditions for a plant. All of this was done without any sort of “computer” integration to my boxes. In my personal opinion, what I did above does not fall under the definition of an MVP PFC because even though I grew food hydroponically, in a container, I did so without any “computer” integration and therefore no form of reliable data collection. I think that right now, it is very easy for us to be focused on the hardware, and we’re starting to be more focused on the software, but we still haven’t really shared any data. I would love for this MVP to not only give us a way to share our system designs and software with each other, but our data as well. That is something that really hasn’t been accomplished visibly by anyone yet, but is perhaps one of the most exciting parts of OpenAg to me. One of the hardest parts will be measuring our success, after all who cares if I share a recipe that doesn’t work? This is where the camera I think will prove our most valuable asset in the short term, even if it’s in a very manual proof that I grew this in this box with these settings sort of way. To start us on the minimal path to computing, I think temperature sensors are the simplest/cheapest method for us to establish automated data logging, and a fan (thermostat) is the simplest/cheapest form of actuation. All other aspects of the MVP really are about creating a successful indoor hydroponic system with a light source. The rest of the “recipe” we can try to control manually and help each other to fill in the blanks. For example: I have access to a PAR meter and will be testing the lighting outputs of the MVPs we develop locally. Even if we can’t all measure our PAR, we can help each other by saying that if you use this box a plant will have ___ PAR at 12 inches from the light, and ___ PAR at 6 inches from the light. This would enable someone to begin experimenting with different plants, and the impact of changing the PAR, without even having a PAR meter.
I plan to only post a survey on this thread tomorrow due to the activity primarily having been here. I will also be revising all three of my first posts to reflect what I feel the communities opinions have been. PLEASE don’t hesitate to point out if you feel I’ve overlooked something or made a false assumption/assessment.
$300 Food Computer
Can peppers be grown indoors?
These aren’t technically from a “food computer”, but check out my CC BY-SA datasets that are up on GitHub:
Here’s my choice for software and hardware. Considering alternatives to the Raspberry Pi and Arduino
$3 ticks radically cheap. I’m using a DHT11 on my proto. The pub/sub stuff sounds like MQTT. You don’t have to use paid servers. There is a lot of support available for rolling your own server on a Pi (Mosquitto), though a free service might be sufficient for the MVP. Since MQTT is supported on very cheap hardware, we can offload device side processing, negating the need for a Pi.
@Webb.Peter In terms of the frame and skin, can you summarize our plans above and modify the MVP Wiki with your choices.
I did not yet clean up the code, but you’ll notice it is pretty simple (the whole logic is about 200 lines). The timezone is static, and you should change it, so that LEDs turn on at the “right” time: https://github.com/ic/mark0/blob/master/collect.py#L144
@Webb.Peter Thank you for your feedback.
Yes, the scale impact on children is probably very important. Also, an argument against a single-plant machine is cost per piece.
The original target user for an MVC is the hobbyist with a spare table at home, rather than a school, no? I think the point of my post is to keep the software simple and easy to hack. The HW comment does not apply for the school, then.
My personal goal is to have a simple device to increment on, which seems pretty close to the MVC defined here. As you and @wsnook remind well, there is no solid proof that anything works, nor that anything is simple. The thing I would like to keep simple is how to get started. Yet re-reading the scope of an MVC here, it seems to me there is perhaps a misalignment when we come to schools: Schools may better leverage a simpler PFC, rather than a “formal” MVC.
Here are some pictures. My problem is that my (phone) camera autofocus is broken, so images are blurry. I should have better ones soon, if there is interest.
The PET bottle, cut in half, with the upper part upside down, and the bottom part as a base and reservoir. You can see a RasPi attached to the left, and a mini-bread on the right (for balance). The cables between the RasPi, the bread, and the sensors run around the bottle, as I had no proper lengths for jumpers on this version.
This second picture is a view from 3/4 above. I hold the camera (a Kuman day/night vision less-than $30 RasPi-compatible piece). You can also see LEDs hanging down, and a blurry block lower in the bottle, the DHT11 temperature/humidity sensor.
This last picture is a view from above. The tiny black point at the very bottom, sitting in the bottle cap, is a basil seed.
A night snapshot from the camera:
A daylight snapshot from the camera:
The white plastic bag at the bottom of the pictures is balast. The RasPi is hooked high due to the short cable lengths (see @wsnook several posts on the subject with RasPi ). The balast should be just plain water, but it was not then. Now it is water, and looks better…
I have no formal BOM, but the list I shared above. This is an experiment I share as a “case study” for an MVP. If there is interest to scale this up, I can write a BOM and put it on the code repository.
The LEDs are cheap ones, plain COTS. The narrow space and reflectivity of PET led me to try first with these plain components. I expect the collected data to provide a base for evaluating what LEDs are really necessary for a valid setting. This is yet to do, though.
Totally agree with the approach. The trial I shared is in that vein, where the fan is replaced by LEDs as actuators.
@_ic Cool. It’s always fun to see pictures of what people are building.
I just finished up the lettuce experiment that I’d been doing, and now I feel like I’m at a crossroads–unsure which direction to head in next. I like the example you’re setting of trying out data logging and control strategies with a minimal investment in parts.
Today, I’ve been thinking about maybe writing a minimal environmental controller on Particle Photon that could use LEDs instead of actuator relays–just fake all the expensive hardware with little lights. Maybe I could even make a scale model of a PFC2 enclosure by gluing together bits of paper, wood, and plastic, then put the LEDs where the pumps and fans would go. The point would be to focus on the control software without spending a lot of time or money on a hardware build. I’m interested in exploring good ways of writing recipes and translating them to actuation and control loops.
If I did it right, maybe people who already have hardware could hook up my controller with real relays and stuff instead of just LEDs.
One problem is that I don’t have a DHT11 or an AM2315. They’re cheap, but I hesitate to spend my money on them because, from what I read, it sounds like they’re usually inaccurate, imprecise, and susceptible to sudden failure. I really like the Si7021 sensor that @webbhm recommended–it seems like a good, reliable design, and it’s cheap. I’ve also written a driver for the BME280, but it might be too expensive and high-quality for consideration as part of the MVP. Air pressure is interesting, but I’m not sure if it’s important. Pressure could potentially be useful for calculating wet bulb temperatures in relation to evaporative cooling, but I’m not sure if anybody here cares about that.
Are you interested in Photon or the Si7021? What are you planning to work on next?
@Webb.Peter (and @webbhm if you have an opinion on this), Suppose that, for my next experiment, I try writing Photon firmware that could be the brains of an MVP and/or maybe be swapped out for the Arduino & Pi 3 in a PFC2. What features would be must-haves?
These seem like obvious essentials, but probably not a complete list:
Provide a way to set the time and keep time–if possible, with a battery backed RTC. (do this for real)
Control the photoperiod for at least one channel of grow lights. I’m tempted to do red and blue separately. (fake this with LEDs)
Log nutrient tank temperature, air temperature, and air percent relative humidity. Also provide some kind of real-time charting capability. (Do this for real with DS18B20 and Si7021)
Provide actuation for an air exchange fan to cool the growth chamber. Make fan control logic configurable as part of the recipe–maybe based on a thermostat set point or a timed schedule. (fake the fan actuator with an LED)
For possibly modding PFC2 hardware to create a minimal spring-break proof food computer, I’d add these essentials:
Top off the tank from a reservoir of pre-mixed nutrient solution. Needs: tank water level sensor and a water pump. Assume pH control and monitoring is manual. (fake this with a switch and an LED)
Remote monitoring for sensor values–and maybe camera images–implemented in a way that’s robust on the type of managed network configuration that’s likely to be present in a school. For example, running a server that needs a hole punched in the school’s firewall would be bad. Providing a way to check the wifi MAC address (often needed to join managed networks) before initial connectivity is working would be good. Building a custom production-worthy Internet-facing cloud service would be bad. We should use an existing hosted service to save a lot of wheel-reinventing and stressful ongoing sysadmin overhead. A twitter bot might be good. Something with IFTTT might be good. (do this for real)
[edit: thinking about my PFC2 mod feature list, a Photon alone probably wouldn’t be enough. So, assume the available control hardware would be Photon + Pi 3 or Pi Zero W.]
Similar activity here (my goal is for testing and evaluating growth strategies).
They are all that, indeed… A short answer on my end is that “it depends on the goal”. As a first shot, DHT11 is available, as well as OSS libraries to use it. The Si7021 looks a good alternative. Although cheap, it ranges in 5x the price of the DHT11 where I checked…
Interested, yes! I have just tried so far to be strict and go forward. My version is running non-stop for more than a week. This is too weak to draw any kind of conclusion, so I will post again in a while. In the meantime, I will check on the Si7021, and probably run a PET device is parallel.
I also have interest in the Photon. In the “strict” thinking, I found it less “available” compared to Arduino and Raspberry Pi. I am user of the Onion Omega 1/2 platform for 2 years, and also discarded this one for the same reasons. Also, Photons are less imported here (Japan), and relatively hard to get.
As for work, I continue on the PET design for now. Most of the work is on the software side, to make the code easier to use, deploy, remote-manage, visualize (like you did with Farm), and support additional sensors/actuators (light sensor and fan next). Feedbacks are welcome!
The most important step, though, is to centralize data. The Mark0 repo contains some stubs about logging all sensor readings to an API, as well as sending the snapshots to remote storage. This looks like what @adam has already done with Arduino YUN, and probably many others. I should be done in the next few days. I guess this is the most important piece, with regards to scalability, data, data format—all approached as an MVP, so built for experimentation.
[ Edit: Apparently I can get Si7021 easily. Photons are unfortunately harder to get right now. I’ll check again soon. Anyway interested in building on that one if that is the “thread choice” ]
[ Edit 2: BME280 is cheaper than Si7021 here, for some reaon ]
Thanks for the code. I realized that if the camera is run from cron, then the temp/humidity are the only thing in the code, so it is relatively easy to do the timing there (until I add more sensors).
As to your MVP list:
Real time clock is necessary if you don’t have internet connection, but I am assuming that is available. The Pi syncs time from the internet, thus avoiding the need for RTC. Particle doesn’t do much good if it doesn’t have internet. I could see the need for a non-internet version as an option.
For MVP, I would like to stick to a wall timer and bulb. Separate channels would definitely be a good enhancement/option for the future.
I assume the temperature in the chamber will be uniform (air and tank the same), thus only needing one temperature sensor. you have some thoughts or experience otherwise? Separate from the MVP, I would personally like to run an experiment with having multiple temperature sensors through out the box to validate that the air circulation is keeping the temperature uniform.
Back a while I wrote the logic for a fan controller (5v analog pin voltage). I had a target value as well as a high/low value. This can be a simple as on/off if out of range, variable speed or a fancy PID logic.
Tank top-off. I think schools will need this for vacations. A simple Particle control of a pump should work, though do need some level sensor or such. I have not looked into this.
Particle helps with some remote sensing (temp, humidity), but don’t know what its firewall needs are. At the moment, if data is going to a flat file, charting could be Excel. I would like to get beyond this soon, but this covers a ‘minimal’ requirement. It is nice to see that Particle has IFTTT in it already. My ideal would be to have the data go to the same repository as the PFC, with the charting and interface being the same for both.
I am going to start doing some code and see where I can get.
I think the MQTT route with data published to Adafruit IO, or another free service provides this feature. No need for exposed ports or firewall exceptions, it’s simple to use with a great UI Considering alternatives to the Raspberry Pi and Arduino