My GitHub site describes how to use Amazon’s Alexa to create a voice user interface to the OpenAg Personal Food Computer (PFC). I was very fortunate to meet the leaders of Fenome and got the opportunity to build my own OpenAg PFC which is now in use at my home. I thought I’d give back to Fenome and the OpenAg community by integrating Amazon’s Alexa with the PFC. I find the PFC an amazing device and the OpenAg initative inspiring! I hope people find the PFC even more useful with Alexa. Please let me know your comments and suggestions to help me develop the Alexa interface for the PFC!
I don’t have a food computer, but have been growing microgreens and AeroGarden lettuce using monitoring software that I wrote. What I’ve found most useful to watch is a chart of the nutrient tank temperature compared to the ambient air temperature. I use it to decide when I should turn my fan on and off. Also, it’s fascinating to watch how the indoor temperature changes over time. The big surprise for me was that the AeroGarden’s power supply will heat the nutrient tank about 2 °C above ambient unless I have a fan pointed at it to increase the air circulation.
The other main things I check are water level (every day), leaves that I should harvest (every day), and adding or changing nutrients (every two weeks). So far I don’t have anything automatic for that. I’m thinking about, what if I could say, “Alexa, how are my plants?” and get replies like “you need to add water”, “you need to change the nutrient solution tomorrow”, or “you should probably turn on the fan”. Most of that wouldn’t make sense for a PFC2, and I don’t think the CouchDB api exposes that level of abstraction. But, at a more general level, it would be to cool to get a summary including any exceptional conditions that need immediate attention along with reminders about upcoming tasks I need to do.
You mentioned that you’re using your PFC. What are you growing? How have you found yourself using the different voice commands? Have you learned anything surprising from that? How do you feel about using Alexa compared to the OpenAg web UI?
Hi, thanks for the feedback. I really like the suggestion to develop an Alexa skill that informs the user of exceptional conditions. This is inline with my thinking and would like to eventually use computer vision to help identify those conditions as well as deviations between desired and measured parameters. I’ve really just begun to use the PFC, I’ve started to grow lettuce. Its too early to conclude anything from the voice command based interface vs. the Couch UI. At a minimum I’d expect voice to offer a lower friction way to understand what’s going on with the PFC especially in remote situations. Please watch the github site for future enhancements and do not hesitate to let me know if you have other ideas for the voice interface. Thanks again.