This is a good place to ask us about the backend projects, data format, the web UI, etc.
@rbaynes I haven’t looked at the code yet, but this looks really awesome!
I’m interested in the flow of data to keep things ‘up to date’ in the UI. How do you plan on consistently getting the data from the food computer to the UI. i.e. will you use a polling HTTP request set at an interval, websockets, etc…
I see there is some sort of message queue, what triggers updates to this?
Hopefully these questions makes sense, I’m mostly trying to get an overview of the communication between the major components.
Thanks for opening the discussion!
For folks wanting to get up to speed on how this works, here are a few resources I’ve found. @jmitsch, perhaps this will help answer some of your questions.
Architecture diagram (original is on github in OpenAgInitiative/openag-cloud-v1)
“OpenAg Cloud Description” pdf. Also from github (check out the pdf’s parent folder for additional high-level documentation).
The Google Data Solutions for Change article about OpenAg from this past summer.
@rbaynes with this new model of communicating with a central database, could other machines communicate with the central server/engine? i.e. could something be made to support the MVP model or raspberry pi based food computers?
Yes, @jmitsch any machine could use MQTT to do pub-sub to the backend service. Look in the brain device code for the registration script and the IoT classes.
Looks like I have some investigating to do
@rbaynes is the openag-cloud only for the PFC-EDU or do you plan on it also being the central data hub for the food server? I have designed and built a food server that I would be interested in pushing the data from.
Hi @adam, the backend is generic and we already use it for our 4 food servers here and one in India. We also use it for our hazenut tree computer.
So, yes, any device can use it. Look at the IoT code in the device code to see the format of the messages and how to register your device with the MQTT system.
this might help:
@rbaynes I see the “flavor_data” GitHub has been added into the openag-cloud-v1. These data summaries of past experiments don’t appear to be in the format of executable sequences like “recipe_bag” and previous JSON formatted files.
- Have any recipes been developed which are intended for running on the openag-cloud-v1?
- What phenotypical measurements are you collecting in the lab and field (schools)?
I would like to attempt to run a group experiment with a common “recipe” between members of the community using MVP’s. Part of this would include manual phenotypical data collection and storage via a form we are developing. I’m considering:
- Canopy Width (CM)
- Canopy Length (CM)
- Plant Height (CM)
- Leaf Count (Count)
- Edible Biomass (Grams)
- Non-edible Biomass (Grams)
- Taproot Length (CM)
- Flavor (9-Point Hedonic Scale Used by NASA - see page 10)
- Crispness 5-Point “Just About Right Scale” used by NASA - see page 11)
- Tenderness (5-Point “Just About Right Scale” used by NASA - see page 11)
Curious to hear your feedback.
@Webb.Peter favor_data and recipe_bag are old. I marked them as deprecated.
See the latest test recipes in the openag-device-software/data/recipies dir. https://github.com/OpenAgInitiative/openag-device-software/tree/master/data/recipes
These recipes are part of the device side code that we use mainly for testing. The real recipe is sent from the web UI on the backend to the device.
All the schools are running the same “Get growing basil” recipe now. Since you probably didn’t make an account on our web UI and look at the recipe directly, here’s a screen shot:
These are the manual measurements the students take:
And here is the device configuration all the PFC_EDUs that are being used in the pilot test are running: https://github.com/OpenAgInitiative/openag-device-software/blob/master/data/devices/edu-v0.3.0.json It tells you which sensors are active. All active sensors publish their values as they change.
@paula will have to comment about the post harvest measurements, as she and @hildreth are running the pilot test. The answer may also be in the curriculum that was shared via our wiki if you care to dig it out.
@rbaynes I appreciate the detailed reply.
I really like what you’ve done with the recipes to define “environments” which as I perceive if are different physical configurations of the device which can be associated with various phases: https://github.com/OpenAgInitiative/openag-device-software/blob/master/data/schemas/recipe.json
I appreciate you posting pictures from the UI. I was not aware that I could create an account and assumed it was not yet available to the public and when I tried to register a device I received this error:
The “Horticultural Measurements” form is precisely what I was interested in seeing. This kind of approach (I love that the pictures are actually Basil) will definitely provide the most meaningful data. I look forward to hearing more about how the students are being instructed to enter this data (individual, teams, etc.) and at what cadence. I’ve debated whether or not this should incorporate some sort of workflow with prompts at different “phases”.