Openag-ui web app build diary


#1

Hey OpenAg friends,
I’m working on a little web app that can control Food Computers. This is my build diary. Here’s the idea:

  • This app will be a basic dashboard for Food Computers, aimed at makers and tinkers. It will expose all the basic controls, along with useful diagnostic features like data export.
  • A hopeful goal is to serve this web app from the Raspberry Pi itself. This way, every Food Computer will have a user interface right out of the box.
  • In addition, we can have have other apps that use the same REST API (like the Unity app, for example), designed for specific users (e.g. “schools” or “scientists”).

I’m just getting started. Step 1 is getting a “blink test” up and running.

All the code is on Github if you want to follow along (but I don’t recommend trying to run it yet :slight_smile: ): https://github.com/OpenAgInitiative/openag-ui.


#2

Hi Gordon,

I played with your code on Saturday. I like the method : generating displays. My plan is to rely on this to generate a web UI (similar to what I see on UBIDOTS). Would that fit?


#3

@thomas at the moment, my priority is getting a “blink test” up and running. In other words, I’m prioritizing getting everything talking to everything.

Sensors -> Raspberry Pi
Raspberry Pi <-> Cloud
Raspberry Pi <-> UI
UI <-> Cloud

After that, my plan is to start working on a dashboard UI that can both read and write to the Food Computer (similar in concept to what you’ve posted above). We’ve actually got a JS framework for this already, along with design files for how those sensor readings should look. Fun stuff!


#4

Wonderfull. Will follow this with attention.


#5

Yesterday: got recipes posting from form to DB.
Today: working on getting tri-directional database sync working.


#6

Yesterday: added helpers for push and pull between Raspberry Pi’s database and UI database, as well as the ability to clear the add recipe form.

I also looked into CouchDB and PouchDB design docs for building indexes of the various sensor types. This will let us render different control clusters for the different sensor types on the dashboard.

Today, I’ll be working on continual updating for sensor readings.


#7

Introduced Result type for tasks that may have failures:

Lesson learned: with tasks you must either

  • always call success (and encode the error as a “success” value)
  • or recover from thrown errors with .capture

#8

I see some active commit activity recently. Is it time to give it a try?


#9

@serein It’s steadily improving. I have the front-end displaying sensor readings from an instance of openag-brain running on the Raspberry Pi, but openag-brain and openag-ui aren’t release-ready yet, so YMMV. Let me know if you want to help develop either/both :).


#10

I can help with testing and troubleshooting. Doubt my coding skill however :wink: I know enough to read the code, but I’m afraid that I could write a clean one.

Anyway, I can try to give a hand with backend openag-brain. Do you have any backlog I can take a look at or it’s all informal?


#11

@serein I would love some help reproducing https://github.com/OpenAgInitiative/openag-brain/issues/6, if you get a chance.


#12

Today I put together a flow diagram of the whole v1.1 software system. There’s a lot going on :smiley:.


#13

Thanks a lot! This really helps to understand bigger picture vs trying to reverse engineer it from tracking github changes :wink:
Maybe it makes sense to place it on wiki as well?


#14

Good idea :smiley: http://wiki.openag.media.mit.edu/OpenAg%20Food%20Computer%201.1


Community Developed Mods
Community Developed Mods
#15

Looking at proposed architecture I see you are considering to reimplement brains from proprietorially Python code to ROS modules. I think it’s a good idea to reuse as mach as possible from existing technologies and similar domains.

Are you going to rename PFC to PFR (personal food robot) :wink:


#16

Hi, have the openag_brain running on my pi 2 and have been trying the openag_ui software. Unfortunately the sensor code for the brain is for the v1.01/2 upgraded sensors at the moment. I get the green bar across the top of the web page and can see the add recipe pop up.
Should there also be sensor categories with no input or not connected messages ?
Have forked the code for the new air temp/humidity sensor am2315 and changed the code to that of the am2305 to try and add that to the, ’ brain ’ to get some sensor readings.
Hmmm merging the old code wont work so have created a separate repo.
Now I have found that the code is hosted at platformio, so it needs to be part of the openag platformio repository and have a registered number so we have web access.
Any ideas to whom i contact and how, I’m new to using github, direct email or github issue, ( Leon Chambers ? ) about whether or not support for the v 1 sensors should be included with the v 1.1/ 2 ?
Will continue trying to understand the openag-ui code, but at this point bug finding and copy / paste is my limit.

My github url is, https://github.com/MisterBurnley


#17

Supporting the original hardware (sensors, etc) with the new software is definitely the plan, but there’s still a lot that is in flight, both in the UI and the backend. If you want to run your food computer today, I suggest using the current release image. Once the new software version stabilizes, we’ll do a release.


#18

Cool, thanks for that, will keep hacking locally, and await the next release. Tried the current release but don’t have a , tft, screen so no go. Trying to keep costs down to bare minimum so thought I would go straight to v2 software as it runs through a web based interface not requiring a tft screen.
As the gro-bot I’m building is non-standard, I’m using a disgarded fridge minus the cooling system ( https://misterburnley.com ), and is going to take a longer time to put together I reckoned this would give me time to get to know the version 2 code.
Love learning about new software, thought about streaming the webcam through ROS and found, robotwebtools.org/index.html, I’ve started thinking of the Food Computer as a robot now ROS is being used,
Can you get it to speak how it’s getting on with the growing ? Ubuntu, https://insights.ubuntu.com/2016/07/07/mycroft-the-open-source-answer-to-natural-language-platforms/?utm_source=Newsletter_July&utm_medium=Email&utm_campaign=IOT-innovators&. might need two RPi’s.networked with ROS, wish I knew more about writing code. Sorry just getting carried away the team have probably brainstormed loads of ideas already.


#19

hi Gordon,

when you say the “current release” do you mean the PI img that works with the Unity Player client? If so do you have an easy answer for how to get the PC to start a new recipe. My FC seems to have timed out on it’s current recipe and has stopped controlling things. Realize these are newbie questions feel free to ignore. I’ll keep hacking away at it.

thanks


#20

It’s been too long! Haven’t been updating much, but there has been lots of progress:

  • Charts are working for all the sensors
  • You can export CSV
  • You can drop markers in the chart for timed experiments

I’m hosting the latest version of the UI on gh-pages. You can try it yourself if you’re brave enough to run the new backend software on a Raspberry Pi (here be dragons).

http://openaginitiative.github.io/openag_ui/

…things are coming along nicely.