Openag-ui web app build diary


I was using a non-Docker brain, which appears to be part of the issue. Just installed a Docker-brain and am able to access the UI.

The question remains for me though how to access the UI if I’m using the non-Docker/developer-brain.

Thanks for the help everyone!


Have you gotten any further?
I put your code in a software_module and loaded it to the database, and I am not even seeing the rostopic. The software_module_type is there (from the default.json).
Since this is running on the Raspberry, it shouldn’t need to be flashed (to the Arduino). Was there anything you did to get it up and running (besides possibly a reboot)?


@serein I’m impressed you got the camera working… I didn’t know that code was hooked up yet! Can you share the code you used to get it up and running?


@webbhm, @gordonb

Well, that was a long time ago and I don’t recall details out of my head. Let me hook it up again and refresh my memories. Will do this over a weekend.

And just to manage expectations correctly - I was not able to see webcam output in OpenAg UI. The only thing I got is a ROS topic /camera_1/image_raw with some image data posted.


A missing piece of the documentation (at this time) is rostopic management - how to easily connect messages to services. I think this is being done through json files, but it would be nice if this all got moved into the database. First step is to get image data to a rostopic, then persist the rostopic to the database, and finally get the database info passed to the UI (likely an API with curl).
At the moment I will be happy to see how Raspberry side python (camera) scripts get managed, and have their output to to rostopic. I suspect that python scripts are a counterpart to sensors on the Android.


@serein yeah, I don’t expect the UI to work yet (because I think you’re the first to have the backend for images all hooked up). However, once the backend is working, I think the UI should be an easy fix.


@gordonb, am I correct that the problem with images has been fixed already?

I finally was able to find some time to play with recent version of openag_brain (arduino-less setup WIP PR) and see that areal images are being correctly saved to environmental_data_point

They are not displayed in UI though. Since, I assume, UI is hardcoded to some time-lapse accumulation ("http://raspberry/img.jpg?timestamp=1491509454029" class="timelapse--video")


@gordonb @webbhm

How is this coming? I’ve got sensors and actuators up and am interested in connecting ROS topics. How is this done? Where does one set them up?




You would need to add code into the openag_brain repo, mainly under firmware/lib and edit personal_food_computer_v1.yaml, personal_food_computer_v1.launch, and maybe personal_food_computer_var_types.yaml.


Cool! Thank you @spaghet

I see var_types in the openag_brain repo but not in the docker build. Will you be generating a new docker container soon?




The camera is still WIP and the UI displays an image from http://<ip>/img.jpg which is a hack based on fswebcam saving directly to /var/www/html/img.jpg

I tried working on getting the camera image to work on Monday, it looks like there is a ROS topic of sensor_msgs/Image that is being published but I was having difficulty trying to get that to something accessible through the UI (mjpg, etc.)


I think we’re working on getting a build process set up so we can use something like TravisCI to automatically create new docker images and publish to dockerhub.

I’d say since you’re doing a lot of developer work right now, you would probably benefit from doing a local install of openag_brain and pulling the latest changes as they get pushed to github.


Yep, I’m gonna do that. I can contribute a PR on the PFC1 bom build as well. Let’s me get that set up and I’ll ping you back.

P.s. I use Travis every day to publish docker containers to a regionally redundant ECS on Amazon. While we don’t need a ECS deployment, getting the containers to ECR or DockerHub is easy peasy. I trigger builds/publish when we merge PRs to the develop master branches.

If you need help, let me know.



Oh, is that PR already there? I must have missed it. Sorry, I’d be the person to contact regarding the gro-* repos since they aren’t officially maintained anymore but I still try to look at PRs and stuff.

TravisCI is pretty easy to use, I’ve been using it to deploy our wiki theme

I think we had automated builds on openag_python as well, but not on openag_brain yet. We’re working on it though!



No PR yet. I’ll post it after I get things cleaned up.


Hi All, I am new to OpenAg and working on building a food computer with my professor at the university.
I have a very basic question regarding the UI. I have built the food computer and connected various sensors. I am able to launch the web browser and see the UI. There is no recipe yet, and I see the temperature on the side of the web page. But I dont see any other reading of any of the other sensors such as Co2 level, temperature level. Is there anything I am missing. I have searched a lot of forums but did not find much help on UI. Could someone please guide or provide any insight ?


Dear, a photo from Openag UI Camera Fix?

somthing like above ?
you see the wihte colored “tag” ? you may move mouse pointer to it and click on it will change to photo/data/control
Don’t forgot to post you screen capture .


I got that, I have the openag_ui working and I can access the UI using the URL: http://{IP Address of Food Computer}:5984/app/_design/app/_rewrite.

I see this below. My question is I dont see any reading on the sensors at Water Temperature, Air CO2 or Humidity. Is this how is it supposed to be ? Is there a way to know if the sensors are monitoring the CO2 levels and other things properly for testing purposes


When the first time I startup the ui with a broswer , I just get a large black box at the centre of the page,:fearful:, Three days later, I finally click to next tag, seeing some reading, three more days later I can import recipe, using WebUI. Then I start over from install form source to WebUI ,That is my story.:nerd_face:

By modify src.ino only include DS18B20 sensor and update data from it only. The UI page is just from github. and take the data to "Humidity " leave some will be “0”, some will be “null” .

My reading from DS18B20 data goes to Humidity so I echo on Humidity

You may try to focus on “DS18B20” then I2C BUS “AM2315”.
DS18B20 is using Pin5 (pull up resistor) , 5V and GND from Mega2560. If nothing connected to Pin5.
Open a terminal

$rostopic list
$rostopic echo "/environments/environment_1/water_temperature/raw

by using “rostopic echo” you can have all readings is ok or not at the moment.

Or you may want to check AM2315. please uplug all I2C bus form the circuit only leave AM2315 ,again using rostopic to check air_temperature /raw and air_humidity/raw

you may also try stop the openag_brain.service and run it manually and looking at any message related to sensors like DS1820, AM2315 and so on.

$sudo systemctl stop openag_brain.service
$rosrun openag_brain main personal_food_computer_v2.launch