Openag-ui web app build diary


#21

@gordonb, I was able to connect my rpi3 with am2315 (so far the only sensor attached) today. Looks great!
BTW, there is strange lag between humidity and temperature. While those are coming from the same sensor and at the same time (I see it by checking ros topic broadcast), UI displays temperature with some lag.


#22

@serein thanks! I think I know what the problem is!

  • We poll the db every 4s. This is “often enough” but is not live. This will change though. I’m looking at switching to long-polling.
  • Regarding the delay between humidity and temp, I suspect there is a bug in the UI’s de-duplication code, since we de-dupe by timestamp and these almost certainly have the same timestamp.

Working on a fix.


#25

Hi serein, been following your progress, I’ve been working on code for the TSL2561, think i’ve got the basic structure but little idea of how to get values in and out of the module structure, inputs, outputs and arguments, in the module.json file. fancy looking at where I might be making errors in logic? the code is on my Github repo, MisterBurnley.


#26

@JohnB, you can take a look at my driver for bh1750.

Looking briefly at your code I would skip argument at this moment (just use constant i2c address you have already defined ) and for output use standard variable name so it can be picked up by OpenAg UI

"outputs": {
    "light_illuminance": {
        "type": "std_msgs/UInt16"
    }
},

Then in your TSL2561_CalculateLux class you need to declare and implement just 3 public methods:

void begin();
void update();
bool get_light_illuminance(std_msgs::UInt16 &msg);

Those will be called to read data from your sensor by external code. Implementation should be pretty straight forward using your already implemented functions:

signed long readVisibleLux();
unsigned long calculateLux(unsigned int iGain, unsigned int tInt,int iType);
void getLux(void);
void init(void);
uint8_t readRegister(int deviceAddress, int address);

I hope it helps.


#27

Brilliant,was getting a little bogged down, my comprehension of oops programming is basic to say the least, cutting and pasting is about my limit, thank-you. The fact that there might be some standard variable names was just getting into my consciousness. Will have another go tomorrow, time for bed here on the East Coast of England.


#28

@gordonb, during last few days I was able to attach few more sensors, including writing drivers for Light Sensor based on BH1750 chip and CO2 based on MH-Z19.

I also hope to calibrate and attach Atlas Science’s EC and pH sensors over weekend and though will be ready to proceed further.

Would like to check the current state of recipe handling. I see some related functionality on UI, but could not get it working - don’t know the JSON recipe format and actually not sure that related backend endpoints are implemented (yet to look at the code). Can you clarify this?


#29

@serein this is awesome!

during last few days I was able to attach few more sensors, including writing drivers for Light Sensor based on BH1750 chip and CO2 based on MH-Z19.

Super cool. Do you have this code posted to GitHub? It would be great to have a PR to add it to the list of supported sensors.

Would like to check the current state of recipe handling. I see some
related functionality on UI, but could not get it working - don’t know
the JSON recipe format and actually not sure that related backend
endpoints are implemented (yet to look at the code). Can you clarify this?

Selecting and starting a recipe does work (as long as the openag_brain docker container is running). Adding a recipe is somewhat basic at the moment (copy/paste JSON). In future we’ll want to build a recipe creator UI. Here’s a simple test recipe we’ve been using that will give you the idea: https://gist.github.com/gordonbrander/22eb2f90c7c2cdda022abe0fec9cf67d. The main change is that recipes are JSON now — a format that many are familiar with.


#30

@gordonb, yes, both are on GitHub. Here are the links:

  1. [https://github.com/serein7/openag_bh1750] (https://github.com/serein7/openag_bh1750)
  2. https://github.com/serein7/openag_mhz19

And thank you for the recipe sample!


#31

@gordonb I would like to start testing the web UI with the brain_docker running at the backend, but I have an issue: since my PFC is running with the original image, I don’t have Raspbian Jessie installed in my RPi, which is a requirement for installing docker and docker-compose.

he following packages have unmet dependencies:
 docker-hypriot : Depends: libc6 (>= 2.14) but 2.13-38+rpi2+deb7u11 is to be installed
                  Depends: libdevmapper1.02.1 (>= 2:1.02.90) but 2:1.02.74-8+rpi1 is to be installed
                  Recommends: aufs-tools but it is not installable
                  Recommends: cgroupfs-mount but it is not installable or
                              cgroup-lite but it is not installable

If I install Jessie in the RPi, what else do I need to do to keep the PFC growing my tomatoes with V1.0 (gro-daemon, …)?


#32

@gordonb, this was due to missing 5000 port forwarding in docker-compose.yml. Once that fixed (here is PR) selection starts to work, but I only see recipe name selected, but still no target values for variables. Maybe it’s not implemented yet?


#33

@pauanta One option is to get a second Raspberry Pi and Arduino. You can get openag_brain running on those without sensors (although the UI won’t show much :D).

The other option could be to install openag_brain on an SD card, wait until your tomatoes are harvested and then swap SD cards.


#34

@gordonb, do you guys already have some working configuration for Proportional-Integral-Derivative controller for any variable? Since I’ve managed to start recipe from UI, I’d eager to reach next step and see some variable be controlled, not only monitored :slight_smile:


#35

@serein not yet! We’re waiting on the chassis parts to come back from the machinist so we can build the first couple v2 chassis before trying to tune the feedback loops. That should happen this week. You’re more than welcome to give it a go, though. It’s just a matter of tuning the PID for each actuator.


#36

@gordonb, so how it goes with v2 chassis?

I hope you can find a minute to help me out with feedback loop architecture.

Here is my setup:

  1. light_illuminance sensor, which is represented by following ROS topics:
    /sensors/bh1750_1/light_illuminance/filtered /sensors/bh1750_1/light_illuminance/raw /environment_1/measured/light_illuminance
  2. binary actuator connected to LED lights. Which is represented by ROS topic /actuators/light_actuator_1/cmd and I can manually turn lights On and Off by sending True or False with rostopic pub command.
  3. simple recipe with light_illuminance set to 1500 lux. I see that this recipe creates ROS topic /environment_1/desired/light_illuminance and when I run rostopic echo /environment_1/desired/light_illuminance I see data: 1500 being constantly published.
  4. light_pid controller with variable parameter set to light_illuminance. This in turn creates following topics:
    /light_illuminance /light_illuminance_cmd /light_illuminance_desired /light_illuminance_down_cmd /light_illuminance_up_cmd

Now, what I don’t get is how to connect a) /environment_1/measured/light_illuminance output to PID’s input /light_illuminance , b) environment_1/desired/light_illuminance output to PID’s /light_illuminance_desired input and how to c) translate /light_illuminance_cmd with numerical output to actuator’s input /actuators/light_actuator_1/cmd, which expects boolean.

I checked topic_connector.py software module, but looks like it is doing a global mapping from low-level /sensors/* topics to /environment_1/*.

I hope you can point me to the source code or sample on how to wire PID with sensors and actuators.


#37

@gordonb, can you please share your fixtures for camera?
I’ve added usb camera to my setup by adding to software_module following:

{
"_id": “camera_1”,
“type”: “usb_cam:usb_cam_node”
}

I see ros topic created /camera_1/camera_info and /camera_1/image_raw and see that some data is published to this topic.

But going to UI I still see no camera data yet. Maybe I need to link image_persistance with my camera_1? But I don’t see any parameter for that.

Any advise?


#38

@serein hmm, I’m not immediately sure. The camera setup is pretty new (e.g. still in heavy development, may have bugs). Can you send me your full fixtures as a GitHub gist? I’ll look into this today.

Camera data is attached to the current recipe_start environmental_data_point. Do you see anything in the db for recipe_start? Are there any attachments on that doc?


#39

@gordonb, I think I have found openag_ros repo with missing glue code. Seems that it is WIP and I better wait till it gets a more stable shape.


#40

Hi all,

I’m at Science Hack Day San Francisco right now and am trying to login to the UI but am receiving a “No response from the server” error. Any thoughts on how to troubleshoot this error?

Hopefully someone sees this before tomorrow morning. But if not, I would still appreciate some help with this road block.


#41

@adrianlu Rose is with you in the place where you are going to have your activity or at home? If you moved Rose to the new place, did you check its ip address?

If you have the TFT touchscreen installed, you can get the IP address with this command:
ipconfig getifaddr en1

Another option, from your laptop, is to check the IPs of the devices connected to the network:
nmap -sn (ip of your router)/24

Since you use the 2.0 software, I don’t know if you still have the txt file with the ip addrees


#42

Hey @pauanta, I think I found my pi’s IP address. I’m currently ssh’ing into it. However, I tried using Adafruit’s Pi Finder and it could not find my Pi.