All posts by tripzero

Auto-irrigation system for raised garden using the Intel Edison

The Plan

I want a raised garden but I don’t want to have to manually water it like my lawn sprinkler system.  So I’ve been planning and gathering parts for an auto-irrigation system.  Here are the key parts:

  • Rainwater gathering system
  • Valve control to drip-water plants
  • Solar power (with solar tracking?)
  • Soil temperature and humidity sensors
  • Auto water-soluble fertilizer mixing

In this part, I’ll talk about the solar power system -specifically power storage.

Solar Power: Power Storage

I have a bunch of 350 farad super capacitors laying around.  The cool thing about super capacitors is that they can charge directly from the solar panel.  I picked up a balancer on ebay and connected six of them in series to give me about 16 volts.  I also have a spare 10W Instapark solar panel that I’ll use to charge the cells.  The Instapark solar panel is rated for 22V closed circuit.  I shouldn’t charge my super caps over 16 volts so I will need to reduce the voltage a bit.  The easiest way to drop the voltage is to use a resistor.  Using Ohm’s law we can calculate how much resistance we need:

R = V/I

My voltage drop (V) is 22V (the panel max) / 16V my super cap array max which is 6V.  The current (I) I expect to see is 600mA or 0.6A.  Plugging in my variables I get

36.66 ohms.  I want 10 watts to be safe (I figure, probably wrongly so, that a 10W resistor for a 10W panel will be fine).


I found some water resistant enclosures on amazon.  This was perfect size for my super cap bank.  I got an additional one to put the Intel Edison and related circuits in.  To keep it water tight, but also allow cables to get in and out I picked up 4 of these from adafruit along with matching water resistant cables.



I used a 5/8″ spade bit to create two holes for the cable glands for the super cap box.



Carefully I screwed in the glands and put some gasket sealer on the inside to seal some of the uneven spots from the drill.



I did the same thing with the “Edison box”, but on opposite sides.



I then stacked two power supplies on top of each other.  I got the power supplies from amazon.  They have adjustable output and a wide input range.  I have one set at 12V for the valve solenoid and the other at 4.2V for the Edison.


Finally, I attached a power button so I can turn on and off. This too needed to be water resistant.  The white LED color is a nice touch, IMHO:



Finished Power Enclosure

The enclosure works pretty well.  It took about 15 minutes to fully charge.  My hope is that it will power the Edison and friends for an entire day and most of the night.  If it turns off in the night, I can live with that.




Next part we’ll look at the 2nd Enclosure for the Edison and friends.  Stay tuned!

Geek Home Theater Update – Minnowboard Max and lights in action

Quick update to the home theater system.  I installed the max with silverjaw lure (adds mpci-e and msata).  To attach the lights, I also required a level shifter to take the 3.3V up to 5V required by the Apa102’s.  Adafruit had just the component for the job: The 74AHCT125.

Here’s the panel with the max in there:

The results are pretty awesome.  The APA102’s peform much better than the WS2801’s.  No flicker.  Fast, and most of all, more lights!

Next up is to add some buttons to turn the thing off when needed and add USB capabilities and a remote.

Intel Edison + Koyo Sprinkler Controller

AKA: How to build a smart sprinkler controller in less than 100 lines of code

Sound impossible?  It isn’t.  Here’s how I did it but before we start, lets define what we mean by “smart”.  “Smart”, in a sprinkler context (yes, this will be highly subjective), to me means it waters my lawn and other things that need watering at just the right amounts at just the right time.

It must be green.  If it rains, it can’t water unnecessarily.  So it has to be able to understand the weather.  It also cannot water more than my lawn or other plant needs.  This means that it has to know what I’m watering and how much water it needs.  With those features in mind, lets begin.

First, the brains of the operation: The Intel Edison.  The Edison is a powerful and small unit with built in wifi and bluetooth.  That means it’s really easy for it to connect to both other devices and to the cloud.

The wifi feature will help us get weather data from the cloud and also evapotraspiration data (ET say whut?  More on this later).  The Edison also has USB, which we will need later on.

The Second piece of hardware we will use is the “Koyo”.  It’s a PLC.  My brother had an extra one that he was willing to donate.  I understand you can do the same thing with the Edison and a relay array, but if you have a PLC, why not use it, right?

The koyo uses UDP for control.  My brother wrote a python library to interface with the unit.  It’s pretty easy to use, the readme that comes with the library will help you find the koyo on the network and change its IP address.  The library also allows us to toggle the koyo’s relays.  The koyo python library requires a connection to the koyo.  If you can plug the koyo into your network via ethernet, awesome.  If not, you need to use something to bridge to the koyo over wifi.  The Edison has wifi, so we are good there.  But it doesn’t have ethernet.  This is where USB comes in.

I picked up a usb to ethernet adapter on digikey.  Using the Edison breakout board and a USB OTG to USB female I now have working ethernet -well almost.  Connman, the network manager, which comes installed on the Edison was a bit of trouble here.

Once we have a connection set up, we need some data from the Internet so we can make the system smart.  There are two sources we will use for data.  Weather Underground and Agrimet.  Whether Underground can tell us if it has rained and Agrimet will tell us how much to water.

Agrimet has a table that contains the evapotranspiration data for several crop types including “lawn”.  Evapotransperation (ET) is how much transpiration and evaporation of water has occurred during the day.  This number takes several variables into account including precipitation, solar irradiation, wind, and several other variables.  For more on ET and how it is calculated checkout the Wikipedia page.  Many local weather stations calculate ET for different crop types and Agrimet aggregates that data. Agrimet organizes the data per station so  we will need to find you nearest weather station to our location.  You can find your station on a map here.  Remember the station id. We will need it later.

Once you have the weather station id nearest your area, we can get an ET data table for that station.  The URL to get the table is this:

Replace {stationID} with your station and open the URL in a browser and you should see the data table.  To parse the data I used this script:

To use this class do the following:

“cropName” can be any crop that you see in the chart.  Since we are watering grass, I will use “LAWN” for my crop.  The next thing we want to do is see if it’s rained today.  Since our forecast was predicted using yesterdays numbers, if it rained today, it will not have factored in.  So we will just subtract the amount of precipitation if any from our forecast.  I use weather underground to get the current weather conditions.  Here’s my that combines the Et with weather data and turns on/off my sprinkler zones.


The zones are defined in a json file:

I got the rate for each zone by putting a cup and running the sprinkler for 10 mins then dividing by 10 to give me how much water my system produces per minute.  Type is LAWN, but it could be any crop.  No I’m not using this variable in my code.

The last thing to do is set it up on a schedule.  To do that, check out my video.


I’m almost completely happy with this setup after a few weeks of operation.  I love being able to ssh into the sprinkler and turn it on and off -even from my android phone.

Building the Geekiest Home Theater: TV wall mounting

My room TV is working great.  I’ve got the minnowboard max hooked up to some ws2801 LEDs and acting as a DLNA renderer.  Since my last post on it, I’ve also paired up my PS3 six-axis controller and have been using it as a mouse.  I have other cool ideas planned for it, but before those are finished, I wanted to get a Max-based system on my other TV and work it into a home theater setup.

First thing I have to do is get my TV off the floor.  The base for the TV broke, so when we moved we decided to mount it on the wall.  We found an “articulating” wall mount that looked like it would meet our needs:

What I like about this system, is that it has an open area on the base mount on the right and left that’s perfect for a double gang box.  So I grabbed one of these

It fit very nicely in the open area:



I think it looks really clean -especially after I zip-tie up those cables!

I also wanted to add some ambilights to this setup like I did my room TV.  However, I want to do something better.  Enter, APA102.

The APA102 is similar to the older WS2801 in that it’s an individually addressable LED strip that supports 24 bit color.  But that’s about where the similarities end.  The WS2801, shown on the image below on the right, is a larger chip that takes up valuable space.  You can only typically find strips of 32 LEDs/meter of the WS2801 flavor.  The APA102 (shown on the left), however, have the IC built right into the LED.  This allows up to 144 LEDs/meter.


The protocol is very similar to the WS2801.  You send an array of bytes, but in 32bit segments.  The first 32bit segment must be all 0’s.  The next byte is for brightness, usually all 1’s, and finally the 3 bytes is the data.  I updated my python light library to support the APA102 in only a few minutes.  Here’s basically the meat of the code:
data = bytearray()

data += chr(0x00) + chr(0x00) + chr(0x00) +

for rgb in ledsData:
data += chr(0xff)
# apa102 is GBR because THINGS
data += chr(rgb[1]) + chr(rgb[2]) + chr(rgb[0])

data += chr(0xff) + chr(0xff) + chr(0xff) + chr(0xff)

The fun thing was discovering that it uses GBR color format.

Mounting the LEDs

Instead of mounting the LEDs on the back of the TV, I’ll be attaching them to the wall.  I’m using the same aluminum brackets I used with my last LEDs.  However, this time I’m going for a flat black strip 60LEDs/meter.  4 total meters of it for a grand total of 240 LEDs.  To power them all, I need 80Watts @5V.  This turned out to be a problem.  I was only able to find a 5V power brick that was 50W.  Not good enough.  After some more searching, I was able to find this on amazon:

This can supply up to 100W @5V which should be more than enough.  The problem is… where am I going to put this?  It’s too large to fit in a double or triple gang.  And if I run a cable through the wall for this, I’m guaranteed to lose a bit of voltage (4-8% depending on which wire I use).  I also need a place to put the Max.  After a trip to the Home Depot, i found what I need.  It’s an “telecommunications” box 14″ by 14″.  It fits and mounts between two studs.  Turns out this was almost perfect.  I cut a hole in the laundry room, right behind the TV, to the size of the box.  One thing I didn’t plan well, was that the double-gang box for the TV prevented both being in the same spot.  To fix this, I just cut a large hole in the box to let the gang box come through.  This ends up being advantageous because it’s so easy to wire and rewire from the box instead of removing the outlet.


I have plenty of room to mount a max in the middle.  There’s also a nice looking cover that screws shut to protect the insides from little hands:


That’s it for this part.  Next part I’ll continue the adventure building a home theater including mounting the LEDs, the Max and speaker system.

Super Dock Qi Upgrade Video

The “Super Dock” is a dock for my moto x (2014) phone that has a giant 4200mAh battery in it.  It keeps my phone charged for a few days instead of just 10-12 hours.

I made the dock several months ago.  It consists of 2 parts.  The first part is the actual dock.  It’s molded so that the moto x just snaps into it.  It has a USB plug at the bottom to charge the phone.

The second part is the cover.  It is molded to fit around the dock and protect the battery and circuit.  It fits snugly on the dock part and the bottom of the phone.

Here’s the video:

Minnowboard Max RFDuino Lure


This lure is still a very rough draft, but it should allow for a number of pretty cool features:

  • Bluetooth Low Energy peripheral mode
  • Programmable with Arduino IDE
  • Adds some IO, PWM, SPI, I2C to the max
  • Communicates with Max over UART


  • Define RFduino <-> Max communication protocol
  • Create protocol library for arduino
  • Create protocol library for Max (python?  C++?)
  • Write examples

The source including kicad schematics is here:

Bluetooth Low Energy LED – Part 1: Hardware

LEDs should be taking the world by storm.  They are so far the most power efficient lighting source available.  Because they are more power efficient, they can save you money.

There are many smart LED lights out there.  The Philipps “Hue” is one where Philips has opened up their API (to registered developers).  This allows people to write custom software to take more control over their lighting.

However, there are concerns.  The Hue is a connected device.  It is on the Internet which brings in security concerns.  Aside from the security concern, maybe you want to just do your own lights?  Perhaps you just want to learn and discover how such systems work?  Maybe you have a killer feature in mind that isn’t supported by the other lights?  For all of these reasons, I have undergone an attempt to create my own Bluetooth Low Energy Smart LED.

DC Power and Solar possibilities

LEDs are DC powered.  Meaning, in order to use them in your AC-powered home, you need a AC-DC converter.  The AC-DC conversion may not itself be inefficient relative to a DC-DC stepdown converter, however, if your power comes from solar, which is also DC, it adds inefficiencies to go from DC to AC and back to DC.

If you have a solar setup, which someday I hope to have, you may want to try to convert your home lighting system to DC powered, in which case, off-the-shelf smart LEDs won’t work because they have built-in AC-DC converters.

Bluetooth Low Energy

Bluetooth Low Energy (BLE) is a new protocol incorporated into the Bluetooth 4.0 standard for low energy consumption.  It doesn’t have the speed that classic bluetooth has, but it allows for a lot lower power usage.  It’s range is also limited, but for a small house, it’s probably tolerable.  Other smart LEDs use bluetooth low energy.  Some use Zigbee, another low-power wireless technology.

I use BLE in some of my other projects, so it made sense to use it here as well.

One cool device that supports BLE is the RFDuino.


This device combines the programming interface of an arduino with a chip that supports BLE.  It’s API is pretty flexible allowing you to customize the UUID’s, device name and even change the transmit power the radio uses.  Best of all, in ultra low power mode it only consumes 4uA.


There are lots of LEDs to choose from.  I went with a 10W RGB LED that I found on Amazon.


10W is bright, but not too bright.  With this hardware guide, it’s also possible to use a 20W.  If you go bigger, you’ll need to find different cooling than what I talk about here.  You can also use a plain white LED of 10W or 20W.  You’ll have to calculate the resistor values yourself, but there’s help here.


LEDs can get hot.  So cooling is necessary.  I went with the passive cooling route and picked up 5 of these on amazon for pretty cheap:

They have mounting holes for 10W or 20W LEDs and come with the screws and thermal paste.

The Circuit

We want to be able to adjust the brightness on all three color channels.  To do this, we need 3 n-channel MOSFETs and some resistors.  Here’s my circuit diagram:


LEDs 1-3 represent the Red Green and Blue channels.  If you have only a single white LED, you don’t need to worry about LEDs 2 and 3 or the MOSFETs.  This circuit diagram uses the RFDuino SMD module instead of the DIP.  If you are using the DIP, you won’t need the USB programmer connections.

NOTE: All the code including the RFDuino code, android app and circuit diagrams are all open source.  I will provide links in part 2 of this guide.

NOTE 2: I’m not an electronic engineer.  I’m learning this stuff as I go and sharing what I learn.  Please feel free to correct me, my designs or even better, submit a patch!


For power I use a 12V DC power supply at 3.0A.  The LED takes 12V and the RFDuino uses 3.3V.  To supply the 3.3V for the RFDuino, I use a mini buck converter that takes 12V and converts it down to 3.3V.

The Bulb

How do we house all this hardware?  Well, I designed a “bulb” using a website called “tinkercad“.  Using this website you can design simple 3D models that you can print either using online services, or with your own printer (or in my case, a friend’s printer).

The case is round so that the round heatsink fits perfectly inside it.  I printed the case using clear resin.  My first bulb did not have holes in it to breath.  So it got kinda hot inside.  This latest revision has holes to breath.  Last but not least, it has a hole at the bottom for power input (sized to fit a standard socket adapter) and a lid that fits snugly at the top.

You can copy the design I made here.

For the power socket, I ordered these on amazon:

I used clear epoxy to cement the sockets in place.

WARNING: One flaw in this design, is that it uses a standard socket, but if you plug this LED into a standard socket, bad things could happen.  The LED expects DC current.  I modified a lamp to provide DC which I cover later in this guide.

After putting everything in the bulb, I was pretty happy with the results:
2015-02-22 (1) 2015-02-22

DC Powered Lamp

As noted above, I modified a lamp input to accept DC.  This was pretty easy.  I snagged an DC Barrel adapter from, cut some wires, attached the adapter and done!

In part 2, I’ll go over the software to get this up and running including an Android App, Linux “LED” Server and the RFDuino code.  Stay tuned!




Minnowboard Max Video Renderer with Ambilights – Part1: Software


The minnowboard max is a pretty cool platform.  It’s small -just a little larger than a credit card, power efficient and best of all: powerful.  It uses the Intel Atom single or dual core with hyperthreading.  The best part about it, however, may be the integrated graphics with open source accelerated drivers.

Because the drivers are open source, you can expect them to generally “just work” on a typical linux distro.  No extra EULA or compiling necessary like on other embedded system.

The max’s Intel HD graphics also supports OpenCL 1.2 via the open source beignet project.  OpenCL allows you to offload otherwise CPU intensive computations onto the GPU which is specialized for specific tasks.  Having OpenCL available in an embedded system opens up a lot of possibilities including image processing via the open source OpenCV (Computer Vision) project.  I will be using all of these components in this project.Goal:To create a DLNA renderer that uses and LED strip to display an ambient light which correlates to the image on the screen.  There are several projects out there that do this: boblight, hyperion are a few.  In effort to teach myself some new skills, I opted not to use any of these projects and instead start from scratch with an architecture where I could utilize the CPU power that the max avails.  I believe this exercise has created something simple, yet unique.Components of the system:

  • OpenCV for image analysis
  • Gstreamer to play the video
  • Beignet for OpenCL/GPU offloading
  • Rygel for DLNA renderer support
  • Vaapi for hardware accelerated decoding/encoding
  • MRAA for accessing IO on the Max
  • MaxVideoRenderer
  • Python – the language
  • Ubuntu 15.04


  • Minnowboard Max
  • LED Strip with the WS2801 IC (google for LED strip and WS2801 and you’ll find dozens of options that aren’t very expensive)
  • Aluminium right-angle bracket I got from Home Depot for $2
  • Double-sided heavy duty 3M tape.
    (More about hardware in Part 2!)


OpenCV is a library for computer vision.  It’s used for object recognition, detection and has a lot of image manipulation routines that can take advantage of hardware acceleration where available.  OpenCV 3.0, now in beta, features transparent OpenCL usage when available.  In the version 2.4-days, you had to use special opencv function calls to take advantage of OpenCL.  In 3.0, all these functions have been unified into the same call.  The underlying OpenCV system will then decide if it can use OpenCL on the GPU or not.

Ubuntu 15.04 doesn’t have OpenCV 3.0, so we will have to get it from source.  First, lets get the dependencies going.

sudo apt-get install build-essential cmake cmake-gui python-dev python-numpy git

sudo apt-get build-dep opencv

Next, checkout out opencv from github:

git clone

cd opencv/

mkdir build

cd build/

cmake-gui ..

These commands will have brought you to the cmake gui.   click configure to make Unix-style makefiles and then make sure you click to enable python and the python examples.  After configuring, look at the output to make sure the python module was enabled.  If it wasn’t, look for clues in the output as to what was missing.

Tip: To make compiling faster and to eliminate errors, I usually turn off the opencv_java module in cmake.

type “make -j5”, get yourself a drink and maybe something to eat.  It takes a little bit to compile opencv.  After make is done, run “sudo make install” to install opencv.


Beignet is an open source project that provides OpenCL support for Intel graphics platforms.  It supports the minnowboard max as well as Core “i” platforms.  Ubuntu 15.04 has version 1.0.1 already in the repository.  That will work wonderfully for our needs:

sudo apt-get install beignet ocl-icd-libopencl1 ocl-icd-dev

Gstreamer and Vaapi

Gstreamer is a powerful media framework that supports decoding and encoding of numerous media types.  It has a plugin framework system where you can combine several “elements” into a “pipeline”.  We will use this framework with our own customized and optimized pipeline.  Ubuntu comes with Gstreamer 1.0 by default, but we need a few extra packages for rygel and for vaapi support

sudo apt-get install libgstreamer1.0-dev gstreamer1.0-vaapi libgstreamer-plugins-base1.0-dev gstreamer1.0-tools

Test out gstreamer with vaapi support by using gst-launch-1.0:

gst-launch-1.0 videotestsrc ! vaapisink

You should see a test video image.


Rygel is a DLNA framework for serving and rendering DLNA content.  Ubuntu has a slightly older version of rygel that doesn’t have python bindings enabled.  Further, upstream rygel does not yet have python bindings for the gstreamer renderer library.  I created a patch to be merged upstream that enables the bindings.  So for now, we’ll use my github fork until the patch is merged upstream.

git clone

Next, let’s get the build dependencies:

sudo apt-get build-dep rygel

sudo apt-get install python-gi libgirepository1.0-dev

We also need to grab mediaart 2 from github.

git clone

cd libmediaart

./ –enable-introspection=yes

make -j5

sudo make install

Build Rygel:

cd rygel

./ –enable-introspection=yes

make -j5

sudo make install

If everything compiled and installed, we can now test rygel out.  I use BubbleUPNP on my android to control DLNA renderers.  It also allows me to play content from my phone.  There are probably DLNA apps for other platforms.  Look around and find the one that’s best for you.

To run the example rygel renderer, navigate to rygel/examples/gi and run “python”.  Note that you may have to edit the interface which is hardcoded to “eth1” at the time of this writing to the interface on your system that has an active connection.  When I run this, I see some output on the screen about some deprecated “SOUP” calls.  This usually indicates to me that it’s working.  I can now launch up BubbleUPNP on my phone and select the “rygel gst renderer” renderer from the renderers list.


MRAA is a library for accessing IO on various devices including the Max, RPI, Intel Edison and some others.  It has c++ and python bindings and is pretty easy to use.  It supports SPI, I2C, GPIO, PWM, and AnalogIO.

We will need to grab the source:

git clone
sudo apt-get install swig
cd mraa
mkdir build && cd build
make && sudo make install

To test, run python and enter in the following:

import mraa

This should output “MinnowBoard MAX”.  If it did.  It works!

MaxVideoRenderer – putting it all together

My source code for this project is found on github:

git clone

cd MaxVideoRenderer

mkdir build

cd build

cmake ..

In part 3 of this project series will go into greater detail of how this system works and the discoveries I made alone the way.  For now, let’s just run it.

python eth0 MaxRenderer 0

This will run a DLNA renderer named “MaxRenderer” on “eth0” with “0” lights.  We don’t have any lights hooked up yet, so this should be fine.

Now we should be able to see MaxRenderer in our DLNA control app and play content to it.

Part 2 will go into setting up the LEDs.  Stay tuned!

NOTE: much of this comes from memory.  If you run into issues, drop me a comment and I may remember more of what I did to get this all going.