Introduction: How to Easily Control or Monitor Anything

When you get to the point in life, where the need to solve a real-life everyday problem is more critical than messing with the latest, greatest and cheapest electronics or software - you will have arrived at the v2 zone.

Step 1: One Board, One Platform to V2 Them All

    • A - 64 MB DDR RAM
    • B - Atheros ar9331 SOC running OpenWrt Linux
    • C - 20 pin Linux GPIO header
    • D - External WiFi antenna
    • E - Oolite ar9331 Linux module
    • F - Internal WiFi antenna
    • G - Wifi Driver
    • H - Power button
    • I - DC power socket
    • J - Power management
    • K - External watchdog chip
    • L - Watchdog enabled power reset switches
    • M - Ethernet transformer
    • N - 10/100 base-T ethernet adapter
    • O - USB2 adapter
    • P - Micro SD-Card adapter
    • Q - Digital pin / I2C headers
    • R - Atmega2560 hard reset switch
    • S - Atemga 2560 SPI programmer interface
    • T - Digital Pins/ 1- wire header
    • U - 1-wire enable switches
    • W- Atemga 2560 micro-controller
    • X - Precision analogue sensor header
    • Y - Precision analogue resistor chain enable switch
    • Z - Digital/Analogue pin header
    • a - UART headers
    • b - 4 Open Collector relay drivers
    • c - FTDI USB to serial chip
    • d - Digital pin header
    • d - Logic level shifter
    • f - Atemga2560 programming enable switch
    • g - Atemaga2560 micro USB connector
    • h -RTC real time chip
    • i - USB Host adapter chip
    • j - 8 Open Collector relay drivers
    • k - Real-time clock battery

    Step 2: V2 - Conceptual Design

    In this stage, we were verifying our conceptual design.

    Whereas electronic hardware gets cheaper, the cost of development goes higher. How could we use the same hardware and same software for any IoT or control application without further development?

    V2 is wordplay for Version 2.

    The v1.x boards were double sided through plated as these were used for DIY educational programs. I was on v1.3 when I met Trigg from Gainstrong and opted to make a v2 board which is a self-contained SMT (surface mount) based board.

    I feel the place and way different microcontrollers and microcomputers are used in different applications tend to have similar approaches even though they have different specifications or capabilities. The hardware end of IoT or control projects is too hardware'ish and the software end of such projects is too software'ish - it's almost like water and oil in a jar, they won't seamlessly merge.

    A fighter is considered a mixed martial arts MMA when they have a standup fighting game such as kickboxing and a ground fighting game such as wrestling or jiujitsu. One style is effective in an outside kicking or punching range and the other style in an in-tight grappling range - people tend to get knocked out in the middle of the two ranges as neither fighting styles address the transition - they don't merge seamlessly. Kenpo karate uses "rapid hand" attacks and "shifting" to glue the two ranges in a true mixed martial arts fighting discipline such as Kajukenbo - the transition between the two ranges is seamless, safer and scalable.

    IoT projects don't seamlessly merge because software can be abstracted whereas the physical real-life end of things can't be abstracted. So hardware and software solutions for one IoT or control application cannot be used in a completely different application - without changing hardware parts or software parts, it is not scalable.

    The v2 controller addresses this issue by abstracting the sensor interface circuits using generic input potential divider chains and Linux to seamlessly glue the physical end of IoT or control projects to the software presentation layers. ie .. it is a second generation way of dealing with physical computing applications as compared to the traditional way this problem is approached ... ie ... 2nd version - v2. There is more information on abstracting sensor inputs here in this earlier Interfacing Sensors instructable.

    As with many PCB boards, the path to tracks on a copper clad board begins with wires on a breadboard. Our control board at v0.x was no different.

    Step 3: Mean Time Before Failure - MTBF

    The reliability of a system is the probability that it will perform as expected under specified conditions for a given duration. Reliability is defined as the reciprocal of the exponential of the Mean Time Before Failure MTBF . Each connection on a circuit increases the MTBF and reduces the circuits Reliability.

    As farmers, the nature of our physical applications meant the number of extension boards increased proportionally. Soon every application was a precarious stack of unreliable boards.

    Step 4: Linux >= 10+ Expansion Boards

    We found that connecting a Linux microcomputer to the Arduino micro-controller through the serial port was the same as adding more than 10 shields to the micro-controller. With fewer connections, higher reliability and greater scalability from the integration of a powerful Operating System

    Linux added:

    • WiFi and secure networking tools
    • A real-time clock using NTP
    • SDcards for extra storage
    • USB for video camera's and other uses
    • Powerful operating system and software utilities

    In the images, the micro-controller with test sensors is connected to a Beagle board micro-computer running Debian Linux using a logic level shifter. I used this setup for conceptual design development.

    Step 5: Sheilding the Breadboard

    The first board I created for this was an Arduino Uno shield with 8 sensor interface circuits and 4 relay driver circuits. Pretty much everything that was on a breadboard transferred to this board, improving the testing Reliability.

    The v1.1 Arduino Uno Shield board is shown above. I created this board when OshPark was new and Laen's panelling software was buggy, which is why the board is Zebra. I remember it would short tracks on the edge of the board in production. Laen was always helpful then.

    Step 6: MightyOhm - Joining the Resistance

    The cheaper Beaglebone Black was not in production yet so the development board cost almost $100. While trying to figure a solution to this, my friend Chris pointed me to The MightyOhm WiFi radio project that hacked an Asus 520g router into a Linux OpenWrt microcontroller.

    This board is hackable because Asus left the console UART connection on the board and all it needs is a PCB Header soldered as shown above and to expose RX, TX, Vcc, GND pins giving us access to the router internals using an FTDI cable or microcontroller. And also because you can build OpenWrt embedded Linux kernel images for its Broadcom BCM5354 System On A Chip Soc and upload it to the board easily.

    Step 7: V1.1 - Design for Prototyping

    After verifying the conceptual design on the breadboard, we moved on to designing for our first controller prototype. The v1.1 smart controller used an Arduino Uno microcontroller for real-time computing, the custom Arduino shield to interface the physical world and the Asus 520g router running embedded Linux for brains.

    Design for prototyping using the Asus 520g was really fun and allowed us to build our software harness and API platform for the control system.

    There were still too many physical connections increasing the MTBF of the system. The Linux microcomputer was not powerful enough and the BOM cost was relatively high.

    Step 8: V1.3 - Design for Cost

    As the backend software suite started together, the limitations of our hardware design became apparent. One of the issues was cost and reliability, so we decided to make a better cheaper physical control system.

    The first step was to combine the Arduino Uno and our shield design into one standalone board. The v1.3 microcontroller interface board is shown above

    Next, we needed a suitable embedded Linux microcomputer.

    Step 9: Hacking the TPLink703 Router

    The Asus 520g had 4M of working space - for Linux and my application, the TPLink 708 came with 8M and was only $20. It also had an Atheros ar9331 System On a Chip SOC microprocessor and could run OpenWrt Linux. It was heavenly. This was perfect for controlling the v1.3 standalone controller board.

    Step 10: Six Out of Ten

    It required a few physical hacks to expose the serial console port before it could be flashed with a new Linux image. This meant adding pins power connections and adding header pins on the board as shown above. The problem was everything on this board was surface mounted and really delicate to work with as tracks and pads on the board would peel easily. The flimsy tracks, our connections (RX, TX, GND, Vcc, GND to the board then to the header pins) reduced the MTBF drastically. For every 10 I tried to hack, only 6 worked without surgery and 3 broke off later. I had to glue the connections to prevent them from falling after making them. It was expensive especially in terms of time.

    But once the connections were exposed securely, the TPLink 703r running OpenWrt is a fun embedded Linux board for Internet-based physical applications.

    Step 11: V1.3 Controller

    The images above show the combined Arduino standalone microcontroller board connected the hacked TP-Link 703 board. A close up of the actual connections is also shown. The standalone board polls the microcontroller input pins and the sends these as a JSON object to the TP-Link 703 running OpenWrt Linux for processing. The microcontroller listens for a JSON object from the microcomputer to do things such as turning on relays. The combination of a microcontroller running on real-time combined with a microcomputer running Linux is the basis of the kj2arduino libraries

    Step 12: V1.3 Enclosures

    The v1.3 standalone microcontroller and OpenWrt based microcomputer had a nice form factor and dressed up well. A complete setup with sensors for an aquaponics garden is shown in the images. Sensors were easily terminated with stereo audio connections. This setup worked great when it did.

    Step 13: V2 - Design for Manufacturing

    I think we attained the core objective of the v1.3 controllers which was to make a better, cheaper smart controller and better than the v1.1 controller. However, some parts of the process were so cheap it was not possible to scale production from DIY mode to benefit from industrial production processes. So we decided to design a much better, cheaper and production capable smart controller - the v2 controller.

    I liked using OpenWrt on the Atheros 9331 SOC on the TP-Link 703. On further research, I found the Oolite board from GainStrong which used the same AR9331 SOC as the TP-Link 703 but with 16M and all the pins critical were exposed for hacking.

    The three embedded solutions are shown above, the Asus 520g, the TP-Link 703 and the tiny "powerful" AR9331 Oolite board from GainStrong. Another reason for using the Oolite board from GainStrong is the boards have WiFi certification from RF interference.


    Step 14: V2 Block Diagrams

    The objective for the v2 controller board was to combine the working prototype design v1.1 and the low-cost design v1.3 as the basis for a manufacturing design. The v2 controller had to be cost-efficient, functional and easy to manufacture. This meant combining the Linux micro-computer, the Arduino micro-controller, the sensor interfaces, relay drivers, peripheral connectors on the same board for SMT production.

    We used the AR9331 as the Linux computer but upgraded the micro-controller to use the ATMega 2560. We also added sensor interface circuits, relay drivers, PSU, USB, logic level shifters, watchdogs and other circuitry all on the same board. The core design is shown in the block diagram for the v2 smart controller above.

    Of particular pride is the external watchdog that watches both the micro-computer and micro-controller and can initiate a cold reboot through the MOSFET switch on the power line if hardware problems are detected or on demand.

    Step 15: That New PCB Board Smell

    I wanted a board so beautiful that I would have no problem using it as a paperweight if I could brick it.

    The ATMEGA side is relatively easy to add software to as any Arduino Mega code will run on it and it can be programmed using USB like a regular Arduino.

    Since the hardware connected to the Linux microcontroller is customized, we had to build a custom kernel binary for the brand new boards. This kernel is preinstalled during manufacture so the users don't have to do anything but start using the boards.

    Outside of a faulty component on the board, which is normal during MTBF burn-in , it is near impossible to brick OpenWrt or the Atmega side or the v2 controller.

    Step 16: Those New PCB Board Nightmares

    At the office and home my workbench is next to the WiFi router, so everything worked. But every time I went out to do demo's or presentations, the application would fail and work again when I went back to the office. To make it worse, it worked ok for some boards and weird for others.

    The first 5 boards were ok but needed some upgrades. The next batch of 100 gave me nightmares for a little while. Then the next 100 worked nearly perfectly, but for a dim LED and an electrolytic capacitor set for 12vdc rather than the 48vdc power supply board can handle.

    It took a long to realize the problem was related to my proximity to the WiFi router and to narrow it down to the orientation of a tiny pencil tip 0 ohm SMT resistor on the ar9331 module.

    In production, the resistor is on one side when calibrating WiFi then it is supposed to be flipped to the other side to default to the internal WiFi antenna. Some of the boards were set for internal WiFi, some for calibration and others for external WiFi antenna - it drove me crazy. All that was required was to flip the resistor in the internal WiFi direction. it took 3 months to catch this problem as I could not tell if it was a hardware issue, or a networking issue, or a kernel issue, or an application problem or even an API issue.

    I had to mentally absorb the entire IoT Stack to pinpoint this issue to a factory fabrication problem.

    It is a normal communication problem between teams, moving from 5 boards (hand assembled by an engineer) to 100 boards assembled on the production floor line.

    Step 17: V2 Controller Hardware Specifications

    The v2 controller board has the following hardware specifications. This is determined by the electronic components on the board and the physical connections. The behaviour can be modified using the configuration switches, but the hardware specification cannot change. What I think I am trying to say is that we have to tell the Linux microcontroller about every connection on the board, USB, network port, serial port, the watchdog, the out GPIO's, the input GPIO'S, real-time clock, the Arduino serial connection etc. This requires a custom kernel and a custom Uboot (~bios)

    Step 18: V2 Controller Linux Kernel

    Building a custom Linux kernel for embedded systems is fun. What this means is we need a bootable binary file that will load and run OpenWrt when the v2 controller is powered. Linux will start in such a way that it is aware and can communicate with all the connections into and out of the v2 controller board. To build kernel images, buildroot images are required. In the example images, a kernel image for an Atheros Oolite board with USB video and mjpg-streamer enabled are shown. First, the system on the chip soc is selected, then support for USB is enabled, then USB video drivers are enabled and finally, the video application mjpg-streamer is enabled before the kernel binary is compiled. This process has to be done for every connection on the board to enable them before the kernel is compiled to create the Linux installation binary.

    Step 19: V2 Boot Messages

    When the kernel binary is uploaded, the following messages will scroll through the screen as Linux boots. Kernel messages are shown as the images for an Atheros Oolite board are initialized, as USB devices are enabled, as the USB video drivers are loaded. Connecting a USB camera will start the mjpg-streamer application and begin video streaming and recording. Likewise, all the other parts of the board configured in the kernel will start automatically depending on what needs to get done.

    Step 20: V2 Board Suite

    After the Linux kernel is bootstrapped and loaded, execution is passed on to the Init system where the operating system's device drivers, libraries and utilities are initialized. Execution is then passed on to v2 init system where the v2 suite of software utilities are enabled or disabled.

    The core tasks are monitoring the serial input for valid sensor data and transmitting a JSON data object to the API. Checking for instructions on v2 commands to execute,, Checking for sensor errors conditions that need an instant response (eg. if there is a leak, deal with it before sending data to the server). Custom pluggable control tools such as controlling temperature, controlling humidity, controlling pH,. Tools for controlling relays. And many others as can be seen from the tree snapshot.

    These are controlled using kijani.json which determines which scripts to run or which relays to activate and other configurations. kijani.json is synchronized with the data at API, meaning it is possible to control all these scripts and thus the v2 hardware remotely through the Internet easily.

    There are 12 open collector relay drivers to enable this and control them remotely is as easy as enabling it and giving it a name - it will show up as a control button on the API.

    "relay3": {
    "name": "vent",

    "description": "air vent",

    "enabled": 0,

    "on": "ventOn"

    , "off": "ventOff"

    },

    Step 21: V2 Software Specifications

    The default image build for the v2 controller comes with some of the following specifications:

    Step 22: User Case: Smart Aquaponics

    The best way to explain how the board hardware, software and backend API platform come together using an example - an organic food factory.

    In this case, a symbiotic relationship between fish, bacteria and plants is used to grow food using aquaponics. The key components are modelled after nature, controlled using relays and monitored using sensors. The physical layer devices such as the sensors and relays are connected to the Atmega. The reason is that the micro-controller runs at real-time and will give more dedicated attention to the physical end of things. Security, logic and communications are handled by the Linux end. This overview is shown in the image in this step

    Step 23: Creating Data From Bits and Bytes

    The Atmega input pins are all polled every couple of seconds and the results are passed through the serial port into a JSON object. This object is generalized because it consists only of raw polled data. Analogue pins will read between 0 and 1024, digital pins will either be 1 or 0 and serial data lines will return a value, The data is not calibrated to give a human-readable value. The objective of the Atmega micro-controller end on the board is to faithfully collect bits and bytes from sensors and create a JSON data object. Translation of the raw data into meaningful information is done in the presentation stage - on your phone or computer.

    Likewise, the output pins that connect to the Open Collector relay drivers respond to ON/OFF commands sent as a JSON object on the serial port.

    Here is an earlier instructable on How to interface sensors on the v2 controller.

    Step 24: Creating Information From Data

    Even though the JSON data object from the real-time micro-controller is an accurate representation of what the sensors read, it is just generic data. The Linux micro-computer will read the JSON string on the serial line, sanitize it and append data such as the hostname, a time zone, a time stamp and other data elements to the data object turning the converting the data object into an information object (it now has a name, timezone and other informational bread crumbs).

    The information object is analysed for local errors before it is transmitted to the backend API for further processing over the Internet.

    Step 25: Posting the V2 Data Sensor Object

    The data is sent to the remote API using the following JSON format

    curl -H "Content-Type: application/json" -X POST -d \

    {"baudRate": 38400, "name": "kj_v2_44", "uptime": "5:42:57.290000", "pins": {"temperature_0": 70.47, "relay6": 0, "D38": 1, "D36": 1, "D37": 1, "D34": 1, "D35": 1, "D32": 1, "D33": 1, "D30": 0, "D31": 0, "humidity_temperature": 22.0, "A15": 451.0, "A14": 495.0, "temperature_sensor_count": 1, "A11": 679.0, "A10": 744.0, "A13": 502.0, "A12": 564.0, "UART3": 0, "A1": 449.0, "A0": 362.0, "A3": 413.0, "A2": 426.0, "A5": 378.0, "A4": 393.0, "A7": 375.0, "A6": 372.0, "A9": 1023.0, "A8": 378.0, "relay4": 0, "D29": 1, "D28": 0, "nutrientTemp": 21.31, "corizHumidity": 0.2, "D23": 1, "D22": 0, "UART2": 0, "capacitance": 67785, "D49": 0, "D48": 0, "corizCo2": 2, "D43": 1, "D42": 0, "D41": 1, "D40": 1, "D47": 1, "D46": 0, "D45": 1, "D44": 0, "rtc": "2000/8/15 11:18:25", "humidity": 38.0, "flow_rate_sensor": 0.0, "D8": 1, "D9": 0, "D6": 0, "D7": 0, "D4": 1, "D5": 0, "D3": 0, "corizTemp": -99.8}, "version": "v2.0.0", "wlan0": "192.168.1.34", "initialize": 0, "atmegaUptime": "00:05:42:12"}

    https://api.kijanigrows.com/v2/put/kj_v2_01

    Hint: ... send your data here in this format for processing and visualization using the v2 API tool. It still works best with a v2 controller.

    Step 26: A Data Topology Overview

    The data object with information on the board is transmitted to the API at https://api.kijanigrows.com for logging, visualization, alerting, remote control and other application services. The information can then be accessed using a phone or computer by users with permissions to a device. The backend API is mainly Nodejs based. The users then access the physical data from their devices using REST or Sockets.

    Step 27: API @ Kijani Grows Backend

    Humans perceive information from their surrounds in multiple ways. I found that representing physical information takes different types of visualizations to recreate complete pictures. For this reason, the API platform is made of many software tools, Nodejs for data flow management, MongoDB for document processing, InfluxDB for time series trends, Python for processing errors, Mjpg-streamer for video streaming etc.

    These tools allow the user to visualize and consume the physical data as graphs, dynamic icons, interactive forms, animations, tweeting alerts, video time-lapse streams and other forms in real time as though they are physically next to the actual project.

    Step 28: Accessing Data Remotely Using the Api

    the following typical HTTP call is used to the latest data for a device from the API. A typical response of the latest JSON data object is shown as well.

    curl -k https://api.kijanigrows.com/v2/device/get/kj_v2_01


    { "baudRate": 38400, "name": "kj_v2_01", "uptime": "1:24:10.140000", "pins": { "D38": 0, "D39": 0, "D36": 0, "D37": 0,, "D33": 0, "D30": 0, "D31": 0, "A15": 422, "A14": 468, "A11": 624, "A10": 743, "A13": 475, "A12": 527, "relay8": 0, "UART3": 0, "A1": 933, "A0": 1023, "A3": 1022, "A2": 1023 "A9": 1023, "A8": 348, "D29": 0, "D28": 0, "nutrientTemp": 22.44, "D23": 1, "D22": 0, }, "version": "v2.0.0", "wlan0": "192.168.1.2", "initialize": 0, "atmegaUptime": "00:00:34:52", "timestamp": 1473632348121, "day": 1472256000000, "time": "2016-09-11T22:19:08.121Z", "_id": "57d5d85cd065ea4654009fce" }

    Step 29: Device Lists

    V2 devices sending data will show up in the device lists if permission for this is enabled for the device.

    The status of how long the devices have been running for is shown and also color coded. The devices in white are active and the green ones have been offline for more than 30mins.

    Step 30: Device Sensor Mappings

    Since we transmitted JSON objects with all the raw data pins, we now map this object to the sensors the application has so useful information can be presented to the user.

    The sensor mapping is down in the Sensor Mapping window. You select a pin and select from a library of commonly known sensors types. The raw data will be calibrated during presentation by the kj2arduino library.

    For instance, in the image, the raw data on pin A1 is mapped to a photocell sensor.

    Step 31: Sensor Details

    The Mapped Sensor brings up the Sensor Details interface.

    In this example, clicking on the photocell sensor mapped to pin A1 brings up the photocell details page.

    This shows how raw data for the sensor type will be displayed in human-readable form - units, alarm setpoints, alert messages, Unicode icons, messages and a method are shown for this sensor. The kj2arduino method ldr2lumens converts the raw data from the photocell sensor on pin A1 to lumens.

    It is very easy to add new or remove existing sensor types. It is even possible to combine sensor types to create new sensor types.

    Step 32: Mapped JSON File

    The mapped JSON object is created after mapping raw sensor data from the v2 controller to sensor types objects on the API. This object says from all the raw data that the v2 controller just sent, this is the data from the sensors the user is interested in and this is what it means to the user.

    The new object from the API is used for visualizing data in different forms as shown next.

    Step 33: Displaying Sensor Data As Dynamic Icons

    This is fun.

    Even though I am not the greatest at original artwork, I really enjoy graphical elements that change color and shapes based on actual sensor conditions as it happens. It is important to enjoy things you develop and things you will be testing for extended periods.

    Three color ranges are used for each sensor; blue for less than, green for ok, red for greater than, and in-betweens. I am probably color blind as well, but it is very easy to configure your own sensor icons and sensor state colors or shapes. The sensor icons shown come from the mapped sensor JSON object shown earlier on.

    Step 34: Displaying Sensor Data As Animations

    I like to think that if I got alerts when I was on my bicycle, I would rather see the alerts with respect to the entire system. The idea behind animations is to visualize the sensor elements in a way that allows me to see visualize the entire system as one picture.

    In the picture, sensors on an aquaponic garden are overlaid to create an aquaponics garden animation. The different colors and shapes are used to display the sensor status.

    It a little bit hard to scale this as each different application requires a custom animation.

    Step 35: Displaying Sensor Data Trends

    When I have enough time to look at sensor data and ponder about the physical application, I am likely to be interested in how the system has been performing over time and on how the different parts respond with respect to one another.

    For this kind of analysis I prefer graphs that allow me to display all the sensor data in parallel to see relationships and with the ability to zoom over 5 minutes or 5 years refreshing every 5 seconds.

    The v2 controller backend API uses InfluxDB for a timeseries database and Grafana for graphing. This means you easily can create very dynamic, scalable, beautiful, sharable visualizations that make picking out trends easy as shown above.

    Learning about the system begins here.

    Step 36: Displaying Sensor Alerts

    As shown earlier alerts are dynamic and scalable. Each device can enable alerts on a sensor type and on a sensor state. ie. I can choose to alert on a device if someone walks outside the room and not inside the room.

    Alerts are sent as tweets by default. 280 characters with a multimedia ability are sufficient to catch my curiosity.

    Generally, I get bored with messages so, quotes, custom messages and Unicode images fill up the rest of the tweet making the applications fun and sassy as shown in the graphic above.

    It is easy to send out email, text or phone alerts, but the costs are different.

    Step 37: Is the V2 Controller Open Source

    A successful Open Source project means more than just posting files on Github, it means keeping the project active and supporting it. I have not had the cycles for this so I haven't done it yet.

    That said the v2 controller is not the board, it is not the software, nor is it the backend - rather an end to end full IoT stack platform. The kernels will only work fully on v2 controller board, the board utilities can probably be easily port to a raspberry pi or other Linux micro-computer. The atmega2560 code will run on an Arduino Mega. You would need numerous shield depending on what you were doing.

    The API will take a well-formed JSON object from any source and apply the same business logic to the data, One would need extra permission to see their data on the v2 platform.

    The kernel and the Atmega files are at https://kijanigrows.com/downloads. There is lots of documentation on the v2 controller board on my blog and the v2 knowledgebase . You can get the board preinstalled with all the software and ready to go from here.

    There is so much I have not touched about the controller and platform in this instructable. If I could get some help I can Open Source the project. If I could get some artistic help the sensor images would look better.

    The WiFi module for the V2 controller has EMC certification so the v2 controller board is generally safe to use in production environments globally.

    PCB Contest

    Participated in the
    PCB Contest