We aim an innovative smart technology that eliminates the hassle from gardening as we know it and enables anybody to grow their own food and plants from their mobile and web interface remotely indoor making it simple, fun and interactive. It’ll water them, feed them and make sure they have optimal growing conditions 24/7. Just sit back, relax and watch your plants thrive, knowing that everything is being taken care of.

It uses an array of sensors and actuators to create the perfect growing environment for your plants, watering and feeding them exactly what they need when they need it.

The setup will be connected to our mobile app and a web application as well so, that you can always stay connected with your plant buddy. With the help of them you can track and monitor the growth of your plant and understand their progress. It will also let you experiment with the plant with custom settings which will provide endless possibilities.

Step 1: How Does It Work?

The project consists of 4 parts:

1. Plant Habitat

2. Android App and Web app

3. Real-time Cloud hosted backend

4. Social sharing

Plant Habitat

This is the plant with an array of sensors and actuators which are controlled using Intel Edison. Edison was connected in realtime to the cloud via a NodeJS server.

Realtime Cloud Hosted Backend

Edison, Mobile app and Website are connected the cloud back-end i.e. Firebase. There is no middleware involved in our architecture, it is a two tier architecture.

Android App and Web app

To control and monitor the data of the plant setup we have two client applications.

Social Sharing

You can share your plant life with your friends and media.

Step 2: Features

Intel Edison will be acting as a data acquisition system, gathering all the data from the plant and its periphery, processing it and sending out to the server in real time. It’ll be monitoring parameters like Temperature, Light, Moisture and Humidity of and around the plant. The system will be having sensors like, Light Dependent Sensor (LDR), Temperature sensor, Humidity sensor to monitor the all the parameters.

There will be a simplified relay system to control the lighting, water and temperature conditions around the plant. All the processed data will be sent to the server where it’ll be saved in a database which probably will be running on Edison. Once the data is received by the server it’ll be further processed for plotting the graphs of all the parameters and monitoring other systems.

The web browser based micro-site and Android app will comprise following sections:

1. Monitoring

2. Preset Controls

3. Manual control

4. Tweet

5. Messaging

6. Snap

System can operate in two modes: a) Autonomous b) Preset

Track the Progress! Monitor their current statues and living conditions in form of graphs.

Tap and Grow! Just tell the app what you want to grow and watch it load pre-programmed settings for your plant.

Show some Love! Set the parameters and living conditions for the plant and see them follow your instructions.

Tweet! With on click of a Tweet button, tweet the health and living conditions around the plant on your twitter account.

Get Alerts! Get notified if any parameter goes out of the optimal values.

Snap! With on click of a button, take the picture of plant and living conditions around the plant on your twitter account.

Step 3: Materials Required


Intel Edison


1. Moisture sensor

2. LM 35 Temperature sensor

3. Light Dependent Resistor

4. Water flow sensor


1. Hair Dryer


3. Motor

4. USB Camera

5. 12V Relay board


1. Bucket

2. Plant

3. Pipe

4. Scissors

5. Soldering Gun

6. Glue Gun

Step 4: Sensors and Actuators

Light Sensor

This will be used to monitor if the light around the plant is optimal or not and further it’ll be synced with the clock to emulate conditions like sunrise and sunset. It’ll also keep on monitoring whether the light is working or not. If it is not working then system will send an alert message to the owner.

Temperature Sensor

Temperature sensor will monitor the temperature around the plant. The temperature sensor will also ultimately be used to switch a fan ON to control the temperature. Depending upon the mode it’ll regulate the temperature to emulate the best conditions for plant to grow.

Moisture Sensor

Measures the moisture levels in the soil to trigger the water pump that will water the plants.

Water flow Sensor

It measures the flow of water based on which it provides the water consumption by the plant.

Relay Board

It controls the Light, Temperature and water flow, Relay is there to take instructions from edison and act accordingly.

USB Camera

The USB camera captures the photo of the plant and snap event is triggered remotely from app/site.


Intel Edison is the core of our Entire Infra. It had following functionalities to complete.

1) Gather data from Habitat

2) Take action when told (i.e from App/ Website)

3) Take Photos

4) Send data back to Cloud

Technical aspects

1) NodeJS Server

2) OpenCV code

3) Arduino Sketch


Step 6: Hardware Flow

In our setup, The Client applications are never to talking to the Plant directly, everything was over the cloud because keeping it over Wifi only defeats the whole purpose of 'I' in IOT.

Arduino Sketch

Arduino sketch implementation contains the logic of Sensor data acquisition and commanding actuators. Instructions from the Client app are received by the Node server and then, they were communicated over to Arduino using MQTT protocol.

What is MQTT?

MQTT stands for MQ Telemetry Transport. It is a publish/subscribe, extremely simple and lightweight messaging protocol, designed for constrained devices and low-bandwidth, high-latency or unreliable networks. The design principles are to minimise network bandwidth and device resource requirements whilst also attempting to ensure reliability and some degree of assurance of delivery. These principles also turn out to make the protocol ideal of the emerging “machine-to-machine” (M2M) or “Internet of Things” world of connected devices, and for mobile applications where bandwidth and battery power are at a premium.

NodeJS server

NodeJS server is running on Intel Edison only. Server is connected to Cloud backend in realtime. One of crucial architectural decision that we took for GreenBit is to make our architecture Two Tier rather than typical Three tier architectures.

NodeJS server has three responsibilities:

1) Maintain realtime cloud connection with Cloud - This connection is implemented using Firebase Node Api which basically gives us a set of hooks on certain parameters that we want to keep under watch. Any changes made to these values are immediately communicated to all the Clients watching that value.

2) OpenCV for Clicking pictures- Everytime a request for a plant selfie comes NodeJS server executes our OpenCV module which takes a picture and then stores that in File System in Base64 encoded format. Node server will read that file and then transmit image in encoded form to the server.

3) Get logs in every one minute - In an interval of one minute Node asks Arduino to take readings and pass it over to Node on MQTT. These values are then logged in cloud.

4) Invoking actuators - On getting action calls from the App like change parameters or switch of values, taking photos etc are communicated to Arduino.

Step 7: Node Server Code

// Include all modules required in your server
var Firebase =  require('firebase'); 
var mraa = require("mraa"); 
var fs = require('fs');

// Create a new Firebase Reference object  with your Firebase application url 
var firebaseRef =  new Firebase('');

//Initialize values 
var currentSettings = null; 
var pushedSettings = null;

/********** Trigger message sending interrupt every 20 seconds *************/ 
var notifier_pin = new mraa.Gpio(5); 

// IPC to read data from Arduino Sketch ()example content: 123|45|200|....|0) 
//Subscribe to interrupt notifications from Arduino 
var subscriber_pin = new mraa.Gpio(1);     
subscriber_pin.isr(mraa.EDGE_RISING, subscriberEvent); 

//Attach change event listener on Firebase currentSettings value 
// Everytime the currentSettings value changes the callback will be executed. 
// Ref: 

firebaseRef.child('currentSettings').on('value', function(dataSnapShot){    
  // Get the new settings 
  var data = dataSnapShot.val();
  // Set the updated settings in pushed settings     
  pushedSettings = data;          

  //If the lightState is true then we need to switch on the otherwise vice versa    
	data.lightState = '1';     
       data.lightState = '0';     
   //Now we need to pass currentSettings value to Arduino to take appropriate action     
   //Since arduino cannot understand Node object we will create a concatenated string with values   //and pass it to Arduino using MQTT 
   var arduinoSettingString = 'abcd'+'|'+data.plant+'|'+data.lightState+'|'+data.lightOn+'|'+data                              .lightOff+'|'+data.moisture+'|'+data.temperature; 
   fs.writeFileSync("/home/root/ipc_codes/js_notification_out.txt", "NodeJS: "
                    + arduinoSettingStr   ing + "\n");     
   // Notify all the subscribers of the MQTT broker     

//Fire event to notify all subscribers 
function notifyWorld() {     

// Subscribe event of Node server 
// This event is called when data is sent from  Arduino to Node over MQTT 
function subscriberEvent() {     
  var arduinoSettingString = fs.readFileSync('/home/root/ipc_codes/arduino_notification_out.txt')			     .toString();     
  currentSettings = arduinoSettingString.split('|');    

  // In case tweet setting is true then send a tweet through firebase     
  // This setting is true whenever person touches the Tweet touch sensor     
  if(currentSettings[currentSettings.length -1] === '1')     
     var tweet = 'Temperature: '+ currentSettings[0] +' DEG  | Moisture: '+ currentSettings[1] +'     PPM | Light:'+ currentSettings[2]+ ' LUX';     
    firebaseRef.update({Tweet : tweet});     

  //Add a new entry in logs     
       temperature: currentSettings[2],       
       moisture: currentSettings[1],     
       light: currentSettings[0]     

Step 8: OpenCV Server

//Opencv node  server for sending Images
var Firebase = require('firebase');
var fs = require("fs");

var firebaseRef = new Firebase('');
var image_original =  "/home/akshay/Desktop/IoT/images/sepia.jpg";

//Attach watcher on snap value in firebase
//This value is set to True everytime a snap request comes from the App or web-app
  var value = snapShot.val();

  // If there's a snap request
	fs.writeFile("/home/akshay/Desktop/IoT/write.txt", "D", function(err) {
	 if(err) {
	  return console.log(err);
	 console.log("The file was saved!");

  // Wait for two seconds because opencv will click the photo and save it in a file
	//Read the file from the saved location
	fs.readFile(image_original,  function(err, original_data){
	    //Get the string
	    var base64Image = original_data.toString('base64');
	    //Add a new snap in Snaps array in Firebase
  }, 2000);

 fs.readFile(image_original, function(err, original_data){
    var base64Image = original_data.toString('base64');

 //Execute Opencv 
 var exec = require('child_process').execFile;
 fs.writeFile("/home/akshay/Desktop/IoT/write.txt", "C", function(err) {
    if(err) {
        return console.log(err);
    console.log("The file was saved!");

// start the server
var func = function(){
	console.log("Server starts!!");
	exec('/home/akshay/Desktop/IoT/laptop', function(err, data){

Step 9: Arduino Sketch

// Arduino sketch
#include "rgb_lcd.h"

rgb_lcd lcd;

// make some custom characters:
byte heart[8] = {

byte smiley[8] = {

byte armsDown[8] = {

byte armsUp[8] = {

// Defining Macros
#define DEBUG 1

// Analog Pins Definition
const int light = 0;			
const int temp = 1;
const int moisture = 2;
const int waterFlow = 3;

// Digital Pins Definition
const int touch = 2;
const int bulbRelay = 4;
const int fanRelay = 7;
const int pump = 8;
const int ledPin = 13;

//IPC pins
int notifier_pin = 3;
int js_subscriber_pin = 6;

FILE *fromarduino, *toarduino;
int i = 0;
int c;

// Timer
unsigned long timeOut = 0;
bool newdata = false;
int showType = 0;

int mapLight;
int mapTemp;
int mapMoisture;

String lightString;
String tempString;
String moistureString;
int B = 3975;
char charVal[10]; 

char ipcString[200];

//Touch Bool
boolean touchStarted = false;
boolean tweetState = false;

String plantName = "";
String lightState = "";

char *incomingString;
char *splitVal;
String input = "";
int counter = 0;
int lastIndex = 0;
const int numberOfPieces = 6;
String pieces[numberOfPieces];

// Funtion to print error
void printError(char *str)
  Serial.print("Error: ");

void setup() 
  pinMode(light, INPUT);
  pinMode(temp, INPUT);
  pinMode(moisture, INPUT);
  pinMode(waterFlow, INPUT);
  pinMode(bulbRelay, OUTPUT);
  pinMode(fanRelay, OUTPUT);
  pinMode(pump, OUTPUT);
  pinMode(ledPin, OUTPUT);
  pinMode(notifier_pin, OUTPUT);   
  pinMode(js_subscriber_pin, INPUT_PULLUP);
  // set up the LCD's number of columns and rows:
  lcd.begin(16, 2);
  lcd.setRGB(0, 255, 0);
  lcd.createChar(0, heart);
  lcd.createChar(1, smiley);
  lcd.createChar(3, armsDown);
  lcd.createChar(4, armsUp);
  lcd.setCursor(0, 0);
  lcd.setCursor(4, 0);
  lcd.setCursor(14, 0);
  digitalWrite(bulbRelay, HIGH);
  digitalWrite(fanRelay, HIGH);
  //Interrupts Initialization
  attachInterrupt(touch, touchTweet, CHANGE);
  attachInterrupt(js_subscriber_pin, subscriberEvent, RISING);

void loop() 
  lcd.setCursor(0, 1);
//  lcd.write(3);
//  delay(10);
//  lcd.write(4);
//  delay(10);
  int setTemp, setLight, setMoisture;

  if(timeOut == 0)
  timeOut = millis();
  if ((millis()-timeOut) >= 1000)
    mapLight = analogRead(light);
    mapTemp = analogRead(temp);
    mapMoisture = analogRead(moisture);
    //Light processing
    mapLight = map(mapLight, 0, 800, 0, 100);
    //Temperature processing
    float floatTemp = (float)(1023-mapTemp)*10000/mapTemp;
    int tempCelsius=1/(log(floatTemp/10000)/B+1/298.15)-273.15;
    String lightString = String(mapLight);
    String tempString = String(tempCelsius);
    String moistureString = String(mapMoisture);




    String finalString = tempString + "," + moistureString  + ","  + lightString; 
    finalString.toCharArray(ipcString, finalString.length()+1);
    timeOut = millis();
  //Temperature control  
  if(mapTemp > setTemp)
      digitalWrite(fanRelay, HIGH);
    }while(mapTemp < (setTemp - 5));
  else if(mapTemp < setTemp)
    }while(mapTemp < (setTemp + 5));
//Light Control
if(lightState == "0")
  digitalWrite(bulbRelay, LOW);

  digitalWrite(bulbRelay, HIGH);

//Moisture and Motor Control
    if(mapMoisture > setMoisture)
      digitalWrite(pump, HIGH);
    }while(mapMoisture < (setMoisture - 5));
  else if(mapMoisture < setMoisture)
  }while(mapMoisture < (setMoisture + 5));

//loop ends

void touchTweet()
  if(mapTemp < 20)
    tweetState = 1;
    lcd.setRGB(200, 0, 0);
    lcd.setCursor(0, 1);
    lcd.write((unsigned char)0);
    lcd.setCursor(3, 1);
    lcd.print("I AM SAD");
    tweetState = 0;
    lcd.setRGB(0, 200, 0);
    lcd.setCursor(0, 1);
    lcd.print("Light: ");
    for (int positionCounter = 0; positionCounter < 13; positionCounter++) 
        // scroll one position left:
    lcd.setCursor(2, 1);
    lcd.setCursor(4, 1);
    lcd.write((unsigned char)0);
    lcd.setCursor(6, 1);
  Serial.println("Touch Detected!!!");

//Read message from js notification file
void subscriberEvent() {
   toarduino = fopen("/home/root/ipc_codes/js_notification_out.txt","r");  //Opening message from JS
   if (toarduino)
    while ((c = getc(toarduino)) != EOF)
      if(c != 10)//new line

void publishData()
  fromarduino = fopen ("/home/root/ipc_codes/arduino_notification_out.txt", "w+");
  fprintf(fromarduino, "[%s]", ipcString);

//Nofity any body connected to this interrupt  (C++ program and NodeJS) program
void notifyWorld()
    digitalWrite(notifier_pin, HIGH);
    digitalWrite(notifier_pin, LOW);

//Print data on Serial monitor and to adjust the sensor values upto 3 digits
void print_data(int val)
  int new_val;
  new_val = 0;
  else if(val>255)
  else new_val = val;
  else if(new_val>=10 && new_val<100)

Step 10: OpenCV Module

//OpenCV code to click images in multiple formats
// like Sobel, Blur, Sepia etc
#include iostream
#include opencv2/opencv.hpp
#include opencv2/highgui/highgui.hpp
#include opencv2/core/core.hpp
#include opencv2/imgproc/imgproc.hpp
using namespace std; using namespace cv; int main () { // local variable declaration: char parameter; cout << "Enter the parameter: "; cin >> parameter; // image variables int width = 640; int height = 480; Mat capImg; Mat grayImg; //Gray Image // Sobel Image Mat sobleGrayImg; Mat sobelImg; Mat grad_x, grad_y; Mat abs_grad_x, abs_grad_y; int sscale = 2; int sobelDelta = 0; int ddepth = CV_16S; // Blur Image Mat blurImg, blurGrayImg; // SEPIA IMAGE Mat sepiaImg; Mat_ sepia(3,3); // Canny Image Mat cannyImg; int threshold1 = 1; int threshold2 = 150; // Laplace Image Mat laplaceImg; Mat src_gray, dstImg; int kernel_size = 3; int lscale = 1; int laplaceDelta = 0; int c; // Canny Blur Image Mat cblurImg, cb_grayImg; // Pattern Image Mat dst, cir; Mat cir_32f, dst_32f; int bsize = 8; VideoCapture cap(-1); if(!cap.isOpened()) { cout<<"Cam Not opend..." << endl; exit(-1); } cap.set(CV_CAP_PROP_FRAME_WIDTH, width); cap.set(CV_CAP_PROP_FRAME_HEIGHT, height); cap >> capImg; imwrite("/home/root/akshay/final/frame.jpg", capImg); cap.release(); imshow("Color", capImg); switch(parameter) { // COLOR IMAGE case 'A' : imwrite("/home/root/akshay/final/color.jpg", capImg); imshow("Color Image", capImg); break; // GRAY IMAGE case 'B' : cvtColor(capImg, grayImg, CV_BGR2GRAY); imwrite("/home/root/akshay/final/gray.jpg", grayImg); imshow("Gray Image", grayImg); break; // SOBEL IMAGE case 'C' : GaussianBlur( capImg, capImg, Size(3,3), 0, 0, BORDER_DEFAULT ); cvtColor( capImg, sobleGrayImg, CV_RGB2GRAY ); // Gradient X Sobel( sobleGrayImg, grad_x, ddepth, 1, 0, 3, sscale, sobelDelta, BORDER_DEFAULT ); // Gradient Y Sobel( sobleGrayImg, grad_y, ddepth, 0, 1, 3, sscale, sobelDelta, BORDER_DEFAULT ); convertScaleAbs( grad_x, abs_grad_x ); convertScaleAbs( grad_y, abs_grad_y ); addWeighted( abs_grad_x, 0.5, abs_grad_y, 0.5, 0, sobelImg ); imwrite("/home/root/akshay/final/sobel.jpg", sobelImg); imshow("Sobel Image",sobelImg); break; // SEPIA IMAGE case 'D' : sepia << 0.131, 0.534, 0.272, 0.168, 0.686, 0.349, 0.189, 0.769, 0.393; cv::transform(capImg, sepiaImg, sepia); imwrite("/home/root/akshay/final/sepia.jpg", sepiaImg); imshow("Sepia Image", sepiaImg); break; // CANNY IMAGE case 'E' : cvtColor(capImg, cannyImg, CV_BGR2GRAY); Canny(cannyImg, cannyImg, threshold1, threshold2); imwrite("/home/root/akshay/final/cannyImg.jpg", cannyImg); imshow("Canny Image",cannyImg); break; // LAPLACE IMAGE case 'F' : GaussianBlur( capImg, capImg, Size(3,3), 0, 0, BORDER_DEFAULT ); Laplacian( capImg, dstImg, ddepth, kernel_size, lscale, laplaceDelta, BORDER_DEFAULT ); convertScaleAbs( dstImg, laplaceImg ); imwrite("/home/root/akshay/final/laplace.jpg", laplaceImg); imshow( "Laplace Image", laplaceImg ); break; // CANNY BLUR IMAGE case 'G' : cvtColor(capImg, cb_grayImg, CV_BGR2GRAY); Canny(cb_grayImg, cb_grayImg, threshold1, threshold2); blur( cb_grayImg, cblurImg, Size(4,4) ); imwrite("/home/root/akshay/final/cblur.jpg", cblurImg); imshow("Canny Blur Image",cblurImg); break; // Pattern Image case 'H' : dst = cv::Mat::zeros(capImg.size(), CV_8UC3); cir = cv::Mat::zeros(capImg.size(), CV_8UC1); for (int i = 0; i < capImg.rows; i += bsize) { for (int j = 0; j < capImg.cols; j += bsize) { Rect rect = cv::Rect(j, i, bsize, bsize) & cv::Rect(0, 0, capImg.cols, capImg.rows); Mat sub_dst(dst, rect); sub_dst.setTo(cv::mean(capImg(rect))); circle(cir, cv::Point(j+bsize, i+bsize), bsize/2-1, CV_RGB(255,255,255), -1, CV_AA); } } cir.convertTo(cir_32f, CV_32F); normalize(cir_32f, cir_32f, 0, 1, cv::NORM_MINMAX); dst.convertTo(dst_32f, CV_32F); vector channels; split(dst_32f, channels); for (int i = 0; i < channels.size(); ++i) channels[i] = channels[i].mul(cir_32f); merge(channels, dst_32f); dst_32f.convertTo(dst, CV_8U); imwrite("/home/root/akshay/final/pattern.jpg", dst); imshow("Pattern Image", dst); break; } return 0; }

Step 11: Software Front End

Technology Stack

2) Ionic (

onic is a powerful HTML5 SDK that helps you build native-feeling mobile apps using web technologies like HTML, CSS, and Javascript.

2) AngularJS ( )

HTML is great for declaring static documents, but it falters when we try to use it for declaring dynamic views in web-applications. AngularJS lets you extend HTML vocabulary for your application. The resulting environment is extraordinarily expressive, readable, and quick to develop.

3) Angular Material ( )

The Angular Material project is an implementation of Material Design in Angular.js. This project provides a set of reusable, well-tested, and accessible UI components based on the Material Design system.

Step 12: Software BackEnd

Technology Stack

1) Firebase ( )

Firebase can power your app's backend, including data storage, user authentication, static hosting, and more. Focus on creating extraordinary user experiences. We'll take care of the rest.

2) Zapier ( )

Zaps are automations created using Triggersand Actions. You can use Zaps to connect any two Zapier-supported apps to each other.


Database schema has been shared in the pictures attached with this section.

Step 13: Links

Git Repositories

Mobile App


Kindly inbox me in case of any queries or just for FUN.

Rock on! \m/

First Time Author Contest

Participated in the
First Time Author Contest