Automatic soil irrigation over IOT Project Synopsis

Project Title: Automatic soil irrigation over IOT

Functional specifications: 

• The system will detect the moisture level in soil with help of moisture sensor.
• The system will be interfaced to Microcontroller board which will be tracking the status of soil moisture level.
• A Wi-Fi module is interfaced to Arduino motherboard and the same module is paired with the wireless router.
• A small water pump will be activated automatically when the moisture level in soil decreases to a minimum point.
• The entire unit works with 9V/ 1A power supply.
• User or admin will get the live status report of soil moisture level in ground and water pump running status from a remote location.

Non Functional specifications:

• The entire project electronics part is fixed in 200mm X 150mm X 100mm ABS plastic box with mounting holes.
• The entire system will be fixed in a waterproof box with IP 65 standards.
• This project will be easy to use and a person with minimum knowledge can also understand.
• The entire system has very minimum wires outside with proper name tags.
• The unit will easy to configure the surface (Tree/pole / flat surface).
• This project is totally shock protected.
• This project is low-cost automation for farmers/agriculture sectors.

Deliverables:

Hardware

• Soil moisture sensor 1no.
• Water motor pump 9V 1no.
• Water pipes 2 meters.
• Arduino board 1no.
• AC to DC Adaptor 9V/ 1A.
• ESP8266 wifi module
• Connecting wires.

Software:

• Arduino IDE is used to develop the entire project source code.
• Eagle CAD is used to develop circuit design.

Automatic Touch Screen Testing Machine Project Synopsis

Content:

  1. Abstract
  2. Electronics circuit and embedded systems
  3. Mechanical design.
  4. Microcontroller program
  5. Android app

Abstract

Designing and developing an automation process for testing capacitive touch screen of mobile, there will be android app installed in mobile which displays few color blocks on lcd screen when capacitive touch pen touches the particular color blocks on lcd screen with help of automation process with X,Y and Z-axis movement by holding capacitive touch pen on Z-axis, when capacitive touch pen touches color blocks on lcd screen in this process color block on lcd screen will change if its touch sensitivity is good else the block color will remain same and feedback from android mobile is taken for evaluating performance of touch screen and feedback is displayed on LCD screen.

Specifications

• XYZ axis router setup.
• Atmega 2586 Microcontroller board.
• DC motor driver L298.
• 12V/5A regulated supply.
• Hi-tech 645MG Servo motor z-axis movement.
• 2 DC motors for x-y plane axis movement.
• 2 DC motor encoders 16bit.
• Capacitive pen and its holder.
• 20X4 LCD Display.

Electronics circuit and embedded systems

In this Automatic Touch Screen Testing Machine project we are Arduino nano board which consists of Atmega328 microcontroller with 2KB RAM and 64 KB flash memory this board will act as control unit which gives signals to DRV8266 stepper driver motors and 16 X 2 LCD screen, we have 4 input switches to select menu options all these are interfaced with Arduino nano motherboard.

List of components used in this project.

• Arduino nano 1no.
• DRV8266 stepper motor driver 2nos.
• Nima 13 stepper motos 2nos.
• Servo motor 90G 1no.
• 16 X 2 alphanumeric LCD.
• Input switches.
• Connectors and wires.
• 12V / 2A Adaptor.
• Dotted PCB.
In our Automatic Touch Screen Testing Machine project, we have used dotted PCB for interfacing.

Block diagram of electronics parts

Circuit working operation

We are using 12V / 2A adaptor for powering our circuit board Arduino nano and DRV8825, Arduino nano plays a major role in this Automatic Touch Screen Testing Machine project.

Real Time Hand Gesture Recognition for Computer Interaction

ABSTRACT

The Real Time Hand Gesture Recognition system presented, uses only a webcam and algorithms which are developed using OpenCV computer vision, image and the video processing algorithms.

Existing Work:

The Existing work carried was done a PC based MATLAB software, where it has readily available all the algorithms and such system is not used to full extent in embedded side.

Proposed Work:

Proposed Real Time Hand Gesture Recognition for Computer Interaction work will be carried out on Linux based single board computer with ARM 11 architecture. This board will be ported with Raspbian Operation System and OpenCV Image processing Library. Using which will design an algorithm such that the system will identify the finger tips and count them how many fingers have been displayed.

BLOCK DIAGRAM


Hardware:

ARM11, USB Camera, Power supply.

Software:

OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator, Opencv.

Applications:

Computer Interaction, Gesture recognition based control

Advantages:

• Elimination of external hardware like mouse
• Easy to access and control appliances using gestures.

Hand Gesture Recognition based on Depth map

ABSTRACT

In this Hand Gesture Recognition based on Depth map paper a proposed method for gesture recognition using depth map image using Opencv is presented.  Using feature extraction method based on Radon transform will identify the hand posture recognition.

This method reduces the feature vector of Radon transform by averaging its values. For the classification of features vectors, the support vector machine is used.

Progress in depth map calculation of later year’s leads to exploitation then for several objects of research, one of depth map usage is gesture recognition since it provides information about shape of captured hand as well as position in frame. Depth maps have several advantages over traditional color pictures.

Existing Work:

In the existing the work, in order to capture the image  they have used an external hardware i.e Microsoft Kinect system, using the implementation will on higher cost side as well this system only has done all the preprocessing of the images ,no exact algorithm was implemented to detect the hand.

Proposed Work:

Proposed Hand Gesture Recognition work will be based on a Open Computer Vision library, using an Linux based real time device, will capture the images using a USB Camera connected to the device. And with our algorithms will detect the Open palm and Closed Palm in the capture images and draw a rectangle around it.

 

BLOCK DIAGRAM

Hardware:

ARM11, USB Camera,  Power supply.

Software: OS:

Embedded Linux, Language: C/ C++, IDE: Qt Creator.

Applications:

 Computer Vision system, Gesture control devices

Advantages:

  • Open source algorithm are used to implement the concept
  • Low cost and has future scope of controlling the appliances on gesture. 

Wireless Sensor Based Energy Conservation via Bluetooth

Existing Work:

In the existing Wireless Sensor Based Energy Conservation work, the communication protocol was limited to Bluetooth only which is very short distance and should require a device interface in order to view or control the data which is the drawback of the developed system.

Proposed Work:

The proposed Wireless Sensor Based Energy Conservation via Bluetooth work will include the low power high accuracy controllers through the Bluetooth communication as well we also control and view the data over the Intranet Network.

Load contains a Bulb, the current and Voltage consumed by the load will be monitored and an automated operation of the load based on PIR and LDR values will be done from remote location.

BLOCK DIAGRAM

Hardware:

ARM11, PIR Sensor, LDR Sensor, Bluetooth module, Wi-Fi Router, 89s52, Current Sensor, Voltage Sensor, Relay Driver, Load.

Software:

OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator.

Applications:

Home automation, Industrial Power Control

Advantages:

Load Control, Remote location access,