Category Archives: IOT Projects

Smart Luggage System IoT Project


The main Idea of the Smart Luggage System IoT Project is to develop a luggage that could be user-friendly. The project is more of a luggage less of a robot.


  • Bags have always been an integral part of travel life whether it may be a travel bag or a plastic bag or even leather bag every bag has its own importance and carries different functions and utility.
  • Dragging the luggage all over the place has been done since the golden ages.
  • Thinking of a luggage which conveys its weight, tracks its location, which follows the user automatically or manually, by the touch of the present technology to the old baggage it may bring out its true potential.
  • This has motivated the project all along so that it is user-friendly and could be operated by a Smartphone.

Proposed System:

  • There are a lot of applications to the luggage but all of them are not controlled from the luggage instead the commands are sent from the mobile phone to the luggage via Machine to machine communication.
  • The mobile phone has a pre-installed application software with a pre-installed set of instructions.
  • They wait for the user to send the commands.
  • After the microcontroller embedded inside the luggage receives instruction from the user it acts accordingly. This can either be for tracking its location and send it to the user or send the luggage weight.

Hardware Components:

  • ESP8266 Wi-Fi module

Software components:


Working Model:

  1. When Bluetooth is Connected

2) When Wifi  is not Connected

App Screens:

Screenshot of app showing Welcome Page

Screenshot of app showing HomePage

Screenshot of app showing displaying Weighing Module
Screenshot of app Displaying Weight
Screenshot of app when Tracking is Pressed

Screenshot of app in Autonomous Mode
Screenshot of app in Manual Mode


  • Painting/Artwork Security:  With a GPS device owners can quickly learn the location of a stolen piece and recover it immediately.
  • Prevention of Car Theft: By installing a tracking device on a vehicle, the location of it can always be known. The vehicle can be easily recovered, but the device will also act as a deterrent for thieves. This will prevent the nightmare of having a car stolen from occurring in the first place.
  • Police & Private Detectives: Police can benefit by attaching tracking devices to baggage during investigations, allowing them to easily track movement and come up with solid evidence.
  • Hiking: With a GPS device, hikers who get lost can be recovered quickly should they ever come up missing


  • Don’t worry losing your luggage anywhere! It has an onboard GPS module to help you track its position using Google Maps.
  • Whole system integration is controlled through Application installed in Android where controlling the device will get very easy to the user.
  • Weighing the luggage will become easy.
  • The carry of luggage is changed with motorized wheels.

Future Scope:

  • The application should be more dynamic and it should show the live feed of the movement of the luggage which updates every time the luggage is in movement. The tracking could be taken online using the cloud technology.
  • The introduction of Digital Locks into the luggage will help the user secure the contents of the luggage with the help of dynamic encryption algorithms and techniques which will safeguard any Machine to Machine Communication.
  • It comes with GSM module which helps us to triangulate its location when GPS is failed to retrieve the data.
  • Get the status of the flights, regulations using app and pack accordingly.


  • The limitations of the traditional luggage will overcome with Smart Luggage. The market is still new and has potential to accept the new face of luggage.
  • Know the weight of your Luggage at any time and pack accordingly.
  • Move the luggage easily using autonomous and manual modes.
  • The smart luggage will ensure its safety and builds security for its user. From built-in scales to GPS tracking and mobile apps, these bags won’t make hauling stuff any lighter, but they could make the experience less harrowing.

Department Announcement System ECE Project Abstract


In most of the institutes, announcements are coming under the major problem. A universal announcement by speakers is mostly deployed in most of the institutes, but there are cases when the announcement needs to be done for a particular classroom or for a particular person.

In that case, such a universal system will not be helpful. One resolution of this problem most of the institutes adopt is deploying one universal system and a separate system for each room and then place people to handle the entire control system.

Though this resolves many issues, it increases the cost of the entire control unit when you have to be dependent on human resources. Automation is one of the most thriving of today’s technology through which many organizations are cutting costs on Human Resources and utilizing human resources in more innovative and productive works.

So here for this issue also we have come up with automating the entire process of announcement where a pre-recorded voice will be played for every announcement in the institutes.

Block Diagram:

Communication channel box :

In this communication channel box signals from the desktop will be received to microcontroller and microcontroller will send a signal to relay driver  IC (ULN2803)  to switch the relay.

Relays —— SPDT.
Voltage —–12V

Purpose of the relay is to channelize the sound for each classroom.

Amplifier box

This box will receive the signals from mic where we speak and sound is amplified and sent to relay channel junction box.


  • Human effort is reduced.
  • Time to convey the things can be done in a more faster phase.
  • They won’t be any communication gap with this setup.
  • We can communicate in multiple classrooms at one time.
  • We can know the delivery report of a speech delivered to the classroom or not with help of feedback button in each classroom.

Automatic soil irrigation over IOT Project Synopsis

Project Title: Automatic soil irrigation over IOT

Functional specifications: 

• The system will detect the moisture level in soil with help of moisture sensor.
• The system will be interfaced to Microcontroller board which will be tracking the status of soil moisture level.
• A Wi-Fi module is interfaced to Arduino motherboard and the same module is paired with the wireless router.
• A small water pump will be activated automatically when the moisture level in soil decreases to a minimum point.
• The entire unit works with 9V/ 1A power supply.
• User or admin will get the live status report of soil moisture level in ground and water pump running status from a remote location.

Non Functional specifications:

• The entire project electronics part is fixed in 200mm X 150mm X 100mm ABS plastic box with mounting holes.
• The entire system will be fixed in a waterproof box with IP 65 standards.
• This project will be easy to use and a person with minimum knowledge can also understand.
• The entire system has very minimum wires outside with proper name tags.
• The unit will easy to configure the surface (Tree/pole / flat surface).
• This project is totally shock protected.
• This project is low-cost automation for farmers/agriculture sectors.



• Soil moisture sensor 1no.
• Water motor pump 9V 1no.
• Water pipes 2 meters.
• Arduino board 1no.
• AC to DC Adaptor 9V/ 1A.
• ESP8266 wifi module
• Connecting wires.


• Arduino IDE is used to develop the entire project source code.
• Eagle CAD is used to develop circuit design.

Automatic Touch Screen Testing Machine Project Synopsis


  1. Abstract
  2. Electronics circuit and embedded systems
  3. Mechanical design.
  4. Microcontroller program
  5. Android app


Designing and developing an automation process for testing capacitive touch screen of mobile, there will be android app installed in mobile which displays few color blocks on lcd screen when capacitive touch pen touches the particular color blocks on lcd screen with help of automation process with X,Y and Z-axis movement by holding capacitive touch pen on Z-axis, when capacitive touch pen touches color blocks on lcd screen in this process color block on lcd screen will change if its touch sensitivity is good else the block color will remain same and feedback from android mobile is taken for evaluating performance of touch screen and feedback is displayed on LCD screen.


• XYZ axis router setup.
• Atmega 2586 Microcontroller board.
• DC motor driver L298.
• 12V/5A regulated supply.
• Hi-tech 645MG Servo motor z-axis movement.
• 2 DC motors for x-y plane axis movement.
• 2 DC motor encoders 16bit.
• Capacitive pen and its holder.
• 20X4 LCD Display.

Electronics circuit and embedded systems

In this Automatic Touch Screen Testing Machine project we are Arduino nano board which consists of Atmega328 microcontroller with 2KB RAM and 64 KB flash memory this board will act as control unit which gives signals to DRV8266 stepper driver motors and 16 X 2 LCD screen, we have 4 input switches to select menu options all these are interfaced with Arduino nano motherboard.

List of components used in this project.

• Arduino nano 1no.
• DRV8266 stepper motor driver 2nos.
• Nima 13 stepper motos 2nos.
• Servo motor 90G 1no.
• 16 X 2 alphanumeric LCD.
• Input switches.
• Connectors and wires.
• 12V / 2A Adaptor.
• Dotted PCB.
In our Automatic Touch Screen Testing Machine project, we have used dotted PCB for interfacing.

Block diagram of electronics parts

Circuit working operation

We are using 12V / 2A adaptor for powering our circuit board Arduino nano and DRV8825, Arduino nano plays a major role in this Automatic Touch Screen Testing Machine project.

Real Time Hand Gesture Recognition for Computer Interaction


The Real Time Hand Gesture Recognition system presented, uses only a webcam and algorithms which are developed using OpenCV computer vision, image and the video processing algorithms.

Existing Work:

The Existing work carried was done a PC based MATLAB software, where it has readily available all the algorithms and such system is not used to full extent in embedded side.

Proposed Work:

Proposed Real Time Hand Gesture Recognition for Computer Interaction work will be carried out on Linux based single board computer with ARM 11 architecture. This board will be ported with Raspbian Operation System and OpenCV Image processing Library. Using which will design an algorithm such that the system will identify the finger tips and count them how many fingers have been displayed.



ARM11, USB Camera, Power supply.


OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator, Opencv.


Computer Interaction, Gesture recognition based control


• Elimination of external hardware like mouse
• Easy to access and control appliances using gestures.

Hand Gesture Recognition based on Depth map


In this Hand Gesture Recognition based on Depth map paper a proposed method for gesture recognition using depth map image using Opencv is presented.  Using feature extraction method based on Radon transform will identify the hand posture recognition.

This method reduces the feature vector of Radon transform by averaging its values. For the classification of features vectors, the support vector machine is used.

Progress in depth map calculation of later year’s leads to exploitation then for several objects of research, one of depth map usage is gesture recognition since it provides information about shape of captured hand as well as position in frame. Depth maps have several advantages over traditional color pictures.

Existing Work:

In the existing the work, in order to capture the image  they have used an external hardware i.e Microsoft Kinect system, using the implementation will on higher cost side as well this system only has done all the preprocessing of the images ,no exact algorithm was implemented to detect the hand.

Proposed Work:

Proposed Hand Gesture Recognition work will be based on a Open Computer Vision library, using an Linux based real time device, will capture the images using a USB Camera connected to the device. And with our algorithms will detect the Open palm and Closed Palm in the capture images and draw a rectangle around it.




ARM11, USB Camera,  Power supply.

Software: OS:

Embedded Linux, Language: C/ C++, IDE: Qt Creator.


 Computer Vision system, Gesture control devices


  • Open source algorithm are used to implement the concept
  • Low cost and has future scope of controlling the appliances on gesture. 

Wireless Sensor Based Energy Conservation via Bluetooth

Existing Work:

In the existing Wireless Sensor Based Energy Conservation work, the communication protocol was limited to Bluetooth only which is very short distance and should require a device interface in order to view or control the data which is the drawback of the developed system.

Proposed Work:

The proposed Wireless Sensor Based Energy Conservation via Bluetooth work will include the low power high accuracy controllers through the Bluetooth communication as well we also control and view the data over the Intranet Network.

Load contains a Bulb, the current and Voltage consumed by the load will be monitored and an automated operation of the load based on PIR and LDR values will be done from remote location.



ARM11, PIR Sensor, LDR Sensor, Bluetooth module, Wi-Fi Router, 89s52, Current Sensor, Voltage Sensor, Relay Driver, Load.


OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator.


Home automation, Industrial Power Control


Load Control, Remote location access,

Face Identification Implementation in a Standalone Embedded System


In this Face Identification Implementation in a Standalone Embedded System paper is described an embedded system for face identification. The system, running on ARM processor, is built around BCM2835 processor and consists of several IP (Intellectual Property) modules designed as bus peripherals.

The face detection and recognition is accelerated with the help of a hardware and software algorithm modules. The system has been designed on the criteria of resources optimization, low power consumption and improved operation speed

Existing Work:

The Existing work has been implemented on FPGA based processor device, which is complex and high cost of implementation when compared to an embedded chips. As well in the entire system description only procedure have been explained no exact output results where shown.

Proposed Work:

The host target for the proposed face detection system is an embedded environment based on ARM 11 architecture. Which has much higher RAM and Clock speech compared to an FPGA based Devices. Here using Raspbian Operating System, Open Computer Vision algorithm and Qt based GUI interface will be used to implement the face detection and recognition.

Whenever the authenticated face is identified the system will provided login access or else SMS will be sent to the concerned person with GPS location simultaneously a buzzer will be turned on to indicating the unauthorized entry.



ARM11, USB Camera, GSM, GPS, Buzzer, Power supply.


OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator, Image Processing Algorithm.


Home, Security, Authentication sites


• Hand Held System and online face training can be done.
• Easy installation and usage

Human Data Interaction in IoT – The Ownership Aspect

In this Human Data Interaction in IoT – The Ownership Aspect, we develop a password based user authenticated IoT device login server. This will provide the data access to the owner only. 

Existing Work:

The approach of the existing work has not been clearly explained, they not provide any practical approach for the data safety and security over the IoT application.

Proposed Work:

Proposed Human Data Interaction in IoT system will be implemented on a Operating System(OS) based interface, which included and advances speed processor architecture i.e ARM 11, which makes the system very robust and the networking over the internet is done through on board Ethernet module with built in web server.


Hardware: ARM11, Wi-Fi, Power supply, ADC, Sensors, Load.

Software: OS: Embedded Linux, Language: C/ C++, IDE: Qt Creator.

Applications: Automation, Educational Knowledge, Robotics.


  • Helpful or the disable children and industrial automation for making daily activates, through controlling the devices over Internet.

A plug-n-play Internet enabled platform for Real time image processing

Existing System:

In the A plug-n-play Internet enabled platform for Real time image processing existing system, they have used a Cellular based technology which is very unreliable for the image data transmission over the web. As the network speed is very less, so we cant observe the results at rapid speed.

Proposed System:

Proposed A plug-n-play Internet enabled platform for Real time image processing System will be included with an advance algorithm based on Ethernet protocol, where will develop a server hosted on the raspberry pi board, and using Image processing algorithm the desired image will be processed and whenever the image processing is done immediately the results will be displayed on the webpage.



Raspbian OS, Opencv, web server


ARM 11(Raspberry pi), Ethernet router, USB camera


Web based results, Remote location accessing


Easy to access, live image feeding