These are the below Projects on Big Data Hadoop.

1) Twitter data sentimental analysis using Flume and Hive

2) Business insights of User usage records of data cards

3) Wiki page ranking with hadoop

4) Health care Data Management using Apache Hadoop ecosystem

5) Sensex Log Data Processing using BigData tools

6) Retail data analysis using BigData

7) Facebook data analysis using Hadoop and Hive

8) Archiving  LFS(Local File System) & CIFS  Data to Hadoop

9) Aadhar Based Analysis using Hadoop

10) Web Based Data Management of Apache hive

11) Automated RDBMS Data Archiving and Dearchiving  using Hadoop and Sqoop

12) BigData Pdf  Printer

13) Airline on-time performance

14) Climatic Data analysis using Hadoop (NCDC)

15) MovieLens  Data processing and analysis.

16) Two-Phase  Approach for Data Anonymization Using MapReduce

17) Migrating Different Sources To Bigdata And Its Performance

18) Flight History Analysis

19) Pseudo distributed hadoop cluster in script

HARDWARE REQUIREMENT FOR CLUSTER

  • 12-24 1-4TB hard disks in a JBOD (Just a Bunch Of Disks) configuration
  • 2 quad-/hex-/octo-core CPUs, running at least 2-2.5GHz
  • 64-512GB of RAM
  • Bonded Gigabit Ethernet or 10Gigabit Ethernet (the more storage density, the higher the network throughput needed)

SOFTWARE REQUIREMENT

  • FRONT END :           Jetty server, WebUI in JSP
  • BACK END :           Apache Hadoop, Apache FLUME, Apache HIVE, Apache PIG, JDK 1.6
  • OS       :           Linux-UBUNTU
  • IDE             :           ECLIPSE