CSE Seminar Topic on Algorithm

Introduction to Seminar Topic on Algorithm:

The primitive types of algorithms dates back to centuries, ever since mathematics have taken predominance in the society algorithms have also been influencing mathematicians worldwide. So what is an algorithm, it can be loosely defined as a collection of set rules that explains the finite operations sequence of a specific problem.  We use algorithms everywhere, for small mathematical problems to complex ones. An algorithm is commonly used for calculating equations, processing of data, defining logic and understanding various aspect of reasoning.

There should be a definite common characteristics an algorithm should exhibit, namely input, output, finiteness, definiteness and effectiveness. An external data should be given as input which comes out as output after processing.  Certain steps must be stopped after a finite number and should not move on infinitely and it should have clarity and must be unambiguous. Absolutely no intelligence must be used to execute the algorithm steps and once all the steps are completed with success, an algorithm should be terminated fully.

To process a data, computers use algorithms that instruct to perform in a specific way and manner. An algorithm will contain a precise number of finite steps and its flow is controlled in a definite way. Data’s can be stored in the entity in which the algorithm is processed. The different types of asymptotic notations in use are Big Oh, Big Omega, Big theta, Little Oh and Little Omega. 

 The expressing of algorithms can be through

  • Natural Languages
  • Flowcharts
  • Pseudo codes
  • Control tables
  • Programming Languages

A method by which a complex programming structure is dissolved into simpler finite steps is called dynamic programming whereas the greedy approach makes the optimum choice at a given stage and solves the major problem. Greedy algorithms can be classified as pure, relaxed and orthogonal greedy algorithms. The time taken by an algorithm to execute fully is called time complexity. 

Download  CSE Seminar Topic on Algorithm  .

Seminar Report on Active Template Library

Introduction to Seminar Topic on Active Template Library:

In order to make the COM or component object model programming much accessible and simple, Microsoft aimed to design a flock of C++ classes or in other words a program library based on unique template sets, this is popularly known as Active Template Library. Earlier it was named Active-X template library. 

The main advantage of using Visual C++ was its ability to adapt to the COM classes and this resulted in creating objects like Active-X controls and automation server. By using suitable parameter substitution, a template can expand by invoking and result in construction of new classes which is used to execute the desired operational procedure.  Although there are various C++ libraries, ATL excels bringing desired functions as a class that could be instantiated from a template. 

ATL is often being denoted as source codes and is widely known as the primary building blocks of any component object model objects and code reusability is another factor with a variety of security features like converting into dynamic link library. Another type of similar library which helps in component development is Microsoft’s Foundation Class Library or MFC. ATL has an advantage over MFC as it is more swift, reliable and simpler for usage.  If we are creating a new control from the scratch, it’s better to use ATL than MFC.

We create ATL applications inside a Visual C++ IDE and have to choose the appropriate server type first after having ATL COM AppWizard as the application type. The necessary files would be generated and we can view it by clicking inside the project workspace fileview tab. The exports of the DLL are implemented by Test.Cpp. Similarly lot of test process like def, idl, rc are used for generation, definition and resource information retrieval. The controls and complex can be developed using ATL  along with COM objects and classes

Download  Seminar Report on Active Template Library .

Access 2007 Step by Step Study Material

When we create software it is as important to design a perfect database as to create a front end design. A good relational database which is normalized reduces the chance of any type of logical bugs in the future. Access is a database management structure created by Microsoft in order to solve the data storage problem and is a part of the Microsoft Office suite. Access 2007 comes with the Microsoft Office 07 edition and has major changes from its earlier predecessors. 

To install access database, we have to install the Microsoft Office software first, after installation we can find a Microsoft Office folder in our start up menu and inside it we can find the link to the Microsoft Access 2007. We can also obtain the same by going to the windows folder where Office was installed. We can notice the change in design from Access 2003 almost immediately.

The first step is to create a blank database on a desired location. A table is the soul of a database and we have to create the table according to the needs and demands of the software. Each table column must have a definite data type such as Text, Memo, Number, DateTime etc. 

We must define a column with unique values in a table and assign the primary key constraint to it; most of these types of columns will have AutoNumber as data type. Field size for each column must be specified according to requirement. The table is valid only when we save the table with a table name and according to nomenclature rules the name should start with “tbl”.

We can connect our front end of application to the access database through Object database connectivity driver or commonly known as ODBC driver and can populate, edit or search the table by giving specific queries. We can also assign a password to the access database to provide more security. We can also create access forms instead of front end applications and write code inside it. 

Download  Access 2007 Step by Step Study Material .

Augmented Reality Seminar Abstract

Introduction to Augmented Reality Seminar Topic:

Most of us love playing video games; it often brings out the child inside from all of us. Over the generations video games have evolved from time to time. If it started with big kiosk type machines which consumed a lot of space and now we have small portable PlayStation from Sony which we can carry anywhere or everywhere we want. Still the consumers are not satisfied as most gaming contents offer only 2-D or 3-D perspective and lacks a real feel to it.

We have currently wireless sticks which we can connect to the gaming consoles in the market which is somewhat close to reality as we can play games like golf and tennis by using this stick as rackets or clubs. The search of a real gaming experience went on and that gave birth to an unique concept called augmented reality or AR.

AR provides a real time world environment and allows the viewers to interact with game live. This happens with the help of various augmented factors such as audio, visual, computer graphics and even global positioning input. Augmented reality synchronizes the environment with the graphical structure to provide an ultimate virtual reality gaming experience. A user has to wear special type of glasses to absorb this technology and as more research is done on this topic in the coming years, we can expect even better results happening

Some of the areas we can use Augmented Reality:

  • Video gaming consoles and mobile gaming
  • Virtual keyboard
  • Head mounted displays
  • Virtual retina displays
  • Location tracking and mapping
  • Sports telecasting
  • Automobile engineering
  • Information technology
  • Medical instruments
  • Projectors

AR technology mainly works with the help of the sensors and is also a stretched virtual reality technology. The users will be able to experience a real experience while using this technology, for e.g. if we are watching a live telecast of a game, AR will give us the same ambiance of sitting inside that stadium. AR is certainly one technology to look for the future. 

Download  Augmented Reality Seminar Abstract  .

Seminar Topic on Google Search Engine

Introduction to Seminar Topic on Google Search Engine:

In earlier times we used to refer books for every doubts and queries that arise in our mind, it was time consuming as we had to go to a library or book shop to get our hands on those books and then we would have to see through the pages for our desired content. The situation remained the same until a thing called ‘search engine’ came up the horizon.

Google Inc. was the pioneer of the search engine revolution so much so that even internet is referred as Google sometimes. Google search has become so much popular that we rely on Google for almost everything now ranging from searching articles, finding locations, buying goods and many more. With the comfort of searching anything and everything we often forget there is lot of techniques going behind these search engines. 

Data center or DC involves power supplies backup, housing computer systems, telecommunication systems, storage mechanisms, redundant supplies and data communication connections, security devices, environmental controls such as AC, fire control etc. A computer program which is coded in such a way that the user would be able to surf the internet is called web crawler, it is also commonly known as data crawler. Its behavior is dependent on policy combinations such as selection, re visit, politeness and parallelization. 

The data is browsed in a web crawler in: 

  • Methodical manner
  • Automated manner
  • Orderly fashion 

The factor that makes Google one of the most preferred and swift search engine is its strong distributed network and fantastic optimization techniques. The query submitted by the user is analyzed and checked for different combinations which in turn are processed parallel swiftly. The three distinct parts of the Google which helps in this cause are GoogleBot web crawler, data index sorter and a processor which process the user queries.

Download  Seminar Topic on Google Search Engine  .

Balanced Ant Colony Optimization BACO in Grid Computing Abstract

Introduction to Balanced Ant Colony Optimization BACO in Grid Computing:

A huge computing power and technique is required to solve complex and difficult scientific doubts and problems. The space required for storing data is also pretty huge as the solution takes up a lot of memory. Grid computing is an innovative computation technique through which we can manage a large number of files through interactive workloads distribution system. It mainly focuses on unused processing cycles and harnesses them to solve these problems. 

 Two types of grids:

  • Computing grid
  • Data grid

It would consume a lot of time to process, solving and storing a large amount of data and grid computing helps us to do the same with less storage space and time. Status conditions of resource and networks are closely monitored and if it is found to be unstable, the proposed job would be a failure and will result in a large computation time. In order to bring in more effectiveness to the job, a scheduling algorithm is proposed to schedule these jobs in the most effective manner.

This scheduling algorithm is extremely important as hundreds of computers are used as resources and the task is impossible to do manually.  Balanced Ant Colony Optimization or BACO is such a scheduling algorithm used in the grid environment to schedule jobs effectively. Although there are other scheduling algorithms such as FCFS and SJF, BACO excels in the dynamic grid environment. Local search are made extremely quick and efficient and the strategy used for scheduling will be dependent on the job types and present environment.

Some of the problems solved with the help of BACO

  • Traveling salesman problem
  • Vehicle routing problem
  • Graph coloring problem

BACO algorithm is of different types:

  • Ant colony system
  • Max min ant system
  • Fast ant system
  • Elitist ant system
  • Rank Based ant system

Download  Balanced Ant Colony Optimization BACO in Grid Computing Abstract .

ECE Seminar Topic on Minimizing File Download Time in Stochastic Networks

Introduction to Seminar Topic on Minimizing File Download Time in Stochastic Networks:

Most of us shares files with our friends, colleagues, family etc., it’s a basic thing we all do while using a computer.  Peer to peer network is a new and innovative networking process that is catching up with a lot of people in today’s world. Such peer to peer file sharing process uses a large chunk of bandwidth and hams up the network traffic. In such scenarios, downloading the shared file on the network is a hugely time consuming process. The slowness of the system may also be as the result of fluctuation in service capacity. 

There are many concerns effecting the time consumed for downloading files namely:           

  • The service capability undergo spatial heterogeneity in different peers 
  • In a single peer, the service capability are subjected to temporal fluctuations 

From the two points we can make out that it’s the service capability of the peers that undergoes flaws and as a result the download time takes a hit. There is a need of a simple non complex algorithm with the help of which we can increase the performance of the network by minimizing negative factors affecting the peers and thereby reducing the download time to a huge extent. As this network is used mainly for distribution of the content, the single system will act both as a client and a server making the system centralized than any other network. The existing system has much flak that can be solved by this proposed system. 

Steps in the proposed system: 

  • The relationship is characterized with reference to download time and service capability 
  • Proving that non unique capacities will increase the downloading time 
  • A scholastic capacity process defines the download time 
  • Increased relations among capacity increases downloading time 
  • Eliminating any negative impact in peers
  •  

The proposed modules include parallel downloading, random chunk based downloading, random periodic switching etc. 

Download  ECE Seminar Topic on Minimizing File Download Time in Stochastic Networks .

Seminar Report on Artificial Intelligence Pattern Recognition Using Neural Networks

Introduction to Seminar Topic on Artificial Intelligence Pattern Recognition Using Neural Networks:

Innovative technology are the ones that takes the world forward, computer was one such innovation, it made human work pacier and error free. It is a common fact that electronic devices lack the ability to think and act according to a particular situation; this is because they don’t have any intelligence of their own and once it is loaded by code feeds it will behave intelligently and this method of feeding knowledge to an electronic system is called artificial intelligence. One such type of artificial intelligence is the pattern recognition of the images with the help of numerous neural networks. It is a simple technique where the system adapts to the object send information.

A non-patterned sample of a neural network is hard to be trained as their population will be too large. A basic generalization can be made up from this pattern which helps them to learn the concerned relations.  During pattern recognition process, the input image pyramid is subjected to sub sampling first and extracted windows are formed of 20*20 pixels. The pixels then go to a pre processing mode which includes correcting the lighting and histogram equalization.

 The neural networks then take these pre processed image pixels and assign it as network input. It passes through respective fields and come out as hidden unit. The final unit is given as output. The discriminant s would contain the basic structure of pattern recognition. The whole process of involving neural networks is noise tolerant and this results it in extensive usage in pattern recognition field.

Pattern recognition is also used in speech recognition by reading the lip movement and acoustic patterns. This technology is also used to detect facial expressions and also used in security systems in banks and companies. Pattern recognition technology using neural networks is definitely one to look out for the future. 

Download  Seminar Report on Artificial Intelligence Pattern Recognition Using Neural Networks  .

CSE Technical Seminar on Ubiquitous Smart Home

Introduction to CSE Technical Seminar on Ubiquitous Smart Home:

Most of us would be having a common dream of that to own a house which has all the facilities and meets all our requirements. In our home would be having difficulty in switching on/off our lights and electronic devices, identify the visitor who has just rang the calling bell etc. With the help of this cool and unique concept we can be more close to complete automation of our home. This article throws light on using the omnipresent ubiquitous computation socially, technically and ethically. The importance of such ubiquitous smart homes is getting higher and higher in today’s society.

Main features of a ubiquitous smart home:

  • Interoperability
  • Manageability
  • Reliability

The first biggest challenge while creating a ubiquitous smart home starts during its construction itself. If the houses are being built from the scratch, it is the best time to adapt this innovative technology as we have to make lot of upgrades to an already existing mansion.

The decision should be made on how much is the boundary of the home and what all devices we are planning to connect to the system. The range of the electronic devices and the usage of it should be properly kept in mind. Another main feature is the impromptu interoperability, i.e. without devices of different genre and make should be able to operate within the environment without any sorts of planning whatsoever.

There would be no system administrator and most of these devices should be application centric or utility centric and moreover these devices would be designed for domestic use only. This technology should be able to satisfy social needs off the user and should have an impact on the community that we live in.  The system should act on our choices and assumptions and should interfere in the presence of ambiguity.

Download  CSE Technical Seminar on Ubiquitous Smart Home  .

A Vectorizing Compiler for Multimedia Extension M.tech Seminar Report

Introduction to A Vectorizing Compiler for Multimedia Extension Seminar Topic:

Multimedia comprises of visual, audio and text files integrated together under one roof. There would be a large number of processes happening while a data transfer of multimedia files which includes encoding, decoding, rendering etc. The MMX or multimedia extension is a technology developed by Intel to go with their Pentium series of microprocessors and is used in various other embedded applications. The proposed compiler uses scalar expansion to detect similar sections of codes, they also use the analysis of array dependency to identify the same.

Code transformation is the primary motive of these compilers which includes in application such as: 

  • Strip Mining
  • Grouping and reduction
  • Fission and distribution of Loops
  • Scalar expansion 

Instructions to the inline assembly are generated with respect to the parallel sections of data. The generated code undergoes various testing process to see if satisfies the benchmark level of multimedia. The vectorized result shows much better improvement in performance that without it. Usually we compress the data before storing it and various processing techniques are dominating areas like huge input output requirements and data set sizes, tiny native data types etc.

A specific case where a parallel processing of data takes place and data is packed into a word at a much lower precision is called subword parallelism. The Intel microprocessors usually have X86 and X64 processors i.e. 32 bit and 64 bit words respectively. The instruction set architecture are increased or extended in such processors which are called the multimedia extension or MMX files.

The proposed system notes the steps in the loop and without touching the basic program semantics, executes successive instances. Writing compilers in assembly language is pretty daunting and bug prone, there would be enhanced system libraries and macro calls for extended set of instructions to improve parallelism and decrease overhead. Portability can be easily obtained by vectorizing the whole process and a change in architecture of a multimedia set. Stanford university intermediate format are used for domain compiling.

Download  A Vectorizing Compiler for Multimedia Extension M.tech Seminar Report  .