RSA Algorithm Computer Science Seminar Topics 2011

RSA Algorithm Computer Science Seminar Topics 2011 was discovered by three scientists, Ron Rivest, Adi Shamir and Len Adleman. This scheme is a block cipher where plain text and cipher text are integers between 0 and n-1 for some value n. The typical size of n is 1024 bits or 309 decimal digits. It is a public key encryption RSA scheme.

The two pairs of integers {e, n} and {d, n} are used in this scheme. First of them {e.n} is called the RSA public key and the other one {d, n} is called the RSA secret key. The sender uses the public key to encrypt the message into cipher text.

It is expressed as C =  M^e mod n.

Here C is the cipher text and M is the message or the plane text. At the receiving end, the receiver receives the cipher text C and decrypts C into M using secret key {d, n}.

KEY GENERATION: The method of Key Generation consists of following steps:

1. Select two prime numbers say p and q randomly Where p ≠ q.

2. Calculate n = p *q.

3. Calculate Ø(n) = (p-1) (q-1).

Applications of RSA are: It is widely used for encryption and decryption in message process to get secure communication. It is used for digital signature. It is used for key distribution. It is used in e-commerce and remote banking.

Conclusion:

It is concluded that RSA is a powerful scheme most widely utilized for digital signature and encryption/decryption. It is more secure than DES and other schemes. It is known that that the key length RSA use has increased for security purpose. Using RSA, it has a heavier processing load on applications. This scheme has ramified especially for electronic commerce sites that have large numbers of transactions. RSA is fundamentally an easy method to explain than ECC.

Download RSA Algorithm Computer Science Seminar Topics 2011.

Robotics Engineering Seminar Download

Robotics Engineering Seminar Download: Robotics is the technology and science of robots, their manufacture, design, and application. Robotics must have a working knowledge of mechanics, electronics, and software. 

The person working and designing in the field is a roboticist. The robot is structured mechanically which is called a kinematic chain. It functions analogously like a human skeleton. This chain contains actuators (its muscles), links (its bones) and joints.

Autism is a permeate developmental disorder which is characterized by and communicative and social impairments. Social robots respond and recognize to human social humor with particular behaviors. 

The technology utilized in the construction of social robots can be a unique tool in the field of autism. This report discusses how social robots made an impact in the field of diagnoses, treatment, and understanding of autism.

The social disability of autism is a profound way affecting a human’s capacity for feeling and understanding other people and to establish reciprocal relationships. Today, this tool remains a specified disorder.  There is no genetic screening, no blood test, and no functional imaging test for diagnosing of autism.

Robots need motivation and engagement in therapy. Many studies have demonstrated that robots form a high degree of engagement and motivation in tasks to interact socially with human therapists. Only three facial expressions are obtained from the Playtest device and the ESRA robot.

The fine-grained analyses of social capabilities resulted from work on diagnostic and therapeutic applications. They have the capability to enhance our understanding of autistic disorders.

Conclusion:

Expressing effect and recognizing are the important part of social participation. Their social cues are growing up to comprehend social nuances in social behavior or communication or to blend during the everyday interaction. This research represents an approach for the teacher to understand emotional expressions, guides recognition to teach emotion recognition to autistic children with a heterogeneous disorder. It is designed to offload few more tedious part of the work.

Resource Sharing On Internet Computers Presentation

Resource Sharing On Internet Computers Presentation: The technology developed as storage is known as the Internet Backplane Protocol (IBP). It is designed to test hypothesis and explore of implications. It is our basic tool in the logistical networking technology. It is a field motivated by storage and viewing data transmission within unified framework.

Internet Backplane Protocol (IBP) has been developed for the support of domain and application and optimization of data movement. This technology is used for managing storage within the network filed. It allows an application to implement inter-process communication. With this implementation of data staging operations, locality can be exploited and scarce buffer resources are managed more effectively.

In the internet world, a large distributed system provides a communication service to computer systems at its periphery. The consideration of robustness and scalability enables its designers to select a stateless model. Routers do stateless data transfer operations and control is calculated by routing algorithms.

It is to remember that the designers of large-scale computation systems often follow shared memory model because the functional model has undesirable limitations on performance and control. Increasingly the design of Internet-based information systems is utilizing shared-memory approaches which support the management of distributed data rather than access.

In order to have uniform interface to state management that is better integrated with the Internet, the Internet Backplane Protocol (IBP) has been defined. IBP can be viewed as a mechanism to adjust either remote files or communication buffers.

Conclusion:

 This approach is based on the Internet model of resource sharing. It also represents one general method of using storage resources to create common distributed infrastructure that share communication bandwidth. It is designed on the model of IP to allow storage resources that are shared by users and applications for maintaining necessary security and protection from abuse. IBP is mainly for the intermediate resource management components and accessible to every end-system

Download Resource Sharing On Internet Computers Presentation.

Remote Method Invocation Technical CSE Presentations

Remote Method Invocation Technical CSE Presentations technology is first introduced in JDK 1.1. It is highly network programming to a higher plane. RMI is very easy to use. It is a remarkably powerful technology and displays the average Java developer to a complete new paradigm, a world of distributed object computing. It has evolved since JDK 1.1. It has been significantly upgraded under Java 2 SDK.

The primary goal of RMI developer is to enable programmers to develop distributed Java programs with same syntax and semantics which is used for non-distributed programs. They had to map Java classes and objects work carefully in a single Java Virtual Machine (JVM) to a new model.

The RMI architecture explores differences from the distributed or remote Java objects with the character of local Java objects. This architecture defines the character of objects, how and when exceptions can occur, how memory is managed, and how parameters are passed to and returned from remote methods.

The RMI architecture depends on one main principle that is the definition of behavior and the implementation of behavior which are separate concepts. RMI enables the code that defines the behavior and the code implements the behavior to remain separate and to run on separate JVM’s.

The implementation of the remote service is coded in a class. Hence it is to remember that interfaces define behavior and classes define implementation which is the key for understanding RMI. The RMI implementation is built on three abstraction layers. They are Stub and Skeleton layer, Remote Reference Layer, and Transport Layer.

Conclusion:

RMI is an implementation in Java. The goal of RMI architecture was to create a Java distributed object model. That integrates into Java programming language and the local object model. RMI architects have successfully created a system that extends robustness and safety of the Java architecture to the distributed computing world.

Download Remote Method Invocation Technical CSE Presentations.

Red Tacton CSE Seminar Submission Report

Red Tacton CSE Seminar Submission Report:, this is the technology that has made many tasks easier. Red Tacton is a new human area networking technology. “Red Tacton” is a new technology of Human Area Networking that uses the surface of the human body. It is a high speed and safe network transmission field.

This technology uses the weak electric field emitted on the surface of the human body. It relies on the rule of optical properties of an electro-optic crystal. It may vary depending on the changes of weak electric signal. Beside WANs and LANs, Human Area Networks (HANs) support many applications which are best served. Human society is going to have an impact on ubiquitous computing where everything is networked.

There are three main functional features of Red Tacton. They are Touch, Broadband & Interactive, and any media. With Touch, touching, walking, gripping, sitting, stepping and other human movements can be the triggers for starting or stopping equipment, locking or unlocking, or obtaining data. With Broadband & Interactive, drivers are being downloaded instantly and executable programs are being sent quickly. In transmission media, many dielectrics and conductors are being used as transmission media. Conductors and dielectrics are used in combination.

There are prototypes which are used as a Red Taction transceiver like PC card type, hub type, and box type.

Applications of Red Tacton: elimination of human error with help of alarm sound, marketing applications, personalization of mobile phones, automobiles, touch printer to print, instant private data exchange, security applications for user verification and lock management at entrances.

 Red Tacton CSE Seminar Submission Report Conclusion:

It is concluded that comparing Red Tacton with other advanced technology in today’s era; it provides a better performance over others. Red Tacton is best for short distance connection network. There is no threat of hackers with Red Tacton since our body is itself a media

Download Red Tacton CSE Seminar Submission Report.

Real Time Task Scheduling Download Seminar Report

Real Time Task Scheduling Download Seminar Report: When rigid time requirements is placed on a processor or the flow of data than a real time task scheduling is utilized. Hence it is used as a control device in a dedicated application.

The aim of real time scheduling is to execute its critical control tasks with the appropriate deadlines. The allocation/scheduling problem: Given a set of tasks, resource requirements, task precedence constraints, task characteristics, and deadlines and to devise allocation/schedule on a system.

A task requires one or more resources and requires some execution time on a processor. Tasks may have precedence constraints and may require some amount of memory or access to a bus. The physical resource may be exclusive or non-exclusive depending on the operation to be performed.

The total time which all the data required to begin executing are available is the release time of a task and the time taken to complete its execution is the deadline time. The deadline time may be hard or soft depending on the nature of the task. The absolute deadline minus the release time is the relative deadline of a task.

There are three types of tasks. They are periodic, sporadic or aperiodic. If task Ti is released periodically then it is a periodic task. If the task is not periodic then it is a sporadic task but invoked at irregular intervals. Sporadic tasks are sometimes also called a periodic task.

Conclusion:

The process of deciding which task to execute next is called dispatching. Simple priority assignment policies regard with time constraints associated with a task. Priority assignment is scheduling of tasks with precedence constraints, timing constraints resource constraints and arbitrary values on multi-processors. 

There two main considerations are schedule construction and feasibility checking. Hence the complexity of the algorithm is not dependent on a number of tasks but depends only on the number of processors.

RDBMS Database Management System Seminar

RDBMS Database Management System Seminar: Database is the collection of data that have implicit meaning. Database Management System is a collection of programs that allows user to create and maintain a database. The Database system is the combination of both database and DBMS software.

The advantages of DBMS are redundancy is controlled, unauthorized access is restricted, providing multiple user interfaces, enforcing integrity constraints, and providing backup and recovery.

The data model is a collection of perceptual tools for characterizing data, data relationships, data constraints and semantics. ER data model is based on real world that contains basic objects called entities and of relationship among objects. Entities are described by a set of attributes. Object oriented model is based on collection of objects. An object consists of values stored in instance variables.

Entity is a thing in the real world with an independent existence. Entity set is a collection of all entities of particular entity type. Attribute is a property that describes entity. A relation is defined as a set of tuples. Each tuple is an ordered list of n-values. Degree of a relation is the number of attribute of its relation schema. Relationship is an association between two or more entities. Relationship set is the collection of similar relationships. Degree of relationship type is the number of entity type participating.

Conclusion:

A data base schema specifies a set of definitions expressed by a special language called DDL (Data Definition Language). VDL (View Definition Language) specifies user views and their mappings to the conceptual schema. DML (Data Manipulation Language) is a language that allows user to access or manipulate data by appropriate data model. NF (Normal Form) is the domain of attribute that includes only atomic values. Fully Functional dependency is based on concept of functional dependency. There are many NF like 2NF, 3NF, BCNF (Boyce-Codd Normal Form), 4NF, 5NF. Either all actions are carried out or none is called Atomicity. Aggregation is a relationship between entities and relationships.

Download RDBMS Database Management System Seminar

BE BTECH MCA Seminars Quantum Cryptography

BE BTECH MCA Seminars Quantum Cryptography: Cryptography is defined as the conversion of data into a scrambled code that can be deciphered and sent over wide area network. It is transmitted over network communication, such as electronic transactions, Internet, e-mail, and mobile phones etc. The encrypted message is transmitted and the receiver decrypts the message by unscrambling transmission. The mathematical discovery has got a key called PKC which is used to scramble and unscramble the transformation.

Classical cryptography relies on computing technology that restricts eavesdroppers from accessing the information of encrypted messages and in quantum cryptography the data secured by physics laws. Cryptography is an art of devising ciphers and codes, and cryptoanalysis is an art of breaking them. Cryptology is the combination of the ciphers and codes.

There are two keys of cryptographic techniques: public key encryption and secret key encryption. PKC is already mentioned above. RSA algorithm is an example of PKC where as in secret key encryption, two users share a k-bit “secret key” that is used to transform plaintext inputs to cryptotext for transmission and back to plaintext upon receipt.

The basic idea of Quantum Cryptography is defined as electromagnetic waves that exhibit the phenomenon of polarization. These electric field vibrations are constant or vary in any definite way. Heisenberg uncertainty principle includes the foundation of quantum cryptography.

The fundamental security of this cryptography is obtained from the criteria that each qubit of information is carried by a single photon and each photon is changed as soon as it is read once. This technique has no protection against the classic bucket brigade attack.

Conclusion:

Quantum cryptography aims to revolutionize secure network communication by providing security based on the basic laws of physics instead of computing technology or mathematical algorithms. The performance of computer systems has been continuously improved and systems for implementing such methods exist.

Download BE BTECH MCA Seminars Quantum Cryptography.

Quantum Computers Technical Presentation

Quantum Computers Technical Presentation: The quantum computer is a device for computation that makes direct use of quantum mechanical phenomena to perform operations on data. It is based on transistors. The basic aim of quantum computation is that quantum properties are used to represent data and perform operations.  It exploits quantum-mechanical interactions.

In a quantum computer, the basic unit of information which is called a quantum bit is not binary. This property of quantum mechanics differs radically from laws of classical physics. A qubit exists as a one, a zero, or both 0 and 1 representing the probability. The given number of qubits is exponentially more complex than a classical computer with the same number of bits.

Quantum computation systems and languages have been developed to attempt to allow simulations of quantum algorithms. Few algorithms of QCL are Q-gol, Qubiter etc. Many uses have been proposed for quantum computation from modelling quantum mechanical systems, breaking public key encryption, searching databases, generating true random numbers to providing secure communication using quantum key computation.

These computers work on quantum phenomenon. A quantum computer on a large-scale is able to solve certain problems faster than any classical computer with the help of currently known algorithms like integer factorization.

Following are the applications of Quantum computer: Breaking ciphers, statistical analysis, factoring large numbers, solving problems in theoretical physics, and solving optimization problems in many variables.

QOS has been created to date operating software. It is designed to coordinate configuration and the timing as well as processing many signals of output of computers. Factorization is a way to encrypt the technology. With the help of quantum technology, computers can decrypt faster.

Conclusion:

Today, quantum information technology and quantum computers remain in its pioneering platform. Quantum hardware remains an emerging field thereby quantum computers will come forth as the higher computational devices. Quantum computation has its origins in superiorly specialized areas of theoretical physics.

Download Quantum Computers Technical Presentation.

QTP – Testing Seminar Topic Download

QTP – Testing Seminar Topic Download: Test Automation is software which controls the execution of programs and the comparison of results either predicted or expected. 

There are many tools and software programs that are used to test software programs. Automation tools are used to reduce the test time as well as the cost, to avoid the errors, takes less time, requires less human intervention etc. The testing tool is consistent and repeatable.

In this presentation, we briefly describe an automated testing tool which is called Quick Test Professional (QTP). It supports both web and windows applications. Languages like .NET, J2EE, Main Frame, XML, Java, ERP, SAP, Seibel etc are supported by QTP. This software is run on only windows platform and does not support UNIX, Linux etc.

This Quick Test is a Graphical User Interface (GUI) testing tool that gives access to the user on a web or client based software application. The versions of QTP are QTP 5.5, 5.6, 6.0, 6.5, 8.0, 9.0, 9.2 etc. QTP 9.2 is the current version.

Testing process of QTP: Record & Run Settings, Develop Automation Testing, Enhance/ Customize Automation test, Debug Automation Test, Execute Automation Test, Analyze Test Results, and Report defects in bug tracking tool.

There are many Add Ins that provide information about the technology supported by the QTP like ActiveX, VB, Web, Java, Oracle, People soft, .NET, Terminal Emulator, SAP, Siebel etc.

A Checkpoint is a specialized procedure that compares two values and gives the result. QTP runs in normal and fast modes. Exception Handling enables the Quick test to detect and handle error occurred during execution time.

 QTP – Testing Seminar Topic Download Conclusion:

 Manual testing is time-consuming and requires heavy investment. Automated testing with QuickTest dramatically speeds up the testing process. Benefits of Automated QTP: it runs faster than human users, it is reliable and eliminates human error, and it is repeatable.