Processing Methods of Digital Image Processing

Description: The research paper Processing Methods of Digital Image Processing suggests that the methods adopted for image processing in the yore are still being used as the fundamental concepts although digital technology has revolutionized image processing techniques. Optical applications such as holography are still in use. Digital processing allows enhancing the quality of required features while attenuating the features that are of negligible interest and consideration.

Digital image processing is a subset of the electronic domain wherein the image is converted to an array of small integers, called pixels, representing a physical quantity such as scene radiance, stored in a digital memory, and processed by computer or other digital hardware. Digital image processing, either as enhancement for human observers or performing autonomous analysis, offers compensation in cost, speed, and flexibility, and with the rapidly falling price and rising performance of personal computers it has become the dominant method in use.

What is Digital Image Processing?

An image may be defined as a two-dimensional function, f(x, y), where x and y are spatial  (Plane) coordinates, and the amplitude of f at any pair of coordinates (x, y) is called the intensity or  Gray level of the image at that point. When (x, y) and amplitude values of f are all finite, discrete Quantities, we call the image a digital image. The field of digital image processing refers to processing digital images by means of a digital computer. Note that a digital image is com- posed of a finite number of elements, each of which has a particular location and value. These elements are referred to as picture elements, image elements, pixels, and pixels. Pixel is the term most widely used to denote the elements of a digital image.

Conclusion: The research paper concludes by saying that today image processing field is one of the fastest growing fields and has many promises to make in the future.

Download Processing Methods of Digital Image Processing Technical CSE Paper Presentation.

DCOM Technical View Distributed Applications Seminar

Description: The research paper DCOM Technical View Distributed Applications Seminar talks about Distributed Applications. Some functions are inherently distributive in nature. It means some applications have many end users. Typical examples of such applications are chat, games, and emails. Microsoft® Distributed COM (DCOM) extends the Component Object Model (COM) to support communication among objects on different computers—on a LAN, a WAN, or even the Internet. With DCOM, your application can be distributed at locations that make the most sense to your customer and to the application. The research paper meticulously explains what DCOM architecture is.

Why Write Distributed Applications: Distributed Applications are written for applications that inherently rope in multiple users. Apart From such things applications are there that rope in at least 2 users. But because these applications were not considered to be distributed, they are limited in scalability and ease of deployment. Any kind of workflow or groupware application, most client/server applications, and even some desktop productivity applications basically control the way their users communicate and cooperate. Thinking of these applications as distributed applications and running the right components in the right places benefits the user and optimizes the use of network and computer resources. The application designed with distribution in mind can accommodate different clients with different capabilities by running components on the client side when possible and running them on the server side when necessary.

Conclusion

The research paper concludes suggesting that the DCOM makes it easy to write a distributed application that

  1. Scales from the smallest single computer environment to the biggest pool of server machines.
  2. Provides a rich and symmetric communication between components.
  3. Has a facility to robustly expand to meet new functional requirements.
  4. Can take advantage of existing custom and off-the-shelf components.
  5. Can integrate teams proficient in any programming language and development tool.
  6. Can use network bandwidth carefully, while providing great response times for end-users.
  7. Is inherently secure.
  8. It provides a smooth migration path to sophisticated load-balancing and fault-tolerance features.

Has the capacity to efficiently deployed and administered.

Download DCOM Technical View Distributed Applications Seminar.

Data Base Designs Help Systems Technical Paper Presentation

Description: The research paper Data Base Designs Help Systems Technical Paper Presentation talks about it help desk and Data Base Designs. The research paper suggests that IT problems are relatively tougher to deal. The IT problems are solved by IT Technicians. Sometimes a many technicians might solve a single problem. Problem arises when a problem is resolved by many technicians. I could be that each technician has his own way of resolving a problem. There could be lot of clash and confusion in a scenario like this.

Drawbacks of help systems: The research paper explains in depth the problems that might arise in Help Systems. They are:

  1. Help systems are inherently problematic
  2. A problem is viewed by several technicians each specializing a different area of expertise.
  3. A technician might not be well versed in tackling a problem as the problem might be away from his zone of specialization
  4. There is no information suggested on the way the problem was resolved.
  5. The manager is unaware of the number of technicians working over a single issue.

Conclusion: The following solutions have been identified to tackle the help desk systems:

Solutions offered:

  1. It is better if manager deputes a specific problem to a technician
  2. Manager should create a team analyzing individual technician’s area of expertise
  3. Technicians are then finally deputed to a problem

Technicians finally start tackling it.

Download Data Base Designs Help Systems Technical Paper Presentation

Data Base Testing Technical Paper Presentation

Description: The research presentation Data Base Testing Technical Paper Presentation talks about Data Base Testing. The presentation suggests that Database Testing mainly concentrates on following:

  • Data Integrity test
  • Stored Procedure test
  • Type test
  • Data Size Test
  • Event Driven Item Test
  • Input Item Verification

What is Data Integrity Test: Once a value undergoes any of the above actions (update / Delete / Insert) the database should be verified for the changes performed on related entities i.e., Foreign key / Primary key and all dependent entities.

Stored Procedure Test: Every Stored Procedure is to be tested separately for its functionality ( Based on Separate functions it performs ) Stored procedures need to be broken up into Action Items based on Functions and then Each action item needs to be tested separately as the results of Complete Stored procedure.

Execution may differ from the results obtained by partial execution. This also helps in validating the modularity of Code (White Box).

In the case of stored procedures, to come up with test cases one can consider the following: 

1. The no. of arguments being passed

2. The data type of each of the arguments being passed

3. The order of the arguments being passed

4. The return value

5. The data type of the return value

Based on these you can write both positive and negative test cases, consider a simple example of a stored procedure taking 2 numbers as input and returning the sum of the 2 numbers

This is the process of verification of the input items (Though this is not totally a part of database testing, but this has to be performed essentially during database testing of the Web based applications.

Often it is seen that the input items (Text Box / RTB / Combination Box / Active-X controls) are tested for validation only at front end (Screen testing) but these are again to be tested with junk Character values to confirm that they do not push in such characters which the databases Often misrepresent or Replace with other characters (this testing can partially be performed during Unit testing also by the developer

Download Data Base Testing Technical Paper Presentation

Data Validations for Secure Web Applications

Description: The research paper DATA VALIDATIONS FOR SECURE WEB APPLICATIONS speaks of Data Validation and secure web applications. It is suggested in the research abstract that the information shared on website in not secure altogether. In simpler words it is highly vulnerable. There is every need to secure the data and information published on our website. But with an increase in the flow of information through the medium of internet there is a very high increase of the vulnerability factor to. Hackers are there in the virtual world hiding in every nook and corner with the desire to steal your information in order to benefit their own business concerns. This research abstract speaks about ways of securing the information in a highly threatening scenario like this.

Modular approach has been suggested as one the most effective ways of securing the information. It talks about how to introduce a scheme in a web application. It says that:

  • Data should be validated in the data model, where the validation rules have maximum scope for interpreting the context; and
  • Escaping of harmful meta-characters should be performed just before the data is processed, typically in the data access components.

When inadequate security measures are implemented the following things might probably take place. An attacker could subvert the application logic, execute unauthorized commands or code on backend systems or compromise the trust the user has in the application. Parameter manipulation, code injection, cross site scripting, SQL injection, operating command injection are some of the possible attacks.

Conclusion: The research paper concludes on a note that it is almost an ocean of information that passes through the medium of internet these days. In a scenario like this there is every possibility of an attack. Some approaches can be adopted to safeguard the data in a scenario like this. Modular approach adopted for data validation and verification is one very effective way of securing the data from any attack.

Download Data Validations for Secure Web Applications Technical White Paper.

Data Warehousing and Data Mining

Description: The research paper Data Warehousing and Data Mining describes data warehousing and mining techniques. It has been suggested in the research paper that there has been increase in knowledge and information in colossal proportions ever since the advent of man on the earth. Knowledge and information thus produced and discovered have been helping the human race to evolve. Systematic archiving of information and data generated becomes important when one talks about Data base management. Data mining and warehousing techniques hold interesting solutions to database management systems.

What is mining and warehousing: Data mining helps archiving information in understandable formats. Data mining also helps in extracting hidden information from the ocean of information. It could be that data mining could discover information that is almost proves ‘gold’ to the organization. Hence the name. Applying data mining techniques companies could lay hands on exquisite information that gives them an edge over their competitors. This way data mining helps in making the procedures of the company extremely transparent, heightening the customer loyalty factor. Data mining has some wonderful solutions to offer the customer fraternity to uphold their integrity and interest. Thus mining helps the company grow in a lesser period of time.

Warehousing on the other hand helps analyzing the data and putting it in patterns as to analyze, evaluate, predict and forecast. Data warehousing helps the company’s evaluate their own strengths and weaknesses in the backdrop of their competitor’s. Data warehousing is constantly coming up with better performance tools such as the Semantic Web to gain both edge and security over the data management processes.

Conclusion: Both the techniques aim at presenting data and information in purest form so as to facilitate business development. Much has been achieved in the field of DBMS. Still much has to be achieved. The reason is that the knowledge, information and data are such things that increase form second to second and almost double in some years. Hence better techniques to cope with the information in colossal form ought be developed simultaneously.

Download Data Warehousing and Data Mining Technical White paper.

Data Warehousing Semantic Web Final Project Seminar Report

Description: The research paper Data Warehousing Semantic Web Final Project Seminar Report suggests that Data Warehousing has now come up with new trends such as semantic web that is more users friendly. The paper suggests that there has been an astronomical increase in the amount of data produced in the recent years. Data that passes through networked systems is vulnerable to security threat. Not all data that is produced can be shared or viewed.

What is semantic web: Semantic Web offers viable solutions to problems like these.

Semantic web is a mesh of information linked in a way that it could easily be processed by machines at a very large scale, i.e. globally. Semantic web is about extracting information and putting them in understandable formats from many sources. The semantic web as the name itself suggests helps in creating a language for data processing. Semantic web also aims at secure transmission of huge data.

Data that is generally hidden away in HTML files is often useful in some contexts, but not in others. The problem with the majority of data on the web that is in this form at the moment is that it is difficult to use on large scale, because there is no global system for publishing data in such a way as it can be easily processed by anyone. Technically WWW means a set of protocols and languages driven by a strong standards approach namely URI, HTTP, HTML, HML. The principles involved are the

1) Implementation and platform independence crucial and

2) World Wide Web consortium the most prominent.

Conclusion: The research paper suggests that Semantics web is still in incipient stage and has to evolve to a very large extent. Semantic web does offer a lot of solutions for security of data processed and transmitted. Semantic web enhances trustworthiness of a document.

Download Data Warehousing Semantic Web Final Project Seminar Report.

CSE Seminar on Data Warehousing and Data Mining

Description: The research paper CSE Seminar on Data Warehousing and Data Mining talks extensively about Data Mining and Warehousing techniques. It says that these days businesses heavily depend upon information. Information is churned out in astronomical pace. Companies not only churn put data but almost manage them simultaneously. Data exuded from innumerable research of hard work and research rests in a company. The information this generated is important to understand the scope of business and future business trends. Data not only has to be managed but has to be archived and presented to customers and during business meetings held to do some quintessential analyses.

The research paper speaks about the drawback of these techniques. It says that although there has been information explosion effective ways of managing this information, patterning it and archiving it in the form of comparative and contrastive formats is still not been achieved. The research paper suggests that these techniques have to evolve alongside information explosion almost unendingly. The paper talks about mining and warehousing and says that warehousing currently is aiming at coming up some best techniques that makes all the decision making go ‘online’.

Data mining almost goes with its name. One of the main objectives of data mining is to discover the ‘hidden’ gold i.e. very valuable information that has gone unnoticed in the data. Owing to viability of a thing like this, data mining techniques have merged with neural networks in order to offer some best of its kind, data base management techniques.

Conclusion: Data mining and warehousing thus help in uncovering the valuable hidden data in order to come up with better customer care feature. Data mining and warehousing have the intrinsic potential to make the business processes transparent thus escalating the customer –loyalty factor. Besides helping the organizations predict, analyse, compare and contrast business procedures these tools also help in achieving a very quick decision making which is fundamental to any business action.

Download CSE Seminar on Data Warehousing and Data Mining technical Student Seminar.

Data Warehousing Seminar Report and Data Mining Seminar Report

Description: It is suggested in the research paper Data Warehousing Seminar Report and Data Mining Seminar Report that the companies now rely heavily on data mining and warehousing techniques to look into their business trends, do competitor analyses, understand their own strengths and weaknesses and so the competitors. Data mining and warehousing techniques are beneficial not just to organizations, businesses but also businesses. Although there are many advantages of data mining there are many disadvantages too. For example one might gain access over some private matters of the company which is otherwise not to be disclosed to a third party. The research paper discusses these things in depth.

 The research paper quotes example of the retail king Wal-Mart. It is suggested in the research paper that Wal-Mart is pioneering massive data mining to transform its supplier relationships. Wal-Mart captures point-of-sale transactions from over 2,900 stores in 6 countries and continuously transmits this data to its massive 7.5 terabyte Teradata data warehouse. Wal-Mart allows more than 3,500 suppliers, to access data on their products and perform data analyses.

Data mining (DM), also called Knowledge-Discovery in Databases (KDD) or Knowledge-Discovery. Data mining has been defined as “the nontrivial extraction of implicit, previously unknown, and potentially useful information from data” and “the science of extracting useful information from large data sets or databases” .Data warehousing is defined as a process of centralized data management and retrieval. Data warehousing represents an ideal vision of maintaining a central repository of all organizational data. Centralization of data is needed to maximize user access and analysis.

Conclusion: The research paper suggests on the note that although there is information explosion there are no highly evolved and secure database management systems going hand in hand so far. The research paper voices out a need for this. Data mining can be beneficial for businesses, governments, society as well as the individual person.  However, the major flaw with data mining is that it increases the risk of insecure operations.  Data mining and warehousing are the tools that need continuous evolution considering the ‘change’ factor in businesses that is almost continuous.

Download Data Warehousing Seminar Report and Data Mining Seminar Report.

Information Explosion Data Mining and Warehousing

Description: The research abstract Information Explosion Data Mining and Warehousing talks about the change in business trends these days. It’s been an information explosion. All the industries/businesses, be they big or small are using data from innumerable sources to identify their own business trends. This they do in order to understand the strengths and the weaknesses of the competitor and thus build their own formidable business empire. Owing to an almost unending release of data and information, the need to channelize this into meaningful formats and patterns is becoming more and more important and ineveitable to smooth and secure business functioning.

The data and information this released are managed by highly evolved and effective Data Base Management Systems. Data Mining and Data warehousing have been the buzz words of success these days. If data mining helps in securing and processing the data into understandable chunks, warehousing helps in analyzing the data and put it in such a way as to facilitate comparison between trends, analyzing the data for the business predictions and above all expedite decision making processes to attain quicker and faster solutions. Data warehousing aims at expediting the speed of business/research decision making skills to such an extent that it almost happens ‘online’. Breakthrough research is going on in these fields and lots and lots has to be done to stay at par with information generation. The research paper suggests a wide gap between information generation and processing.

Conclusion:

Although much has been attained much still needs to be done. The research paper identifies the need to have sophisticated analysis’ techniques to manage the information generated in industries and businesses. The lacuna between information generation and information analyzing has to be lessened. The process of data production and the process of data management and that too in an utmost sophisticated way has to go hand in hand. There is a long way to go and emergence of latest tools and techniques pertaining to the field of mining and warehousing is almost a never-ending process.

Download Information Explosion Data Mining and Warehousing Technical Seminar