Pages

Tuesday, December 30, 2008

Network security and cryptography

Network security and cryptography

ABSTRACT


The world is surging towards a digital revolution where computer networks mediate every aspect of modern life. Not many years ago, most computers were carefully guarded mainframes, held tightly in the hands of skilled professionals. The systems and their guardians combined to provide ironclad protection of the organization’s all important data. .Today the threat to the information on the network has grown to the greatest extent. Information is the most vital aspect of every organization. Access to the internet can open the world to communicating with customers and vendors, and is an immense source of information. But the same opportunities can open your Local Area Network to possible attacks by thieves and vandals. These attackers and hackers try to harm a system and disrupt information exploiting vulnerabilities by using various techniques, methods, and tools.

It is important to identify those areas, which are causing insecurity to the computer, and which are the likely sources of the threat .These days everybody is more security conscious as they browse and conduct business on the Internet. As with any type of security, be it personnel or domestic, the likelihood of an internet security breach depends on one’s degree of preparedness and use of proper prevention tools.

The report provides certain practices that can be implemented by any organization, if it wants to protects the confidentially, availability and integrity of its system data when it contracts with outside parties to install, configure, manage or update any of its Information technology.Overall the report focuses on creating an effective “internet security policy” and also highlights some current issues as such as Network security and cryptography.

Saturday, December 27, 2008

SENSORS OF REMOTE SENSING SATELLITE

SENSORS OF REMOTE SENSING SATELLITE

Abstract:

Remote sensing is a technique to observe the earth’s surface or atmosphere through space (space born) or air (airborne) .It is also known as “earth observation”
or “teledetection”.
Satellites are also used for remote sensing purpose. Main payload on such satellites is sensors which pick up light and heat from ground and convert it into the image. Sensors use on satellites takes the image in various wavelength bands and produces false color images which are used to identify the objects on earth.
Such images are used for the purpose of observation of the earth, to explore the hidden minerals to detect global pollution levels, rundown of natural resources etc.
Sensors generally operate in cross track mode and take images in different bands. The spatial resolution, spectral resolution, radiometric resolution, temporal resolution is different for different sensors.

Remote sensing systems basically consist of four components :-
1) Source of EM energy – it can be sun’s reflected energy or heat radiated by earth.
2) Atmospheric interaction – EM energy passes through atmosphere is distorted, absorbed & scattered.
3) Earth’s surface interaction – characteristics of EM energy is a function of objects present on earth & wavelength as well.
4) Sensors – they are used to register emitted and reflected EM energy & produce images of earth’s surface.

Thursday, December 25, 2008

OPTIMIZATION OF DATABASE PLACEMENT USING DYNAMIC PROGRAMMING IN MOBILE COMMUNICATION NETWORK

OPTIMIZATION OF DATABASE PLACEMENT USING DYNAMIC PROGRAMMING IN MOBILE COMMUNICATION NETWORK

ABSTRACT

User mobility causes two main problems with call set-up and routing that are not encountered in a network containing only stationary users. Firstly, call set-up requires at least one database access to find the current location of the user being called. Secondly, if the network maintains an up-to-date list of the location of each mobile user, then each location change requires one or more database updates. One possible solution to these problems is to use a single database which stores the location of the entire user in the network. This database must then be updated (accessed) every time a user changes location (make a call). This solution becomes infeasible if a single database cannot handle the number of updates and access generated in this manner.
To maintain track with mobile user and assign databases as per up-date list of location of each mobile user, requires lot of computation work. This can be calculated with dynamic programming. We propose a hierarchical arrangement of databases of mobility tracking which reduces the access and update load on each database, relative to the centralized scheme just described. This scheme was motivated by recent studies which estimate the amount, updates, queries, and signaling traffic. its assumed in GSM , that each user is identified with a particular node in the signaling network, called a Home Location Register (HLR), which contains the user current location. Each location change (call set-up) then requires that the HLR be updated (accessed). This system becomes inefficient if a user travels far from his HLR, or if there is a relatively large amount of traffic between the two HLRs far apart in the network. In each case, update and query traffic must travel long distances over the signaling network. The hierarchical scheme presented here has the capability of greatly reducing the distances over which signaling traffic must travel in the situation.

Wednesday, December 24, 2008

GENETIC ALGORITHMS – AN OVERVIEW

GENETIC ALGORITHMS – AN OVERVIEW


ABSTRACT:

In this paper, we intend to discuss one of the latest fields of technology which provides a key to most of the real world problems – GENETIC ALGORITHMS (GAs). Millions of species have evolved, and continue evolving, over millions of years. Their lives are dictated by the laws of natural selection and Darwinian evolution. GAs exploits the ideas of the survival of the fittest and an interbreeding population to create a novel and innovative search strategy. GAs are adaptive heuristic search algorithm. We have found out that GAs has been widely studied, experimented and applied in many fields in engineering worlds. Not only does GAs provide alternative methods to solving problem, it consistently outperforms other traditional methods in most of the problems link. The appeal of GAs comes from their simplicity and elegance as robust search algorithms as well as from their power to discover good solutions rapidly for difficult high-dimensional problems.

KEYWORDS:
Survival of the fittest, interbreeding population, Evolutionary computing, chromosomes, selection, crossover, mutation, random search.

DATA MINING AND WAREHOUSE

DATA MINING AND WAREHOUSE


ABSTRACT


Data mining, the extraction of hidden predictive information from large databases,
is a powerful new technology with great potential to help companies focus on the most important information in their data warehouses.


Wide ranges of companies have deployed successful applications of data mining. Two critical factors for success with data mining are: a large, well-integrated data warehouse and a well-defined understanding of the business process within which data mining is to be applied.


Our major focus in this context is about the FBI plan to acquire and employ modern information technology to thwart future terrorist attacks. The FBI has selected "investigative data warehousing" as a key technology to use in the war against terrorism. The technique uses data mining and analytical software to comb vast amounts of digital information to discover patterns and relationships that indicate criminal activity.


There is a growing gap between more powerful storage and retrieval systems and the users’ ability to effectively analyze and act on the information they contain. A new technological leap is needed to structure and prioritize information for specific end-user problems. The data mining tools can make this leap.

Tuesday, December 23, 2008

Kritansh 2009 The Annual Techno Mangement fest of KIIT University

Kritansh 2009 The Annual Techno Mangement fest of KIIT University

Introduction

Kritansh is the annual Techno-Management Fest organised by KIIT University. Kritansh ‘09 will be a cluster of new and exciting events covering the science, technology and managerial dimensions besides the enlightening workshops and guest lectures and awe-inspiring shows.The second largest of its kind in eastern India, Kritansh draws participation from all over India. Kritansh boasts of a very attractive cash prize money. Kritansh 2009 has events ranging from Guest Lectures, workshops, VLSI Coding, Circuit Designing, Software Designing, robotics, corporate events, to workshops and Cmomputer gaming contests.

Website: http://www.kritansh.in/K09

Organized by: KIIT University, Bhubaneswar

Start Date : 14.02.2009

FootPrints 2009

FootPrints 2009

Introduction

The Faculty of Technology & Engineering is organizing its event, FOOTPRINTS, a national level technical festival, on the 20th, 21st and 22nd of February 2009. FOOTPRINTS has the distinction of being the most popular technical festival in the state of Gujarat and one of the prominent in the country. The primary objective behind FOOTPRINTS is to provide an ideal platform to young engineers for the purpose of testing their technical prowess and enhancing their ingenuity. The festival comprises of technical and non-technical paper presentations, model presentations, various workshops, lectures and quizzes. In addition to this, the fiesta also includes social awareness campaigns, various on-the-spot competitions and a virtual stock exchange. Furthermore, FOOTPRINTS is well-known for bringing some of the best entertainment to the city. 3 action-packed days and a campus buzzing with excitement, FOOTPRINTS ‘09 promises to be an an affair to remember for one and all.

Website: http://www.msu-footprints.org/

Organized by: The Faculty of Technology & Engineering, The Maharaja Sayajirao University

Start Date : 20.02.2009

Monday, December 22, 2008

UTILIZATION OF WASTE PLASTIC BAGS IN ROAD CONSTRUCTION

UTILIZATION OF WASTE PLASTIC BAGS IN ROAD CONSTRUCTION

Abstract

Disposal of waste plastic bags from the domestic has become a become a major problem to the agencies in the cities. The waste plastic bags available in the dogmatic waste plastic mainly consist of lowdensity polyethylene .plastic bags dumped in the dustbins find their way into the drainage system and clog them. Often, these are burnt along the roadside ,which produces fumes causing air pollution .

This paper emphasizes on the laboratory evaluation on the bituminous mixes for utilization of plastic bags in the road construction.the laboratory performance studies were conducted on bituminous concrete mixes a dence mix used as a surface course in road construction.the laboratory studies proved that waste plastic enhances the property of the mix in addition to providing for the way for it’s disposal in a useful way. In the present investigation,comparision was made between convention bituminous concrete mix and the mix modified with waste plastic . The results indicated that the strength properties of the mix with waste plastic were better when compared with a conventional mix. The fatigue life was doubled using waste plastic in the mix under laboratory condition.

GLOBAL POSITIONING SYSTEM AND POSITIONING METHODS

GLOBAL POSITIONING SYSTEM AND POSITIONING METHODS

Abstract

The Global Positioning System (GPS) has made navigation systems practical for a number of vehicle navigation applications.Today, GPS-based navigation systems can be found in motor vehicles, farming and mining equipment, and a variety of other land-based vehicles (e.g., golf carts and mobile robots). In Section II of this paper, each of these applications is discussed, and the reader is introduced to some of the issues involved with each one. Beginning in Section III, one particular technical aspect of navigation for land vehicles is discussed. Specifically, the research discussed in this paper presents a quantitative examination of the impact that individual navigation sensors have on the performance of a land-vehicle navigation system. A range of navigation sensor performance levels and their influence on vehicle positioning accuracy are examined. Results show that, for a typical navigation system, positioning error is dominated by the accuracy of the position fixes provided by the GPS receiver when GPS position fixes are available and by the rate gyro’s bias drift when GPS position fixes are not available. Furthermore, results show that the accuracy of the GPS position fixes has a significant impact on the relative contributions that each dead-reckoning navigation sensor error makes. The implications of these results for navigation system design and sensor design are discussed.

INTRODUCTION

Global Positioning System (GPS) was developed by the Department of Defense (DOD).
The technologies on which GPS is based were initially tested on three “Timation” satellites with the first launched on 31st May 1967 and other two launched in 1969 and 1974.
The first actual GPS satellite named “Navstar”, was launched by a U.S. Air force (USAF) Atlas centaur booster in February 1978 That satellite, plus the ten after it was designated “Block”, and were built by Rockwell International. They were intended as technology demonstrator and differed from the later operational GPS satellite in that they were placed into orbit with an inclination of 63 degrees not 55 degrees.

An Air Force Delta booster launched the first “Block II” operational satellite on 14th February 1989. Another eight Block II satellites were launched, to be followed by 15 slightly improved “Block II” satellite Block II and Block IIA satellites were also built by Rockwell.

The full 24-satellite operational constellation was finally completed with the launched of a Block IIA satellite on March 1994. The block II/IIA satellites have a design lifetime of over 7 years.

Sunday, December 21, 2008

ROBOTICS AND AUTOMATION

ROBOTICS AND AUTOMATION

ABSTRACT

This paper proposes a resolution in the field of robots and automation. The vast resources devoted to robotics research and developments have already begun to bear fruits. Robots classification is based on level of sophistication according to generation, manipulative function, manipulating geometry, motion characteristics, type of control, technology involved, according to method of input of information and teaching. Elements of robots are manipulator, robot controller, end effectors, sensor energy source and robotic sensor. Robot specification is based on mechanical power, control, specific and non-specific.. Robot teaching method can be divided as method in which the robot arm moves physically from point to point in series and another in which the robot does not move physically in the teaching stage from point to point in series. Some of the important considerations for overall design are geometrical dexterity, kinematics chains and its stability, selection of drive system, power transmission system, path measuring system, selection of materials, bearing and coupling. We have discussed about robot application in industry as well as non-industrial application in manufacturing environment. Automation is basically of two types, hard automation and flexible automation. Flexible automation is a new technology in which numerical engineering radically improve efficiency design and production. The electronic computer is at the heart of flexible automation. Flexible automation includes Computer Aided Design (CAD), Computer Aided Manufacturing (CAM) and Computer Aided Techniques for Management. Automation component includes sensors, analyzers, actuators and drives. Future research will focus on the development of artificial intelligent (AI) at the same time on the teaching method.
We cannot afford to miss the present revolution in robotics and automation if our country has to compete in the global market of robot technology and automation.

Robotics And Automation

Robotics And Automation

Abstract

The rapid advancement in electronic field has lead to revolutionary development in mechanical field .It has lead to the automation of vehicles, industries, machines etc. Due to this automation in mechanical field there has been revolutionary change in industrial production and quality of products. The automation is only possible due to robots, sensors, and control systems.

Automation and robotics are two interrelated technologies. Robots industries are example of programmable automation. According to Robotics Industrial Association (RIA’s), “A robot is software controllable mechanical device that uses sensors to guide one or more end effort through a programmable motion in workplace in order to manipulate physical objects”.

What brought about the resurrection of Industrial Automation? The answer is only better robot. Industrial robots are reducing costs, boosting productivity, and minimizing errors. Robotic system is being deployed in factories, buildings, cell phones, packaging, and even making cookies, welding work etc.

This paper includes evolution of robotics, phases in robotics, different types of robots such as ASIMO robot, Walking robot, etc. relation between AI and robotics, case study and the most important its application.

Saturday, December 20, 2008

DIGITAL LIGHT PROCESSING TECHNOLOGY

DIGITAL LIGHT PROCESSING TECHNOLOGY

ABSTRACT

DIGITAL Light Processing (DLPTM) is a revolutionary new way to project and display information. Based on the Digital Micromirror Device (DMDTM) developed by Texas Instruments, DLP creates the final link to display digital visual information. DLP technology is being provided as subsystems or “engines” to market leaders in the consumer, business, and professional segments of the projection display industry. In the same way the compact disc revolutionized the audio industry, DLP will revolutionize video projection.

INTRODUCTION

DLP has three key advantages over existing projection technologies. The inherent digital nature of DLP enables noise-free, precise image quality with digital gray scale and color reproduction. Its digital nature also positions DLP to be the final link in the digital video infrastructure. DLP is more efficient than competing transmissive liquid crystal display (LCD) technology because it is based on the reflective DMD and does not require polarized light. Finally, close spacing of the micromirrors causes video images to be projected as seamless pictures with higher perceived resolution. For movie projection, a computer slide presentation, or an interactive, multi-person, worldwide collaboration DLP is the only choice for digital visual communications, today and in the future.

GAS LIQUEFACTION SYSTEMS IN CRYOGENICS

GAS LIQUEFACTION SYSTEMS IN CRYOGENICS

INTRODUCTION

The word cryogenics means, literally, the production of icy cold; however, the term is word is used today as a synonym for low temperatures. In the field of cryogenic engineering, one is concerned with developing and improving low temperature techniques, processes and equipments.

The point on the temperature scale at which refrigeration ends and cryogenics begins is not sharply defined. But it is chosen that the field of cryogenics involve the temperatures below 123 K. this is a logical dividing line because the normal boilieng points of the so-called permanent gases lie below 123 K.The position and range of the field of cryogenics are illustrated on a logarithmic thermometer scale.

Friday, December 19, 2008

Virus Programming

Virus Programming

ABSTRACT

In the forthcoming discussion we are going to discuss the history of evolution of viruses, reasons to create a virus, types of viruses, different infecting techniques. Basically the seminar is for understanding the structure and working of different types of viruses so as to promote the thinking about programming antivirus techniques so as to tackle the problem of infection by a virus for a system and saving the valuable data on the system. The symptoms are also discussed so as to detect a warning of virus attack. Some examples are also taken into consideration so as to give a brief idea of virus attacks.


Introduction:

A computer virus is a computer program that can infect other computer programs by modifying them in such a way as to include a (possibly evolved) copy of itself. Note that a program does not have to perform outright damage (such as deleting or corrupting files) in order to be called a "virus". However, Cohen uses the terms within his definition (e.g. "program" and "modify") a bit differently from the way most anti-virus researchers use them, and classifies as viruses some things which most of us would not consider viruses.

Many people use the term loosely to cover any sort of program that tries to hide its (malicious) function and tries to spread onto as many computers as possible. (See the definition of "Trojan".) Be aware that what constitutes a "program" for a virus to infect may include a lot more than is at first obvious - don't assume too much about what a virus can or can't do!

These software "pranks" are very serious; they are spreading faster than they are being stopped, and even the least harmful of viruses could be fatal. For example, a virus that stops your computer and displays a message, in the context of a hospital life-support computer, could be fatal. Even those who created the viruses could not stop them if they wanted to; it requires a concerted effort from computer users to be "virus-aware", rather than the ignorance and ambivalence that have allowed them to grow to such a problem.

VIDEO ON DEMAND

VIDEO ON DEMAND

ABSTRACT

Video on Demand (VoD) is an interactive multimedia system that works like

cable Television, the difference being that the customer can select a movie from a

large video database. Individual customers in an area are able to watch different

programs when they wish to, making the system a realization of the video

rental shop brought into the home Video-on-demand system that uses a practical, technologically sophisticated model to serve the home movie-viewing needs of a wide audience,
including meeting peak demand for popular, newly released films.

Adaptive Brain Interface

Adaptive Brain Interface

Abstract

In simple words ABI can be defined as a human computer interface that accepts voluntary commands directly from the brain. The central aim of ABI is to extend the capabilities of physically impaired people. The brain-computer interface provides new ways for individuals to interact with their environment. The computer will continue to be a necessary component as long as detecting a brain response reliably remains a complex analytical task. In most cases, the brain response itself is not new, just the means of detecting it and applying it as a control. However, the necessary feedback associated with experimental trials frequently resulted in improved, or at least changed performance. Little is known about the long-term effects of such training either from an individual difference, or from a basic human physiology point of view.
A brain-computer interface (BCI) is a system that acquires and analyzes neural (brain) signals with the goal of creating a high bandwidth communications channel directly between the brain and the computer. The objective of the ABI project is to use EEG signals as an alternative means of interaction with computers. As such, the goal is to develop a brain-actuated mouse.

Adaptive Brain Interface

Adaptive Brain Interface

Abstract

In simple words ABI can be defined as a human computer interface that accepts voluntary commands directly from the brain. The central aim of ABI is to extend the capabilities of physically impaired people. The brain-computer interface provides new ways for individuals to interact with their environment. The computer will continue to be a necessary component as long as detecting a brain response reliably remains a complex analytical task. In most cases, the brain response itself is not new, just the means of detecting it and applying it as a control. However, the necessary feedback associated with experimental trials frequently resulted in improved, or at least changed performance. Little is known about the long-term effects of such training either from an individual difference, or from a basic human physiology point of view.
A brain-computer interface (BCI) is a system that acquires and analyzes neural (brain) signals with the goal of creating a high bandwidth communications channel directly between the brain and the computer. The objective of the ABI project is to use EEG signals as an alternative means of interaction with computers. As such, the goal is to develop a brain-actuated mouse.

Active Directory Services in Windows 2000’

Active Directory Services in Windows 2000

Abstract

We use directory service to uniquely identify users and resources on a network. Active Directory in Microsoft Windows 2000 is a significant enhancement over the directory services provided in previous versions of Windows. Active Directory provides a single point of network management, allowing us to add, remove and relocate users and different resources easily.
Windows 2000 uses Active Directory to provide directory services. It is important to understand the overall purpose of Active Directory and the key features it provides. Understanding the interactions of Active Directory architectural components provides the basis for understanding how Active Directory stores and retrieves the data. This seminar concentrates on the Active Directory functions, its features and architecture.

Introduction

Windows 2000 is a logical grouping of networked computers that share a central directory database containing security and user account information .This directory database is known as directory and is the database portion of Active Directory, Active Directory is nothing but the Windows 2000 directory service. In a domain, security and administration are centralized because the directory resides on a domain controllers, which manage all security aspects of user-domain interactions.
Active Directory also provides a method for designing a directory structure that meets the needs of our network.
Active Directory is an enterprise-class directory service that is scalable, built from the ground up using Internet-standard technologies, and fully integrated at operating system level. Active Directory simplifies administration and makes it easier for users to find resources. It provides a wide range of features and capabilities. It includes group policies, Scalability without complexity, support for multiple authentication protocols, and the use of Internet Standards.
Active Directory services are secure, distributed partitioned and replicated. It is designed to work well in any size installation, from a single server with a few hundred objects to thousands of servers with millions of objects. Active Directory services adds many new features that make it easy to navigate and manage large amounts of information, saving time for both administrators and end users,.

Tuesday, December 16, 2008

BRAIN-COMPUTER INTERFACE USING EEG SIGNALS

BRAIN-COMPUTER INTERFACE USING EEG SIGNALS

Abstract--- Brain-Computer Interface(BCI) is a interface technique that provide communication and control capabilities to people with severe motor disabilitiesThe further development of BCI depends on the types of different brain signals,protocols,recording formats and processing methods.In this paper we are going to describe an application of BCI ,i.e.,dialing a telephone through motor senses.

I.INTRODUCTION

A. Brain-Computer Interface(BCI) Technology
Human Computer Interface (HCI) has been one of the growing fields of research and development in recent years. Most of the effort has been dedicated to the design of user-friendly and ergonomic systems by means of innovative interfaces such as voice and vision, and I/O devices in virtual reality. Direct Brain-Computer Interface (BCI) adds a new dimension to HCI. Interesting research work in this direction has been already initiated, mainly motivated by the hope to create new communication channels for persons with severe motor disabilities.

Wednesday, December 10, 2008

IC Engine Seminar Topic

IC Engine Seminar Topic

Abstract


In this seminar the undesirable emissions generated in the combustion process of automobile and other IC engines is explored. These emissions pollute the environment and contribute to global warming, acid rain, smog, odors and respiratory and other health problems. The emissions of concern are unburned hydro carbons (HC), carbon monoxide (CO), oxides of nitrogen (Nox), sulphur and solid carbon particulate. Ideally, engines and fuels could be developed so that very few harmful emissions are generated and these could be exhausted to the surroundings without major impact on the environment. With present technology after treatment of exhaust gases to reduce emissions is very important. This mainly consists of the use of thermal and catalytic converters and particulate traps.
In Los Angeles smoke and other pollutant from many factories and automobiles combined with fog that was common in this area and "smog" resulted. During 1950's, the smog problem increased along with increase in number of automobiles. It was recognized that the automobile was one of the major contributor to the smog problem. (smoke + fog) After 1960s emission standards were enforced in California and rest of the united states, in Europe and Japan. By making engines more fuel efficient and with the use of exhaust after treatment emissions like HC, CO and Nox during 1980s.However during this time the number of automobiles greatly increased, resulting more use of fuel and production of more pollutants. Additional reduction will be difficult and costly. As the world population grows, emission, standards become more stringent out of necessity.

Tuesday, December 9, 2008

CDMA Technology

CDMA Technology

Abstract


This report discusses the application of CDMA technology to communicate input port sand output ports inside a switch. The emphasis is on high-speed optical applications. The report discusses 1)History 2)How signal is it generated,3)How/why does it travel,4)How is it modulated,5)How can it be coded/decoded, how can it be shared among several transmitters6)CDMA:basic principles,7)Advantages/disadvantages communications. At least 9 modes should be used for error-free transmission at 1 Gbit/s for the laser we investigated in this work.. The optic code division multiple access (OCDMA) technique, where different users are assigned different “signature codes”, is a promising candidate for next-generation broadband Access networks. As it allows many users to share the same transmission channel, it has unique advantages inherently allowing dynamic allocation of bandwidth, protocol transparency, and a fully asynchronous operation mode with low latency that is suitable for the burst traffic environment. It also offers robust signal security, permitting quality of service guarantees to be managed at the physical layer by assigning different weight codes to different users, and simplified management of large numbers of users by only requiring minimal network control and the coding operation in CDMA can be roughly optical source should have a relatively high speed for temporal spreading and broadband optical spectra for spectral coding, as well as a high time-bandwidth (TB) product. Access, a digital cellular technology that uses spread-spectrum techniques. Unlike competing systems, such as GSM, that use TDMA. CDMA does not assign a specific frequency to each user. Instead, every channel uses the full available spectrum. Individual conversations are encoded with a pseudo-random digital sequence. CDMA consistently provides better capacity for voice and data communications than other commercial mobile technologies, allowing more subscribers to connect at any given time, and it is the common platform on which 3G technologies are built. Once the information became public, Qualcomm claimed patents on the technology and became the first to commercialize it.

Monday, December 8, 2008

MAGNETIC REFRIGERATION

MAGNETIC REFRIGERATION

Abstract


With rapid phase out of chlorofluorocarbons (CFCs) and hydrochlorofluorocarbons (HCFCs), however many researchers and users are looking at an alternative or not in kind technologies for performing heating and cooling duties. The magnetic refrigeration is nothing but the same not in kind technology. The report deals with the construction, working & advantages of the system.
The magnetic refrigeration is based on the magneto-caloric effect i.e. ability of some material to get heated when some magnetic field is applied to them & get cooled when magnetic field is removed.
The currently used refrigeration system i.e. vapor compression refrigeration system is not environmental friendly. Since it is causing depletion of ozone layer & thus promoting Global Warming. Hence alternative system is required.
Compare with the vapor compression refrigeration system, magnetic refrigeration system offers no. of advantages on part of space & efficiency but the main thing is that it is environmental friendly.

Sunday, December 7, 2008

NANOMEDICINE: A Biomedical Engineering Boon

NANOMEDICINE: A Biomedical Engineering Boon

Abstract


In recent days one of the greatest innovation in the field of biomedical engineering is nanomedicine. Only within the last 50 years has medical science begun to examine disease pathology on a molecular level; thus, from a molecular viewpoint, modern medicine remains crude. For example, today's drug is essentially a single molecule with an often sophisticated but always limited repertoire. With nanomedicine, tomorrow's "smart pharmaceuticals" could essentially be programmable machines with a range of "sensory," "decision-making," and "effector" capabilities.
So what exactly is nanomedicine? Technically it is the application of nanotechnology to the prevention and treatment of disease in the human body. The most elementary of nanomedical devices will be used in the diagnosis of illnesses. A more advanced use of nanotechnology might involve implanted devices to dispense drugs or hormones as needed in people with chronic imbalance or deficiency states. Lastly, the most advanced nanomedicine involves the use of nanorobots as miniature surgeon.



Saturday, December 6, 2008

Mobile computing & Wireless communication

Mobile computing & Wireless communication

ABSTRACT


Mobile Computing: - It is an umbrella term used to describe technologies that enable people to access network services anyplace, anytime, and anywhere. Ubiquitous computing and nomadic computing are synonymous with mobile computing.
In the recent past, cellular phone companies have shown an interesting growth pattern. The number of customers has been steadily increased and this has changed the lifestyle of large groups of consumers and had a tremendous impact on the business landscape also.

The paper will begin by describing some basic concepts and will have some advance concept too. Some of which are listed below.
What is Mobile computing?
Some other types of computing?
Need of acceptance
Advantages & Disadvantages.
Future developments.
Some system issues in ‘Mobile Computing’.

Wireless communication: - In general, wireless communication is either text, voice or video communication between two or more parties that does not involve the use of any cords, such as phone cords or cable. Wireless communication is one of the most important technologies that is emerging and developing. The strive to eliminate phone cords and cable connections is just the beginning of this promising technology.
As our world becomes more of a "village", and the pace of business increases, the importance of wireless communication increases as well. But what exactly is wireless communication? Why is it important for businesses today? What types of wireless communications exist?

Thursday, December 4, 2008

Data Mining and Warehouse

Data Mining and Warehouse

Abstract


Most enterprises using Data Warehouse do not suffer from a lack of data but from an overabundance of data. The problem is difficulty in accessing the data for decision support applications. Such applications involve complex queries. Query performance in such an environment is critical because decision support applications often require interactive query response time. Data Warehouses are updated infrequently, so it becomes possible to improve query performance by caching sets retrieved by queries in addition to query execution plans. In this paper we describe an algorithm for finding frequent item sets and the problem associated with it. We suggest two algorithms that are well suited for sets retrieved by queries as compared to traditional LRU in Data Warehousing environment. The two algorithms are Cache Replacement algorithm (LNC-R) and Cache Admission algorithm (LNC-A). These algorithms achieve a substantial performance improvement in a decision support environment when compared to a traditional LRU replacement algorithm.

Tuesday, December 2, 2008

RAPID PROTOTYPING

RAPID PROTOTYPING

ABSTRACT

In the development of new product, there is invariably need to produce a single prototype of a design part before the allocation of large amount of capital to new production system. The main reason for this need is that capital cost is so high and production tooling takes so much time to prepare; consequently, a working prototype is needed for troubleshooting and for design evolution, before a complicated system is ready to produced and marketed. A new technology, which considerably splits the iterative products development process, is the concept and practice of rapid prototyping. In advantages of R.P.: 1 Physical model of parts produced from CAD data files can be manufactured. 2.With suitable material, we can produce final parts. 3.RP operation can be used to produce toolings also.
R.P. process classified into three major groups 1. Subtractive, 2. Additive, & 3. Virtual. In this paper we are focusing on additive processes (built up a part by adding material incrementally).


INTRODUCTION

Though the principle of CE is quite clear and the advantages of the concept for improved quality and reduced cost are implicit, it is not possible to incorporate CE effectively in the absence of some technique for quick development of prototypes. To reduce the development time & adopt concurrent engineering in its true spirit, quick and inexpensive. Fabrication of prototype parts is essential and "rapid prototyping" (RP) technology has made that possible. By rapid prototyping process a solid object with prescribed shape, dimension & finish can be directly produced from the CAD based geometric model data stored in a computer without human interventions.

In all generative manufacturing process, the shape of the work piece is not obtained by removal of chips or forming, or casting. It is achieved by addition of material without any prior recognizable form or shape and the tool is necessary.
Types of generative manufacturing process:

A. Stereo lithography (SL) with photo polymerization
B. Fused deposition modeling (FDM)
C. Laminated object manufacturing (LOM)
D. Selective laser sintering (SLS)
E. Selective powder binding (SPB)
F. Programmable moulding
G. Building metallic objects by GMP