Pages

Wednesday, September 29, 2010

Current Opening At Aftek

Current Opening At Aftek

BSP Openings

Position: Software Engineer / Associate

Responsibilities:

* Development of BSP based on Linux/Windows CE / eCos
* Development of low level device drivers for different peripherals
* Debugging of the hardware. Shall be proficient in using debugging tools
* Co-ordinating with hardware and applcaition team for end to end performance of the system
* Debugging of the kernel level modules
* Porting of the boot loader

Experience: 2 to 5 years

Skills: OS porting, Device Drivers in Linux/eCos/Windows CE, Bootloader

Qualification: BE Electronics / E&TC


Position: Sales Professionals

Responsibilities:

* Identifying new/niche market segments and prospective clients for our products
* Initiating and setting up channel partner network/resller network with dealers/ shop owners and distributors
* Visiting prospective customers/clients and delivering presentations and demonstrations of our products
* Travel within Maharashtra and outside for maintaining and expanding customer base
* Creating awareness of our products in the market

Experience: 3 to 5 years

Skills:

* Experience in sales of industrial automation products and projects; industrial hardware - PLCs, PID controllers, Data Scanners, Loggers, Protocol Convertors / Gateway, Remote Monitoring System, Building Automation System, DAS (Data Acquisition Systems), MES (Manufacturing Execution Systems) etc
* Experience of catering to manufacturing industry would be preferred

Note: All positions are based in Pune.

Submit Your Resume

resume@aftek.com

Please mention the position you are
applying for in the subject line.

BIONIC AGE

BIONIC AGE

Abstract:


Bionic age is basically a technology combining biology and electronics. Advanced Bionics Corporation, in sylmar, CA, announced the U.S. Food and Drug Administration (FDA) have approved Bionic Ear System. The new technology is approved for use in children and adults with profound hearing loss in both ears. An estimated 460,000 to 740,000 people in the United States who are severely or profoundly hearing impaired may benefit from bionic ear surgery.

SPEAR3 (Speech Processor for Electrical and Acoustic Research), is an advanced body-worn speech processor developed by the CRC for Cochlear Implant and Hearing Aid Innovation to enable high-level speech processing research applicable to cochlear implants and/or hearing aids.

The purpose of seminar to give the idea of bionic age and detailed of bionic ear and speech processor used in cochlear implant.


for more info visit.
http://www.enjineer.com/forum

BIOCHIP

BIOCHIP

ABSTRACT:


A biochip is a collection of miniaturized test sites (microarrays) arranged on a solid substrate that permits many tests to be performed at the same time in order to achieve higher throughput and speed. Typically, a biochip's surface area is no larger than a fingernail. Like a computer chip that can perform millions of mathematical operations in one second, a biochip can perform thousands of biological reactions, such as decoding genes, in a few seconds.


for more info visit.
http://www.enjineer.com/forum

AUTOMATED TELLER MACHINE

AUTOMATED TELLER MACHINE

ABSTRACT:


In simple words, an ATM can be described as a m/c which dispenses money on reading information from a card which is inserted into the machine. ATMs can now be seen at many places.
This seminar focuses on the working of an ATM. That is what happens since the card is inserted & till the money is received
is explained in detail. Other features like parts of an ATM , security of an ATM are also covered.


for more info visit.
http://www.enjineer.com/forum

Animatronics

Animatronics

INTRODUCTION:


Need of special effects in entertainment business:

Special effects are of great importance in case of entertainment business. Using “Animatronics” it is possible to get wonderful effects that are very realistic.

Following are its application:

Men In Black
Terminator Part 1 , 2
Jurassic Park 1 , 2 , 3
Godzilla
Joe Little Young
Babe

Trade offs between different tools used for achieving special effects:

We are having different tools for achieving special effects like:

Computer imagery and animation
Animatronics

The use of the tool depends on the application or varies according to the application. If we want various views and movements of the character then animatronics is used to achieve real effect.
But if say the character makes complete rotation of head then animation is the right choice.

Also choice of the tool depends on the cost. In many cases software cost is very high than creating an “Animatron”.


for more info visit.
http://www.enjineer.com/forum

Animation studio

Animation studio

Introducing Maya:


Today the only Tool or Software or Script Language that gives digital content creator such power for producing word class 3D games, 3D animation and Visual effect .

History:

Maya is a product of formation of two companies Alias and Wavefront of Toronto. Company form in 1995, first release is Maya 1.0 for Windows NT. Today in market Maya 4.5 is available for Windows XP, Win 2000, Linux and Mac Os. AliasWavefront customer includes CNN, Disney’s, Electronic Arts, Industrial Light & Magic, Sony picture, Square, Microsoft & many more.

Nonlinear animation:

The Trax Editor provides a way to layer and mix character animation sequences nonlinearly— independently of time. You can layer and blend any type of keyed animation, including motion capture and path animation. The Trax Editor is ideal for developing complex character animations in combination with other Maya features.

The Trax Editor lets you layer and blend character animation sequences nonlinearly—independently of time. You can work with any type of keyed animation except motion path animation.

The Trax Editor lets you synchronize animation created with a variety of Maya techniques. At the heart of Trax Editor Usage is the clip—a span of animation you obtain from an existing animated character. The most commonly used clips are independent, reusable animation sequences that are easy to merge or blend with other clips. For example, a walk cycle, run cycle, jump, and tumble are ideal candidates for clips because you can blend and sequence them in various ways.


for more info visit.
http://www.enjineer.com/forum

Adaptive Brain Interface

Adaptive Brain Interface

Abstract:


In simple words ABI can be defined as a human computer interface that accepts voluntary commands directly from the brain. The central aim of ABI is to extend the capabilities of physically impaired people. The brain-computer interface provides new ways for individuals to interact with their environment. The computer will continue to be a necessary component as long as detecting a brain response reliably remains a complex analytical task. In most cases, the brain response itself is not new, just the means of detecting it and applying it as a control. However, the necessary feedback associated with experimental trials frequently resulted in improved, or at least changed performance. Little is known about the long-term effects of such training either from an individual difference, or from a basic human physiology point of view.

A brain-computer interface (BCI) is a system that acquires and analyzes neural (brain) signals with the goal of creating a high bandwidth communications channel directly between the brain and the computer. The objective of the ABI project is to use EEG signals as an alternative means of interaction with computers. As such, the goal is to develop a brain-actuated mouse.

Introduction:

In today’s fast paced world, information and communication technologies are dramatically transforming industries, economies and the quality of our lives. Access to new emerging technologies can be taken for granted. Unfortunately, not everyone can enjoy the benefits provided by information and communication systems on equal terms. People with severe physical disabilities are practically excluded. But, what if they could communicate their wishes or control electronic appliances merely by thinking? After all, the most important part of our body that controls functioning of our body is our brain. We unnecessarily think that a person handicapped with his hands or legs cannot work on computer or even cannot control electronic appliances. So let’s make them utilize their brain to control computer or even turn on electrical appliances. The European Commission’s Joint Research Center (JRC) is coordinating a project called Adaptive Brain Interfaces (ABI) as part of European Union Information Technologies ESPRIT programme.

for more info visit.
http://www.enjineer.com/forum

ACTIVE DIRECTORY SERVICES IN WINDOWS 2000

ACTIVE DIRECTORY SERVICES IN WINDOWS 2000

Abstract:


We use directory service to uniquely identify users and resources on a network. Active Directory in Microsoft Windows 2000 is a significant enhancement over the directory services provided in previous versions of Windows. Active Directory provides a single point of network management, allowing us to add, remove and relocate users and different resources easily.

Windows 2000 uses Active Directory to provide directory services. It is important to understand the overall purpose of Active Directory and the key features it provides. Understanding the interactions of Active Directory architectural components provides the basis for understanding how Active Directory stores and retrieves the data. This seminar concentrates on the Active Directory functions, its features and architecture.

Introduction to Directory Services:

In case of networks, the terms ‘directory’ and ‘directory service’ refer to the directories found in public and private networks. A directory provides a means of storing information related to the network resources to facilitate locating and managing these resources.

A directory service is a network service that identifies all resources on a network and makes them accessible to users and applications. A directory service differs from a directory in that it is both the source of the information and the services making the information available to the users.

A directory service acts as the main switchboard of the network operating system. It is the central authority that manages the identities and relationships between distributed resources. It enables these resources to work together.

A directory service supplies these fundamental operating system functions. Therefore it must be tightly coupled with the management and security mechanisms of the operating system to ensure the integrity and privacy of the network.


for more info visit.
http://www.enjineer.com/forum

Network Security

Network Security

Abstract:


“Privacy and security is must for me, you and others”

For the few decades of their exixtence, computer networks were primarily used for university researchers for sending e-mail and by corporate employees for sharing printers.

Under these conditions, security did not get a lot of attention. But now, as millions of ordinary citizens are using network for banking, shoping and filling their tax, network security for all this purpose is the problem. In this paper we have tried to point out the problems to computer and networks we have also reveals how to tackle this problems using method like firewall and cryptography.

Our focus is mainly on the cryptographic techniques. We have included RSA method which is used by many Network Security Software Developers.

Security is a broad topic I have covered almost all the basics with sufficient information.

Introduction To Network Security :

Network:

A network is a data highway deigned to increase access to computer systems. Any number of computers can be connected to form a network.

Security:

Whenever we use term network the word security come with it. Security in network is nothing but to control access to resources, data.

Network Security:

The process of protecting data and equipment from unauthorized access is collectively known as network security.

Need for Security:

Most initial computer applications had no,or at best very little security.ths continued for a number of years until the importance of data was truly realized. Until then, computer data was consider to be useful, but not something to be protected. When computer applications were used to handle financial and personal data, the real need for security was felt like never before. People realized that data on the computer is an extremely important aspect of modern life. Therefore ,various areas in security begin to gain prominence.

for more info visit.
http://www.enjineer.com/forum

Network Security

Network Security

Abstract:


“Privacy and security is must for me, you and others”

For the few decades of their exixtence, computer networks were primarily used for university researchers for sending e-mail and by corporate employees for sharing printers.

Under these conditions, security did not get a lot of attention. But now, as millions of ordinary citizens are using network for banking, shoping and filling their tax, network security for all this purpose is the problem. In this paper we have tried to point out the problems to computer and networks we have also reveals how to tackle this problems using method like firewall and cryptography.

Our focus is mainly on the cryptographic techniques. We have included RSA method which is used by many Network Security Software Developers.

Security is a broad topic I have covered almost all the basics with sufficient information.

Introduction To Network Security :

Network:

A network is a data highway deigned to increase access to computer systems. Any number of computers can be connected to form a network.

Security:

Whenever we use term network the word security come with it. Security in network is nothing but to control access to resources, data.

Network Security:

The process of protecting data and equipment from unauthorized access is collectively known as network security.

Need for Security:

Most initial computer applications had no,or at best very little security.ths continued for a number of years until the importance of data was truly realized. Until then, computer data was consider to be useful, but not something to be protected. When computer applications were used to handle financial and personal data, the real need for security was felt like never before. People realized that data on the computer is an extremely important aspect of modern life. Therefore ,various areas in security begin to gain prominence.


for more info visit.
http://www.enjineer.com/forum

Computer Security

Computer Security

Introduction:


Because of the increased reliance on powerful, networked computers to help run businesses and keep track of our personal information, industries have been formed around the practice of network and computer security. Enterprises have solicited the knowledge and skills of security experts to properly audit systems and tailor solutions to fit the operating requirements of the organization.

Unfortunately, most organizations (as well as individual users) regard security as an afterthought, a process that is overlooked in favor of increased power, productivity, and budgetary concerns. Proper security implementation is often enacted "postmortem" — after an unauthorized intrusion has already occurred. Security experts agree that the right measures taken prior to connecting a site to an untrusted network such as the Internet is an effective means of thwarting most attempts at intrusion.

What is Computer Security?:

Computer security is a general term that covers a wide area of computing and information processing. Industries that depend on computer systems and networks to conduct daily business transactions and access crucial information regard their data as an important part of their overall assets. . Several terms and metrics have entered our daily business lives, such as total cost of ownership (TCO) and quality of service (QoS). In some industries, such as electronic commerce, the availability and trustworthiness of data can be the difference between success and failure.


for more info visit.
http://www.enjineer.com/forum

NETWORK SECURITY

NETWORK SECURITY

ABSTRACT:


A computer network is simply a system of interconnected computers. How they're connected is irrelevant. Over the last 25 years or so, a number of networks and network protocols have been defined and used. Some are ``public'' networks. Anyone can connect to either of these networks, or they can use types of networks to connect their own hosts (computers) together, without connecting to the public networks. For example the Internet. When you want to access the resources offered by the Internet, you don't really connect to the Internet; you connect to a network that is eventually connected to the Internet backbone, a network of extremely fast network components. This is an important point: the Internet is a network of networks -- not a network of hosts. Initially, university researchers and corporate employees mainly for sending e-mails and for printer sharing purposes used computer networks. Hence, security was not of great concern. But now millions and millions of people is using networks across the globe, for various purposes including banking, insurance, taxes etc. As a result security has become the need. Security is concerned with the people trying to access remote services that they are not authorized for.

Network security is a complicated subject, historically only tackled by well-trained and experienced experts. Network security problems can be divided into four closely intertwined areas: secrecy, authentication, nonrepudiation, and integrity control. Secrecy (Confidentiality) deals with keeping information out of the hands of unauthorized users. Authentication deals with determining whom you are talking to before revealing sensitive information or entering into a business deal. Nonrepudiation deals with signatures to prove your identity. Using registered mail and locking documents up achieve integrity and secrecy. Then comes authentication. People authenticate other people by recognizing their faces, voices and handwriting. However, this cannot happen electronically. Furthermore, each layer of the OSI model contributes differently. Thus, user authentication and nonrepudiation are dealt with in the application layer of the OSI model.

Cryptographic methods deal with transmitting data secretly, that is, sending the data in such a way that nobody can access it or modify it. Here, in communication security, we deal with getting the data secretly and without modification from source to destination. To ensure this type of security, different techniques are used namely, IPsec, firewalls, VPN, etc. IPsec is the long-term direction for secure networking. It provides a key line of defense against private network and Internet attacks, balancing security with ease of use. VPNs provide a more active form of security by either encrypting or encapsulating data for transmission through an unsecured network. These two types of security—encryption and encapsulation—form the foundation of virtual private networking. However, both encryption and encapsulation are generic terms that describe a function that can be performed by a myriad of specific technologies. A system designed to prevent unauthorized access to or from a private network. Firewalls can be implemented in both hardware and software, or a combination of both. Firewalls are frequently used to prevent unauthorized Internet users from accessing private networks connected to the Internet, especially intranets. All messages entering or leaving the intranet pass through the firewall, which examines each message and blocks those that do not meet the specified security criteria.

There are different techniques of security for different wireless media. Here our focus is on blue tooth. Bluetooth wireless technology is designed to replace cables between devices, such as printers, keyboards, and mice. A Blue tooth device can transmit through walls, pockets, and briefcases. There are three modes of security for Blue tooth access between two devices. Mode 1: non-secure, Mode 2: service level enforced security, Mode 3: link level enforced security. The different threats for Blue tooth such as blue jacking, blue bugging, bluesnarfing, etc. have been discussed. The usage of personal identification number (PIN) for ensuring security is a key issue been discussed.

for more info visit.
http://www.enjineer.com/forum

The Evolution of Virtual Instrumentation

The Evolution of Virtual Instrumentation

Abstract:


This paper presents the evolution of instrumentation and its applications. A virtual instrumentation is computer software that a user would employ to develop a computerised test and measurement system, for controlling from a computer desktop an external measurement hardware device, and for displaying test or measurement data collected by the external device on instrument like panels on a computer screen . Virtual instrumentation extends also to computerised systems for controlling processes based on data collected and processed by a computerized instrumentation system.

Virtual instrumentation systems typically also comprise pure software instruments, such as oscilloscopes and spectrum analysers ,for processing he collected sensor data and messaging it in ways useful to the users of the data. Also, data collecting hardware devices differ in their internal structures and functions, requiring virtual instrumentation systems to take these differences into account. Thus, some data acquisition devices are so-called “register-based” instruments, controlled by streams of 1s and 0s sent directly to control components within the instrument; others are “message-based” instruments, and are controlled by “strings” of ASCII characters, effectively constituting written instructions that must be decoded within the instrument.


for more info visit.
http://www.enjineer.com/forum

VIRTUAL INSTRUMENTATION

VIRTUAL INSTRUMENTATION

Abstract:


Historically, Instrumentation systems used measuring rods, thermometers and scales. Now the instrumentation system consists of individual instruments, like a pressure gauge comprising of a transducer, signal conditioner, and display panel and it may also have a line recorder to record change in pressure. Virtual instrumentation extends to control process based on data collected and processed by computerized instrumentation system. In this paper, the discussion is about how virtual instrumentation can extend the capability of existing instruments, the applications of it in real world, comparison of virtual instrumentation with traditional instrumentation, the future of virtual instrumentation and the developments that came in virtual instrumentation.

History of Instrumentation Systems:

An instrument is a device designed to collect data from an environment, or from a unit under test, and to display information to a user based on the collected data. Historically, instrumentation systems originated in the distant past, with measuring rods, thermometers, and scales. In modern times, instrumentation systems have generally consisted of individual instruments. For example, an electro-mechanical pressure gauge comprising a sensing transducer wired to signal conditioning circuitry, outputting a processed signal to a display panel and perhaps also to a line recorder, in which a trace of changing conditions is inked onto a rotating drum by a mechanical arm, creating a time record of pressure changes. Even complex systems such as chemical process control applications typically employed, until the 1980s, sets of individual physical instruments wired to a central control panel that comprised an array of physical data display devices such as dials and counters, together with sets of switches, knobs and buttons for controlling the instruments. The introduction of computers into the field of instrumentation began as a way to couple an individual instrument, such as a pressure sensor, to a computer, and enable the display of measurement data on an instrument panel, displayed in software on the computer monitor and containing buttons or other means for controlling the operation of the sensor. Thus, such instrumentation software enabled the creation of a simulated physical instrument, having the capability to control physical sensing components.


for more info visit.
http://www.enjineer.com/forum

VIRTUAL INSTRUMENTATION

VIRTUAL INSTRUMENTATION

ABSTRACT:


Every parameter in the industry or laboratory needs measurement. For measuring those quantities dedicated instruments are more often used. These instruments provide very accurate measurement and are reliable. But they cannot be customized. They are very much useful in industries but they cannot meet the requirements of scientists and research workers. A virtual instrument overcomes the drawbacks of traditional instruments.

Virtual instruments are fueled by the rapid advancement of the chip technology and in PC. Virtual instruments represent a fundamental shift from traditional hardware-centered instrumentation system to software-centered systems that exploit the computing power, productivity, display and connectivity capabilities of popular desktop computers and workstations. Virtual instruments are real instruments, real world data is collected, recorded and displayed, it just uses the data acquisition capabilities, processing, storage and other capabilities of a computer.

This paper explains the main concept of virtual instrumentation and its application in power engineering lab.

INTRODUCTION:

In industries we find many parameters to be controlled, and many electronic instruments are used to control these parameters. All these instruments are dedicated to measure or control those parameters only. They entirely differ from one another but they have one thing in common, they all are box shaped and has some controls and knobs on them. the Stand-alone electronic instruments are very powerful, expensive and designed to perform one or more specific tasks defined by the vendor. The user cannot extend or customize them. The knobs and buttons, built-in circuitry and the functions available to the user, all of these are specific to the nature of the instrument. In addition, special technology and costly components must be developed to build these instruments.

Widespread adoption of the PC over the past twenty years has given rise to a new way for scientists and engineers to measure and automate the world around them. One major development resulting from the advancement of the PC is the concept of virtual instrumentation. A virtual instrument consists of an industry-standard computer or workstation equipped with off-the-shelf application software, cost effective hardware, which together performs the function of traditional instruments. Today virtual instrumentation is used by engineers and scientists for faster application development, higher quality products at lower costs.

Virtual instruments represent a fundamental shift from traditional hardware-centered instrumentation systems towards software-centered systems that exploit the computing power, productivity, display and connectivity capabilities of popular desktop computers and workstations. Even if PC and IC technologies experienced a good growth, it is the software that makes a reality of building virtual instruments.


for more info visit.
http://www.enjineer.com/forum

SMART SENSORS

SMART SENSORS

VEHICLE OPERATOR SAFETY:THE ADVANTAGES OF USING ELECTRONIC SENSORS IN OFF-ROAD
VEHICLES

Abstract:


In extreme outdoor conditions, vehicles are designed to perform arduous tasks .Vehicles must navigate on uneven surfaces while lifting and moving heavy loads in areas like industries such as construction, agriculture and forestry. For vehicle operators, the chances of tipping or rolling over are high, and safety is a concern. Operator’s safety is increased by installing preventative measures such as electronic sensors to detect the vehicle’s operating condition and alignment. The sensor families which are used in off-road vehicles include tilt sensors, inductive position sensors and pressure sensors. This paper will describe the advantages of using these three distinct sensor types in off-road vehicles to ensure the safety of operator.

Introduction:

To prevent vehicle operator injury, electronic sensors can be used in off-road vehicles to warn the operator if the vehicle or its load is in danger. The technologies behind each sensor family will be examined as well as application examples presented.
Environmental exposure is also a safety factor. As virtual "plants-on-wheels," off-road vehicles are exposed to extreme shock and vibration, harsh chemicals, dirt and electrical interference. The sensors used on these vehicles must be able to withstand these same extreme conditions to prevent mechanical damage and downtime.

Tilt Sensors Monitor the Safe Horizontal Alignment of Vehicles

Rugged terrain and moving machine parts can quickly shift the balance of a vehicle. A dangerous alignment can cause hazardous conditions for a vehicle operator. To assist operators in monitoring the horizontal alignment of vehicles, tilt sensors precisely detect slight angle variations. Tilt sensors can report, for example, the exact road-grade angle, boom angle, platform angle and crane-level angle. Upon receiving these signals, an operator can take action to avoid an unsafe situation.
Tilt sensors must be able to withstand the extreme shock, vibration, and harsh elements associated with outdoor use. Direct exposure to chemicals, dirt, moisture, sunlight, and electrical interference is common. Design features that enable tilt sensors to resist the elements and perform in extreme environments include:

• Compact housings - rated for IP67 protection - encase and protect the electronics from chemicals and liquid ingress,
• UV-resistant plastic and metal housings prevent damage from exposure to sunlight,
• Noise-immune technology enables the sensors to ignore conducted and radiated electrical noise.
• Outputs protected from short-circuits and overloads eliminate damage during installation,
• Highly flexible cables, jacketed to resist chemicals and perform at temperatures as low as -40 °F.

Two very different, yet highly effective sensing technologies can be applied to verify horizontal alignment.


for more info visit.
http://www.enjineer.com/forum

TRAFFIC CONTROL AND TRESPASSER DETECTION

TRAFFIC CONTROL AND TRESPASSER DETECTION

ABSTRACT:


Being a densely vehicle populated country, optimization of fuel would yield conservation of same to a great extent, especially by eluding the unnecessary run time of the engine in proximity to the traffic signal fuel consumption can be optimized. Implication of this system will make our nation a disciplined one and moreover detection of trespassers will lead to declination of number of accidents. Traffic control with trespasser detection is feasible by our dexterous system.

INTRODUCTION:

Each vehicle will be given a unique identification card i-card fixed on the top which bears its category and vehicle number. An arch-scanner constructed at a certain distance from the junction, will continuously scan the I- card of vehicles passing underneath it. The microcontroller inside the arch calculates the signal timing. It transmits this data(time in seconds) to the display set, which displays the decreasing counter value every second. As soon as the counter displays zero trespasser detection scanner nearby the junction will detect the trespasser’s vehicle numbers by scanning the i-card on it,. To have an efficient traffic control, circular priority is employed among the lanes. For special vehicles such as ambulance, fire engines etc will have higher priority over the others.


for more info visit.
http://www.enjineer.com/forum

THE SPINTRONIC SCANNER FOR CANCER DETECTION

THE SPINTRONIC SCANNER FOR CANCER DETECTION

ABSTRACT:


Spintronics is a branch of science that deals with the spin of an electron and not on its charge which electronics does. There are two spins (UP spin and DOWN Spin) like 0’s and 1’s to represent to states. The important property that spins could be directed in a particular direction, and the change in spin could be detected is used in our paper.

Cancer cells are easy to identify only when they are large in number. These cancer cells when matured results in the formation of tumor, which has to be removed by surgery. After surgery the presence of a single cancer cell would result in the growth of tumor in that part of the body.

This spintronic scanning technique is an efficient technique to detect cancer cells even when they are less in number.

The steps involved are,

1) The patient is exposed to a strong magnetic field so that his body cell gets magnetized.
2) A beam of electrons with polarized spin is introduced on the unaffected part of the body and the change in spin is detected by a polarimeter. Let it be X
3) A beam of electrons with polarized spin is introduced on the part which had undergone surgery. And the corresponding change in spin be Y
4)If X - Y = 0, it indicates that cancer cells have been removed from the body, if not it indicates the presence of traces of cancer cells and it has to be treated again for ensuring complete safety to the patient

Thus this technique efficiently identifies the presence of cancer cells in that part of the body that has undergone surgery to prevent any further development.


for more info visit.
http://www.enjineer.com/forum

SMART SENSORS

SMART SENSORS

ABSTRACT:


Smart sensors are sensors with integrated electronics that can do one or more following functio
ns:- Logic functions, two-way communication, make decisions. It consists of transduction element, signal conditioning electronic and controller or processor that supports some intelligence in a single package. This paper introduces concept of smart sensors systematically. The progress in the integrated circuits become possible because of the tremendous progress in semiconductor technology resulted in the low cost microprocessor. By designing a low cost sensor, which is silicon based, the overall cost of the control system can be reduced. The usefulness of silicon technology as a smart sensor, physical phenomena of conversion to electrical output using silicon sensors, characteristics of smart sensors. The silicon sensor can produce output, as voltage, current, resistance or capacitances, output format can be analog or digital. Suitable signal conditioning circuits along with processor can easily designed using silicon technology. The presence of controller or processor in smart sensor has lead to corrections for different undesirable sensor characteristics which include input offset and span variations, non-linearity and cross sensitivity. As these are carried in software, no additional hardware is required and thus calibration becomes an electronic process. Reduced cost of bulk cables and connectors, cost improvement and remote diagnostics are the qualities of smart sensors. In this paper specifically laser based smart displaced sensor is explained, how smart sensors help anesthesiologist in Anesthesia supervision is elaborated.


for more info visit.
http://www.enjineer.com/forum

SMART SENSORS

SMART SENSORS

ABSTRACT:


Smart sensors represent the next evolutionary tools for studying the environment..The smart environment relies first and foremost on sensory data from the real world. Sensory data comes from smart sensors of different modalities in distributed locations. Smart sensor systems are capable of prediction, interpretation,communication and intelligent interaction with the environment & hence will leverage new fault management of devices and control for distributed resources. Tremendous advances in digital signal processing and laser capabilities in recent years have enabled many new sensor developments ,one of these being smart sensors.

Fundamental research has already been carried out to develop smart sensors to monitor and control robotics, mobile vehicles, cooperative autonomous systems, mechatronics and bio-engineering systems. Emerging sensors and instrumentation technology can be exploited for enhanced research and operational capabilities. Such smart information technology manifests the potential for varied applications. It is envisioned that concepts of smart sensors and information technology can be transferred and applied to numerous system. The implementation of large networks of interconnected smart sensors can monitor and control our world. Better understanding of smart sensors perform satisfactorily in real-world conditions and can help improve efficiency and reliability.A sensor network consisting of a large number of smart sensors, enabling the collection, processing analysis and dissemination of valuable information gathered in a variety of environments is being implemented quickly .


for more info visit.
http://www.enjineer.com/forum

Smart Sensors Smart IR Temperature Sensor

Smart Sensors Smart IR Temperature Sensor

Synopsis:


This paper discusses the new trend in sensor technology- smart sensors encompassing the principles of “Smart IR Temperature Sensors”.
Today’s new smart IR sensors represent a union of two rapidly evolving sciences that combine IR temperature measurement with high-speed digital technologies usually associated with the computer. These instruments are called smart sensors because they incorporate microprocessors programmed to act as transceivers for bidirectional, serial communications between sensors on the manufacturing floor and computers in the control room.

Today’s more powerful sensor system have the following characteristics:

Accepts inputs from various sensors.
Provides local display of sensor readings.
Allows for non-intrusive sensor calibration.
Provides relays for local alarm action.
Follows user defined alarm strategy.
Accepts feedback signals from final control element.
Has independent back-up.
Low cost link to control room.
Provides centralized monitoring.

In this paper we will give an overview of the recent advances in the smart ir temperature sensors which includes its application in “Space Heaters.”
Thus, integrating smart sensors into new or existing process control systems provides engineers with a new level of sophistication in temperature monitoring and control.


for more info visit.
http://www.enjineer.com/forum

SMART DUST

SMART DUST

ABSTRACT:


With improvements in integration, packaging, circuit design, and process technology, autonomous sensor nodes like these will continue to shrink in size and power consumption while growing in capability to incorporate the requisite sensing, communication, and computing hardware, along with a power supply, in a volume no more than a cubic millimeter, while still achieving impressive performance in terms of sensor functionality and communications capability. These millimeter-scale nodes are called “Smart Dust.” Although mimicking the mobility of dust is not a primary goal.

The smart dust (mote) can be partitioned into four subsystems: sensors and analog signal conditioning, power system, transceiver front end, and the core. The core is essentially all the digital circuits in the system, including the receiver back end, sensor processing circuits, computation circuits, and memory. one requirement of the core is that it have a degree of on-the-fly reconfigurability determined by the changing needs of the mission. In this paper we define an ultra-low energy architecture for the mote core that will meet the needs of the military base monitoring scenario, look at general architecture concerns to provide guidance in mapping other applications into a mote architecture, and perform a brief theoretical comparison of three of the possible mote transmission techniques.

INTRODUCTION:

Smart Dust are millimeter scale sensing and communication platforms. Distributed sensor network systems can consist of hundreds to thousands of dust motes and one or more interrogating transceivers. Each dust mote consists of a power supply, a sensor or sensors, analog and digital circuitry, and a system for receiving and transmitting data. Depending on the power source, solar cells, thick film batteries, or commercially available batteries, the dust mote can vary in size from 1 mm3 to as large as a sugar cube.

There are both military and commercial applications for the dust motes. The military could use dust motes containing acoustic, vibration, and magnetic field sensors distributed across many square miles of territory to monitor the passage of vehicles. The sensors could be delivered to the area by unmanned air vehicles (UAV), artillery, or distributed like seeds from moving vehicles. They could be interrogated by manned air vehicles (MAV) or soldiers with modified binoculars. In the future, chemical and biological sensors could be incorporated into the dust motes to detect the use of chemical or biological agents in combat. Both the military and industry could use dust motes to monitor the performance of critical parts of aircraft, vehicles, and manufacturing equipment. This could dramatically reduce the cost of maintenance.


for more info visit.
http://www.enjineer.com/forum

AUTOMATED GUIDED VEHICLE SYSTEM

AUTOMATED GUIDED VEHICLE SYSTEM

INTRODUCTION:


An automated guided vehicle system is a material handling system that uses independently operated , self propelled vehicle that are guided along defined pathways on the floor. The vehicles are powered by means of on board batteries that allow operation for several hours (8-16 hrs.) between recharging. Guidance is achieved by using sensors on the vehicles that follow the guide wires. The vehicle is controlled by an off board controller or a micro- processor. This controller sends commands to the vehicle such as identification of load, its destination and other special instructions. An AGV system provides a material handling system i.e. both flexible and readily adaptable to either production or production changes.

AGV systems are originally developed for the distribution of material in warehouse environments although this is an imp. use, two major growth areas have been evolved the movement of material to and from production areas in manufacturing facilities, reflecting manufacturing work lifts and use of carriers of work in progress in assembly plants, replacing serial type asynchronous or fixed index assembly conveyor system and small packages, in hospitals to deliver meals, and for material handling.

AGV system were first introduced in 1950 in USA and later in Europe in early 1960, the technology caught on much faster in Europe.


for more info visit.
http://www.enjineer.com/forum

Hydrogen Fueled Vehicles

Hydrogen Fueled Vehicles

INTRODUCTION:


Interest in the use of hydrogen as a vehicular fuel has been revived in recent times by the desire to control urban pollution resulting from the ever-increasing number of internal combustion engines in our major cities, and by concern over our dwindling petroleum reserves. In fact, serious penalties associated with the storage and use of hydrogen rendered it virtually unusable as recently as one year ago.

BACKGROUND:

Hydrogen is widely regarded as a promising transportation fuel because it is clean, abundant, and renewable. In a gaseous state, it is colorless, odorless, and non-toxic. When hydrogen is combusted with oxygen, it forms water as the by-product. Due to hydrogen’s high flammability range, it can be completely combusted over a wide range of air/fuel ratios. Unlike gasoline, which if combusted outside its optimal air/fuel ratio will produce excess carbon monoxide (CO) and hydrocarbons (HC), hydrogen does not have a carbon element and therefore will not produce those toxic gases. Like gasoline however, when hydrogen is combusted in air (mixture of oxygen and nitrogen) the temperature of combustion can cause the formation of the nitric oxidizes (NOx). Hydrogen however has an advantage over gasoline in this area because it can be combusted using very high air/fuel ratios. Using a high air/fuel ratio (i.e. combusting hydrogen with more air than is theoretically required) causes the combustion temperature to drop dramatically and thus causes a reduction in the formation of NOx. Unfortunately, the use of excess air also lowers the power output of the engine.


for more info visit.
http://www.enjineer.com/forum

AUTOMOTIVE AIR FILTERS

AUTOMOTIVE AIR FILTERS

INTRODUCTION:


Definition : filtration can be defined as the process of separating particles from a dispersing fluid by means of porous media . The dispersing medium can be a gas or a liquid .

The filtration process can be characterized by several parameters,
Del P = P1 – P2.
Where P1 is pressure before filter &
P2 is pressure after filter.


It is believed that the primary function of an air filter is to deliver both high airflow and superior dirt protection. Air filters are designed to provide minimum restriction allowing high airflow into an engine. In the vast majority of cases increased airflow will increase engine performance measured by horsepower and throttle response (torque). The performance benefits of maximum airflow are clear, compelling and well documented. That is why so many professional racers are willing to run expensive vehicles with no air filter, as opposed to installing a disposable air filter. They are seeking the additional horsepower and throttle response needed to win the race.

We design our air filters to provide superior filtration of the contaminants that can harm your engine while maximizing the airflow characteristics of the filter in question. The ability of an air filter to protect your engine is generally measured using a testing procedure developed by the Society of Automotive Engineers identified as the SAE J726 procedure. We subject a sample of our filter designs to this test procedure using Coarse Test Dust, which includes particles ranging in size from less than 5.5 microns to 176 microns. As a point of reference, a human hair is approximately 50 microns in diameter. The result of the above test procedure is a specific air filtration efficiency number. This efficiency number represents the percentage of test dust retained by the filter and thereby kept out of an engine. Our goal is to design our air filters to achieve maximum airflow while targeting overall filtration efficiency at 98%.

Because no two air filters are alike, the specific airflow and overall filtration efficiency will vary depending on the filter in question. However, you can rest assured that each air filter we sell, has been designed to achieve high air flow while providing superior filtration.


for more info visit.
http://www.enjineer.com/forum

5TECH ENGIN

5TECH ENGIN

Abstract:


Scotch yoke is an inversion of double slider crank chain. The scotch yoke mechanism can be used in an engine to convert the reciprocating motion of piston into the rotary motion of the crankshaft. The engine thus build is called as a Scotch Yoke Technology Engine or simply a ‘SYTech Engine’.

Added to their cost effectiveness and simplicity, the SYTech engines have many advantages. Their width can be kept small. The short engine block and low engine height provide the greatest freedom for the design of drag efficient bonnet styling and effective crust zones, even in small vehicles. The absence of unbalanced inertia forces and moments with SYTech engines reduces the need for expensive measures to reduce cabin noise and vibrations.

SYTech engines run more quietly and smoothly for mainly three reasons: firstly, the engines are perfectly balanced with no free inertia forces or moments, secondly, the mechanical piston noise is very low and finally, the peak to peak variation of the output torque are much lower under all important operating conditions.

NOx is the exhaust gas component, which is most difficult to reduce. The sinusoidal motion of SYTech engine piston can provide up to 30% NOx reduction with no increase in specific fuel consumption.

SYTech crank mechanism can be applied to diesel and S.I., two stroke and four stroke engines.

The SYTech Engine is tested in ‘dynamometer durability test’ by Collins Motor Corporation (CMC), Melbourne, Australia. The engine is also tested in the Australian concept family car ‘aXcessaustralia II’ during many kilometers of road running under day-to-day traffic conditions.


for more info visit.
http://www.enjineer.com/forum

Independent Wheel Vehicle suspension

Independent Wheel Vehicle suspension

Introduction:


The suspension system connecting a vehicle body to wheels and its tyre allows the wheel to move in an essentially vertical direction in response to good surface irregularities, a spring element temporarily stores and releases energy thus insulating the vehicle body form acceleration peaks. A shock absorber ‘damper’ insures that oscillation induced by road unevenness or aerodynamic forces (or by accelerating breaking) which would impair ride comfort and road holding die away quickly.

Conventional system provides greater simplicity. They contain systems like rigid axle suspension. They are most widely used in heavy carriers. Suspension is directly using leaf spring hence they have less complexity. In this system both wheels of the
vehicle are suspended in combination as directly axle is suspended. Simplicity is the only advantage, they do not provide
effective suspension.


With development of technology a new concept called independent wheel suspension has emerged. In this system both the wheels are suspended independently as every wheel has it’s independent suspension system. This provides suspension for actual road conditions. This system is very complex but have advantages like low weight, plenty of scope for achieving favorable elasto-kinematics effects, no coupling of masses, no suspension parts that run right across the vehicle. Hence now a days in most of the passenger cars this system is used and preferred over conventional system, providing the real smooth drive.

for more info visit.
http://www.enjineer.com/forum

Rotor Blades Used In Wind Turbines

Rotor Blades Used In Wind Turbines

INTRODUCTION:

The energy of the wind airflow is efficiently converted into a rotating motion by the sophisticated shape of the rotor blades. The rotor blades are fixed to the rotor hub, which are placed directly on the main shaft causing it to rotate. This rotating motion is accelerated in a gearbox to drive a high-speed generator producing electricity.

The rotors thus convert rotational motion of the rotor blades into electrical energy. Thus rotors have a great importance in the design and working of a windmill. . On a modern 1000 kW wind turbine each rotor blade measures about 27 metres (80 ft.) in length and is designed much like a wing of an aeroplane.

Working of a Rotor Blades, attached to the turbine rotor, convert power (kinetic energy of the wind) from the air flowing through the turbine and make the rotor turn. This rotational energy passes on to the energy converter--typically an electrical generator.
The energy production of a turbine depends on the diameter of the rotor and the wind strength. Hence it can be seen that larger the diameter of the rotor blades more the energy that can be harnessed. Here another consideration is the size of the generator used, a large blade diameter is used with a large generator so as to produce more energy, if a small generator were to be placed there would be unnecessary wastage of power. Modern wind turbines use high efficiency airfoil shaped blades that extract energy from the wind by generating aerodynamic lift. Air flowing over the airfoil generates two forces: lift, perpendicular to the airflow; and drag, in the direction of the airflow. Air flowing over the top of the airfoil creates a lift force, which causes a slight decrease in pressure that forces the airfoil 'upward'.

The amount of upward force depends on the length of the airfoil, velocity of the air, and the angle at which the air meets the airfoil or the angle of attack.

for more info visit.
http://www.enjineer.com/forum

ERGONOMICS IN AUTOMOBILES

ERGONOMICS IN AUTOMOBILES

ABSTRACT:


This report discusses the ergonomic considerations for automobiles i. e. designing & arrangement of things in an automobile so that there is interaction between the user and the things which will be beneficial for human purpose.

Ergonomics amalgamates many sciences & technologies to maximize user’s safety, efficiency & reliability of performance of the car, makes user’s task easier & increases his feeling of comfort.

An ergonomic apparatus was designed and built to aid in the design of a Formula race car cockpit. The apparatus incorporated adjustable cockpit dimensions that were adjusted accordingly for the tested subjects. A compilation of this data was used to design a cockpit suitable for people within the range of 95th percentile male and 5th percentile female. By testing subjects in various cockpit designs, the ergonomics apparatus was also used to validate the final design of the cockpit.


for more info visit.
http://www.enjineer.com/forum

ANTIMATTER PROPULSION

ANTIMATTER PROPULSION

INTRODUCTION:


Humankind has been exploring space for four decades, and in time our reach has extended throughout the solar system with the use of unmanned probes. Finally, what abut the exploration of our solar systems?

These issues are being addressed by the NASA Advanced Space Transportation Program (ASTP), which is currently investigating new ways to propel a unmanned spacecraft to Alpha Centauri in the span of a human lifetime of 50 years Both tasks suffer the same dilemma: chemical propellants simply will not work. For the first case, chemical propellants lack the energy needed to boost a space probe up to 10% the speed of light. The overall mass of such a booster would be unthinkable. For the latter case, the spacecraft only needs to obtain the velocity necessary to get to Mars within 3-6 months; however, the mass of a manned payload once again places a burden on the size of the booster engine.

Many concepts have been devised. For years, scientists have suggested nuclear fission as an alternative approach for sending a manned spacecraft to Mars. Although the specific impulse (Isp) is still too low for interstellar missions, it does open new avenues near the vicinity of Earth. Unfortunately, environmental issues have all but "grounded" the use of nuclear fission as a propulsion source. Nuclear fusion is cleaner, and it is a more exciting prospect with its higher energy density and specific impulse. However, scientists are still developing such a device that offers beyond break-even energy (more energy output than input), let alone making the same device small enough to be sent into deep space. Last, electric propulsion, as used for Deep Space I, cannot accelerate a spacecraft fast enough for the tasks mentioned above due to its low thrust-to-weight ratio.

It is here that antimatter addresses attention. Upon annihilation with matter, antimatter offers the highest energy density of any material currently found on Earth. As shown in the table below, this indicates that antimatter offers the greatest specific impulse of any propellant currently available or in development, and its thrust-to-weight ratio is still comparable with that of chemical propulsion. Simply put, it would take only 100 milligrams of antimatter to equal the propulsive energy of the Space Shuttle.

Propulsion Type Specific Impulse [sec] Thrust-to-Weight Ratio

Chemical Bipropellant 200 - 410 .1 - 10
Electromagnetic 1200 - 5000 10-4 - 10-3
Nuclear Fission 500 - 3000 .01 - 10
Nuclear Fusion 10+4 - 10+5 10-5 - 10-2
Antimatter Annihilation 10+3 - 10+6 10-3 - 1

Antimatter is one of the most recognized and attractive words in science fiction. It's the stuff that drives fictional starships from one side of the universe to the other. Now NASA is giving it serious consideration as a rocket propellant to get around the solar system. A gram of antimatter would carry as much potential energy as 1,000 Space Shuttle external tanks carry.
The rockets will employ the ages old action-reaction principle in an interesting meeting of Albert Einstein (E=mc2) and Isaac Newton (F=ma).


for more info visit.
http://www.enjineer.com/forum

ADVANCE IN CAR SAFETY

ADVANCE IN CAR SAFETY

INTRODUCTION:


The death toll on the worlds roadways makes driving the number one cause of death and injury for young people aging from 15 to 44 which kills more than one million people a year and injuring another thirty eight million seriously. Cars are getting must safer in recent years because safety is the selling point in new cars--people actually seek out and buy safer cars. With the introduction of airbags, crash testing and other smart equipments in the vehicles, the number of people killed and injured by motor vehicles has decreased in many countries.

FUNDAMENTALS OF MOVING OBJECT:

Before looking at the specifics, let’s review the knowledge of the laws of motion. We know, that moving objects have momentum. Unless an outside force acts on it, the object will continue to move at its present speed and direction. If loose objects (passengers, luggage.etc.) in a car are not restrained, they will continue moving with the speed at which the car is traveling, even if the car is stopped by collision. When a car crashes, the force required to stop an object is very great because the car’s momentum has changed instantly while the passengers has not. [1]

When your body is moving at the speed of 35 mph (56 kph), it has a certain amount of kinetic energy. After the crash, when you come to a complete stop, you will have zero kinetic energy. To minimize the risk of injury, the kinetic energy has to be removed as slowly and evenly as possible. Ideally the car has seat belt pretensioners and force limiters which tighten up the seat belts as soon as the car hits the barrier, but before the airbag deploys. The seatbelt then absorbs some of the kinetic energy as you move forward towards the airbag. Milliseconds later the force limiters kick in making sure the force in the seatbelts doesn’t get too high. Next the air bag deploys and absorbs some more of the forward motion for protecting you from hitting anything hard. [2]


for more info visit.
http://www.enjineer.com/forum

MAGNETIC LEVITATION TECHNOLOGY FOR TRAINS

MAGNETIC LEVITATION TECHNOLOGY FOR TRAINS

ABSTRACT:


In today’s Computer Era, the world is becoming a global village. The distance between the two places becomes lesser and lesser due to high technology Transport Systems. So I decided to choose this seminar topic, as it is very interesting as well as a different one. The transport systems of the world are at their crossroads. Now as never before there is a wealth of invention. The commitment to any one system involves financial investment at least, as that needed to put man on the moon.

At the same time there is marked improvement in the field of high-speed transport technology. The seminar covers both theoretical and experimental contribution on suspension, guidance and propulsion of land transport with no vehicle to ground contact and it covers information about HSST and TRANSRAPID system.

And finally the seminar deals with the existing trends and speculates about the future prospects of magnetic levitation trains.

Magnetic levitation:

Every element (in the periodic system) is magnetic. However, there are three main levels or domains of magnetism. The smallest effect is know as diamagnetism and is a property of elements like antimony, bismuth, copper, carbon but also phosphorus and hydrogen. Paramagnetism has an intermediate value, about two orders of magnitude higher than diamagnetism. Paramagnetism is seen in e.g. aluminum, zinc, liquid oxygen but also in glass and rubber. The largest magnetic effect is known as ferromagnetism. This is, in our daily life, the most common form of magnetism. The only elements with ferromagnetic properties are iron, cobalt, and nickel. Ferromagnetism is about 109 times larger compared to diamagnetism Ferro and paramagnetic elements may easily be levitated in low gradient magnetic fields. This system is used in various commercial products. To elevate diamagnetic materials, however, much higher field strengths and field gradients are necessary. In such a high power magnet organic materials or even living biological samples may be elevated, thereby counteracting the force of gravity.

Although the various systems have their specific limitations, for some small samples this microweight 'simulation' may be used to study living samples in preparation of a real microweight experiment.


for more info visit.
http://www.enjineer.com/forum

Tuesday, September 28, 2010

NETWORK SECURITY

NETWORK SECURITY

Abstract:


A ``network'' has been defined as ``any set of interlinking lines resembling a net, a network of roads || an interconnected system, a network of alliances.'' A computer network is simply a system of interconnected computers.Networking is to share programs, but granting others to access a computer device reveals an open window for those with foul motives.Network security is the effort to create a secure computing platform.We are focusing on a tool firewall for network security.We have also discussed some types of protocols.Firewalls is a non-flammable wall that helps prevent dangerous viruses and hackers.A firewall makes the connection secure between a trusted and an untrusted network. Firewalls can come in the form of hardware or software.Firewall Software is a basic requirement for anyone using broadband to prevent hacking, virus, and other security risks. Firewall software is software designed to prevent unauthorized access to a computer or network that is connected to the Internet. Firewall software comes in a variety of forms, offering a wide variety of features, protection capabilities, scalability and cost.there are different types of firewalls in market.

Introduction to Networking:

A basic understanding of computer networks is requisite in order to understand the principles of network security. In this section, we'll cover some of the foundations of computer networking and security, then move on to security measures Once we've covered this, we'll go back and discuss some of the threats that managers and administrators of computer networks need to confront, and then some measures that can be used to reduce the exposure to the risks of network computing

What is a Network?:

A ``network'' has been defined as ``any set of interlinking lines resembling a net, a network of roads || an interconnected system, a network of alliances.''This
definition suits our purpose well: a computer network is simply a system of interconnected computers.


for more info visit.
http://www.enjineer.com/forum

Network Security

Network Security

Introduction:


A “Network” has been defined as “any set of interlinking lines resembling a net, a network of roads an interconnected system, a network of alliances.” This definition suits our purpose well: a computer network is simply a system of interconnected computers. How they're connected is irrelevant, and as we'll soon see, there are a number of ways to do this

Network Security:

Network security is the process of preventing and detecting unauthorized use of your network. Prevention measures help you to stop unauthorized users (also known as “intruders”) from accessing any part of your computer system.
Detection helps you to determine whether or not someone attempted to break into your system, if they were successful, and what they may have done.

Need of Network Security:

We use Network for everything from banking and investing to shopping and communicating with others through email or chat programs.

• Although you may not consider your communications "top secret," you probably do not want strangers reading your email. using your computer to attack other systems, sending forged email on your Network, or examining personal information stored on your Network.
• Protecting data from being read by unauthorized persons.
• Preventing unauthorized person from inserting or deleting messages.
• Verifying the sender of each message.


for more info visit.
http://www.enjineer.com/forum

DBMS

DBMS

ABSTRACT:


The development pace of computing appears to accelerate year on year. DBMS have been maturing slowly over the last twenty years and have reached a high level of reliability. The database that is proffered with your library application is important. It is important that it works well (reliably, efficiently and flexibly), can respond to the up-coming changes in the computer and information handling world and is commercially viable.

So when looking at a library system consider the advantages and disadvantages of the various types of DBMS that underlie the system and assess whether your needs are likely to limited by the technologies offered. If you need lower hardware and administration costs then the greater efficiency of Nested Relational DBMS could be useful. If you need in-depth retrieval based on the text of the documents rather than assigned terms then look for a product with a free form DBMS or text retrieval engine
incorporated.

Although database management system (DBMS) technology is very mature, there is a potential for much future innovation in integrating structured and unstructured data, virtualizing access to data, and simplifying data management through greater automation and intelligence.

• DBMS is used to maintain query large datasets.
• Benefits include recovery from system crashes, concurrent access, quick application development, data integrity and security.
• Levels of abstraction give data independence.
• A DBMS typically has a layered architecture.
• DBMS is one of the broadest, most exciting areas in CS.

The future will call for efficient handling of objects and sophisticated Web serving. Simple Web serving is a feature of most systems like DBMS and should not pose problems. Object storage will be interesting to watch, especially as the demand ramps up for libraries to store and deliver more and more images, digitized texts, and video etc.


for more info visit.
http://www.enjineer.com/forum

NEW TRENDS IN NETWORKING

NEW TRENDS IN NETWORKING

ABSTRACT:


In today’s New Economy, small businesses that might have dealt with just local or regional concerns now have to consider global markets and logistics. At the same time security concerns of their network from hackers, Denial-of-Service (DoS) attacks and sending data over the Internet have become more widespread.

Until recently,communications were only available by using leased telephone lines to maintain a Wide Area Network (WAN). Leased lines enabled companies to expand their private network beyond their immediate geographic area. Moreover, WAN provided advantages over a public network like the Internet when it came to reliability, performance, and security.

The continuing popularity with the Internet has led to the evolution of Virtual Private Networks (VPNs). A VPN is a connection that allows private data to be sent securely over a shared or public network, such as the Internet. In fact, one of the driving forces behind VPNs is the Internet and its global presence.

With VPNs, communication links between users and sites can be achieved quickly, inexpensively, and safely across the world. In this way, VPNs empower organizations to extend their network service to branch offices and remote users -such as traveling employees, telecommuters, and strategic partners -by creating a private WAN via the Internet. With all these benefits, small businesses are also eager to reap the advantages afforded by VPNs. This paper explains what a VPN is and how VPNs provide secure, private connections to network applications.

By reading this paper, you will gain a fundamental understanding of VPNs, including their security mechanisms, benefits, and cost-saving advantages.


for more info visit.
http://www.enjineer.com/forum

NETWORK SECURITY

NETWORK SECURITY

Abstract:


Network security is a complicated subject, historically only tackled by well-trained and experienced experts. However, as more and more people become ``wired'', an increasing number of people need to understand the basics of security in a networked world. This document was written with the basic computer user and information systems manager in mind, explaining the concepts needed to read through the hype in the marketplace and understand risks and how to deal with them.

Some history of networking is included, as well as an introduction to TCP/IP and internetworking. We go on to consider risk management, network threats, firewalls, and more special-purpose secure networking devices.

This is not intended to be a ``frequently asked questions'' reference, nor is it a ``hands-on'' document describing how to accomplish specific functionality.

It is hoped that the reader will have a wider perspective on security in general, and better understand how to reduce and manage risk personally, at home, and in the workplace.


for more info visit.
http://www.enjineer.com/forum

Network security

Network security

ABSTRACT:


These days, computer security is a serious and complex business. True security requires the coordination of staff and technology across the enterprise infrastructure, as well as educated and cooperative users. But even the best of information security policies and plans will fail if the underlying network is not secure. You may think you are doing all you can to protect your network, but think again. Security dangers you're not even aware can be lurking in every corner of your network. This paper provides you with an overview of network security, including firewalls and proxy servers.

One of the pillars of network security is commonly referred to as triple AAA as an abbreviation for authorization, authentication and accounting.

The process of identifying an individual is referred to as authentication.
A common method of the authentication is based upon user name and password, requiring a person to enter a preassigned data pair to gain access to a network facility. This method of authentication via a Password Authentication Protocol
(PAP) in which the data pair is compared to entries stored in table at the destination .Although the stored data is encrypted, the transmitted data is not, representing the weakness that can be exploited by the network monitor.

This paper mainly provides the details of firewalls used in network security, types of firewalls , what it does ,how it protects your system architecture of firewall.


for more info visit.
http://www.enjineer.com/forum

Artificial Intelligence

Artificial Intelligence

Abstract:


Artificial Intelligence is the hot topic on many boards today. It is also the hot topic in many laboratories and software houses. The military is trying to harness it to replace humans, the game houses are trying to control it to make games more realistic and the appliance makers and homebuilders are trying to perfect it to make our lives easier. Expert system is one of the important and leading application of artificial intelligence used in modern world for expert working. The primary goal of expert systems research is to make expertise available to decision makers and technicians who need answers quickly. There is never enough expertise to go around -- certainly it is not always available at the right place and the right time. Portable with computers loaded with in-depth knowledge of specific subjects can bring decades worth of knowledge to a problem. The same systems can assist supervisors and managers with situation assessment and long-range planning. Many small systems now exist that bring a narrow slice of in-depth knowledge to a specific problem, and these provide evidence that the broader goal is achievable. These knowledge-based applications of artificial intelligence have enhanced productivity in business, science, engineering, and the military. With advances in the last decade, today's expert systems clients can choose from dozens of commercial software packages with easy-to-use interfaces. Each new deployment of an expert system yields valuable data for what works in what context, thus fueling the AI research that provides even better application.


for more info visit.
http://www.enjineer.com/forum

Artificial Intelligence

Artificial Intelligence

ABSTRACT:


This paper is the introduction to Artificial intelligence (AI). Artificial intelligence is exhibited by artificial entity, a system is generally assumed to be a computer. AI systems are now in routine use in economics, medicine, engineering and the military, as well as being built into many common home computer software applications, traditional strategy games like computer chess and other video games.

We tried to explain the brief ideas of AI and its application to various fields. It cleared the concept of computational and conventional categories. It includes various advanced systems such as Neural Network, Fuzzy Systems and Evolutionary computation. AI is used in typical problems such as Pattern recognition, Natural language processing and more. This system is working throughout the world as an artificial brain.

Intelligence involves mechanisms, and AI research has discovered how to make computers carry out some of them and not others. If doing a task requires only mechanisms that are well understood today, computer programs can give very impressive performances on these tasks. Such programs should be considered ``somewhat intelligent''. It is related to the similar task of using computers to understand human intelligence.

We can learn something about how to make machines solve problems by observing other people or just by observing our own methods. On the other hand, most work in AI involves studying the problems the world presents to intelligence rather than studying people or animals. AI researchers are free to use methods that are not observed in people or that involve much more computing than people can do. We discussed conditions for considering a machine to be intelligent. We argued that if the machine could successfully pretend to be human to a knowledgeable observer then you certainly should consider it intelligent.


for more info visit.
http://www.enjineer.com/forum

Artificial Intelligence

Artificial Intelligence

Abstract:


Artificial Intelligence is the absolute solution to reduce all kinds of stress on the mankind. The Artificial Intelligence has created a new paradigm in the field of research & applications for itself. Users will demand greater capabilities that allow them to access more applications and different forms of those applications in future.

AI has come along way since John Haugeland gave the first Idea of AI in 1970s.

This paper discusses the theory of development of AI and also experimental research performed on it. It also deals with approach to psychology, and emphasizes what may be called linguistic intelligence in developed of AI.

Paper also discusses the types of AI and their sub-types viz. Strong AI and its Sub-types and Weak AI.
Ultimately the paper is focused on the tests used for detection of AI in both the Strata’s i.e. Neuropsychological & Linguistic which are still in use by majority. It is also focused on the Programming Languages that are used for AI.

Finally the paper stresses the various fields of application in AI & lists some famous applications in various fields with a short description.

To end with the paper consist of a list of Sub-Fields of AI Research.


for more info visit.
http://www.enjineer.com/forum
ARTIFICIAL INTELLIGENCE

ABSTRACT:


Today in the world of computers, everything has become automated. Computers have taken over our boring and tedious jobs. Earlier they were designed to help humans but nowadays computer engineers are working on a new breed of computers known as intelligent machines. These machines could simulate the human intelligence to an extent. Although they are designed by humans, still they are able to learn by themselves. They can learn and work in a way a new born child does. These machines can adapt to the changing environment by observing their surroundings and learning by their past experiences just like a small child.

To help us and our organizations cope with the unpredictable eventualities of an ever-more volatile world, these systems need capabilities that will enable them to adapt readily to change. Health-care providers require easy access to information systems so they can track health-care delivery and identify the most recent and effective medical treatments for their patients' conditions. Crisis management teams must be able to explore alternative courses of action and support decision making. Educators need systems that adapt to a student's individual needs and abilities. Businesses require flexible manufacturing and software design aids to maintain their leadership position in information technology, and to regain it in manufacturing.

By providing computer programs that amplify human cognitive abilities and increase human productivity, reach, and effectiveness, we can help meet national needs in industries like health care, education, service, and manufacturing. This paper reviews the issues arising from the combination of artificial intelligence techniques with those of virtual environments created by humans.


for more info visit.
http://www.enjineer.com/forum

Silicon technology for optical MEMS

Silicon technology for optical MEMS

ABSTRACT:

A system is nothing but an implementation of logic to regulate output in response to inputs. Using discrete components and IC technology we can do the designing of electronic systems. But the systems that contain large number of operations have failed to get ideality using discrete components so we are moving towards IC technology.

The enormous developments in electronic microfabrication technologies have led to very large scale integration and emergence of an era of microelectromechanical systems. The fascinating aspect of the field of MEMS and micro devices is its multidimensional nature.

As the semiconductor silicon has optimized characteristics it is one of the most important materials for the development of MEMS There and due to the progress in optical systems for telecommunications to meet the needs for increased bandwidth this paper deals Silicon technology for optical MEMS.

Through this paper we wish to introduce MEMS, different fabrication technologies in MEMS, implementing it to the optoelectronics, various optical devices used in the MEMS and the future prognosis of the MEMS technology.


for more info visit.
http://www.enjineer.com/forum

Sensor Technology

Sensor Technology

ABSTRACT:


The recent advances of sensor technologies have been powered by high-speed and low-cost electronic circuits, novel signal processing methods and innovative advances in manufacturing technologies. The synergetic interaction of new developments in these fields allow completely novel approaches increasing the performance of technical products. Innovative sensor structures have been designed permitting self-monitoring or self-calibration. The rapid progress of sensor manufacturing technologies allows the production of systems and components with a low cost-to-performance ratio. Among Microsystems manufacturing technologies, surface and bulk micromachining are increasingly winning recognition. The potential in the field of digital signal processing involves new approaches for the improvement of sensor properties. Multi-sensor systems can significantly contribute to the enhancement of the quality and availability of information. For this purpose, sophisticated signal processing methods based on data fusion techniques are more effective for an accurate computation of measurement values or a decision than usually used threshold based algorithms. In this state-of-the-art lecture we will give an overview of the recent advances and future development trends in the field of sensor technology. We will focus on novel sensor structures, manufacturing technologies and signal processing methods in individual systems. The predominantly observed future development trends are: The miniaturization of sensors and components, the widespread use of multi-sensor systems and the increasing relevance of radio wireless and autonomous sensors.


for more info visit.
http://www.enjineer.com/forum

Security as a New Dimension in Embedded System Design

Security as a New Dimension in Embedded System Design

INTRODUCTION:


Today, security in one form or another is a requirement for an increasing number of embedded systems, ranging from low-end systems such as PDAs, wireless handsets, networked sensors, and smart cards, to high-end systems such as routers, gateways, firewalls, storage servers, and web servers. It has been observed that the cost of insecurity in electronic systems can be very high. For example, it was estimated that the “I Love You” virus caused nearly one billion dollars in lost revenues worldwide. With an increasing proliferation of such attacks, it is not surprising that a large number of users in the mobile commerce world (nearly 52% of cell phone users and 47% of PDA users, according to a survey by Forrester Research) feel that security is the single largest concern preventing the successful deployment of next-generation mobile services.

With the evolution of the Internet, information and communications security has gained significant attention. For example, various security protocols and standards such as SSL, WEP, and WTLS, are used for secure communications. While security protocols and the cryptographic algorithms they contain address security considerations from a functional perspective, many embedded systems are constrained by the environments they operate in, and by the resources they possess. For such systems, there are several factors that are moving security considerations from a function centric perspective into a system architecture design issue. For example,

An ever increasing range of attack techniques for breaking security such as software, physical and side-channel attacks require that the embedded system be secure even when it can be logically or physically accessed by malicious entities. Resistance to such attacks can be ensured only if built into the system architecture and implementation.

The processing capabilities of many embedded systems are easily overwhelmed by the computational demands of security processing, leading to undesirable tradeoffs between security and cost, or security and performance.

Embedded system architectures need to be flexible enough to support the rapid evolution of security mechanisms and standards.

New security objectives, such as denial of service and digital content protection, require a higher degree of co-operation between security experts and embedded system architects.

This paper will introduce the embedded system designer to the importance of embedded system security, review evolving trends and standards, and illustrate how the security requirements translate into system design challenges. Emerging solutions to address these challenges through a combination of advanced embedded system architectures and design methodologies will be presented.


for more info visit.
http://www.enjineer.com/forum

DEVELOPMENT OF AN MICROPROCESSOR BASED DATA ACQUISITION SYSTEM FOR SUBSTATION AUTOMATION

DEVELOPMENT OF AN MICROPROCESSOR BASED DATA ACQUISITION SYSTEM FOR SUBSTATION AUTOMATION

Abstract:


In high voltage transmission systems fault levels are generally high, which if not cleared rapidly can cause system instability as well as extensive damages and hazards to personnel. Speed of operation, selectivity, reliability and security of the trip decision assume more importance at higher operating voltages as they handle bulk of the power. Data Acquisition is the prerequisite to achieve the above stated objectives. The paper deals with the designing and assembling of architecture, both hardware and software design for data acquisition for automating a substation. A scaled down model of a substation has been developed in this regard. The hardware mainly consists of signal processing and data acquisition. An Intel 8086 microprocessor has been used to acquire data. Software, developed in 8086 language, is written in modules for easy understanding and implementation. This program acquires data like bus voltages and line currents from the system. In order to demonstrate the use of data acquired we have developed a simple fault detection scheme using over voltage and over current criteria. When a fault signal is detected, the processor sends a trip signal to the appropriate circuit breaker to isolate the faulted component from the system. The program also calculates the power factor of the load present in the system. Possibilities of future developments to improve the design and performance of the data acquisition schemes have been discussed.

Introduction:

Substations are switching and transforming stations, located between generating and load centers. A modern substation is a complex structure, as it requires numerous items of equipment and allied service. This equipment provides supervision, control and protection of substation. In a typical substation, the control equipment can be either analog or digital in nature. Their operation can be either manual or automatic. In either case, the control equipment gives commands to circuit breakers to perform duties
such as switching, auto reclosure, bus bar protection etc. With the increase in the complexity and size of the substation, automatic control or the automation of Substation has assumed great importance.


for more info visit.
http://www.enjineer.com/forum

The New Integrated WAN Interconnected SCADA Systems

The New Integrated WAN Interconnected SCADA Systems

ABSTRACT:


A number of new technologies for monitoring, protection, and control of the power grid have been perfected in recent years and a judicious application of these technologies can help to reduce the frequency and severity of future catastrophic failures. Supervisory Control And Data Acquisition (SCADA) system provides an excellent tool for monitoring and control of grid operations. With the opening of the electrical utility market, SCADA systems have now changed from “Luxury” to a “Necessity”.

The traditional SCADA approaches assume that each function such as protection, control, monitoring, and maintenance are supported by a separate infrastructure of recording instruments and/or controllers for obtaining and processing data. One issue that did not get adequate attention regarding control and protection of power systems in the past SCADA systems is the data integration and information exchange. These SCADA models provide acceptable performance and reliability, but it has numerous drawbacks, particularly in the areas of flexibility and open access to information. The new upcoming trends like Wide Area Network (WAN), interconnected SCADA systems can eliminate these drawbacks. Configuration and communication techniques of WAN interconnected SCADA system are presented in the paper. This system can lead to excellent reliability and processing capabilities of existing SCADA systems.


for more info visit.
http://www.enjineer.com/forum

SUPERVISORY CONTROL AND DATA ACQUISITION IN POWER SYSTEMS

SUPERVISORY CONTROL AND DATA ACQUISITION IN POWER SYSTEMS

ABSTRACT:


The implementation of “SCADA” (Supervisory Control And Data Acquisition) along with its software aims at the “CONTROL AND OPERATION OF INTER CONNECTED POWER SYSTEM”

By using SCADA system a large network having several generating stations and substations and large load centres is controlled from central load dispatch centre.

In this system the real time data from the power system is acquired through transducers (which converts the a.c. signals from the C.T.’s and P.T.’s to the dc signal proportional to the measured value for the respective parameters) is converted to the digital signals. Then the data through the RTU (Remote Terminal Unit), which is located at the generating stations or substations is transmitted to the load control centre through powerline carrier system (PLCC), fiber optics communication and microwave channels. In this way the data acquisition is completed. The data, which is processed by the computer systems employing energy management software, provides automatic and remote control of the network at the load control centre. Then the instructions from the load control centres are transmitted to the control rooms of the substation and generating station for executing appropriate action.

Digital computers and microprocessor installed in the control rooms of large substation generating station and load control centre are used for data collection, data monitoring and a automatic control.

Thus, by employing SCADA system to power distribution network provides integrated approach to the power system protection, operation control and monitoring automatically with least intervention of control room operator.


for more info visit.
http://www.enjineer.com/forum

SUPERVISORY CONTROL AND DATA ACQUISITION FOR AUTOMATIC SUB-STATION CONTROL

SUPERVISORY CONTROL AND DATA ACQUISITION FOR AUTOMATIC SUB-STATION CONTROL

ABSTRACT:


“Good people all, of every sort, give ear unto my song ”, this was the cry made by thousands of operator running to and fro to detect, analyze and report the fault to their senior officials and wait for their instructions to be followed. Finally someone heard their cry and a master system was born. The energy is generated, transmitted, distributed and finally utilized. At every stage certain supervision, control and protection is necessary. This paper deals with basic SCADA system, its components, basic block diagram, functions, applications and advantages of SCADA.

SCADA helps in monitoring the system and have alarm generation facility incase of faults, which helps in real time analysis. This paper deals with the data logging ability of SCADA enabling off line analysis . Thus remote control over power system is achieved by SCADA, which is effectively depicted in this paper.

This paper is an attempt to highlight the features of SCADA, which is a revolutionary development in automotive monitoring and control of process. The need for SCADA as well as its hierarchical structure has also been mentioned in the paper. The component of SCADA, which is divided into two parts hardware and software, has also been taken in detail. Features of the SCADA such as simulation options, data import and export function has also been taken into account. A criterion for economic operation of grid as well as states in which it operates has also been considered. Security and reliability which is a prime factor for automated system as well as the different level of redundancies which is required for various automated system has been explained to us. Facilities provided by the SCADA have also been highlighted in the paper.



for more info visit.
http://www.enjineer.com/forum

SCADA (Supervisory Control and Data Acquisition)

SCADA (Supervisory Control and Data Acquisition)

ABSTRACT:


An industrial SCADA system will be used for the development of the controls of the four LHC experiments. This paper describes the SCADA systems in terms of their architecture, their interface to the process hardware, the functionality and the application development facilities they provide. Some attention is also paid to the industrial standards to which they abide, their planned evolution as well as the potential benefits of their use.

MASON-GREY designed a SCADA (Supervisory Control and Data Acquisition) System to provide overall control of a client’s material handling process from a central control room in the plant. The purpose of the SCADA system is to provide centralized control of process, and to log a history of important process parameters for trending, quality control, etc. purpose

The SCADA System consists of an IBM-compatible computer running a SCADA software package.Because of the importance of the SCADA system’s uptime, a faisafe system is of critical importance.


for more info visit.
http://www.enjineer.com/forum

SCADA

SCADA

ABSTRACT:


An industrial SCADA system will be used for the development of the controls of the four LHC experiments. This paper describes the SCADA systems in terms of their architecture, their interface to the process hardware, the functionality and the application development facilities they provide. Some attention is also paid to the industrial standards to which they abide, their planned evolution as well as the potential benefits of their use.

Acronym for supervisory control and data acquisition, a computer system for gathering and analyzing real time data. SCADA systems are used to monitor and control a plant or equipment in industries such as telecommunications, water and waste control, energy, oil and gas refining and transportation. A SCADA system gathers information, such as where a leak on a pipeline has occurred, transfers the information back to a central site, alerting the home station that the leak has occurred, carrying out necessary analysis and control, such as determining if the leak is critical, and displaying the information in a logical and organized fashion. SCADA systems can be relatively simple, such as one that monitors environmental conditions of a small office building, or incredibly complex, such as a system that monitors all the activity in a nuclear power plant or the activity of a municipal water system.

SCADA (supervisory control and data acquisition) is an industrial measurement and control system consisting of a central host or master (usually called a master station, master terminal unit or MTU); one or more field data gathering and control units or remotes (usually called remote stations, remote terminal units, or RTU's); and a collection of standard and/or custom software used to monitor and control remotely located field data elements. Contemporary SCADA systems exhibit predominantly open-loop control characteristics and utilize predominantly long distance communications, although some elements of closed-loop control and/or short distance communications may also be present.

Systems similar to SCADA systems are routinely seen in factories, treatment plants etc. These are often referred to as Distributed Control Systems (DCS). They have similar functions to SCADA systems, but the field data gathering or control units are usually located within a more confined area. Communications may be via a local area network (LAN), and will normally be reliable and high speed. A DCS system usually employs significant amounts of closed loop control. SCADA systems on the other hand generally cover larger geographic areas, and rely on a variety of communications systems that are normally less reliable than a LAN. Closed loop control in this situation is less desirable.

for more info visit.
http://www.enjineer.com/forum