Pages

Tuesday, January 27, 2009

Open Hack 2009 Yahoo !

Open Hack 2009 Yahoo!

a Yahoo! Hack Day in India (Bangalore)

What is this?

We’ll be inviting 200-300 developers to attend this FREE event, which will begin with hack-related presentations from some of the Web’s most respected developers. We’ll then dive into 24 hours of hacking on a very nice collection of tools, Yahoo!s Open Platforms, other APIs, and data, and end with awards from the sponsors plus bragging rights until the end of eternity or the next Hack Day, whichever comes first.

As a Hacker Guru you will get a chance to play with our newest releases such as BOSS, Blueprint, YAP, YQL and more…

Naturally, we plan to provide physical and mental sustenance throughout the event.
If you’d like to be considered for participation please let us know now. Due to the limited size of the venue, we will be reviewing applications for attendance and giving preference to those we believe will best be able to contribute to the development of a hack at the event. You could be hacking individually or in groups but do sign-up individually.

To support your application, please enter information (such as a link to your blog or other online presence) in the “Special Requirements” section on the third page of the application form. Also please comment on why or what you would expect at this event that would make it useful to you.

When?

14th February '09 – 15th February '09.

Where?

The Taj Residency, Bangalore

Link : http://hackdayindia.eventwax.com/open-hack-2009

Anna University- BIOTECHCELLENCE 09 : 15 National Level Technical Symposium

Centre for Biotechnology, Anna University- BIOTECHCELLENCE 09: 15 National Level Technical Symposium

Introduction :
As the name suggests, this event is all about showcasing the excellence in the various facets of biotechnology.. Serving as a common platform of interaction and discussion for students, professors and eminent scientific personalities from all over India , it also kindles the spirit for research in the field – the most needed for the moment!
An additional highlight to this year’s biotechcellence is a measure to go eco-friendly. An effort has been made to make every nuance involved in the symposium non hostile to the environment. We hope to carry on the high reputation set by this event and also add our own dimensions to it..
Welcome to the big world of micro and nano! .

Workshops

Description

A one-day workshop on Gene expression analysis, with hands-on training on Real time PCR is conducted for selected participants on February 19, 2009, in the Centre for Biotechnology, Anna University.
The workshop is conducted by experts from Eppendorf India Limited. Quality training by the experts is offered here in this program. This workshop will throw light on the Real time PCR, Micro array techniques and their applications in the analysis of gene expression.
Hands-on training for each participant is provided for the better understanding of the technique. Participants will be provided with hand-outs containing the protocols and material of international standard.

Click here to register for Workshop

Eligibility

Students in the II, III year of B.Tech, are eligible for participation.
Separate registration for Workshop.
Only limited number of students can participate. Selected candidates will be intimated through mail 10 days before the event.

Contact

For enquiries, drop a mail to: workshopbtx09@gmail.com

Important Dates :

Event starts : 02/19/09

Event finishes : 02/22/09

Last date for online registration: 01/31/09

Source: http://www.biotechcellence09.com/

Sun Microsystems-Sun Tech Days 2009: A world wide Developer conference, Hyderabad

Sun Microsystems-Sun Tech Days 2009: A world wide Developer conference, Hyderabad

Introduction:

Sun Tech Days is a world wide developer conference organized by Sun Microsystems. This year Sun Tech days 2009 is being held at Hyderabad. This is an opportunity to hear from and meet with some of the brightest from around the world. It’s an chance to network with our peers, understand more about what’s going on from Sun and the industry and all in a local environment.This is a three day event. The eminent speakers would be Robert Brewin ,Distinguished Engineer and Chief Technology Officer, Software ,Jeff Jackson : Senior Vice President, Solaris Engineering Group , Jeet Kaul: Vice President of Developer Products and Programs ,Aisling MacRunnels :Vice President of Software Marketing ,Karen Tegan-Padir: Vice President, Enterprise Java Platforms all from Sun Microsystems, Inc.

Location :

Sun Tech Days 2009Hyderabad International Convention Centre Novotel & HICC Complex (Near Hitec City) P O Bag 1101 Cyberabad Post Office Hyderabad - 5000 081India
Telephone : + 91 40 66824422/ 66134422

Important Dates :

Conference starts : 02/18/09

Conference ends : 02/20/09

Link : http://www.sercononline.com/suntechdays09/index.htm

Thursday, January 22, 2009

WEB 2.0

WEB 2.0

Abstract

In the past months, Web 2.0 floats as a buzzword through the World Wide Web,

signifying something new, exciting, and promising. This report aims at capturing

the major ideas that stand behind this term, presenting them in a comprehensible

way, trying to summarize and draw conclusions out of the current hype.

Universities are using Web 2.0 in order to reach out and engage with generation Y and other prospective students according to recent reports. Examples of this are: social networking websites – YouTube, MySpace, Facebook, Twitter and Flickr; upgrading institutions’ websites in gen Y-friendly ways – stand-alone micro-websites with minimal navigation; placing current students in cyberspace or student blogs; and blogging which enables prospective students to log on and ask questions.

Tuesday, January 20, 2009

Intrusion Detection System

Intrusion Detection System

ABSTRACT

This Seminar answers simple questions related to detecting intruders who attack systems through the network, especially how such intrusions can be detected. Intrusion in today’s network leads to frequent problems for the existing vast networks. If they are not taken care of at proper time they result into improper functioning of network and loss of confidential information, which is of utmost importance.

The seminar briefly takes an overview on:

Intrusion: What is it?

Types of intrusion.

Detect intrusion of known signatures.

Measures to be taken to prevent intrusion.

INTRODUCTION.

What is a "network intrusion detection system (NIDS)"?

An intruder is somebody ("hacker" or "cracker") attempting to break into or misuse your system. The word "misuse" is broad, and can reflect something severe as stealing confidential data to something minor such as misusing your email System for Spam.

An "Intrusion Detection System (IDS)" is a system for detecting

such intrusions. Network intrusion detection systems (NIDS) monitors’ packets on

the network wire and attempts to discover if a hacker/cracker is attempting to break into a system (or cause a denial of service attack). A typical example is a system that watches for large number of TCP connection requests (SYN) to many different ports on a target machine thus discovering if someone is attempting a TCP port scan. A NIDS may run either on the target machine who watches its own traffic (usually integrated with the stack and services themselves), or on an independent machine promiscuously watching all network traffic (hub, router, probe). Note that a "network" IDS monitors many machines, whereas the others monitor only a single machine system integrity verifiers (SIV) monitors system files to find when a intruder changes them (thereby leaving behind a backdoor). The most famous of such systems is "Tripwire". A SIV may watch other components as well, such as the Windows registry and chron configuration, in order to find well know signatures. log file monitors (LFM) monitor log files generated by network services. In a similar manner to NIDS, these systems look for patterns in the log files that suggest an intruder is attacking.

A typical example would be a parser for HTTP server log files that looking for intruders who try well-known security holes, such as the "phf" attack.

Example: swatch deception systems, which contain pseudo-services whose goal is to emulate well-known holes in order to entrap hackers.

Sunday, January 18, 2009

Computational Fluid Dynamics (CFD)

Computational Fluid Dynamics (CFD)

ABSTRACT

The sleek & beautiful aircraft roles down the run away, takes off & rapidly climbs out of sight within a minute, this same aircraft has accelerated to hypersonic speed; still within atmosphere. Its powerful supersonic engine continues to propel aircraft with velocity near 26000 ft/s orbital velocity and vehicle simply coasts into low earth orbit. Is this the stuff of dreams? Not really; indeed this is possible due to CFD.

CFD is the art of replacing the differential equation governing the fluid flow, with a set of algebraic equations. Which in turn can be solved with aid of digital computer to get an approximate solution.

The physical aspects of any fluid flow are governed by three fundamental principles (1) Mass is conserved (continuity equation)

(2) Newton’s second law (Navier-Stokes equations)

(3) Energy is conserved (energy equation)

CFD is a research tool & playing a strong role as design tool, in aerodynamics as well as non-aerospace applications like automobile, flow field inside I.C. engines, industrial manufacturing (actual behavior of molten metal in mould ), civil & environmental Engg.

In a next decade CFD will become a critical technology for aerodynamic design & on other hand concurrent Engg. is possible by drastic changes, shortening of design process & optimization of air vehicle system in terms of overall economical performance. CFD is growth industry with an unlimited numbers of new application & new ideas just waiting in future.

This seminar includes Discretization , Finite Difference Method (FDM), Finite Volume Method (FVM), Finite Element Method (FEM), Grid generation),Simulation & greater advantage of CFD over wind tunnel testing. This seminar also highlights on “AERODYNAMIC DESIGN OF FORMULA ONE RACING CAR BY CFD” (simulation technique)




Saturday, January 17, 2009

NON-CONVENTIONAL ENERGY SOURCES

NON-CONVENTIONAL ENERGY SOURCES


ABSTRACT

Energy was, is & will remain the basic foundation which determines the stability of economic development of any nation. In a world, which is threatened with the non-availability of fuel, oil supplies & menace of nuclear war, one is bound to look into the future energy prospects with considerable skepticism. The tides are caused by the gravitational forces of moon & sun on the ocean water & are affected by the spinning of earth around its axis & the relative positions of earth, moon & sun. Tides are periodically renewed & hence tidal energy is renewable energy. This paper mainly emphasizes on capsizing the most of alternate power generation using Ocean Thermal Energy Conversion (OTEC) & Tidal Energy.

OCEAN THERMAL ENEGRY CONVERSION (OTEC):

Ocean covers 71% of earth’s surface. On an average day, 60 million sq. meter of tropical sea absorbs an amount of solar radiation equivalent in heat content to about 245 billion barrels of oil. India has 15 million sq. meter of tropical water zone. OTEC available for India is 180,000 MW. OTEC is more convenient than wave & tidal energy. The capacity in single unit is up to 400 MW.


WORKING PRINCIPLE OF OTEC:

The surface of the ocean waters are at a temperature of 28-30ºC. This temperature decreases with depth and at a depth of about 1000m we get a temperature of 6-8ºC. This is a consequence of the Lambert's law of radiation according to which the intensity of radiation exponentially decreases with depth and so the surface receives more solar radiation (also called insolation) compared to the waters below. Moreover, warm water is less dense and hence stays at the top. Hence, the temperature distribution available in the ocean is stable and the temperature difference remains constant more or less throughout the year.


Friday, January 16, 2009

ADHESION OF DIAMOND COATINGS

ADHESION OF DIAMOND COATINGS

ABSTRACT

Diamond is the hardest known material to man and diamond coatings are emerging as a latest technique in the field of cutting tools, optics, defense and medical. Realizing this various new techniques of diamond deposition are being developed. The paper tabulates different properties of diamond and dependent applications.

The central focus of the article is on adhesion of diamond coatings on cutting tools. Mechanism of diamond growth and factors affecting adhesion of diamond has been briefly discussed.

Composite electroplating and Flame Assisted CVD have been studied as methods to improve adhesion of diamond films in respect of substrate material, pretreatment of substrate and experimental set-up.

The paper also describes Abrasion Wear Test and Indentation Test employed for testing the diamond films. The article concludes with results of Abrasion Wear Test, Indentation Test and Discussion.

Thursday, January 15, 2009

TUBELESS TYRE TECHNOLOGY

TUBELESS TYRE TECHNOLOGY


INTRODUCTION

It has been over a century from the time Dunlop patented his 'mummified wheel' to the modern radial tyres of today. Yet with all the improvements a tyre has undergone, one thing remained unchanged, which is only when it is inflated to the optimised level and that inflation is kept constant that it can deliver maximum comfort and performance. This is one of the basic reasons all tyre manufacturers try to focus on in the development stage of a tyre which can have the best air retention ability. Usage of a tube or an extra air container within the tyre was regarded as the best solution for many years

Mr. P.W. Litchfield demonstrating the manufacture of a tubeless tyre in the 50's at Akron

It may come as a surprise to many that in 1903, engineer Paul Weeks Litchfield, then in his early 20s, was granted a patent for the first 'tubeless' tyres. He later rose to be the chairman of the Board of Goodyear in the year 1940. Just like many other patents, which were granted during that period, this concept was not pursued until late 1939 when the requirement for the first amphibious tyre was felt. The 120x33.5 - 66 smooth tread Marsh Buggy tyres, by far the largest tyres produced then, were used on Admiral Byrd's Snow Cruiser. This vehicle was capable of carrying very heavy loads over all sorts of terrain, even float on water. These were off-the-road tyres, flexible but inextensible pressure vessel that were pre-stressed and skin-stressed by air pressure. To produce such tyres Goodyear at Akron employed the idea of Litchfield, using nylon cords for the first time and a newly developed synthetic rubber compound called Chemigum to line the inner casing of this tyre to lighten its weight and eliminate the tube.

The Second World War highlighted the need for reliable tyres as loss of air or punctures cost precious moments or even endangered lives. Though the tubeless concept was not used during the war, subsequent development of tyres with a 'run-flat' capability by introducing tubes, which had a special construction of a sealant on the lower side, this allowed it to run without an air loss even after a penetration. The added weight of the tube made the steering wheel heavy and restricted speed. They were used on low speed trucks, which traveled on areas with puncture hazards like wrecker's equipment, dock and warehouse vehicles, and other utility trucks

To reduce weight lifeguard tubes were introduced, having two air chambers, the outer rubber tube with a thick canvas tube inside. In case of a blowout only the outer chamber gave way, while the reserve air in the thick canvas tube would not allow the tyre to be completely deflated allowing the vehicle for a safe straight line gradual stop.

After the war a more determined effort towards elimination of the inner tube was sought as it was considered the main source of service trouble and failures while being clearly superfluous and costly. Experiments were therefore conducted both in the
USA (initially by Goodrich) and in UK (by Dunlop), towards providing a near perfect seal between the tyre bead and rim, under all service conditions. This meant that the tyre had to run even at low inflation pressures or with a penetration to a safe distance without loss of vehicle control. It was in the year 1954 that the first commercially realised tubeless tyre was fitted as original equipment, by the now defunct Packard marque.
During the mid 1950s and early 1960s,
India too manufactured tubeless tyres, which were not only supplied as original equipment for the cars, but also had a number of sizes meant for the replacement market. While the rest of the world accepted this new technology and by the middle of 1962, nearly all commercial vehicles, trucks and passenger cars used tubeless tyres, we in India reverted to the old tube-tyre theory. Even though most companies in India still manufacture tube-type tyres, many have the tubeless technology available with them and do manufacture tubeless tyres meant for export only.

Tubeless tyres have reappeared in the Indian scenario but many users are reluctant to use them. Some fit tubes in them. So which is actually better? Let us see where the construction difference lies. Apart from the basic construction, which remains the same with the run of the cords distinguishing the type of tyre construction, whether it is a cross-ply or a radial ply one; the main difference lies in the application of the inner liner of the carcass. Whereas in a tube-type construction the inner liner acts as a medium for reducing friction between the cord body and the tube, in a tubeless construction this is the tube itself. Thus the inner liner in a tubeless tyre is made up of a Halogenated Butyl rubber like Chlorobutyl or Bromobutyl for better air impermeability together with high heat and weather resistance.

Though compounds used in a tubeless or a tube type tyre may vary, the other major difference lies in the bead area of the tyre. While considering a radial tyre both type of tyres have a flexible yet rigid bead, where the bead bundle is very thin and the stability of the tyre is enhanced by the bead apex or bead filler controlling it, in a tubeless it also has to maintain the air pressure within. Thus the bead heel in the tubeless sits more tightly within the flange of the rim, and to ensure this tight fitting most tyre manufacturers add an extra wrapping over the bead area. This enhances high-speed performance while achieving a better cornering ability on the tubeless.


Tuesday, January 13, 2009

NANOTECHNOLOGY

NANOTECHNOLOGY

Abstract

Albert Einstein first proved that each molecule measures about a nanometer (a billion of a meter) in diameter. And in 1959, it was Richard P. Feynman who predicted a technological world composed of self-replicating molecules whose purpose would be the production of nano-sized objects.


Almost a hundred years after Einstein's insight and 40 years after Feynman's initial proposition, the nanometer scale looms large on the research agenda. The semiconductor industry is edging closer to the world of Nanotechnology where components are miniaturized to the point of individual molecules and atoms.

Nanotechnology broadly refers to the manipulation of matter on the atomic and molecular scales. This technology enables creation of things one atom or molecule at a time. The possibilities with nanotechnology are enormous and are of great benefit to us.

This seminar report aims to discuss various concepts behind nanotechnology, implementation of nanotechnology in medical field and to other fields also.

Introduction

The development and use of devices that have a size of only a few nanometres. Research has been carried out into very small components, which depend on electronic effects and may involve movement of a countable number of electrons in their action. Such devices would act faster than larger components. Considerable interest has been shown in the production of structures on a molecular level by suitable sequences of chemical reactions. It is also possible to manipulate individual atoms on surfaces using a variant of the atomic force microscope.

Nanotechnology is the creation of functional materials, devices and systems through control of matter on the nanometer length scale (1-100 nanometers), and exploitation of novel phenomena and properties (physical, chemical, biological) at that length scale. For comparison, 10 nanometers is 1000 times smaller than the diameter of a human hair. A scientific and technical revolution has just begun based upon the ability to systematically organize and manipulate matter at nanoscale. Payoff is anticipated within the next 10-15 years.

A push is well underway to invent devices that manufacture at almost no cost, by treating atoms discretely, like computers treat bits of information. This would allow automatic construction of consumer goods without traditional labor, like a Xerox machine produces unlimited copies without a human retyping the original information.

Electronics is fueled by miniaturization. Working smaller has led to the tools capable of manipulating individual atoms like the proteins in a potato manipulate the atoms of soil, air and water to make copies of itself.

The shotgun marriage of chemistry and engineering called "Nanotechnology"is ushering in the era of self replicating machinery and self assembling consumer goods made from cheap raw atoms (Drexler, Merkle paraphrased).


Nanotechnology is molecular manufacturing or, more simply, building things one atom or molecule at a time with programmed nanoscopic robot arms. A nanometer is one billionth of a meter (3 - 4 atoms wide). Utilizing the well understood chemical properties of atoms and molecules (how they"stick" together), nanotechnology proposes the construction of novel molecular devices possessing extraordinary properties. The trick is to manipulate atoms individually and place them exactly where needed to produce the desired structure.


Working at the resolution limit of matter, it will enable the ultimate in miniaturization and performance. By starting with cheap, abundant components--molecules--and processing them with small, high-frequency, high-productivity machines, it will make products inexpensive.

6-SIGMA A LEAP TOWARDS QUALITY & PERFECTION

A LEAP TOWARDS QUALITY & PERFECTION


Abstract

6-sigma is a quality management program, which is not limited to any specific field. It covers all types of industries, which strive for achieving highest quality in their products. It was developed at Motorola in early 1980s. It achieves quality by reducing no. of defects in the total quantity of manufactured product.

Its basic structure is formed of 6 layers. Each layer signifies the maximum no. of defects that could be allowed per million parts of a product. As we step up through the layers, defect rate starts reducing. At the highest level, we can achieve the defect rate which is as small as 3.4.parts per million.

Achieving a sigma standard up to 6th level is not an easy job. It requires perfection at each & every step of production. This takes care of quality even in the smallest part of production system and helps in achieving a world-class quality product.

Introduction to 6-sigma

6-Sigma is a structured, data driven methodology that can be applied to any aspect of business. It is used to improve customer satisfaction, eliminate waste and increase profit. It is a quality management program, which covers the minute things for achieving better quality.

The initial development of 6-Sigma strategies were contrived at Motorola in the early 1980’s. It was initiated because the company authorities found that some areas of their products were losing in competitiveness and costs. Later they found out that it was because of poor quality, which were only at 3 or 4 sigma levels. The CEO of Motorola then started a company wide program to improve quality to make sure that a 6sigma quality level is obtained, which is essentially at 3.4ppm, which is 3.4 parts in each million parts produced. The methodology only became well known after GE’s Jack Welch made it a central focus of his business strategy in 1995.

The main thrust of 6-Sigma is the application of statistical tools in the context of a well disciplined, easy to follow methodology. It also attacks the basic reason, which is responsible for further loss or profit issues, that is decision making. Many times the decisions are based solely on intuition or experience.

6-Sigma removes the personal touch in decisions by quantifying the issues and using statistics to provide probabilities of success and failure.

To apply such strategies in one’s business, he/she needs the proper training.

The people expert in these methods are called Black Belts. The whole initiative has been broken up into projects, which are led by Green Belts.

The program is culture change, with the Black Belts and Champions as the agents of change. It requires tenacity, mental toughness and, above all, dedication to the pursuit of perfection.

Monday, January 12, 2009

System Improvements through Total Quality Management TQM and Total Productive Maintenance Techniques

System Improvements through Total Quality Management TQM and Total Productive Maintenance Techniques


ABSTRACT

System improvement through TQM, TPM, JIT & KANBAN is based on quality improvement. These techniques are used to improve the quality of product with minimum cost of investment. Nowadays there is tremendous competition for any product in market. The product, which will have high quality, will absorb the whole market. Because of this competition new techniques are developed to improve the quality.
Total Quality Management& Total Productive Maintenance techniques are used to maximize operational efficiency& to provide strategic competitive edge to business. Just In Time is the management that focuses organization on continuously identifying & removing sources of waste.
As with any new technique many questions arise. How these techniques help to improve the quality? What are the advantages of using these techniques?
Some of basic advantages of using these techniques are reduced finished& WIP inventory, shorter product flow times, increased worker productivity, low production cost.


INTRODUCTION

Quality is a customer issue. Success of any organization depends on its product quality. It arises because customers require products, which not only meet their performance requirements but also are satisfactory in terms of safety, working life and pride of ownership. So every organization is looking for a new technique which when applied will give better quality product without increasing cost of it.

Total Quality Management and Total Productive Maintenance are the latest and powerful techniques, which can be applied to improve quality. These techniques have potential to maximize an operational efficiency and to provide strategic competitive edge to business and strive to make the best use of all available resources and opportunities.


How TQM corelates with TPM, JIT and KANBAN?

In a manufacturing organization, the achievement of quality standards is not restricted to the production department. It extends to all parts of business from conceptual design to marketing, from order processing and distribution.

Thus to include every employee, concept of TPM evolved. Total Productive Maintenance is a means of creating a safe and participative work environment in which all employees target the elimination of losses in order to continuously enhance the capacity, flexibility and capability of processes leading to higher employee morale and greater organizational profitability.

Thus these techniques aim at improving quality through overall improvement of the system and waste elimination or reduction in the system. Just In Time (JIT) is management that focuses organization on continuously identifying and removing sources of waste so that processes are continuously improved.

Kanban is a simple to operate visual control signal/tool to facilitate a ‘pull system’, i.e. JIT, which offers the opportunity to delegate routine material transactions on the shop floor.

DATA MIRRORING The RAID (Redundant Array of Independent Disks)

DATA MIRRORING

The RAID (Redundant Array of Independent Disks)

ABSTRACT

There are various techniques that are used for achieving data recovery by the means of data redundancy. The RAID (Redundant Array of Independent Disks) devices use various data mirroring techniques. The chief advantage of mirroring is that it provides not only complete redundancy of data, but also reasonably fast recovery from a disk failure.

In this document we are going cover the basic idea behind the mirroring techniques.

The mirroring is actually creating another copy of data which can be retrieved immediately after the data loss.

The mirroring techniques help to improve the read performance considerably. The techniques like Network data mirroring, Asynchronous data mirroring and also Remote data mirroring are considered for the discussion in following sessions.

INTRODUCTION

The DATA MIRRORING is an act of copying data from one location to a storage device in real time. Because the data is copied in real time, the information stored from the original location is always an exact copy of the data from the production device. Data mirroring is useful in the speedy recovery of critical data after a disaster. Data mirroring can be implemented locally or offsite at a completely different location.

Mirroring is one of the two data redundancy techniques used in RAID (the other being parity). In a RAID system using mirroring, all data in the system is written simultaneously to two hard disks instead of one; thus the "mirror" concept. The principle behind mirroring is that this 100% data redundancy provides full protection against the failure of either of the disks containing the duplicated data. Mirroring setups always require an even number of drives for obvious reasons.

The chief advantage of mirroring is that it provides not only complete redundancy of data, but also reasonably fast recovery from a disk failure. Since all the data is on the second drive, it is ready to use if the first one fails. Mirroring also improves some forms of read performance (though it actually hurts write performance.) The chief disadvantage of RAID 1 is expense: that data duplication means half the space in the RAID is "wasted" so you must buy twice the capacity that you want to end up with in the array. Performance is also not as good as some RAID levels.




Zero Configuration Networking

Zero Configuration Networking

Introduction

The evolution of the IP standards suite has concentrated on achieving a reliable and scalable networking architecture. Emphasis has always been placed on mechanisms that allow decentralized administration. Individual networks have been operated with local configuration, while Internetwide configuration has been coordinated through different agencies handling registration of domain names, network numbers, and other parameters.

Network operation requires consistent configuration of all hosts and servers and normally requires centralized, knowledgeable network administration and increasingly complex configuration management services.

It is important to note that there are other networking protocol suites with different priorities and deployment characteristics. In particular the AppleTalk Protocol Suite is simple to operate in small networks, though it requires system administration in larger deployments. Automatic configuration and ease of deployment were of primary importance to AppleTalk's designers. As a consequence, AppleTalk networks can be and are used in homes, schoolrooms, small offices, and conference rooms. In short, AppleTalk is successful in environments where IP networks have been absent, since IP networks have been too complicated and costly to administer. One advantage AppleTalk has over IP is that it functions even in networks where no services have been deployed.

As computers become cheaper and more pervasive, the obvious thing to do is network them together. Many companies are investigating adding features to their products to allow communication between consumer electronics, home appliances, personal computers, telecommunications devices, and more. Until the IP suite becomes as easy to operate as the AppleTalk Protocol Suite, the notion of a networked home or office using IP is impractical. There are many new networking protocols that offer easy deployment (for devices using that technology) in very simple network topologies. This diversity complicates the integration of communicating entities into a single network -- precisely what the IP suite is supposed to achieve.

Several computer software companies have taken the initiative to enhance the IP suite to address this challenge. The IETF has begun work on zero configuration networking for IP. The goal is to allow hosts to communicate using IP without requiring any prior configuration or the presence of network services. True to the traditional architectural principals of the IP suite, care is being taken to ensure that zero configuration networking protocols and operation do not detract from the scalability of larger configured networks with fully administered services.

This paper discusses the services and configuration essential for IP networking and surveys the IP suite standards for providing configuration to IP-enabled devices. The central topic of this paper is the emergence of protocols for operation without services or configuration. Work in the area of zero configuration protocols has been motivated by new demands in the marketplace. Some architectural principles are generally agreed upon and under gird the development of standards in this area. Finally, the paper summarizes the status of this work and concludes by discussing implications for the future.

Sunday, January 11, 2009

BLUETOOTH

BLUETOOTH

Abstract:

Communication can take many forms –audio, visual, written, electronic and so on. In the realm of electronics, analog and digital communications are so pervasive in modern society that they are taken for granted. The exchange of data using these forms of communication has led to use of terms such as information industry and information age from telephones to computers to televisions, communication in many respects makes the world go around. Bluetooth wireless technology is one of the forms of electric communications. In this paper we are presenting some of the fundamentals and specific characteristics.

Describing the main concept of Bluetooth the paper travels in a fashion that at the end reader will quite familiar with the technology. Starting with an introduction, paper illustrates various technology details in order to let reader understand various features of this technology.

Bluetooth has emerged as the best solution for wireless communication for the mobile devices such as laptop, mobile phones etc. Here in this paper we have included few of such important applications.




Wednesday, January 7, 2009

NETWORK SECURITY

  NETWORK SECURITY

ABSTRACT


Threats to information security are appearing more and more frequently, and are of greater magnitude, than ever before. Threats come from both internal and external sources. They can be online or local, and range from accidental to malicious to criminal attacks that can expose your most sensitive business information to unauthorized use, disclosure, modification or even total loss. 
Network security is a complicated subject, historically only tackled by well-trained and experienced experts. However, as more and more people become ``wired'', an increasing number of people need to understand the basics of security in a networked world. This paper is written with the basic computer user and information systems manager in mind, explaining the concepts needed to read through the hype in the marketplace and understand risks and how to deal with them. 
 This paper discusses the importance of security, threats to data, who are the enemies? What these enemies can do, Security tools. We go on to consider risk management, network threats, firewalls, and more special-purpose secure networking devices. 



ENERGY EFFICIENT BUILDINGS

ENERGY EFFICIENT BUILDINGS

AN ANSWER TO OPTIMUM ENERGY UTILIZATION

ABSTRACT

The built environment has a huge impact on our economy, environment, health and productivity. Construction activity is one of the largest activities that imposes a major impact on various environmental and energy related problems. The building sector is growing at a rapid pace and is the third largest consumer of energy, after industry and agriculture. In spite of various efforts of achieving a sustainable structure, these buildings still consume a major portion of the available energy. So to address these energy related sustainable issues in a holistic manner, the world over, today, the focus is on energy efficient buildings. These are those buildings which maximize the system efficiency and minimize the natural resources to the minimum. It is basically an integrated process to reduce energy demand and maximize efficiency. An energy efficient structure promises a 30-40% reduction in energy cost and so is a sole answer to the ever rising energy problems.

This paper mainly throws light on all such issues related to energy efficient structures with the main emphasis on the various methodologies and materials adopted to make a structure energy efficient and eco friendly (with special emphasis on various thermal comfort conditions). Also a brief review of an energy efficient structure has also been provided with reference to its climatic zone.


Sunday, January 4, 2009

Digital TV

Digital TV

Introduction

Internet has changed the way we do business and the next revolution is Digital TV, whichis changing the way we live. Internet was a dress rehearsal for Digital TV, which is brining in all the convergence. TV viewing will be a different and rich experience as it is going to be highly interactive compared the conventional dumb viewing. According to Allied Business Intelligence there would be 242 million TVs will be digital by 2005 from 40 million in 2000. The number of global TV installations as of today is exceeding 1 billion in number. There is a huge potential of converting all these TVs into Digital. Many governments are passing regulation to make digital TV mandatory. In US all the commercial broadcasting will be digital by 2003. By 2006 there won’t be any along transmission.


Thursday, January 1, 2009

EMBEDDED APPLICATIONS DESIGN USING REAL-TIME OS”

EMBEDDED APPLICATIONS DESIGN USING REAL-TIME OS


INTRODUCTION

You read about it everywhere: distributed computing is the next revolution, Perhaps relegating our desktop computers to the museum. But in fact the age of distributed computing has been around for quite a while. Every time we withdraw money from an ATM, start our car, use our cell phone, or microwave our dinner, microprocessors are at work performing dedicated functions. These are examples of just a very few of the thousands of “embedded systems.”

Until recently the vast majority of these embedded systems used 8- and 16-bit microprocessors, requiring little in the way of sophisticated software development tools, including an Operating System (OS). But the 32-bit processors are now driving an explosion in high-volume embedded applications. And a new trend towards integrating a full system-on-a-chip (SOC) promises a further dramatic expansion for 32-bit embedded applications as we head into the 21st century.

Aerospace companies were the first end-markets for embedded systems, driven by military and space applications. But this market has never developed the growth potential of the newer commercial applications. In the past five years, the major end-markets for embedded systems have been telecommunications, computer networking, and office equipment. But, now we see consumer and automotive electronics as major emerging markets. And looming on the horizon is the suspected wave of networked appliances, with Sun Microsystems, IBM, Microsoft and others targeting products and standards; envisioning billions of embedded network connections running embedded JAVA applications across a network.