Hello Guest. Sign Up to view and download full seminar reports               


Engineering Topics Category

Optical Computers

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Computers have enhanced human life to a great extent.The goal of improving on computer speed has resulted in the development of the Very Large Scale Integration (VLSI) technology with smaller device dimensions and greater complexity.

VLSI technology has revolutionized the electronics industry and additionally, our daily lives demand solutions to increasingly sophisticated and complex problems, which requires more speed and better performance of computers.

For these reasons, it is unfortunate that VLSI technology is approaching its fundamental limits in the sub-micron miniaturization process. It is now possible to fit up to 300 million transistors on a single silicon chip. As per the Moore’¬s law it is also estimated that the number of transistor switches that can be put onto a chip doubles every 18 months. Further miniaturization of lithography introduces several problems such as dielectric breakdown, hot carriers, and short channel effects. All of these factors combine to seriously degrade device reliability. Even if developing technology succeeded in temporarily overcoming these physical problems, we will continue to face them as long as increasing demands for higher integration continues. Therefore, a dramatic solution to the problem is needed, and unless we gear our thoughts toward a totally different pathway, we will not be able to further improve our computer performance for the future.

Optical interconnections and optical integrated circuits will provide a way out of these limitations to computational speed and complexity inherent in conventional electronics. Optical computers will use photons traveling on optical fibers or thin films instead of electrons to perform the appropriate functions. In the optical computer of the future, electronic circuits and wires will be replaced by a few optical fibers and films, making the systems more efficient with no interference, more cost effective, lighter and more compact. Optical components would not need to have insulators as those needed between electronic components because they don’t experience cross talk. Indeed, multiple frequencies (or different colors) of light can travel through optical components without interfacing with each others, allowing photonic devices to process multiple streams of data simultaneously.


Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Biochips were invented 9 years ago by gene scientist Stephen Fodor. In a flash of light he saw that photolithography, the process used to etch semi conductor circuits in to silicon could also be used to assemble particular DNA molecules on a chip.

The human body is the next biggest target of chip makers. medical researchers have been working since a long period to integrate humans body and chips . In no time or at maximum within a short period of time Biochips can get implanted into the body of humans . So integration of humans and chips is achieved this way .

Money and research has already gone into this area of technology .Anyway such implants are already being experimented with animals . A simple chip is being is being implanted into tens of thousands of animals especially pets.

Expert Systems

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

An intelligent fault diagnosis and operator support system targeting in the safer operation of generators and distribution substations in power plants is introduced in this paper. Based on Expert Systems (ES) technology it incorporates a number of rules for the real time state estimation of the generator electrical part and the distribution substation topology. Within every sampling cycle the estimated state is being compared to an a priori state formed by measurements and digital signaling coming from current and voltage transformers as well as the existing electronic protection equipment. Whenever a conflict between the estimated and measured state arises, a set of heuristic rules is activated for the fault scenario inference and report. An included SCADA helps operators in the fast processing of large amounts of data, due to the user-friendly graphical representation of the monitored system. Enhanced with many heuristic rules, being a knowledge based system, the proposed system goes beyond imitation of expert operators’ knowledge, being able to inference fault scenarios concerning even components like the power electronic circuits of generator excitation system. For example, abnormal measurements on generator’s terminals can activate rules that will generate fault hypothesis possibly related to an excitation thyristors abnormal switching operation.

Interstate Hydrogen Highway

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Interstate Hydrogen Highway was brought up by Justin Eric Sutton. This highway mainly depends on hydrogen and water. Hydrogen is obtained in the basic process that produces electricity when sunlight striking EPV (electro photo voltaic panels).panels is then used to convert distilled water into hydrogen and oxygen .while the oxygen could be bottled and sold cheaply the hydrogen would serve as a “battery” store in compressed form in cooling tanks adjacent to the traveler system in utility centers. Electricity is produced by hydrogen using hydrogen fuel cell technology. Electricity generated in hydrogen highway by Magnetic Levitation (MAGLEV) technology may be used to provide for other power needs such as utility stations, access stations lightning and maintenance and rest can be used for domestic usage.

A certain amount of hydrogen would be stored each day to encompass night time travel and weather related burdens. Speed of trailblazer in hydrogen highway is 250-300 MPH. all it takes is “$1,50,00,000 per mile , and $2,50,000 per Rail Car. With an eventual system size of nearly 54,000 miles would yield as much as 45 billion watts of continuous electrical power.

Quantum Mirage

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Since it first appeared on the cover of Nature in February 2000, the quantum mirage” has featured on posters, calendars, websites and the covers of various books and magazines. The image – which was obtained using a scanning tunnelling microscope – shows the electronic wavefunctions inside an elliptical “quantum corral” made of cobalt atoms on a copper surface. It was created by Hari Manoharan, Christopher Lutz and Don Eigler of the IBM Almaden Research Center in California. In 1990, working with Erhard Schweizer, Eiger spelt out the letters “IBM” using 35 xenon atoms. And three years later, working with Lutz and Michael Crommie, he released the first images of the “quantum corral”, which have also been reproduced in numerous places.

The quantum mirage uses the wave nature of electrons to move the information, instead of a wire, so it has the potential to enable data transfer within future nano-scale electronic circuits so small that conventional wires do not work. It will be years before this technology becomes practical, but it could eventually yield computers that are many orders of magnitude smaller, faster, and less power-hungry than anything we can conceive today.

Rapid Prototyping

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

The term rapid prototyping (RP) refers to a class of technologies that can automatically construct physical models from Computer-Aided Design (CAD) data. It is also called Desktop Manufacturing or Freeform Fabrication. These technologies enable us to make even complex prototypes that act as an excellent visual aid to communicate with co-workers and customers. These prototypes are also used for design testing.

Why Rapid Prototyping?

  • Objects can be formed with any geometric complexity or intricacy without the need for elaborate machine set-up or final assembly.
  • Freeform fabrication systems reduce the construction of complex objects to a manageable, straightforward, and relatively fast process.
  • These techniques are currently being advanced further to such an extend that they can be used for low volume economical production of parts.
  • It significantly cut costs as well as development times.

Air Car

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

The Air car is a car currently being developed and, eventually, manufactured by Moteur Developpement International (MDI), founded by the French inventor Guy Nègre. It will be sold by this company too, as well as by ZevCat, a US company, based in California.

The air car is powered by an air engine, specifically tailored for the car. The used air engine is being manufactured by CQFD Air solution, a company closely linked to MDI.

The engine is powered by compressed air, stored in a glass or carbon-fibre tank at 4500 psi. The engine has injection similar to normal engines, but uses special crankshafts and pistons, which remain at top dead center for about 70% of the engine’s cycle; this allows more power to be developed in the engine.

Though some consider the car to be pollution-free, it must be taken into account that the tanks are recharged using electric (or gasoline) compressors, resulting in some pollution, if the electricity used to operate the compressors comes from polluting power plants (such as gas-, or coal-power plants). Solar power could possibly be used to power the compressors at fuel station.

Crusoe Processor

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Mobile computing has been the buzzword for quite a long time. Mobile computing devices like laptops, notebook PCs etc are becoming common nowadays. The heart of every PC whether a desktop or mobile PC is the microprocessor. Several microprocessors are available in the market for desktop PCs from companies like Intel, AMD, Cyrix etc. The mobile computing market has never had a microprocessor specifically designed for it. The microprocessors used in mobile PCs are optimized versions of the desktop PC microprocessor.

Mobile computing makes very different demands on processors than desktop computing. Those desktop PC processors consume lots of power, and they get very hot. When you’re on the go, a power-hungry processor means you have to pay a price: run out of power before you’ve finished, or run through the airport with pounds of extra batteries. A hot processor also needs fans to cool it, making the resulting mobile computer bigger, clunkier and noisier. The market will still reject a newly designed microprocessor with low power consumption if the performance is poor. So any attempt in this regard must have a proper ‘performance-power’ balance to ensure commercial success. A newly designed microprocessor must be fully x86 compatible that is they should run x86 applications just like conventional x86 microprocessors since most of the presently available software has been designed to work on x86 platform.

Crusoe is the new microprocessor, which has been designed specially for the mobile computing market .It has been, designed after considering the above-mentioned constraints. A small Silicon Valley startup company called Transmeta Corp developed this microprocessor.

The concept of Crusoe is well understood from the simple sketch of the processor architecture, called ‘amoeba’. In this concept, the x86 architecture is an ill-defined amoeba containing features like segmentation, ASCII arithmetic, variable-length instructions etc. Thus Crusoe was conceptualized as a hybrid microprocessor, i.e. it has a software part and a hardware part with the software layer surrounding the hardware unit. The role of software is to act as an emulator to translate x86 binaries into native code at run time. Crusoe is a 128-bit microprocessor fabricated using the CMOS process. The chip’s design is based on a technique called VLIW to ensure design simplicity and high performance. The other two technologies using are Code Morphing Software and LongRun Power Management. The crusoe hardware can be changed radically without affecting legacy x86 software: For the initial Transmeta products, models TM3120 and TM5400, the hardware designers opted for minimal space and power.

Mine Detection Using Radar Bullets

Added on: February 26th, 2012 by Afsal Meerankutty No Comments

Land mines are a case of serious threats to the life of civilians, especially in mine-affected countries like Afghanistan and Iraq .The mines which are implanted during the war time may remain undetected for several decades and may suddenly be activated after that. There are several methods for detection of land mines, such as metal detection and explosive detection. These ways of detection are dangerous because they are done very close to the mine.

A safe method for detecting land mines is “mine detection using radar bullets”. As the name suggests detection is done using Radar Bullets and hence can be done further away from the mine. Detection is usually done from helicopters. Researchers are being conducted to overcome certain inefficiencies of this method.


Added on: February 25th, 2012 by Afsal Meerankutty No Comments

iDEN is a mobile telecommunications technology, developed by Motorola, which provides its users the benefits of a trunked radio and a cellular telephone. iDEN places more users in a given spectral space, compared to analog cellular and two-way radio systems, by using speech compression and time division multiple access TDMA. Notably, iDEN is designed, and licensed, to operate on individual frequencies that may not be contiguous. iDEN operates on 25kHz channels, but only occupies 20 kHz in order to provide interference protection via guard bands. By comparison, TDMA Cellular (IS-54 and IS-136) is licensed in blocks of 30 kHz channels, but each emission occupies 40 kHz,and is capable of serving the same number of subscribers per channel as iDEN. iDEN supports either three or six interconnect users (phone users) per channel, and either six or twelve dispatch users (push-to-talk users) per channel. Since there is no Analogue component of iDEN, mechanical duplexing in the handset is unnecessary, so Time Domain Duplexing is used instead, the same way that other digital-only technologies duplex their handsets. Also, like other digital-only technologies, hybrid or cavity duplexing is used at the Base Station (Cellsite).

Microwave Superconductivity

Added on: February 25th, 2012 by Afsal Meerankutty No Comments

Superconductivity is a phenomenon occurring in certain materials generally at very low temperatures, characterized by exactly zero electrical resistance and the exclusion of the interior magnetic field (the Meissner effect). It was discovered by Heike Kamerlingh Onnes in 1911. Applying the principle of Superconductivity in microwave and millimeter-wave (mm-wave) regions, components with superior performance can be fabricated. Major problem during the earlier days was the that the cryogenic burden has been perceived as too great compared to the performance advantage that could be realized. There were very specialized applications, such as low-noise microwave and mm-wave mixers and detectors, for the highly demanding radio astronomy applications where the performance gained was worth the effort and complexity. With the discovery of high temperature superconductors like copper oxide, rapid progress was made in the field of microwave superconductivity.

This topic describes the properties of superconductivity that can be exploited in microwave and mm-wave technologies to yield components with appreciable performance enhancement over conventional systems. Superconducting signal transmission lines can yield low attenuation, zero-dispersion signal transmission behavior for signals with frequency components less than about one tenth the superconducting energy gap. No other known microwave device technology can provide a similar behavior. Superconductors have also been used to make high speed digital circuits, josephsons junction and RF and microwave filters for mobile phone base stations.

Beamed Power Transmission

Added on: February 24th, 2012 by Afsal Meerankutty No Comments

The desire of the modern man for more and more amenities and sophistication led to the unscrupulous exploitation of natural treasure. Though nature has provided abundant source of resources. It is not unlimited. Hence the exhaustion of the natural resources is eminent. The only exception to this is sunlight.

Scientist who had understood the naked truth had thought of exploiting the solar energy and started experimenting in this direction even from 1970. But the progress was very slow. Much headway is yet to be made in this direction. However as the impotents source of non-conventional energy and due to the limited source of conventional energy emphasis has given for the better utilization of solar energy.

But the application of solar cell, photovoltaic cell etc, we are able to concert only a small percentage of solar energy into electrical energy. But by using beamed power transmission from solar power satellite we can envisage a higher percentage of conversion. By beamed power transmission, we can extend the present system of two dimensional transmission network to three dimensional, if does not have any environmental problem as well.


Added on: February 24th, 2012 by Afsal Meerankutty No Comments

Generally when most people think about electronics, they may initially think of products such as cell phones, radios, laptop computers, etc. others, having some engineering background, may think of resistors, capacitors, etc. which are the basic components necessary for electronics to function. Such basic components are fairly limited in number and each having their own characteristic function.

Memristor theory was formulated and named by Leon Chua in a 1971 paper. Chua strongly believed that a fourth device existed to provide conceptual symmetry with the resistor, inductor, and capacitor. This symmetry follows from the description of basic passive circuit elements as defined by a relation between two of the four fundamental circuit variables. A device linking charge and flux (themselves defined as time integrals of current and voltage), which would be the memristor, was still hypothetical at the time. However, it would not be until thirty-seven years later, on April 30, 2008, that a team at HP Labs led by the scientist R. Stanley Williams would announce the discovery of a switching memristor. Based on a thin film of titanium dioxide, it has been presented as an approximately ideal device.
The reason that the memristor is radically different from the other fundamental circuit elements is that, unlike them, it carries a memory of its past. When you turn off the voltage to the circuit, the memristor still remembers how much was applied before and for how long. That’s an effect that can’t be duplicated by any circuit combination of resistors, capacitors, and inductors, which is why the memristor qualifies as a fundamental circuit element.
The arrangement of these few fundamental circuit components form the basis of almost all of the electronic devices we use in our everyday life. Thus the discovery of a brand new fundamental circuit element is something not to be taken lightly and has the potential to open the door to a brand new type of electronics. HP already has plans to implement memristors in a new type of non-volatile memory which could eventually replace flash and other memory systems.

Blue Gene

Added on: February 24th, 2012 by Afsal Meerankutty No Comments

Blue Gene/L (BG/L) is a 64K (65,536) node scientific and engineering supercomputer that IBM is developing with partial funding from the United States Department of Energy. This paper describes one of the primary BG/L interconnection networks, a three dimensional torus. We describe a parallel performance simulator that was used extensively to help architect and design the torus network and present sample simulator performance studies that contributed to design decisions. In addition to such studies, the simulator was also used during the logic verification phase of BG/L for performance verification, and its use there uncovered a bug in the VHDL implementation of one
of the arbiters. Blue Gene/L (BG/L) is a scientific and engineering, message-passing, supercomputer that IBM is developing with partial funding from the U.S. Department of Energy Lawrence Livermore National Laboratory. A 64K node system is scheduled to be delivered to Livermore, while a 20K node system will be installed at the IBM T.J. Watson Research Center for use in life sciences computing, primarily protein folding. A more complete overview of BG/L may be found in [1], but we briefly describe the primary features of the machine.

Image Authentication Techniques

Added on: February 24th, 2012 by Afsal Meerankutty No Comments

This paper explores the various techniques used to authenticate the visual data recorded by the automatic video surveillance system. Automatic video surveillance systems are used for continuous and effective monitoring and reliable control of remote and dangerous sites. Some practical issues must be taken in to account, in order to take full advantage of the potentiality of VS system. The validity of visual data acquired, processed and possibly stored by the VS system, as a proof in front of a court of law is one of such issues.

But visual data can be modified using sophisticated processing tools without leaving any visible trace of the modification. So digital or image data have no value as legal proof, since doubt would always exist that they had been intentionally tampered with to incriminate or exculpate the defendant. Besides, the video data can be created artificially by computerized techniques such as morphing. Therefore the true origin of the data must be indicated to use them as legal proof. By data authentication we mean here a procedure capable of ensuring that data have not been tampered with and of indicating their true origin.


Added on: February 24th, 2012 by Afsal Meerankutty No Comments

Vehicles designed to travel close to but above ground or water. These vehicles are supported in various ways. Some of them have a specially designed wing that will lift them just off the surface over which they travel when they have reached a sufficient horizontal speed (the ground effect). Hovercrafts are usually supported by fans that force air down under the vehicle to create lift, Air propellers, water propellers, or water jets usually provide forward propulsion. Air-cushion vehicles can attain higher speeds than can either ships or most land vehicles and use much less power than helicopters of the same weight. Air-cushion suspension has also been applied to other forms of transportation, in particular trains, such as the French Aero train and the British hover train.

Hovercraft is a transportation vehicle that rides slightly above the earth’s surface. The air is continuously forced under the vehicle by a fan, generating the cushion that greatly reduces friction between the moving vehicle and surface. The air is delivered through ducts and injected at the periphery of the vehicle in a downward and inward direction. This type of vehicle can equally ride over ice, water, marsh, or relatively level land.

Plastics Explosion

Added on: February 24th, 2012 by Afsal Meerankutty No Comments

Synthetic polymers are often referred to as “plastics”, such as the well-known polyethylene and nylon. However, most of them can be classified in at least three main categories: thermoplastics, thermosets and elastomers.

Man-made polymers are used in a bewildering array of applications: food packaging, films, fibers, tubing, pipes, etc. The personal care industry also uses polymers to aid in texture of products, binding, and moisture retention (e.g. in hair gel and conditioners).

Plastics explosion: Acrylic, Polyethylene, etc…

Voice Browsers

Added on: February 24th, 2012 by Afsal Meerankutty No Comments

Browser technology is changing very fast these days and we are moving from the visual paradigm to the voice paradigm. Voice browser is the technology to enter this paradigm. A voice browser is a “device which interprets a (voice) markup language and is capable of generating voice output and/or interpreting voice input, and possibly other input/output modalities.”This paper describes the requirements for two forms of character-set grammar, as a matter of preference or implementation; one is more easily read by (most) humans, while the other is geared toward machine generation.


Added on: February 23rd, 2012 by Afsal Meerankutty No Comments

The Internet is a rapidly becoming an integral aspect of the desktop computer if it has not already become so. The Internet can be visualized as worldwide information repository enabling resource sharing on a worldwide basis, through the use of distributed applications. Agents are a new approach to the development of distributed client-server applications built to exploit this information resource.

AgentOS provides an environment for the development and the deployment of agent-based client-server applications based on agents. Agents are an object representation of distributed systems containing both computational logic and state information. Agents are active and mobile, in that they (along with their state information) can migrate between the various hosts that exist in an agent-system such as AgentOS. Agents are autonomous in that they contain code to execute and take decisions on behalf of a human user or another agent as they carry out their assigned tasks. Agents can be as simple as s single algorithm or as complex as a complete application.

Viewed from the perspective of a single host, AgentOS behaves as a traditional server providing an environment for the execution of agents and accessibility of services. Abstracting one level further, i.e., viewed from the perspective of the entire network, AgentOS exceeds the role of the traditional server by assuming the role of a peer in a network of similar servers collectively providing an environment for distributed applications.

The first section in this paper discusses the agent-paradigm, the range of applications most suited for this mode of programming and the typical life cycle of an agent within an agent system such as AgentOS. The second section of this report discusses the requirements for AgentOS, and the design of AgentOS. Finally, a brief survey of related works and an overview of future directions for AgentOS are presented.

Water Jet Cutter

Added on: February 23rd, 2012 by Afsal Meerankutty No Comments

In the battle to reduce costs, engineering and manufacturing departments are constantly on the lookout for an edge. The water jet process provides many unique capabilities and advantages that can prove very effective in the cost battle. Learning more about the water jet technology will give us an opportunity to put these cost-cutting capabilities to work. Beyond cost cutting, the water jet process is recognized as the most versatile and fastest growing process in the world. Waterjets are used in high production applications across the globe. They compliment other technologies such as milling, laser, EDM, plasma and routers. No poisonous gases or liquids are used in waterjet cutting, and waterjets do not create hazardous materials or vapors. No heat effected zones or mechanical stresses are left on a waterjet cut surface. It is truly a versatile, productive, cold cutting process. The waterjet has shown that it can do things that other technologies simply cannot. From cutting whisper, thin details in stone, glass and metals; to rapid whole drilling of titanium; for cutting of food, to the killing of pathogens in beverages and dips, the waterjet has proven itself unique.

Reverse Engineering

Added on: February 23rd, 2012 by Afsal Meerankutty No Comments

Engineering is the profession involved in designing, manufacturing, constructing, and maintaining of products, systems, and structures. At a higher level, there are two types of engineering: forward engineering and reverse engineering.

Forward engineering is the traditional process of moving from high-level abstractions and logical designs to the physical implementation of a system. In some situations, there may be a physical part without any technical details, such as drawings, bills-of-material, or without engineering data, such as thermal and electrical properties.
The process of duplicating an existing component, subassembly, or product, without the aid of drawings, documentation, or computer model is known as reverse engineering.

Reverse engineering can be viewed as the process of analyzing a system to:
1. Identify the system’s components and their interrelationships
2. Create representations of the system in another form or a higher level of abstraction
3. Create the physical representation of that system

Reverse engineering is very common in such diverse fields as software engineering, entertainment, automotive, consumer products, microchips, chemicals, electronics, and mechanical designs. For example, when a new machine comes to market, competing manufacturers may buy one machine and disassemble it to learn how it was built and how it works. A chemical company may use reverse engineering to defeat a patent on a competitor’s manufacturing process. In civil engineering, bridge and building designs are copied from past successes so there will be less chance of catastrophic failure. In software engineering, good source code is often a variation of other good source code.

In some situations, designers give a shape to their ideas by using clay, plaster, wood, or foam rubber, but a CAD model is needed to enable the manufacturing of the part. As products become more organic in shape, designing in CAD may be challenging or impossible. There is no guarantee that the CAD model will be acceptably close to the sculpted model. Reverse engineering provides a solution to this problem because the physical model is the source of information for the CAD model. This is also referred to as the part-to-CAD process.

Another reason for reverse engineering is to compress product development times. In the intensely competitive global market, manufacturers are constantly seeking new ways to shorten lead-times to market a new product. Rapid product development (RPD) refers to recently developed technologies and techniques that assist manufacturers and designers in meeting the demands of reduced product development time. For example, injection-molding companies must drastically reduce the tool and die development times. By using reverse engineering, a three-dimensional product or model can be quickly captured in digital form, re-modeled, and exported for rapid prototyping/tooling or rapid manufacturing.

Maglev Train

Added on: February 23rd, 2012 by Afsal Meerankutty 1 Comment

Magnetic levitation is the latest in transportation technology and has been the interest of many countries around the world. The idea has been around since 1904 when Robert Goddard, an American Rocket scientist, created a theory that trains could be lifted off the tracks by the use of electromagnetic rails. Many assumptions and ideas were brought about throughout the following years, but it was not until the 1970’s that Japan and Germany showed interest in it and began researching and designing.

The motion of the Maglev train is based purely on magnetism and magnetic fields. This magnetic field is produced by using high-powered electromagnets. By using magnetic fields, the Maglev train can be levitated above its track, or guideway, and propelled forward. Wheels, contact with the track, and moving parts are eliminated on the Maglev train, allowing the Maglev train to essentially move on air without friction.

Maglev can be used for both low and high speed transportation. The low speed Maglev is used for short distance travel. Birmingham, England used this low speed transportation between the years of 1984 and 1995. However, engineers are more interested in creating the high-speed Maglev vehicles. The higher speed vehicle can travel at speeds of nearly 343mph or 552 km/h. Magnetic Levitation mainly uses two different types of suspension, which are Electromagnetic Suspension and Electrodynamic Suspension. However, a third suspension system (Intuctrack) has recently been developed and is in the research and design phase. These suspension systems are what keep the train levitated off the track.

Heat Pipe

Added on: February 22nd, 2012 by Afsal Meerankutty No Comments

A heat pipe is a simple device that can quickly transfer heat from one point to another. They are often referred to as the “superconductors” of heat as they possess an extra ordinary heat transfer capacity & rate with almost no heat loss.

The development of the heat pipe originally started with Angier March Perkins who worked initially with the concept of the working fluid only in one phase (he took out a patent in 1839 on the hermetic tube boiler which works on this principle). Jacob Perkins (descendant of Angier March) patented the Perkins Tube in 1936 and they became widespread for use in locomotive boilers and baking ovens. The Perkins Tube was a system in which a long and twisted tube passed over an evaporator and a condenser, which caused the water within the tube to operate in two phases. Although these early designs for heat transfer systems relied on gravity to return the liquid to the evaporator (later called a thermosyphon), the Perkins Tube was the jumping off point for the development of the modern heat pipe.

The concept of the modern heat pipe, which relied on a wicking system to transport the liquid against gravity and up to the condenser, was put forward by R.S. Gaugler of the General Motors Corporation. According to his patent in 1944, Gaugler described how his heat pipe would be applied to refrigeration systems. Heat pipe research became popular after that and many industries and labs including Los Alamos, RCA, the Joint Nuclear Research Centre in Italy, began to apply heat pipe technology in their fields. By 1969, there was a vast amount of interest on the part of NASA, Hughes, the European Space Agency, and other aircraft companies in regulating the temperature of a spacecraft and how that could be done with the help of heat pipes. There has been extensive research done to date regarding specific heat transfer characteristics, in addition to the analysis of various material properties and geometries.

Blue Eyes

Added on: February 21st, 2012 by Afsal Meerankutty No Comments

Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection.

Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and censoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners. Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presents, talk, listen, or even guess their feelings.

The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusive sensing method, employing most modern video cameras and microphones to identify the users actions through the use of imparted sensory abilities. The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states.

Tunable Spiral Inductors

Added on: February 20th, 2012 by Afsal Meerankutty No Comments

A tunable micro-electromechanical systems integrated inductor with a large-displacement electro-thermal actuator is discussed here. Based on a transformer configuration, the inductance of a spiral inductor is tuned by controlling the relative position of a magnetically coupled short-circuited loop. Theoretical studies are backed by a variety of fabricated and measured tunable inductors that show a 2 : 1 inductance tuning ratio over a wide frequency range of approximately 25 GHz. In addition, the maximum and minimum quality factors of the tunable inductor are measured to be 26 and 10 which is high compared to previous designs. They can considerably extend the tuning capabilities of critical reconfigurable circuits such as tunable impedance matching circuits, phase shifters voltage controlled oscillators, and low noise amplifiers.

Burj Dubai Tower

Added on: February 20th, 2012 by Afsal Meerankutty No Comments

As with all super-tall projects, difficult structural engineering problems needed to be addressed and resolved. This paper presents the approach to the structural system for the Burj Dubai Tower. This paper first presents the architectural knowledge and the comparison of the Burj Dubai tower with other tall buildings of the world. It also describes the geotechnical procedures and structural detailing of the building besides the wind engineering applied to the tower.

Mobile Phone Cloning

Added on: February 20th, 2012 by Afsal Meerankutty No Comments

Mobile communication has been readily available for several years, and is major business today. It provides a valuable service to its users who are willing to pay a considerable premium over a fixed line phone, to be able to walk and talk freely. Because of its usefulness and the money involved in the business, it is subject to fraud.

Unfortunately, the advance of security standards has not kept pace with the dissemination of mobile communication.
Some of the features of mobile communication make it an alluring target for criminals. It is a relatively new invention, so not all people are quite familiar with its possibilities, in good or in bad. Its newness also means intense competition among mobile phone service providers as they are attracting customers. The major threat to mobile phone is from cloning.

Flexible AC Transmission Systems

Added on: February 19th, 2012 by Afsal Meerankutty No Comments

The rapid development of power electronics technology provides exciting opportunities to develop new power system equipment for better utilization of existing systems. Such as enhancing the security, capacity and flexibility of power transmission systems. FACTS solutions enable power grid owners to increase existing transmission network capacity while maintaining or improving the operating margins necessary for grid stability. Supply of reliable, high-quality electrical energy at a reasonable cost is at the heart of the nation’s economy. The electric power system is one of the nation’s important infrastructures, and the infrastructure most closely tied with the gross domestic product. In view of changes in both the operating and business sector of the electric utility industry. These changes have been mandated by provisions of the Energy Policy Act of 1992, and electric utilities to provide open access to the transmission system.

Several advanced methods are developed for maintaining a high degree of power quality and reliability under a deregulated environment. At the distribution level, flexible control strategies involve computerized automation of system control devices such as capacitor banks, under load tap changing transformers (Ultc’s) and voltage regulators. In the transmission system, a new method of achieving this control is through the use of power electronics based Flexible AC Transmission System (FACTS) devices. This paper provides a comprehensive guide to FACTS, covering all the major aspects in research and development of FACTS technologies. Various real-world applications are also included to demonstrate the issues and benefits of applying FACTS.The objective of this project is to create a multi-institutional power curriculum to address this new environment and technologies.


Added on: February 19th, 2012 by Afsal Meerankutty No Comments

XML is the language used to develop web applications. XML is a set of rules for designing structured data in a text format as opposed to binary format, which is useful for man, and machine both. A parser is used for syntactical and lexical analysis. XML parser extract the information from the XML document which is very much needed in all Web applications Simple object access protocol is a protocol that lets the program to send XML over HTTP to invoke methods on remote objects. An XML parser can serve as an engine for implementing this or a comparable protocol.

XML parser can also be used to send data messages formatted as XML over HTTP. By adding XML and HTTP capabilities to application, software developers can begin to offer alternatives to traditional browsers that have significant value to their customers. This paper presents an XML parser that implements a subset of the XML specification. This is useful to all developers and users for checking the welformedness and validation of an XML documents.


Added on: February 17th, 2012 by Afsal Meerankutty No Comments

Biometrics refers to the automatic identification of a person based on his or her physiological or behavioral characteristics like fingerprint, or iris pattern, or some aspects of behaviour like handwriting or keystroke patterns. Biometrics is being applied both to identity verification. The problem each involves is somewhat different. Verification requires the person being identified to lay claim to an identity. So the system has two choices, either accepting or rejecting the person’s claim. Recognition requires the system to look through many stored sets of characteristics and pick the one that matches the unknown individual being presented. BIOMETRIC system is essentially a pattern recognition system, which makes a personal identification by determining the authenticity of a specific physiological or behavioral characteristics possessed by the user.

Biometrics is a rapidly evolving technology, which is being used in forensics Such as criminal identification and prison security, and has the potential to be used in a large range of civilian application areas. Biometrics can be used transactions conducted via telephone and Internet (electronic commerce and electronic banking. In automobiles, biometrics can replace keys with key-less entry devices.

3 D ICs

Added on: February 16th, 2012 by Afsal Meerankutty 1 Comment

The unprecedented growth of the computer and the Information technology industry is demanding Very Large Scale Integrated (VLSI) circuits with increasing functionality and performance at minimum cost and power dissipation. VLSI circuits are being aggressively scaled to meet this Demand, which in turn has some serious problems for the semiconductor industry.

Additionally heterogeneous integration of different technologies in one single chip (SoC) is becoming increasingly desirable, for which planar (2-D) ICs may not be suitable.

3-D ICs are an attractive chip architecture that can alleviate the interconnect related problems such as delay and power dissipation and can also facilitate integration of heterogeneous technologies in one chip (SoC). The multi-layer chip industry opens up a whole new world of design. With the Introduction of 3-D ICs, the world of chips may never look the same again.

Artificial Turf

Added on: February 13th, 2012 by Afsal Meerankutty No Comments

Artificial turf is a surface manufactured from synthetic fibres made to look like natural grass. It is most often used in arenas for sports that were originally or are normally played on grass. The main reason is maintenance — artificial turf resists heavy use, such as in sports, better, and requires no irrigation or trimming. Domed, covered, and partially covered stadiums may require artificial turf because of the difficulty of getting grass enough sunlight to stay healthy.

A common misconception is that the new synthetic grass is similar to the household carpet. In fact this intricate system involves properly constructing a porous sub base, and using turf with holes in the back, and then the product is filled with sand/rubber granule mix which we call infill.

Artificial turf, also known as synthetic turf, has found a prominent place in sports today. Manufactured from synthetic materials, this man-made surface looks like natural grass. With the international sports associations and governing bodies approving the use of artificial surfaces, sports like football and hockey, which were originally played on natural grass, have moved to these artificial sports pitches. So, next time, you find players playing on an artificial hockey pitch, do not be surprised.

Artificial turf has been manufactured since the early 1960s, and was originally produced by Chemstrand Company (later renamed Monsanto Textiles Company). It is produced using manufacturing processes similar to those used in the carpet industry. Since the 1960s, the product has been improved through new designs and better materials. The newest synthetic turf products have been chemically treated to be resistant to ultraviolet rays, and the materials have been improved to be more wear-resistant, less abrasive, and, for some applications, more similar to natural grass.


Added on: February 13th, 2012 by Afsal Meerankutty No Comments

The rate of rise of power utility in a country represents its development in any field” necessity is the mother of invention “.so, efficient and reliable power supply is the major requirement of any power supply system. as in many other areas, the computer applications in electrical power system have grown tremendously over the last several decades ,penetrating apparently all aspects of electrical power systems including operational planning ,energy management systems, automation of power generation ,transmission and distribution, protection relaying, etc. a system used to monitor and control equipment and process is Called SCADA.SCADA system is the one of the applications of it.

The present paper explains about the whole SCADA system, its hardware components, software components and its application programs. in hardware components its field instrumentation, remote stations, communication network, central monitoring station are explained. in software components data acquisition and processing ,control, alarm handling, logging and archiving, automated mapping and facility management etc.are considered. and very clearly the appliCation programs like load shedding ,load balancing, remote metering ,maintain good voltage profile, maintain, maps, fuse Call off operations, energy accounting, etc. are emphasized.

Blu-Ray Technology

Added on: February 13th, 2012 by Afsal Meerankutty No Comments

Blu-ray is a new optical disc standard based on the use of a blue laser rather than the red laser of today’s DVD players. The standard, developed collaboratively by Hitachi, LG, Matsushita (Panasonic), Pioneer, Philips, Samsung, Sharp, Sony, and Thomson, threatens to make current DVD players obsolete. It is not clear whether new Blu-ray players might include both kinds of lasers in order to be able to read current CD and DVD formats.
The new standard, developed jointly in order to avoid competing standards, is also being touted as the replacement for writable DVDs The blue laser has a 405 nanometer (nm) wavelength that can focus more tightly than the red lasers used for writable DVD and as a consequence, write much more data in the same 12 centimeter space Like the rewritable DVD formats, Blu-ray uses phase change technology to enable repeated writing to the disc.

Blu-ray’s storage capacity is enough to store a continuous backup copy of most people’s hard drives on a single disc. The first products will have a 27 gigabyte (GB) single-sided capacity, 50 GB on dual-layer discs. Data streams at 36 megabytes per second (Mbps), fast enough for high quality video recording Single-sided Blu-ray discs can store up to 13 hours of standard video data, compared to single-sided DVD’s 133 minutes. People are referring to Blu-ray as the next generation DVD, although according to Chris Buma, a spokesman from Philips (quoted in New Scientist) “Except for the size of the disc, everything is different.”

Blu-ray discs will not play on current CD and DVD players, because they lack the blue-violet laser required to read them. If the appropriate lasers are included, Blu-ray players will be able to play the other two formats. However, because it would be considerably more expensive, most manufacturers may not make their players backward compatible. Panasonic, Philips, and Sony have demonstrated prototypes of the new systems.

Sky X Technology

Added on: February 13th, 2012 by Afsal Meerankutty No Comments

Satellites are attractive option for carrying internet and other IP traffic to many locations across the globe where terrestrial options are limited or [censored] prohibitive. But data networking on satellite is faced with overcoming the large latency and high bit error rate typical of satellite communications as well as the asymmetric bandwidth design of most satellite network.Satellites is ideal for providing internet and private network access over long distance and to remote locations. However the internet protocols are not optimized for satellite conditions. So the throughput over the satellite networks is restricted to only a fraction of available bandwidth.Mentat , the leading supplies of TCP/IP to the computer industry have overcome their limitations with the development of the Sky X product family.

The Sky X system replaces TCP over satellite link with a protocol optimized for the long latency, high loss and asymmetric bandwidth conditions of the typical satellite communication. The Sky X family consists of Sky X Gateway, Sky X Client/Server and Sky X OEM products. Sky X products increase the performance of IP over satellite by transparency replacing. The Sky X Gateway works by intercepting the TCP connection from client and converting the data to Sky X protocol for transmission over the satellite. The Sky X Client /Server product operates in a similar manner except that the Sky X client software is installed on each end users PC. Connection from applications running on the PC is intercepted and sends over the satellite using the Sky X protocol.

Brain Computer Interface

Added on: February 12th, 2012 by Afsal Meerankutty 2 Comments

Brain Computer interface (BCI) is a communication system that recognized users’ command only from his or her brainwaves and reacts according to them. For this purpose PC and subject is trained. Simple task can consist of desired motion of an arrow displayed on the screen only through subject’s imaginary of something (e.g. motion of his or her left or right hand). As the consequence of imaging process, certain characteristics of the brainwaves are raised and can be used for user’s command recognition, e.g. motor mu waves (brain waves of alpha range frequency associated with physical movements or intention to move).

An Electroencephalogram based Brain-Computer-Interface (BCI) provides a new communication channel between the human brain and a computer. Patients who suffer from severe motor impairments (late stage of Amyotrophic Lateral Sclerosis (ALS), severe cerebral palsy, head trauma and spinal injuries) may use such a BCI system as an alternative form of communication by mental activity.

The use of EEG signals as a vector of communication between men and machines represents one of the current challenges in signal theory research. The principal element of such a communication system, more known as “Brain Computer Interface”, is the interpretation of the EEG signals related to the characteristic parameters of brain electrical activity.

Artificial Intelligence

Added on: February 9th, 2012 by Afsal Meerankutty 5 Comments

The AP is an artificial intelligence–based companion that will be resident in software and chips embedded in the automobile dashboard. The heart of the system is a conversation planner that holds a profile of you, including details of your interests and profession.

A microphone picks up your answer and breaks it down into separate words with
speech-recognition software. A camera built into the dashboard also tracks your lip movements to improve the accuracy of the speech recognition. A voice analyzer then looks for signs of tiredness by checking to see if the answer matches your profile. Slow responses and a lack of intonation are signs of fatigue.

This research suggests that we can make predictions about various aspects of driver performance based on what we glean from the movements of a driver’s eyes and that a system can eventually be developed to capture this data and use it to alert people when their driving has become significantly impaired by fatigue.

Microbial Fuel Cells

Added on: February 9th, 2012 by Afsal Meerankutty No Comments

In an era of climate change, alternate energy sources are desired to replace oil and carbon resources. Subsequently, climate change effects in some areas and the increasing production of biofuels are also putting pressure on available water resources. Microbial Fuel Cells have the potential to simultaneously treat wastewater for reuse and to generate electricity; thereby producing two increasingly scarce resources.

While the Microbial Fuel Cell has generated interest in the wastewater treatment field, knowledge is still limited and many fundamental and technical problems remain to be solved Microbial fuel cell technology represents a new form of renewable energy by generating electricity from what would otherwise be considered waste, such as industrial wastes or waste water etc. A microbial fuel cell [Microbial Fuel Cell] is a biological reactor that turns chemical energy present in the bonds of organic compounds into electric energy, through the reactions of microorganism in aerobic conditions.


Added on: February 9th, 2012 by Afsal Meerankutty No Comments

Animatronics is a cross between animation and electronics. Basically, an animatronic is a mechanized puppet. It may be preprogrammed or remotely controlled. An abbreviated term originally coined by Walt Disney as “Audio-Animatronics” ( used to describe his mechanized characters ) ,can actually be seen in various forms as far back as Leonardo-Da-Vinci’s Automata Lion ,( theoretically built to present lillies to the King of France during one of his Visits ),and has now developed as a career which may require combined talent in Mechanical Engineering , Sculpting / Casting , Control Technologies , Electrical / Electronic , Airbrushing , Radio-Control.

Long before digital effects appeared, animatronics were making cinematic history. The scare generated by the Great White coming out of the water in “Jaws” and the tender otherworldliness of “E.T.” were its outcomes. The Jurassic Park series combined digital effects with animatronics.

It is possible for us to build our own animatronics by making use of ready-made animatronic kits provided by companies such as Mister Computers.


Added on: February 8th, 2012 by Afsal Meerankutty No Comments

Nanotechnology is the manipulation of matter on the nanoscale. A nanometer is a very small measure of length-it is one billionth of a meter, a length so small that only three or four atoms lined up in a row would be a nanometer. So, nanotechnology involves designing and building materials and devices where the basic structure of the material or device is specified on the scale of one or a few nanometers. Ultimately, nanotechnology will mean materials and devices in which every atom is assigned a place, and having every atom in the right place will be essential for the functioning of the device.

The kinds of product that could be built will range from microscopic, very powerful computers to super strong materials ten times as strong as steel, but much lighter too, food to other biological tissues. All these products would be very inexpensive because the molecular machines that built them will basically take atoms from garbage or dirt, and energy from sunshine, and rearrange those atoms into useful products, just like trees and crops take dirt, water and sunshine and rearrange the atoms into wood and food.

Nanotechnology cannot be defined as a definite branch of science but different from the conventional ones that we have as of now. It is set to encompass all the technological aspects that we have today and is nothing but the extension of scientific applications to a microscopic scale and thereby reaching closer to perfection if not right there.