Hello Guest. Sign Up to view and download full seminar reports               


Engineering Topics Category

Artificial Passenger

Added on: March 6th, 2017 by Afsal Meerankutty No Comments

The AP is an artificial intelligence–based companion that will be resident in software and chips embedded in the automobile dashboard. The heart of the system is a conversation planner that holds a profile of you, including details of your interests and profession.

A microphone picks up your answer and breaks it down into separate words with speech-recognition software. A camera built into the dashboard also tracks your lip movements to improve the accuracy of the speech recognition. A voice analyzer then looks for signs of tiredness by checking to see if the answer matches your profile. Slow responses and a lack of intonation are signs of fatigue.

This research suggests that we can make predictions about various aspects of driver performance based on what we glean from the movements of a driver’s eyes and that a system can eventually be developed to capture this data and use it to alert people when their driving has become significantly impaired by fatigue.

Light Emitting Polymer

Added on: February 13th, 2017 by Afsal Meerankutty No Comments

 The seminar is about polymers that can emit light when a voltage is applied to it. The structure comprises of a thin film of semiconducting polymer sandwiched between two electrodes (cathode and anode).When electrons and holes are injected from the electrodes, the recombination of these charge carriers takes place, which leads to emission of light .The band gap, ie. The energy difference between valence band and conduction band determines the wavelength (colour) of the emitted light.

They are usually made by ink jet printing process. In this method red green and blue polymer solutions are jetted into well defined areas on the substrate. This is because, PLEDs are soluble in common organic solvents like toluene and xylene .The film thickness uniformity is obtained by multi-passing (slow) is by heads with drive per nozzle technology .The pixels are controlled by using active or passive matrix.

It is a polymer that emits light when a voltage is applied to it. The structure comprises a thin-film of semiconducting polymer sandwiched between two electrodes (anode and cathode) as shown in fig.1. When electrons and holes are injected from the electrodes, the recombination of these charge carriers takes place, which leads to emission of light that escapes through glass substrate. The bandgap, i.e. energy difference between valence band and conduction band of the semiconducting polymer determines the wavelength (colour) of the emitted light.

The advantages include low cost, small size, no viewing angle restrictions, low power requirement, biodegradability etc. They are poised to replace LCDs used in laptops and CRTs used in desktop computers today.

Their future applications include flexible displays which can be folded, wearable displays with interactive features, camouflage etc.

MOCT (Magneto-Optical Current Transformer)

Added on: February 7th, 2017 by Afsal Meerankutty No Comments

An accurate current transducer is a key component of any power system instrumentation. To measure currents, power stations and substations conventionally employ inductive type current transformers. With short circuit capabilities of power system getting larger and the voltage level going higher the conventional current transducers becomes more bulky and costly.

It appears that newly emerged MOCT technology provides a solution for many of the problems by the conventional current transformers. MOCT measures the rotation angle of the plane polarized lights caused by the magnetic field and convert it into a signal of few volts proportional to the magnetic field.

Main advantage of an MOCT is that there is no need to break the conductor to enclose the optical path in the current carrying circuit and there is no electromagnetic interference.

Magneto Hydro Dynamic Power Generation

Added on: February 1st, 2017 by Afsal Meerankutty No Comments

When an electrical conductor is moved so as to cut lines of magnetic induction, charged particles in the conductor experience a force in a direction mutually perpendicular to the B field and to the velocity of the conductor. The negative charges tend to move in one direction, and the positive charges in the opposite direction. This induced electric field, or motional emf, provides the basis for converting mechanical energy into electrical energy. In conventional steam power plants, the heat released by the fuel is converted into rotational mechanical energy by means of a thermo cycle and the mechanical energy is then used to drive the electric generator. Thus two stages of energy conversion are involved in which the heat to mechanical energy conversion has inherently very low efficiency.

Also, the rotating machine has its associated losses and maintenance problems. In MHD generation, electrical energy is directly generated from hot combustion gases produced by the combustion of fuel without moving parts. The conventional electrical machines are basically electro mechanical converters while an MHD generator is heat engine operating on a turbine cycle and transforming the internal energy of gas directly into electrical energy.


Added on: January 25th, 2017 by Afsal Meerankutty No Comments

Generally when most people think about electronics, they may initially think of products such as cell phones, radios, laptop computers, etc. others, having some engineering background, may think of resistors, capacitors, etc. which are the basic components necessary for electronics to function. Such basic components are fairly limited in number and each having their own characteristic function.

Memristor theory was formulated and named by Leon Chua in a 1971 paper. Chua strongly believed that a fourth device existed to provide conceptual symmetry with the resistor, inductor, and capacitor. This symmetry follows from the description of basic passive circuit elements as defined by a relation between two of the four fundamental circuit variables. A device linking charge and flux (themselves defined as time integrals of current and voltage), which would be the memristor, was still hypothetical at the time. However, it would not be until thirty-seven years later, on April 30, 2008, that a team at HP Labs led by the scientist R. Stanley Williams would announce the discovery of a switching memristor. Based on a thin film of titanium dioxide, it has been presented as an approximately ideal device.

The reason that the memristor is radically different from the other fundamental circuit elements is that, unlike them, it carries a memory of its past. When you turn off the voltage to the circuit, the memristor still remembers how much was applied before and for how long. That’s an effect that can’t be duplicated by any circuit combination of resistors, capacitors, and inductors, which is why the memristor qualifies as a fundamental circuit element.

The arrangement of these few fundamental circuit components form the basis of almost all of the electronic devices we use in our everyday life. Thus the discovery of a brand new fundamental circuit element is something not to be taken lightly and has the potential to open the door to a brand new type of electronics. HP already has plans to implement memristors in a new type of non-volatile memory which could eventually replace flash and other memory systems

Methanol Fueled Marine Diesel Engine

Added on: January 19th, 2017 by Afsal Meerankutty No Comments

Nowadays concerns about methanol have increased from the viewpoints of environmental protection and versatility of fuels at a global scale. Energetic research on methanol-fueled automobile engines has been forwarded from the viewpoints of low environmental pollution and the use of alternate fuel since the oil crisis, and they are now being tested on vehicles in various countries in the world. Desire for saving of maintenance cost and labour prevails as well as the environmental problems in the field of marine engines. From these motives scientists have carried out research and development of a methanol fueled marine diesel engine which is quite different from automobile engines in the size, main particulars, working condition and durability.

Although scientists have made a great use of invaluable knowledge from automotive technology, some special studies were necessary due to these differences. Ignition method is a typical one. Dual fuel injection system was tried for trouble-free ignition of methanol fuel. This system is thought to be the most favourable ignition method for marine diesel engines which have to withstand quick load change and accept no misfiring. 

Pill Camera

Added on: January 12th, 2017 by Afsal Meerankutty No Comments

The aim of technology is to make products in a large scale for cheaper prices and increased quality. The current technologies have attained a part of it, but the manufacturing technology is at macro level. The future lies in manufacturing product right from the molecular level. Research in this direction started way back in eighties. At that time manufacturing at molecular and atomic level was laughed about. But due to advent of nanotechnology we have realized it to a certain level. One such product manufactured is PILL CAMERA, which is used for the treatment of cancer, ulcer and anemia. It has made revolution in the field of medicine. This tiny capsule can pass through our body, without causing any harm.

It takes pictures of our intestine and transmits the same to the receiver of the Computer analysis of our digestive system. This process can help in tracking any kind of disease related to digestive system. Also we have discussed the drawbacks of PILL CAMERA and how these drawbacks can be overcome using Grain sized motor and bi-directional wireless telemetry capsule .Besides this we have reviewed the process of manufacturing products using nanotechnology.Some other important applications are also discussed along with their potential impacts on various fields.

We have made great progress in manufacturing products. Looking back from where we stand now, we started from flint knives and stone tools and reached the stage where we make such tools with more precision than ever. The leap in technology is great but it is not going to stop here. With our present technology we manufacture products by casting, milling, grinding, chipping and the likes. With these technologies we have made more things at a lower cost and greater precision than ever before. In the manufacture of these products we have been arranging atoms in great thundering statistical herds. All of us know manufactured products are made from atoms. The properties of those products depend on how those atoms are arranged. If we rearrange atoms in dirt, water and air we get grass. The next step in manufacturing technology is to manufacture products at molecular level. The technology used to achieve manufacturing at molecular level is “NANOTECHNOLOGY”. Nanotechnology is the creation of useful materials, devices and system through manipulation of such miniscule matter (nanometer).Nanotechnology deals with objects measured in nanometers. Nanometer can be visualized as billionth of a meter or millionth of a millimeter or it is 1/80000 width of human hair.

Overall Equipment Effectiveness

Added on: January 11th, 2017 by Afsal Meerankutty No Comments

In today’s economy, you’re expected to continuously improve your Return on Total Capital. And as capital to build new, more efficient plants becomes more difficult to obtain, you often have to meet growing production demands with current equipment and facilities — while continuing to cut costs.

There are several ways you can optimize your processes to improve profitability. But it can be difficult to understand the overall effectiveness of a complex operation so you can decide where to make improvements. That’s especially true when the process involves multiple pieces of equipment that affect each other’s effectiveness.

One metric that can help you meet this challenge is Overall Equipment Effectiveness, or OEE. OEE measures the health and reliability of a process relative to the desired operating level. It can show you how well you’re utilizing resources, including equipment and labor, to satisfy customers by matching product quality and supply requirements.

Overall Equipment Effectiveness (OEE) measures total performance by relating the availability of a process to its productivity and output quality.

OEE addresses all losses caused by the equipment, including
• Not being available when needed because of breakdowns or set-up and adjustment losses
• Not running at the optimum rate because of reduced speed or idling and minor stoppage losses
• Not producing first-pass A1 quality output because of defects and rework or start-up losses.

OEE was first used by Seiichi Nakajima, the founder of total productive maintenance (TPM), in describing a fundamental measure for tracking production performance. He challenged the complacent view of effectiveness by focusing not simply on keeping equipment running smoothly, but on creating a sense of joint responsibility between operators and maintenance workers to extend and optimize overall equipment performance.

First applied in discrete manufacturing, OEE is now used throughout process, batch, and discrete production plants.

Medical Imaging

Added on: January 10th, 2017 by Afsal Meerankutty No Comments

The increasing capabilities of medical imaging devices have strongly facilitated diagnosis and surgery planning. During the last decades, the technology has evolved enormously, resulting in a never-ending flow of high-dimensional and high-resolution data that need to be visualized, analyzed, and interpreted. The development of computer hardware and software has given invaluable tools for performing these tasks, but it is still very hard to exclude the human operator from the decision making. The process of stating a medical diagnosis or to conduct a surgical planning is simply too complex to fully automate. Therefore, interactive or semi-automatic methods for image analysis and visualization are needed where the user can explore the data efficiently and provide his or her expert knowledge as input to the methods.

All software currently being written for medical imaging systems have to conform to the DICOM (Digital Imaging in Communication in Medicine) standards to ensure that different systems from different vendors can successfully share information).so, you can, for example, acquire the image from a Siemens viewing station and do the processing on Philips multimodal stations (the same station being able to easily process say MRI as well as CAT scan images) are already in common use. Vendors are also able to send private information that only their software and viewing stations can read, so as to enhance their equipment. For example a Philips acquisition system can acquire and transmit more information than prescribed by the standard. Such extra information can be deciphered only by the standard. Even though the basic job is that of image processing, the algorithms used in medical software can be vastly different from say those used in other commercial image manipulation software like movie software or Photoshop. The reason behind this is that medical systems have to preserve a very high degree of accuracy and detail or there could be fatal results.

Smart Quill

Added on: January 3rd, 2017 by Afsal Meerankutty No Comments

Lyndsay Williams of Microsoft Research’s Cambridge UK lab is the inventor of the Smartquill, a pen that can remember the words that it is used to write, and then transform them into computer text . The idea that “it would be neat to put all of a handheld-PDA type computer in a pen,” came to the inventor in her sleep. “It’s the pen for the new millennium,” she says. Encouraged by Nigel Ballard, a leading consultant to the mobile computer industry, Williams took her prototype to the British Telecommunications Research Lab, where she was promptly hired and given money and institutional support for her project. The prototype, called SmartQuill, has been developed by world-leading research laboratories run by BT (formerly British Telecom) at Martlesham, eastern England. It is claimed to be the biggest revolution in handwriting since the invention of the pen.

With the introduction of handheld computers, the present trend has started preferring small computers to do computation. This has made computer manufacturers to go for almost gadget like computers. Reducing the size of handheld computers can only be taken so far before they become unusable. Keyboards become so tiny you require needle-like fingers to operate them and screen that need constant cursor controls to read simple text.

The introduction of SmartQuill has solved some of these problems. Lyndsay Williams of Microsoft, UK is the inventor of Smart Quill, a pen that can remember the words that is used to write, and then transform them into computer text. The pen is slightly larger than ordinary fountain pen, with a screen on the barrel. User can enter information into these applications by pushing a button .Information can be entered using his/her own handwriting. User can use any platform for writing like paper, screen, tablet or even air. There is also a small three-line screen to read the information stored in the pen. Users can scroll down the screen by tilting the pen. The pen is then plugged in to an electronic docking station, text data is transmitted to a desktop computer, printer, and modem or to mobile telephones to send files electronically.

Graphical Password Authentication

Added on: December 29th, 2016 by Afsal Meerankutty No Comments

The most common computer authentication method is to use alphanumerical usernames and passwords. This method has been shown to have significant drawbacks. For example, users tend to pick passwords that can be easily guessed. On the other hand, if a password is hard to guess, then it is often hard to remember.

To address this problem, some researchers have developed authentication methods that use pictures as passwords. In this paper, we conduct a comprehensive survey of the existing graphical password techniques. We classify these techniques into two categories: recognition-based and recall-based approaches. We discuss the strengths and limitations of each method and point out the future research directions in this area.

We also try to answer two important questions: “Are graphical passwords as secure as text-based passwords?”; “What are the major design and implementation issues for graphical passwords”. In this paper, we are conducting a comprehensive survey of existing graphical image password authentication techniques. Also we are here proposing a new technique for graphical authentication.


Added on: December 26th, 2016 by Afsal Meerankutty No Comments

BrainGate is a brain implant system developed by the bio-tech company Cyberkinetics in 2003 in conjunction with the Department of Neuroscience at Brown University. The device was designed to help those who have lost control of their limbs, or other bodily functions, such as patients with amyotrophic lateral sclerosis (ALS) or spinal cord injury. The computer chip, which is implanted into the brain, monitors brain activity in the patient and converts the intention of the user into computer commands. Cyberkinetics describes that “such applications may include novel communications interfaces for motor impaired patients, as well as the monitoring and treatment of certain diseases which manifest themselves in patterns of brain activity, such as epilepsy and depression.”

The Braingate Neural Interface device consists of a tiny chip containing 100 microscopic electrodes that is surgically implanted in the brain’s motor cortex. The whole apparatus is the size of a baby aspirin. The chip can read signals from the motor cortex, send that information to a computer via connected wires, and translate it to control the movement of a computer cursor or a robotic arm. According to Dr. John Donaghue of Cyberkinetics, there is practically no training required to use BrainGate because the signals read by a chip implanted, for example, in the area of the motor cortex for arm movement, are the same signals that would be sent to the real arm. A user with an implanted chip can immediately begin to move a cursor with thought alone.

The BrainGate technology platform was designed to take advantage of the fact that many patients with motor impairment have an intact brain that can produce movement commands. This may allow the BrainGate system to create an output signal directly from the brain, bypassing the route through the nerves to the muscles that cannot be used in paralysed people.

Ac Synchronous Generators

Added on: January 23rd, 2016 by Afsal Meerankutty No Comments

AC Generators come in two basic types – synchronous and non-synchronous. Synchrono us generators lock in with the fundamental line frequency and rotate at a syn-chronous speed related to the number of poles similar to that of AC Synchronous motors. Synchronous generator stator windings are similar to a three-phase synchronous motor stator winding. Synchronous generator rotor fields may be either salient or non-salient pole. Salient Pole (also called spider or projected pole) means the rotor has distinct pole windings mounted on the rotor shaft using dovetail joints. These pole windings are wound around field poles. Salient Pole rotors are most commonly used in slow speed applications and tend to be at least six poles. Salient pole rotors typically have damper windings to reduce rotor oscillations (caused by large flux changes between the individual poles) during operation.

Wearable Computers

Added on: November 3rd, 2013 by Afsal Meerankutty 4 Comments

As computers move from the desktop, to the palm top, and onto our bodies and into our everyday lives, infinite opportunities arise to realize applications that have never before been possible.To date, personal computers have not lived up to their name. Most machines sit on a desk and interact with their owners only a small fraction of the day. A person’s computer should be worn, much as eyeglasses or clothing are worn, and interact with the user based on the context of the situation. With the current accessibility of wireless local area networks, and the host of other context sensing and communication tools available, coupled with the current scale of miniaturization, it is becoming clear that the computer should act as an intelligent assistant, whether it be through a remembrance agent, augmented reality, or intellectual collectives. It is also important that a computer be small, such as something we could slip into our pocket, or even better wear like a piece of clothing. It is rapidly becoming apparent that the next technological leap is to integrate the computer and the user in a non-invasive manner, this leap will bring us into the fascinating world of Wearable Computers.


Added on: November 3rd, 2013 by Afsal Meerankutty 10 Comments

DakNet provides extraordinarily low-cost digital communication, letting remote villages leapfrog past the expense of traditional connectivity solutions and begin development of a full coverage broadband wireless infrastructure. DakNet, an ad hoc network that uses wireless technology to provide asynchronous digital connectivity, is evidence that the marriage of wireless and asynchronous service may indeed be the beginning of a road to universal broadband connectivity.
This paper briefly explains about what are DakNet, how wireless technology implemented with DakNet, its fundamental operations and its applications, cost estimation, advantages and disadvantages and finally how to connect Indian villages with town city and global markets.

BiCMOS Technology

Added on: November 3rd, 2013 by Afsal Meerankutty No Comments

The need for high-performance, low-power, and low-cost systems for network transport and wireless communications is driving silicon technology toward higher speed, higher integration, and more functionality. Further more, this integration of RF and analog mixed-signal circuits into high-performance digital signal-processing (DSP) systems must be done with minimum cost overhead to be commercially viable. While some analog and RF designs have been attempted in mainstream digital-only complimentary metal-oxide semiconductor (CMOS) technologies, almost all designs that require stringent RF performance use bipolar or semiconductor technology. Silicon integrated circuit (IC) products that, at present, require modern bipolar or BiCMOS silicon technology in wired application space include the essential optical network (SONET) and synchronous digital hierarchy (SDH) operating at 10 Gb/s and higher.

The viability of a mixed digital/analog. RF chip depends on the cost of making the silicon with the required elements; in practice, it must approximate the cost of the CMOS wafer, Cycle times for processing the wafer should not significantly exceed cycle times for a digital CMOS wafer. Yields of the SOC chip must be similar to those of a multi-chip implementation. Much of this article will examine process techniques that achieve the objectives of low cost, rapid cycle time, and solid yield.

Space Time Adaptive Processing

Added on: October 31st, 2013 by Afsal Meerankutty 2 Comments

Space-time adaptive processing (STAP) is a signal processing technique most commonly used in radar systems. It involves adaptive array processing algorithms to aid in target detection. Radar signal processing benefits from STAP in areas where interference is a problem (i.e. ground clutter, jamming, etc.). Through careful application of STAP, it is possible to achieve order-of-magnitude sensitivity improvements in target detection.
STAP involves a two-dimensional filtering technique using a phased-array antenna with multiple spatial channels. Coupling multiple spatial channels with pulse-Doppler waveforms lends to the name “space-time.” Applying the statistics of the interference environment, an adaptive STAP weight vector is formed. This weight vector is applied to the coherent samples received by the radar.
In a ground moving target indicator (GMTI) system, an airborne radar collects the returned echo from the moving target on the ground. However, the received signal contains not only the reflected echo from the target, but also the returns from the illuminated ground surface. The return from the ground is generally referred to as clutter.
The clutter return comes from all the areas illuminated by the radar beam, so it occupies all range bins and all directions. The total clutter return is often much stronger than the returned signal echo, which poses a great challenge to target detection. Clutter filtering, therefore, is a critical part of a GMTI system.


Added on: October 31st, 2013 by Afsal Meerankutty 2 Comments

A biosensor is a device for the detection of an analytic that combines a biological component with a physicochemical detector component. Many optical biosensors based on the phenomenon of surface plasmon resonance are evanescent wave techniques . The most widespread example of a commercial biosensor is the blood glucose biosensor, which uses the enzyme glucose oxidase to break blood glucose down.
Bio sensors are the combination of bio receptor and transducer. The bio receptor is a biomolecule that identifies the target whereas transducer converts the identified target into the measurable signal. Biosensors are used in the market in many diverse areas. They are also used in the clinical test in one of the biggest diagnostic market of 4000 million in US$.
They are very useful to measure the specific thing with great accuracy. Its speed can be directly measured. They are very simple. Receptors and transducer are integrated into single sensors without using reagents.

Facial Recognition System

Added on: October 29th, 2013 by Afsal Meerankutty 1 Comment

Wouldn’t you love to replace password based access control to avoid having to reset forgotten password and worry about the integrity of your system? Wouldn’t you like to rest secure in comfort that your healthcare system does not merely on your social security number as proof of your identity for granting access to your medical records?
Because each of these questions is becoming more and more important, access to a reliable personal identification is becoming increasingly essential. Conventional method of identification based on possession of ID cards or exclusive knowledge like a social security number or a password are not all together reliable. ID cards can be lost forged or misplaced; passwords can be forgotten or compromised. But a face is undeniably connected to its owner. It cannot be borrowed stolen or easily forged. Then comes the importance of  Facial Recognition System.


Added on: October 23rd, 2013 by Afsal Meerankutty 2 Comments

IPv6: The Next Generation Protocol
Internet Protocol version 6 (IPv6) has come with a package of advantages including simple header format, very large address space and extensibility. However, IPv6 packets transmission still uses the traditional infrastructure of protocol stacks such as TCP/IP. Thus, the big advantages cannot be taken optimally. One of the limitations of TCP/IP is duplication of error detection code verification and regeneration in Data Link layer. Every router has to verify CRC code at incoming port and regenerate the CRC code at outgoing port before forward an IPv6 packet to the next router. With advance networking technology this is a time consuming task. This paper proposes CRC Extension Header (CEH) to do error detection in Network layer and replaces the current error detection in Data Link layer. In CEH, verification of CRC code is only done in the final destination indicated by destination address field of IPv6 header. Experimentation results showed network latency of IPv6 packets transmission decreases 68%.

Automatic Teller Machine

Added on: October 23rd, 2013 by Afsal Meerankutty No Comments

Automatic Teller Machine
An Automatic Teller Machine (ATM) is a machine permitting a Bank’s customers to make cash withdrawals and check their account at any time and without the need for a human teller. Many ATMs also allow people to deposit cash or cheques and transfer money between their bank accounts.

You’re short on cash, so you walk over to the automated teller machine (ATM), insert your card into the card reader, respond to the prompts on the screen, and within a minute you walk away with your money and a receipt. These machines can now be found at most supermarkets, convenience stores and travel centers. Have you ever wondered about the process that makes your bank funds available to you at an ATM on the other side of the country?

Biocolours – Safe food colours

Added on: October 23rd, 2013 by Afsal Meerankutty No Comments

Biocolours-A New Generation Additive For Industries?
Biocolours or natural dyes are derived from plants, insects and minerals. The use of such colouring matter is rooted in antiquity. Relics from the excavations of Harrapan Culture have yielded evidence of ropes and fabrics dyed with natural colours. The caves of Ajanta (the earliest dating back to the first century B.C.) still preserve the beauty of biocolours in their fullest splendour. In short, use of biocolours through the art of dyeing and printing is one of our richest heritages. Biocolours had to pay a very heavy price due to the development of the synthetic genre of dyestuff. Synthetic dyes made their advent in India in the 18th century and gradually pushed natural dyes into oblivion due to their superiority in the speed of dyeing or printing and the fastness of colours.

Bio-Medical Waste Management

Added on: October 11th, 2013 by Afsal Meerankutty No Comments

Medical care is vital for our life and health, but the waste generated from medical activities represents a real problem of living nature and human world. Improper management of waste generated in health care facilities causes a direct health impact on the community, the health care workers and on the environment Every day, relatively large amount of potentially infectious and hazardous waste are generated in the health care hospitals and facilities around the world. Indiscriminate disposal of BMW or hospital waste and exposure to such waste possess serious threat to environment and to human health that requires specific treatment and management prior to its final disposal. The present review article deals with the basic issues as definition, categories, problems relating to biomedical waste and procedure of handling and disposal method of Biomedical Waste Management. It also intends to create awareness amongst the personnel involved in health care unit.

Compensation of Harmonic Currents Utilizing AHC

Added on: October 10th, 2013 by Afsal Meerankutty 2 Comments

In little more than ten years, electricity power quality has grown from obscurity
to a major issue.
Electronic converters and power electronics gave birth to numerous new applications, offering unmatched comfort, flexibility and efficiency to the customers. However, their proliferation during the last decade is creating a growing concern and
generates more and more problems: not only these electronic loads pollute the AC distribution system with harmonic currents, but they also appear to be very sensitive to the voltage distortion.
Then, electricity power quality is becoming a major issue for utilities and for their customers, and both are quickly adopting the philosophy and the limits proposed by the new International Standards (519-1992 IEEE, 61000.3-2/4 IEC).
Today, recent advances in power electronic technology are providing an unprecedented capability for conditioning and compensating harmonic distortion generated by the non-linear loads.
The case study presented in this paper demonstrates the role of the power source,the load and the AC distribution system as regards power quality. The benefit of harmonic cancellation equipment is clearly shown. Among the different technical
solutions, a shunt – current injection mode – active harmonic conditioner is evaluated, and detailed site measurements are presented as confirmation of the unsurpassed performances. This new innovative active conditioner appears to be the easiest of use, the most flexible, the most efficient and cost effective one

E-Paper Technology

Added on: October 10th, 2013 by Afsal Meerankutty 1 Comment

E-paper is a revolutionary material that can be used to make next generation ; electronic displays. It is portable reusable storage and display medium that look like paper but can be repeatedly written one thousands of times. These displays make the beginning of a new area for battery power information applications such as cell phones, pagers, watches and hand-held computers etc.

Two companies are carrying pioneering works in the field of development of electronic ink and both have developed ingenious methods to produce electronic ink. One is E-ink, a company based at Cambridge, in U.S.A. The other company is Xerox doing research work at the Xerox’s Palo Alto Research Centre. Both technologies being developed commercially for electronically configurable paper like displays rely on microscopic beads that change color in response to the charges on nearby electrodes.

Like traditional paper, E-paper must be lightweight, flexible, glare free and low cost. Research found that in just few years this technology could replace paper in many situations and leading us ink a truly paperless world.

Haptic Systems

Added on: October 10th, 2013 by Afsal Meerankutty 5 Comments

‘Haptics’ is a technology that adds the sense of touch to virtual environments. Users are given the illusion that they are touching or manipulating a real physical object.
This seminar discusses the important concepts in haptics, some of the most commonly used haptics systems like ‘Phantom’, ‘Cyberglove’, ‘Novint Falcon’ and such similar devices. Following this, a description about how sensors and actuators are used for tracking the position and movement of the haptic systems, is provided.
The different types of force rendering algorithms are discussed next. The seminar explains the blocks in force rendering. Then a few applications of haptic systems are taken up for discussion.

Cryogenic Grinding

Added on: October 9th, 2013 by Afsal Meerankutty 1 Comment

Cryogenic grinding, also known as freezer milling, freezer grinding, and cryomilling, is the act of cooling or chilling a material and then reducing it into a small particle size. For example, thermoplastics are difficult to grind to small particle sizes at ambient temperatures because they soften, adhere in lumpy masses and clog screens. When chilled by dry ice, liquid carbon dioxide or liquid nitrogen, the thermoplastics can be finely ground to powders suitable for electrostatic spraying and other powder processes. Cryogenic grinding of plant and animal tissue is a technique used by microbiologists. Samples that require extraction of nucleic acids must be kept at ?80 °C or lower during the entire extraction process. For samples that are soft or flexible at room temperature, cryogenic grinding may be the only viable technique for processing samples. A number of recent studies report on the processing and behavior of nanostructured materials via cryomilling.

IP Spoofing

Added on: October 9th, 2013 by Afsal Meerankutty 1 Comment

IP spoofing is a method of attacking a network in order to gain unauthorized access. The attack is based on the fact that Internet communication between distant computers is routinely handled by routers which find the best route by examining the destination address, but generally ignore the origination address. The origination address is only used by the destination machine when it responds back to the source.

In a spoofing attack, the intruder sends messages to a computer indicating that the message has come from a trusted system. To be successful, the intruder must first determine the IP address of a trusted system, and then modify the packet headers to that it appears that the packets are coming from the trusted system.

Black Box

Added on: October 8th, 2013 by Afsal Meerankutty 5 Comments

As the technology progressing, the speed of traveling is also increased. The source to destination became so closer to each others. The main advancement in the field of the air traveling system with the help of airplane. This is the major discovery of technology. But as the speed increases , the horror of air crash also introduced. Because at a height of 2000m and above if a plane crashes ,it will be a terror for any body. So to take the feed back of the various activities happens in the plane and record them engineers need a mechanism to record such activities .
With any airplane crash, there are many unanswered questions as to what brought the plane down. Investigators turn to the airplane’s flight data recorder (FDR) and cockpit voice recorder (CVR), also known as “black boxes,” for answers. In Flight 261, the FDR contained 48 parameters of flight data, and the CVR recorded a little more than 30 minutes of conversation and other audible cockpit noises.

Thermo Acoustic Refrigeration

Added on: October 7th, 2013 by Afsal Meerankutty 2 Comments

Thermo acoustic have been known for over years but the use of this phenomenon to develop engines and pumps is fairly recent. Thermo acoustic refrigeration is one such phenomenon that uses high intensity sound waves in a pressurized gas tube to pump heat from one place to other to produce refrigeration effect. In this type of refrigeration all sorts of conventional refrigerants are eliminated and sound waves take their place. All we need is a loud speaker and an acoustically insulated tube. Also this system completely eliminates the need for lubricants and results in 40% less energy consumption. Thermo acoustic heat engines have the advantage of operating with inert gases and with little or no moving parts, making them highly efficient ideal candidate for environmentally-safe refrigeration with almost zero maintenance cost. Now we will look into a thermo acoustic refrigerator, its principle and functions.

Construction and Safety Management

Added on: October 6th, 2013 by Afsal Meerankutty No Comments

Managing construction sites are difficult due to temporary work nature, changing work area, untrained workers, over time works etc., Most of the accidents occurred in construction activities like building structures, demolition, excavation, roof work, alteration, scaffolding, painting etc,. In typical construction works about 1500 people are killed in Britain and 25,000-30,000 more are seriously injured. The accident statistics represent not only serious human tragedies but also substantial economical losses due to accidents which cause damage to plant and equipment, loss of productive time, loss of morale among workers, increased compensation and loss of image and reputation for the industry. The key to a successful construction project is to identify vulnerable hazards and eliminate or minimize them. To avoid accidents, the causes of accidents and reliability of the statistics are to be analyzed. Various construction regulations should be followed. The safety policy should be framed. Workers are to be trained in safe methods of work. Safe physical conditions should be provided in the construction sites. Sufficient and suitable personal protective equipments be provided and insisted. Supervision of work area are to be ensured.

Underwater Communication Systems

Added on: October 3rd, 2013 by Afsal Meerankutty 7 Comments

There is a high demand for underwater communication systems due to the increase in current human underwater activities. Underwater communication systems employ either sonar or electromagnetic waves as a means of transferring signals. These waves are different physically and electrically, and thus the systems that employ them also differ in their design architecture, wave propagation and devices used for emission and reception. As a result, the two systems have varying advantages and limitations. This paper presents an in-depth review of underwater communication based on sonar and electromagnetic waves, a comparison of the two systems and a discussion of the environmental impacts of using these waves for underwater communication. In the tradeoff between preserving the underwater environment and the need for underwater communication, it appears that underwater electromagnetic wave communication has the most potential to be the environmentally-friendly system of the future.

Sonar and Acoustic Waves Communication

Added on: October 3rd, 2013 by Afsal Meerankutty 1 Comment

In past previous years, the demand of underwater communication increases due to interest and underwater activities of human being. Underwater communication done with the help of sonar waves, electromagnetic waves and acoustic waves, these three waves are different in nature. This paper present an overview of sonar and acoustic waves underwater communication .In this it is also show acoustic wave communication is better than sonar wave communication. Addition with this the factors, which affect the acoustic wave communication, also explained.

Finger Scan Technology

Added on: October 1st, 2013 by Afsal Meerankutty No Comments

Reliable user authentication is becoming an increasingly important task in the Web-enabled world. The consequences of an insecure authentication system in a corporate or enterprise environment may include loss of confidential information, denial of service, and compromised data integrity. The prevailing techniques of user authentication, which involve the use of either passwords and user IDs (identifiers), or identification cards and PINs (personal identification numbers), suffer from several limitations. Once an intruder acquires the user ID and the password, the intruder has total access to the user’s resources.
Fortunately, automated biometrics in general, and Fingerprint Technology in particular, can provide a much more accurate and reliable user authentication method. Biometrics is a rapidly advancing field that is concerned with identifying a person based on his or her physiological or behavioral characteristics. Examples of automated biometrics include fingerprint, face, iris, and speech recognition. Because a biometric property is an intrinsic property of an individual, it is difficult to surreptitiously duplicate and nearly impossible to share. The greatest strength of biometrics, the fact that the biometrics does not change over time, is at the same time its greatest liability. Once a set of biometric data has been compromised, it is compromised forever using Finger Scan Technology.

Adding Intelligence to Internet Using Satellites

Added on: September 30th, 2013 by Afsal Meerankutty 2 Comments

Two scaling problems face the Internet today. First, it will be years before terrestrial networks are able to provide adequate bandwidth uniformly around the world, given the explosive growth in Internet bandwidth demand and the amount of the world that is still unwired. Second, the traffic distribution is not uniform worldwide: Clients in all countries of the world access content that today is chiefly produced in a few regions of the world (e.g., North America). A new generation of Internet access built around geosynchronous satellites can provide immediate relief. The satellite system can improve service to bandwidth-starved regions of the globe where terrestrial networks are insufficient and supplement terrestrial networks elsewhere. This new generation of satellite system manages a set of satellite links using intelligent controls at the link endpoints. The intelligence uses feedback obtained from monitoring end-user behavior to adapt the use of resources. Mechanisms controlled include caching, dynamic construction of push channels, use of multicast, and scheduling of satellite bandwidth. This paper discusses the key issues of using intelligence to control satellite links, and then presents as a case study the architecture of a specific system: the Internet Delivery System, which uses INTELSAT’s satellite fleet to create Internet connections that act as wormholes between points on the globe.

River Linking

Added on: September 28th, 2013 by Afsal Meerankutty No Comments

River Linking is project linking two or more rivers by creating a network of manually created canals, and providing land areas that otherwise does not have river water access and reducing the flow of water to sea using this means. It is based on the assumptions that surplus water in some rivers can be diverted to deficit rivers by creating a network of canals to interconnect the rivers.
It aims to Transfer water from surplus to water deficit areas in the country.
Inter-Linking River Program will help saving the people living in drought-prone zones from hunger and people living in flood-prone areas from the destruction caused by floods”.

Augmented Reality

Added on: September 28th, 2013 by Afsal Meerankutty No Comments

Augmented Reality (AR) is a growing area in virtual reality research. Computer graphics have become much more sophisticated, and game graphics are pushing the barriers of photo realism. Now, researchers and engineers are pulling graphics out of your television screen or computer display and integrating them into real-world environments. This new technology blurs the line between what’s real and what’s computer-generated by enhancing what we see, hear, feel and smell.
The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. An augmented reality system generates a composite view for the user. It is a combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information.

5 Pen PC Technology

Added on: September 27th, 2013 by Afsal Meerankutty 2 Comments

5 Pen PC Technology called as P-ISM (“Pen-style Personal Networking Gadget Package”), which is nothing but the new discovery which is under developing stage by NEC Corporation. At the 2003 ITU Telecom World exhibition held in Geneva, the Tokyo-based NEC Corporation displayed a conceptual $30,000 prototype of P-ISM. It is simply a new invention in computer and it is associated with communication field. Surely this will have a great impact on the computer field. In this device you will find Bluetooth as the main interconnecting device between different peripherals. P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing.

Soft Ground Tunneling

Added on: September 27th, 2013 by Afsal Meerankutty 1 Comment

In Soft Ground Tunneling workers dig soft-ground tunnels through clay, silt, sand, gravel or mud. In this type of tunnel, stand-up time — how long the ground will safely stand by itself at the point of excavation — is of paramount importance. Because stand-up time is generally short when tunneling through soft ground, cave-ins are a constant threat. To prevent this from happening, engineers use a special piece of equipment called a shield. A shield is an iron or steel cylinder literally pushed into the soft soil. It carves a perfectly round hole and supports the surrounding earth while workers remove debris and install a permanent lining made of cast iron or precast concrete. When the workers complete a section, jacks push the shield forward and they repeat the process.

Cruise Missile Technology

Added on: September 26th, 2013 by Afsal Meerankutty 2 Comments

A cruise missile is basically a small, pilotless airplane. Cruise missiles have an 8.5-foot (2.61-meter) wingspan, are powered by turbofan engines and can fly 500 to 1,000 miles (805 to 1,610 km) depending on the configuration. A cruise missile’s job in life is to deliver a 1,000-pound (450-kg) high-explosive bomb to a precise location — the target.
Cruise missiles come in a number of variations and can be launched from submarines, destroyers or aircraft. Cruise missiles generally consist of a guidance system, payload, and propulsion system, housed in an airframe with small wings and empennage for flight control. Payloads usually consist of a conventional warhead or a nuclear warhead. Cruise missiles tend to be propelled by a jet engine, turbofan engines being preferred due to their greater efficiency at low altitude. Cruise missiles are designed to deliver a large warhead over long distances with high accuracy. Modern cruise missiles can travel at supersonic or high subsonic speeds, are self-navigating, and can fly on a non-ballistic, extremely low altitude trajectory. They are distinct from unmanned aerial vehicles (UAV) in that they are used only as weapons and not for reconnaissance. In a cruise missile, the warhead is integrated into the vehicle and the vehicle is always sacrificed in the mission.

Support us!

If you like this site please click on any of these buttons!