Hello Guest. Sign Up to view and download full seminar reports               

SEMINAR TOPICS CATEGORY

Engineering Topics Category

Plastics Explosion

Added on: February 24th, 2012 by No Comments

Synthetic polymers are often referred to as “plastics”, such as the well-known polyethylene and nylon. However, most of them can be classified in at least three main categories: thermoplastics, thermosets and elastomers.

Man-made polymers are used in a bewildering array of applications: food packaging, films, fibers, tubing, pipes, etc. The personal care industry also uses polymers to aid in texture of products, binding, and moisture retention (e.g. in hair gel and conditioners).

Plastics explosion: Acrylic, Polyethylene, etc…

Voice Browsers

Added on: February 24th, 2012 by No Comments

Browser technology is changing very fast these days and we are moving from the visual paradigm to the voice paradigm. Voice browser is the technology to enter this paradigm. A voice browser is a “device which interprets a (voice) markup language and is capable of generating voice output and/or interpreting voice input, and possibly other input/output modalities.”This paper describes the requirements for two forms of character-set grammar, as a matter of preference or implementation; one is more easily read by (most) humans, while the other is geared toward machine generation.

AgentOS

Added on: February 23rd, 2012 by No Comments

The Internet is a rapidly becoming an integral aspect of the desktop computer if it has not already become so. The Internet can be visualized as worldwide information repository enabling resource sharing on a worldwide basis, through the use of distributed applications. Agents are a new approach to the development of distributed client-server applications built to exploit this information resource.

AgentOS provides an environment for the development and the deployment of agent-based client-server applications based on agents. Agents are an object representation of distributed systems containing both computational logic and state information. Agents are active and mobile, in that they (along with their state information) can migrate between the various hosts that exist in an agent-system such as AgentOS. Agents are autonomous in that they contain code to execute and take decisions on behalf of a human user or another agent as they carry out their assigned tasks. Agents can be as simple as s single algorithm or as complex as a complete application.

Viewed from the perspective of a single host, AgentOS behaves as a traditional server providing an environment for the execution of agents and accessibility of services. Abstracting one level further, i.e., viewed from the perspective of the entire network, AgentOS exceeds the role of the traditional server by assuming the role of a peer in a network of similar servers collectively providing an environment for distributed applications.

The first section in this paper discusses the agent-paradigm, the range of applications most suited for this mode of programming and the typical life cycle of an agent within an agent system such as AgentOS. The second section of this report discusses the requirements for AgentOS, and the design of AgentOS. Finally, a brief survey of related works and an overview of future directions for AgentOS are presented.

Water Jet Cutter

Added on: February 23rd, 2012 by No Comments

In the battle to reduce costs, engineering and manufacturing departments are constantly on the lookout for an edge. The water jet process provides many unique capabilities and advantages that can prove very effective in the cost battle. Learning more about the water jet technology will give us an opportunity to put these cost-cutting capabilities to work. Beyond cost cutting, the water jet process is recognized as the most versatile and fastest growing process in the world. Waterjets are used in high production applications across the globe. They compliment other technologies such as milling, laser, EDM, plasma and routers. No poisonous gases or liquids are used in waterjet cutting, and waterjets do not create hazardous materials or vapors. No heat effected zones or mechanical stresses are left on a waterjet cut surface. It is truly a versatile, productive, cold cutting process. The waterjet has shown that it can do things that other technologies simply cannot. From cutting whisper, thin details in stone, glass and metals; to rapid whole drilling of titanium; for cutting of food, to the killing of pathogens in beverages and dips, the waterjet has proven itself unique.

Reverse Engineering

Added on: February 23rd, 2012 by No Comments

Engineering is the profession involved in designing, manufacturing, constructing, and maintaining of products, systems, and structures. At a higher level, there are two types of engineering: forward engineering and reverse engineering.

Forward engineering is the traditional process of moving from high-level abstractions and logical designs to the physical implementation of a system. In some situations, there may be a physical part without any technical details, such as drawings, bills-of-material, or without engineering data, such as thermal and electrical properties.
The process of duplicating an existing component, subassembly, or product, without the aid of drawings, documentation, or computer model is known as reverse engineering.

Reverse engineering can be viewed as the process of analyzing a system to:
1. Identify the system’s components and their interrelationships
2. Create representations of the system in another form or a higher level of abstraction
3. Create the physical representation of that system

Reverse engineering is very common in such diverse fields as software engineering, entertainment, automotive, consumer products, microchips, chemicals, electronics, and mechanical designs. For example, when a new machine comes to market, competing manufacturers may buy one machine and disassemble it to learn how it was built and how it works. A chemical company may use reverse engineering to defeat a patent on a competitor’s manufacturing process. In civil engineering, bridge and building designs are copied from past successes so there will be less chance of catastrophic failure. In software engineering, good source code is often a variation of other good source code.

In some situations, designers give a shape to their ideas by using clay, plaster, wood, or foam rubber, but a CAD model is needed to enable the manufacturing of the part. As products become more organic in shape, designing in CAD may be challenging or impossible. There is no guarantee that the CAD model will be acceptably close to the sculpted model. Reverse engineering provides a solution to this problem because the physical model is the source of information for the CAD model. This is also referred to as the part-to-CAD process.

Another reason for reverse engineering is to compress product development times. In the intensely competitive global market, manufacturers are constantly seeking new ways to shorten lead-times to market a new product. Rapid product development (RPD) refers to recently developed technologies and techniques that assist manufacturers and designers in meeting the demands of reduced product development time. For example, injection-molding companies must drastically reduce the tool and die development times. By using reverse engineering, a three-dimensional product or model can be quickly captured in digital form, re-modeled, and exported for rapid prototyping/tooling or rapid manufacturing.

Maglev Train

Added on: February 23rd, 2012 by 1 Comment

Magnetic levitation is the latest in transportation technology and has been the interest of many countries around the world. The idea has been around since 1904 when Robert Goddard, an American Rocket scientist, created a theory that trains could be lifted off the tracks by the use of electromagnetic rails. Many assumptions and ideas were brought about throughout the following years, but it was not until the 1970’s that Japan and Germany showed interest in it and began researching and designing.

The motion of the Maglev train is based purely on magnetism and magnetic fields. This magnetic field is produced by using high-powered electromagnets. By using magnetic fields, the Maglev train can be levitated above its track, or guideway, and propelled forward. Wheels, contact with the track, and moving parts are eliminated on the Maglev train, allowing the Maglev train to essentially move on air without friction.

Maglev can be used for both low and high speed transportation. The low speed Maglev is used for short distance travel. Birmingham, England used this low speed transportation between the years of 1984 and 1995. However, engineers are more interested in creating the high-speed Maglev vehicles. The higher speed vehicle can travel at speeds of nearly 343mph or 552 km/h. Magnetic Levitation mainly uses two different types of suspension, which are Electromagnetic Suspension and Electrodynamic Suspension. However, a third suspension system (Intuctrack) has recently been developed and is in the research and design phase. These suspension systems are what keep the train levitated off the track.

Heat Pipe

Added on: February 22nd, 2012 by No Comments

A heat pipe is a simple device that can quickly transfer heat from one point to another. They are often referred to as the “superconductors” of heat as they possess an extra ordinary heat transfer capacity & rate with almost no heat loss.

The development of the heat pipe originally started with Angier March Perkins who worked initially with the concept of the working fluid only in one phase (he took out a patent in 1839 on the hermetic tube boiler which works on this principle). Jacob Perkins (descendant of Angier March) patented the Perkins Tube in 1936 and they became widespread for use in locomotive boilers and baking ovens. The Perkins Tube was a system in which a long and twisted tube passed over an evaporator and a condenser, which caused the water within the tube to operate in two phases. Although these early designs for heat transfer systems relied on gravity to return the liquid to the evaporator (later called a thermosyphon), the Perkins Tube was the jumping off point for the development of the modern heat pipe.

The concept of the modern heat pipe, which relied on a wicking system to transport the liquid against gravity and up to the condenser, was put forward by R.S. Gaugler of the General Motors Corporation. According to his patent in 1944, Gaugler described how his heat pipe would be applied to refrigeration systems. Heat pipe research became popular after that and many industries and labs including Los Alamos, RCA, the Joint Nuclear Research Centre in Italy, began to apply heat pipe technology in their fields. By 1969, there was a vast amount of interest on the part of NASA, Hughes, the European Space Agency, and other aircraft companies in regulating the temperature of a spacecraft and how that could be done with the help of heat pipes. There has been extensive research done to date regarding specific heat transfer characteristics, in addition to the analysis of various material properties and geometries.

Blue Eyes

Added on: February 21st, 2012 by No Comments

Imagine yourself in a world where humans interact with computers. You are sitting in front of your personal computer that can listen, talk, or even scream aloud. It has the ability to gather information about you and interact with you through special techniques like facial recognition, speech recognition, etc. It can even understand your emotions at the touch of the mouse. It verifies your identity, feels your presents, and starts interacting with you .You ask the computer to dial to your friend at his office. It realizes the urgency of the situation through the mouse, dials your friend at his office, and establishes a connection.

Human cognition depends primarily on the ability to perceive, interpret, and integrate audio-visuals and censoring information. Adding extraordinary perceptual abilities to computers would enable computers to work together with human beings as intimate partners. Researchers are attempting to add more capabilities to computers that will allow them to interact like humans, recognize human presents, talk, listen, or even guess their feelings.

The BLUE EYES technology aims at creating computational machines that have perceptual and sensory ability like those of human beings. It uses non-obtrusive sensing method, employing most modern video cameras and microphones to identify the users actions through the use of imparted sensory abilities. The machine can understand what a user wants, where he is looking at, and even realize his physical or emotional states.

Tunable Spiral Inductors

Added on: February 20th, 2012 by No Comments

A tunable micro-electromechanical systems integrated inductor with a large-displacement electro-thermal actuator is discussed here. Based on a transformer configuration, the inductance of a spiral inductor is tuned by controlling the relative position of a magnetically coupled short-circuited loop. Theoretical studies are backed by a variety of fabricated and measured tunable inductors that show a 2 : 1 inductance tuning ratio over a wide frequency range of approximately 25 GHz. In addition, the maximum and minimum quality factors of the tunable inductor are measured to be 26 and 10 which is high compared to previous designs. They can considerably extend the tuning capabilities of critical reconfigurable circuits such as tunable impedance matching circuits, phase shifters voltage controlled oscillators, and low noise amplifiers.

Burj Dubai Tower

Added on: February 20th, 2012 by No Comments

As with all super-tall projects, difficult structural engineering problems needed to be addressed and resolved. This paper presents the approach to the structural system for the Burj Dubai Tower. This paper first presents the architectural knowledge and the comparison of the Burj Dubai tower with other tall buildings of the world. It also describes the geotechnical procedures and structural detailing of the building besides the wind engineering applied to the tower.

Mobile Phone Cloning

Added on: February 20th, 2012 by No Comments

Mobile communication has been readily available for several years, and is major business today. It provides a valuable service to its users who are willing to pay a considerable premium over a fixed line phone, to be able to walk and talk freely. Because of its usefulness and the money involved in the business, it is subject to fraud.

Unfortunately, the advance of security standards has not kept pace with the dissemination of mobile communication.
Some of the features of mobile communication make it an alluring target for criminals. It is a relatively new invention, so not all people are quite familiar with its possibilities, in good or in bad. Its newness also means intense competition among mobile phone service providers as they are attracting customers. The major threat to mobile phone is from cloning.

Flexible AC Transmission Systems

Added on: February 19th, 2012 by No Comments

The rapid development of power electronics technology provides exciting opportunities to develop new power system equipment for better utilization of existing systems. Such as enhancing the security, capacity and flexibility of power transmission systems. FACTS solutions enable power grid owners to increase existing transmission network capacity while maintaining or improving the operating margins necessary for grid stability. Supply of reliable, high-quality electrical energy at a reasonable cost is at the heart of the nation’s economy. The electric power system is one of the nation’s important infrastructures, and the infrastructure most closely tied with the gross domestic product. In view of changes in both the operating and business sector of the electric utility industry. These changes have been mandated by provisions of the Energy Policy Act of 1992, and electric utilities to provide open access to the transmission system.

Several advanced methods are developed for maintaining a high degree of power quality and reliability under a deregulated environment. At the distribution level, flexible control strategies involve computerized automation of system control devices such as capacitor banks, under load tap changing transformers (Ultc’s) and voltage regulators. In the transmission system, a new method of achieving this control is through the use of power electronics based Flexible AC Transmission System (FACTS) devices. This paper provides a comprehensive guide to FACTS, covering all the major aspects in research and development of FACTS technologies. Various real-world applications are also included to demonstrate the issues and benefits of applying FACTS.The objective of this project is to create a multi-institutional power curriculum to address this new environment and technologies.

XML

Added on: February 19th, 2012 by No Comments

XML is the language used to develop web applications. XML is a set of rules for designing structured data in a text format as opposed to binary format, which is useful for man, and machine both. A parser is used for syntactical and lexical analysis. XML parser extract the information from the XML document which is very much needed in all Web applications Simple object access protocol is a protocol that lets the program to send XML over HTTP to invoke methods on remote objects. An XML parser can serve as an engine for implementing this or a comparable protocol.

XML parser can also be used to send data messages formatted as XML over HTTP. By adding XML and HTTP capabilities to application, software developers can begin to offer alternatives to traditional browsers that have significant value to their customers. This paper presents an XML parser that implements a subset of the XML specification. This is useful to all developers and users for checking the welformedness and validation of an XML documents.

Biometrics

Added on: February 17th, 2012 by No Comments

Biometrics refers to the automatic identification of a person based on his or her physiological or behavioral characteristics like fingerprint, or iris pattern, or some aspects of behaviour like handwriting or keystroke patterns. Biometrics is being applied both to identity verification. The problem each involves is somewhat different. Verification requires the person being identified to lay claim to an identity. So the system has two choices, either accepting or rejecting the person’s claim. Recognition requires the system to look through many stored sets of characteristics and pick the one that matches the unknown individual being presented. BIOMETRIC system is essentially a pattern recognition system, which makes a personal identification by determining the authenticity of a specific physiological or behavioral characteristics possessed by the user.

Biometrics is a rapidly evolving technology, which is being used in forensics Such as criminal identification and prison security, and has the potential to be used in a large range of civilian application areas. Biometrics can be used transactions conducted via telephone and Internet (electronic commerce and electronic banking. In automobiles, biometrics can replace keys with key-less entry devices.

3 D ICs

Added on: February 16th, 2012 by Afsal Meerankutty 1 Comment

The unprecedented growth of the computer and the Information technology industry is demanding Very Large Scale Integrated (VLSI) circuits with increasing functionality and performance at minimum cost and power dissipation. VLSI circuits are being aggressively scaled to meet this Demand, which in turn has some serious problems for the semiconductor industry.

Additionally heterogeneous integration of different technologies in one single chip (SoC) is becoming increasingly desirable, for which planar (2-D) ICs may not be suitable.

3-D ICs are an attractive chip architecture that can alleviate the interconnect related problems such as delay and power dissipation and can also facilitate integration of heterogeneous technologies in one chip (SoC). The multi-layer chip industry opens up a whole new world of design. With the Introduction of 3-D ICs, the world of chips may never look the same again.

Artificial Turf

Added on: February 13th, 2012 by No Comments

Artificial turf is a surface manufactured from synthetic fibres made to look like natural grass. It is most often used in arenas for sports that were originally or are normally played on grass. The main reason is maintenance — artificial turf resists heavy use, such as in sports, better, and requires no irrigation or trimming. Domed, covered, and partially covered stadiums may require artificial turf because of the difficulty of getting grass enough sunlight to stay healthy.

A common misconception is that the new synthetic grass is similar to the household carpet. In fact this intricate system involves properly constructing a porous sub base, and using turf with holes in the back, and then the product is filled with sand/rubber granule mix which we call infill.

Artificial turf, also known as synthetic turf, has found a prominent place in sports today. Manufactured from synthetic materials, this man-made surface looks like natural grass. With the international sports associations and governing bodies approving the use of artificial surfaces, sports like football and hockey, which were originally played on natural grass, have moved to these artificial sports pitches. So, next time, you find players playing on an artificial hockey pitch, do not be surprised.

Artificial turf has been manufactured since the early 1960s, and was originally produced by Chemstrand Company (later renamed Monsanto Textiles Company). It is produced using manufacturing processes similar to those used in the carpet industry. Since the 1960s, the product has been improved through new designs and better materials. The newest synthetic turf products have been chemically treated to be resistant to ultraviolet rays, and the materials have been improved to be more wear-resistant, less abrasive, and, for some applications, more similar to natural grass.

SCADA

Added on: February 13th, 2012 by No Comments

The rate of rise of power utility in a country represents its development in any field” necessity is the mother of invention “.so, efficient and reliable power supply is the major requirement of any power supply system. as in many other areas, the computer applications in electrical power system have grown tremendously over the last several decades ,penetrating apparently all aspects of electrical power systems including operational planning ,energy management systems, automation of power generation ,transmission and distribution, protection relaying, etc. a system used to monitor and control equipment and process is Called SCADA.SCADA system is the one of the applications of it.

The present paper explains about the whole SCADA system, its hardware components, software components and its application programs. in hardware components its field instrumentation, remote stations, communication network, central monitoring station are explained. in software components data acquisition and processing ,control, alarm handling, logging and archiving, automated mapping and facility management etc.are considered. and very clearly the appliCation programs like load shedding ,load balancing, remote metering ,maintain good voltage profile, maintain, maps, fuse Call off operations, energy accounting, etc. are emphasized.

Blu-Ray Technology

Added on: February 13th, 2012 by No Comments

Blu-ray is a new optical disc standard based on the use of a blue laser rather than the red laser of today’s DVD players. The standard, developed collaboratively by Hitachi, LG, Matsushita (Panasonic), Pioneer, Philips, Samsung, Sharp, Sony, and Thomson, threatens to make current DVD players obsolete. It is not clear whether new Blu-ray players might include both kinds of lasers in order to be able to read current CD and DVD formats.
The new standard, developed jointly in order to avoid competing standards, is also being touted as the replacement for writable DVDs The blue laser has a 405 nanometer (nm) wavelength that can focus more tightly than the red lasers used for writable DVD and as a consequence, write much more data in the same 12 centimeter space Like the rewritable DVD formats, Blu-ray uses phase change technology to enable repeated writing to the disc.

Blu-ray’s storage capacity is enough to store a continuous backup copy of most people’s hard drives on a single disc. The first products will have a 27 gigabyte (GB) single-sided capacity, 50 GB on dual-layer discs. Data streams at 36 megabytes per second (Mbps), fast enough for high quality video recording Single-sided Blu-ray discs can store up to 13 hours of standard video data, compared to single-sided DVD’s 133 minutes. People are referring to Blu-ray as the next generation DVD, although according to Chris Buma, a spokesman from Philips (quoted in New Scientist) “Except for the size of the disc, everything is different.”

Blu-ray discs will not play on current CD and DVD players, because they lack the blue-violet laser required to read them. If the appropriate lasers are included, Blu-ray players will be able to play the other two formats. However, because it would be considerably more expensive, most manufacturers may not make their players backward compatible. Panasonic, Philips, and Sony have demonstrated prototypes of the new systems.

Sky X Technology

Added on: February 13th, 2012 by No Comments

Satellites are attractive option for carrying internet and other IP traffic to many locations across the globe where terrestrial options are limited or [censored] prohibitive. But data networking on satellite is faced with overcoming the large latency and high bit error rate typical of satellite communications as well as the asymmetric bandwidth design of most satellite network.Satellites is ideal for providing internet and private network access over long distance and to remote locations. However the internet protocols are not optimized for satellite conditions. So the throughput over the satellite networks is restricted to only a fraction of available bandwidth.Mentat , the leading supplies of TCP/IP to the computer industry have overcome their limitations with the development of the Sky X product family.

The Sky X system replaces TCP over satellite link with a protocol optimized for the long latency, high loss and asymmetric bandwidth conditions of the typical satellite communication. The Sky X family consists of Sky X Gateway, Sky X Client/Server and Sky X OEM products. Sky X products increase the performance of IP over satellite by transparency replacing. The Sky X Gateway works by intercepting the TCP connection from client and converting the data to Sky X protocol for transmission over the satellite. The Sky X Client /Server product operates in a similar manner except that the Sky X client software is installed on each end users PC. Connection from applications running on the PC is intercepted and sends over the satellite using the Sky X protocol.

Brain Computer Interface

Added on: February 12th, 2012 by Afsal Meerankutty 2 Comments

Brain Computer interface (BCI) is a communication system that recognized users’ command only from his or her brainwaves and reacts according to them. For this purpose PC and subject is trained. Simple task can consist of desired motion of an arrow displayed on the screen only through subject’s imaginary of something (e.g. motion of his or her left or right hand). As the consequence of imaging process, certain characteristics of the brainwaves are raised and can be used for user’s command recognition, e.g. motor mu waves (brain waves of alpha range frequency associated with physical movements or intention to move).

An Electroencephalogram based Brain-Computer-Interface (BCI) provides a new communication channel between the human brain and a computer. Patients who suffer from severe motor impairments (late stage of Amyotrophic Lateral Sclerosis (ALS), severe cerebral palsy, head trauma and spinal injuries) may use such a BCI system as an alternative form of communication by mental activity.

The use of EEG signals as a vector of communication between men and machines represents one of the current challenges in signal theory research. The principal element of such a communication system, more known as “Brain Computer Interface”, is the interpretation of the EEG signals related to the characteristic parameters of brain electrical activity.

Artificial Intelligence

Added on: February 9th, 2012 by Afsal Meerankutty 5 Comments

The AP is an artificial intelligence–based companion that will be resident in software and chips embedded in the automobile dashboard. The heart of the system is a conversation planner that holds a profile of you, including details of your interests and profession.

A microphone picks up your answer and breaks it down into separate words with
speech-recognition software. A camera built into the dashboard also tracks your lip movements to improve the accuracy of the speech recognition. A voice analyzer then looks for signs of tiredness by checking to see if the answer matches your profile. Slow responses and a lack of intonation are signs of fatigue.

This research suggests that we can make predictions about various aspects of driver performance based on what we glean from the movements of a driver’s eyes and that a system can eventually be developed to capture this data and use it to alert people when their driving has become significantly impaired by fatigue.

Microbial Fuel Cells

Added on: February 9th, 2012 by Afsal Meerankutty No Comments

In an era of climate change, alternate energy sources are desired to replace oil and carbon resources. Subsequently, climate change effects in some areas and the increasing production of biofuels are also putting pressure on available water resources. Microbial Fuel Cells have the potential to simultaneously treat wastewater for reuse and to generate electricity; thereby producing two increasingly scarce resources.

While the Microbial Fuel Cell has generated interest in the wastewater treatment field, knowledge is still limited and many fundamental and technical problems remain to be solved Microbial fuel cell technology represents a new form of renewable energy by generating electricity from what would otherwise be considered waste, such as industrial wastes or waste water etc. A microbial fuel cell [Microbial Fuel Cell] is a biological reactor that turns chemical energy present in the bonds of organic compounds into electric energy, through the reactions of microorganism in aerobic conditions.

Animatronics

Added on: February 9th, 2012 by Afsal Meerankutty No Comments

Animatronics is a cross between animation and electronics. Basically, an animatronic is a mechanized puppet. It may be preprogrammed or remotely controlled. An abbreviated term originally coined by Walt Disney as “Audio-Animatronics” ( used to describe his mechanized characters ) ,can actually be seen in various forms as far back as Leonardo-Da-Vinci’s Automata Lion ,( theoretically built to present lillies to the King of France during one of his Visits ),and has now developed as a career which may require combined talent in Mechanical Engineering , Sculpting / Casting , Control Technologies , Electrical / Electronic , Airbrushing , Radio-Control.

Long before digital effects appeared, animatronics were making cinematic history. The scare generated by the Great White coming out of the water in “Jaws” and the tender otherworldliness of “E.T.” were its outcomes. The Jurassic Park series combined digital effects with animatronics.

It is possible for us to build our own animatronics by making use of ready-made animatronic kits provided by companies such as Mister Computers.

Nano-Machines

Added on: February 8th, 2012 by Afsal Meerankutty No Comments

Nanotechnology is the manipulation of matter on the nanoscale. A nanometer is a very small measure of length-it is one billionth of a meter, a length so small that only three or four atoms lined up in a row would be a nanometer. So, nanotechnology involves designing and building materials and devices where the basic structure of the material or device is specified on the scale of one or a few nanometers. Ultimately, nanotechnology will mean materials and devices in which every atom is assigned a place, and having every atom in the right place will be essential for the functioning of the device.

The kinds of product that could be built will range from microscopic, very powerful computers to super strong materials ten times as strong as steel, but much lighter too, food to other biological tissues. All these products would be very inexpensive because the molecular machines that built them will basically take atoms from garbage or dirt, and energy from sunshine, and rearrange those atoms into useful products, just like trees and crops take dirt, water and sunshine and rearrange the atoms into wood and food.

Nanotechnology cannot be defined as a definite branch of science but different from the conventional ones that we have as of now. It is set to encompass all the technological aspects that we have today and is nothing but the extension of scientific applications to a microscopic scale and thereby reaching closer to perfection if not right there.

E-Intelligence

Added on: February 6th, 2012 by Afsal Meerankutty No Comments

Organizations have, over the years, successfully employed business intelligence tools like OLAP and data warehousing to improve the supply of business information to end users for cross industry applications like finance and customer relationship management, and in vertical markets such as retail, manufacturing, healthcare, banking, financial services, telecommunications, and utilities. In the recent years, the Internet has opened up an entirely new channel for marketing and selling products. Companies are taking to e-business in a big way. The issue facing end users as organizations deploy e-business systems is that they do have not had the same business intelligence capabilities available to them in e-business systems as they do in the traditional corporate operating environment. This prevents businesses from exploiting the full power of the Internet as a sales and marketing channel.

As a solution, vendors are now developing business intelligence applications to capture and analyze the information flowing through e-business systems, and are developing Web-based information portals that provide an integrated and personalized view of enterprise-wide business information, applications, and services. This advanced business intelligence systems are called E-intelligence systems.

The Hy-Wire Car

Added on: February 6th, 2012 by Afsal Meerankutty 1 Comment

Hy-Wire Car is without mechanical and hydraulic linkage end engine. Instead of these it contain a fuel cell stack and a drive by wire system. It is fully automated car it is a future car. In future it will have a wide application. The problem with fuel consumption and pollution can be minimize to certain level.

Cars are immensely complicated machines, but when you get down to it, they do an incredibly simple job. Most of the complex stuff in a car is dedicated to turning wheels, which grip the road to pull the car body and passengers along. The steering system tilts the wheels side to side to turn the car, and brake and acceleration systems control the speed of the wheels.

Given that the overall function of a car is so basic (it just needs to provide rotary motion to wheels), it seems a little strange that almost all cars have the same collection of complex devices crammed under the hood and the same general mass of mechanical and hydraulic linkages running throughout. Why do cars necessarily need a steering column, brake and acceleration pedals, a combustion engine, a catalytic converter and the rest of it?

According to many leading automotive engineers, they don’t; and more to the point, in the near future, they won’t. Most likely, a lot of us will be driving radically different cars within 20 years. And the difference won’t just be under the hood — owning and driving cars will change significantly, too.

Airborne Internet

Added on: February 6th, 2012 by Afsal Meerankutty 2 Comments

The Airborne Internet is network in which all nodes would be located in aircraft. The network is intended for use in aviation communications, navigation, and surveillance (CNS) and would also be useful to businesses, private Internet users, and military. In time of war, for example, an airborne network might enable military planes to operate without the need for a communications infrastructure on the ground. Such a network could also allow civilian planes to continually monitor each other’s positions and flight paths.

Airborne Internet is network will serve tens of thousands of subscribers within a super-metropolitan area, by offering ubiquitous access throughout the networkâ„¢s signal “footprint”. The aircrafts will carry the “hub” of a wireless network having a star topology. The aircrafts will fly in shifts to provide continuous service, 24 hour per day by 7 days per week, with an overall system reliability of 99.9% or greater. At least three different methods have been proposed for putting communication nodes aloft. The first method would employ manned aircraft, the second method would use unmanned aircraft, and the third method would use blimps. The nodes would provide air-to-air, surface-to-air, and surface-to-surface communications. The aircraft or blimps would fly at altitudes of around 16 km, and would cover regions of about 40 mi (64 mi) in radius. Any subscriber within this region will be able to access the networkâ„¢s ubiquitous multi-gigabit per second “bit cloud” upon demand. what the airborne internet will do is provide an infrastructure that can reach areas that don’t have broadband cables & wires. Data transfer rates would be on the order of several gigagabits per second, comparable to those of high-speed cable modem connections. Network users could communicate directly with other users, and indirectly with conventional Internet users through surface-based nodes.

Like the Internet, the Airborne Network would use TCP/IP as the set of protocols for specifying network addresses and ensuring message packets arrive. This technology is also called High Altitude Long Operation (HALO) The concept of the Airborne Internet was first proposed at NASA Langley Research Center’s Small Aircraft Transportation System (SATS) Planning Conference in 1999.

Flywheel Energy Storage

Added on: January 26th, 2012 by Afsal Meerankutty 1 Comment

Energy Storage is becoming increasingly important with the advent of individual electronic devices and the rising need to accommodate a greater population which relies on these devices. A flywheel is simple form of mechanical (kinetic) energy storage. Energy is stored by causing a disk to spin on its axis. Flywheels are one of the most promising technologies for replacing conventional lead acid batteries as energy storage systems for a variety of applications including automobiles, economical rural electrification systems, and stand-alone, remote power units commonly used in the telecommunications industry.

Commercially available Flywheel Energy Storage (FES) Systems are used for small uninterpretable power systems. More recently, the flywheel has regained consideration as a viable means of supporting a critical load during mains power interruption, due to the lower capital expense and extended run time now available from many systems, as well as continued customer dissatisfaction with traditional electrochemical energy storage.

Access Premium Seminar Reports: Subscribe Now



Sign Up for comprehensive seminar reports & presentations: DOCX, PDF, PPTs.