Hello Guest. Sign Up to view and download full seminar reports               

SEMINAR TOPICS CATEGORY

Engineering Topics Category

Space Time Adaptive Processing

Added on: October 31st, 2013 by Afsal Meerankutty 3 Comments

Space-time adaptive processing (STAP) is a signal processing technique most commonly used in radar systems. It involves adaptive array processing algorithms to aid in target detection. Radar signal processing benefits from STAP in areas where interference is a problem (i.e. ground clutter, jamming, etc.). Through careful application of STAP, it is possible to achieve order-of-magnitude sensitivity improvements in target detection.
STAP involves a two-dimensional filtering technique using a phased-array antenna with multiple spatial channels. Coupling multiple spatial channels with pulse-Doppler waveforms lends to the name “space-time.” Applying the statistics of the interference environment, an adaptive STAP weight vector is formed. This weight vector is applied to the coherent samples received by the radar.
In a ground moving target indicator (GMTI) system, an airborne radar collects the returned echo from the moving target on the ground. However, the received signal contains not only the reflected echo from the target, but also the returns from the illuminated ground surface. The return from the ground is generally referred to as clutter.
The clutter return comes from all the areas illuminated by the radar beam, so it occupies all range bins and all directions. The total clutter return is often much stronger than the returned signal echo, which poses a great challenge to target detection. Clutter filtering, therefore, is a critical part of a GMTI system.

Biosensors

Added on: October 31st, 2013 by Afsal Meerankutty 2 Comments

A biosensor is a device for the detection of an analytic that combines a biological component with a physicochemical detector component. Many optical biosensors based on the phenomenon of surface plasmon resonance are evanescent wave techniques . The most widespread example of a commercial biosensor is the blood glucose biosensor, which uses the enzyme glucose oxidase to break blood glucose down.
Bio sensors are the combination of bio receptor and transducer. The bio receptor is a biomolecule that identifies the target whereas transducer converts the identified target into the measurable signal. Biosensors are used in the market in many diverse areas. They are also used in the clinical test in one of the biggest diagnostic market of 4000 million in US$.
They are very useful to measure the specific thing with great accuracy. Its speed can be directly measured. They are very simple. Receptors and transducer are integrated into single sensors without using reagents.

Facial Recognition System

Added on: October 29th, 2013 by Afsal Meerankutty 1 Comment

Wouldn’t you love to replace password based access control to avoid having to reset forgotten password and worry about the integrity of your system? Wouldn’t you like to rest secure in comfort that your healthcare system does not merely on your social security number as proof of your identity for granting access to your medical records?
Because each of these questions is becoming more and more important, access to a reliable personal identification is becoming increasingly essential. Conventional method of identification based on possession of ID cards or exclusive knowledge like a social security number or a password are not all together reliable. ID cards can be lost forged or misplaced; passwords can be forgotten or compromised. But a face is undeniably connected to its owner. It cannot be borrowed stolen or easily forged. Then comes the importance of  Facial Recognition System.

IPv6

Added on: October 23rd, 2013 by Afsal Meerankutty 2 Comments

IPv6: The Next Generation Protocol
Internet Protocol version 6 (IPv6) has come with a package of advantages including simple header format, very large address space and extensibility. However, IPv6 packets transmission still uses the traditional infrastructure of protocol stacks such as TCP/IP. Thus, the big advantages cannot be taken optimally. One of the limitations of TCP/IP is duplication of error detection code verification and regeneration in Data Link layer. Every router has to verify CRC code at incoming port and regenerate the CRC code at outgoing port before forward an IPv6 packet to the next router. With advance networking technology this is a time consuming task. This paper proposes CRC Extension Header (CEH) to do error detection in Network layer and replaces the current error detection in Data Link layer. In CEH, verification of CRC code is only done in the final destination indicated by destination address field of IPv6 header. Experimentation results showed network latency of IPv6 packets transmission decreases 68%.

Automatic Teller Machine

Added on: October 23rd, 2013 by No Comments

Automatic Teller Machine
An Automatic Teller Machine (ATM) is a machine permitting a Bank’s customers to make cash withdrawals and check their account at any time and without the need for a human teller. Many ATMs also allow people to deposit cash or cheques and transfer money between their bank accounts.

You’re short on cash, so you walk over to the automated teller machine (ATM), insert your card into the card reader, respond to the prompts on the screen, and within a minute you walk away with your money and a receipt. These machines can now be found at most supermarkets, convenience stores and travel centers. Have you ever wondered about the process that makes your bank funds available to you at an ATM on the other side of the country?

Biocolours – Safe food colours

Added on: October 23rd, 2013 by No Comments

Biocolours-A New Generation Additive For Industries?
Biocolours or natural dyes are derived from plants, insects and minerals. The use of such colouring matter is rooted in antiquity. Relics from the excavations of Harrapan Culture have yielded evidence of ropes and fabrics dyed with natural colours. The caves of Ajanta (the earliest dating back to the first century B.C.) still preserve the beauty of biocolours in their fullest splendour. In short, use of biocolours through the art of dyeing and printing is one of our richest heritages. Biocolours had to pay a very heavy price due to the development of the synthetic genre of dyestuff. Synthetic dyes made their advent in India in the 18th century and gradually pushed natural dyes into oblivion due to their superiority in the speed of dyeing or printing and the fastness of colours.

Bio-Medical Waste Management

Added on: October 11th, 2013 by No Comments

Medical care is vital for our life and health, but the waste generated from medical activities represents a real problem of living nature and human world. Improper management of waste generated in health care facilities causes a direct health impact on the community, the health care workers and on the environment Every day, relatively large amount of potentially infectious and hazardous waste are generated in the health care hospitals and facilities around the world. Indiscriminate disposal of BMW or hospital waste and exposure to such waste possess serious threat to environment and to human health that requires specific treatment and management prior to its final disposal. The present review article deals with the basic issues as definition, categories, problems relating to biomedical waste and procedure of handling and disposal method of Biomedical Waste Management. It also intends to create awareness amongst the personnel involved in health care unit.

Compensation of Harmonic Currents Utilizing AHC

Added on: October 10th, 2013 by 2 Comments

In little more than ten years, electricity power quality has grown from obscurity
to a major issue.
Electronic converters and power electronics gave birth to numerous new applications, offering unmatched comfort, flexibility and efficiency to the customers. However, their proliferation during the last decade is creating a growing concern and
generates more and more problems: not only these electronic loads pollute the AC distribution system with harmonic currents, but they also appear to be very sensitive to the voltage distortion.
Then, electricity power quality is becoming a major issue for utilities and for their customers, and both are quickly adopting the philosophy and the limits proposed by the new International Standards (519-1992 IEEE, 61000.3-2/4 IEC).
Today, recent advances in power electronic technology are providing an unprecedented capability for conditioning and compensating harmonic distortion generated by the non-linear loads.
The case study presented in this paper demonstrates the role of the power source,the load and the AC distribution system as regards power quality. The benefit of harmonic cancellation equipment is clearly shown. Among the different technical
solutions, a shunt – current injection mode – active harmonic conditioner is evaluated, and detailed site measurements are presented as confirmation of the unsurpassed performances. This new innovative active conditioner appears to be the easiest of use, the most flexible, the most efficient and cost effective one

E-Paper Technology

Added on: October 10th, 2013 by 1 Comment

E-paper is a revolutionary material that can be used to make next generation ; electronic displays. It is portable reusable storage and display medium that look like paper but can be repeatedly written one thousands of times. These displays make the beginning of a new area for battery power information applications such as cell phones, pagers, watches and hand-held computers etc.

Two companies are carrying pioneering works in the field of development of electronic ink and both have developed ingenious methods to produce electronic ink. One is E-ink, a company based at Cambridge, in U.S.A. The other company is Xerox doing research work at the Xerox’s Palo Alto Research Centre. Both technologies being developed commercially for electronically configurable paper like displays rely on microscopic beads that change color in response to the charges on nearby electrodes.

Like traditional paper, E-paper must be lightweight, flexible, glare free and low cost. Research found that in just few years this technology could replace paper in many situations and leading us ink a truly paperless world.

Haptic Systems

Added on: October 10th, 2013 by 5 Comments

‘Haptics’ is a technology that adds the sense of touch to virtual environments. Users are given the illusion that they are touching or manipulating a real physical object.
This seminar discusses the important concepts in haptics, some of the most commonly used haptics systems like ‘Phantom’, ‘Cyberglove’, ‘Novint Falcon’ and such similar devices. Following this, a description about how sensors and actuators are used for tracking the position and movement of the haptic systems, is provided.
The different types of force rendering algorithms are discussed next. The seminar explains the blocks in force rendering. Then a few applications of haptic systems are taken up for discussion.

Cryogenic Grinding

Added on: October 9th, 2013 by 1 Comment

Cryogenic grinding, also known as freezer milling, freezer grinding, and cryomilling, is the act of cooling or chilling a material and then reducing it into a small particle size. For example, thermoplastics are difficult to grind to small particle sizes at ambient temperatures because they soften, adhere in lumpy masses and clog screens. When chilled by dry ice, liquid carbon dioxide or liquid nitrogen, the thermoplastics can be finely ground to powders suitable for electrostatic spraying and other powder processes. Cryogenic grinding of plant and animal tissue is a technique used by microbiologists. Samples that require extraction of nucleic acids must be kept at ?80 °C or lower during the entire extraction process. For samples that are soft or flexible at room temperature, cryogenic grinding may be the only viable technique for processing samples. A number of recent studies report on the processing and behavior of nanostructured materials via cryomilling.

IP Spoofing

Added on: October 9th, 2013 by 1 Comment

IP spoofing is a method of attacking a network in order to gain unauthorized access. The attack is based on the fact that Internet communication between distant computers is routinely handled by routers which find the best route by examining the destination address, but generally ignore the origination address. The origination address is only used by the destination machine when it responds back to the source.

In a spoofing attack, the intruder sends messages to a computer indicating that the message has come from a trusted system. To be successful, the intruder must first determine the IP address of a trusted system, and then modify the packet headers to that it appears that the packets are coming from the trusted system.

Black Box

Added on: October 8th, 2013 by 5 Comments

As the technology progressing, the speed of traveling is also increased. The source to destination became so closer to each others. The main advancement in the field of the air traveling system with the help of airplane. This is the major discovery of technology. But as the speed increases , the horror of air crash also introduced. Because at a height of 2000m and above if a plane crashes ,it will be a terror for any body. So to take the feed back of the various activities happens in the plane and record them engineers need a mechanism to record such activities .
With any airplane crash, there are many unanswered questions as to what brought the plane down. Investigators turn to the airplane’s flight data recorder (FDR) and cockpit voice recorder (CVR), also known as “black boxes,” for answers. In Flight 261, the FDR contained 48 parameters of flight data, and the CVR recorded a little more than 30 minutes of conversation and other audible cockpit noises.

Thermo Acoustic Refrigeration

Added on: October 7th, 2013 by 3 Comments

Thermo acoustic have been known for over years but the use of this phenomenon to develop engines and pumps is fairly recent. Thermo acoustic refrigeration is one such phenomenon that uses high intensity sound waves in a pressurized gas tube to pump heat from one place to other to produce refrigeration effect. In this type of refrigeration all sorts of conventional refrigerants are eliminated and sound waves take their place. All we need is a loud speaker and an acoustically insulated tube. Also this system completely eliminates the need for lubricants and results in 40% less energy consumption. Thermo acoustic heat engines have the advantage of operating with inert gases and with little or no moving parts, making them highly efficient ideal candidate for environmentally-safe refrigeration with almost zero maintenance cost. Now we will look into a thermo acoustic refrigerator, its principle and functions.

Construction and Safety Management

Added on: October 6th, 2013 by No Comments

Managing construction sites are difficult due to temporary work nature, changing work area, untrained workers, over time works etc., Most of the accidents occurred in construction activities like building structures, demolition, excavation, roof work, alteration, scaffolding, painting etc,. In typical construction works about 1500 people are killed in Britain and 25,000-30,000 more are seriously injured. The accident statistics represent not only serious human tragedies but also substantial economical losses due to accidents which cause damage to plant and equipment, loss of productive time, loss of morale among workers, increased compensation and loss of image and reputation for the industry. The key to a successful construction project is to identify vulnerable hazards and eliminate or minimize them. To avoid accidents, the causes of accidents and reliability of the statistics are to be analyzed. Various construction regulations should be followed. The safety policy should be framed. Workers are to be trained in safe methods of work. Safe physical conditions should be provided in the construction sites. Sufficient and suitable personal protective equipments be provided and insisted. Supervision of work area are to be ensured.

Underwater Communication Systems

Added on: October 3rd, 2013 by 7 Comments

There is a high demand for underwater communication systems due to the increase in current human underwater activities. Underwater communication systems employ either sonar or electromagnetic waves as a means of transferring signals. These waves are different physically and electrically, and thus the systems that employ them also differ in their design architecture, wave propagation and devices used for emission and reception. As a result, the two systems have varying advantages and limitations. This paper presents an in-depth review of underwater communication based on sonar and electromagnetic waves, a comparison of the two systems and a discussion of the environmental impacts of using these waves for underwater communication. In the tradeoff between preserving the underwater environment and the need for underwater communication, it appears that underwater electromagnetic wave communication has the most potential to be the environmentally-friendly system of the future.

Sonar and Acoustic Waves Communication

Added on: October 3rd, 2013 by 1 Comment

In past previous years, the demand of underwater communication increases due to interest and underwater activities of human being. Underwater communication done with the help of sonar waves, electromagnetic waves and acoustic waves, these three waves are different in nature. This paper present an overview of sonar and acoustic waves underwater communication .In this it is also show acoustic wave communication is better than sonar wave communication. Addition with this the factors, which affect the acoustic wave communication, also explained.

Finger Scan Technology

Added on: October 1st, 2013 by No Comments

Reliable user authentication is becoming an increasingly important task in the Web-enabled world. The consequences of an insecure authentication system in a corporate or enterprise environment may include loss of confidential information, denial of service, and compromised data integrity. The prevailing techniques of user authentication, which involve the use of either passwords and user IDs (identifiers), or identification cards and PINs (personal identification numbers), suffer from several limitations. Once an intruder acquires the user ID and the password, the intruder has total access to the user’s resources.
Fortunately, automated biometrics in general, and Fingerprint Technology in particular, can provide a much more accurate and reliable user authentication method. Biometrics is a rapidly advancing field that is concerned with identifying a person based on his or her physiological or behavioral characteristics. Examples of automated biometrics include fingerprint, face, iris, and speech recognition. Because a biometric property is an intrinsic property of an individual, it is difficult to surreptitiously duplicate and nearly impossible to share. The greatest strength of biometrics, the fact that the biometrics does not change over time, is at the same time its greatest liability. Once a set of biometric data has been compromised, it is compromised forever using Finger Scan Technology.

Adding Intelligence to Internet Using Satellites

Added on: September 30th, 2013 by 2 Comments

Two scaling problems face the Internet today. First, it will be years before terrestrial networks are able to provide adequate bandwidth uniformly around the world, given the explosive growth in Internet bandwidth demand and the amount of the world that is still unwired. Second, the traffic distribution is not uniform worldwide: Clients in all countries of the world access content that today is chiefly produced in a few regions of the world (e.g., North America). A new generation of Internet access built around geosynchronous satellites can provide immediate relief. The satellite system can improve service to bandwidth-starved regions of the globe where terrestrial networks are insufficient and supplement terrestrial networks elsewhere. This new generation of satellite system manages a set of satellite links using intelligent controls at the link endpoints. The intelligence uses feedback obtained from monitoring end-user behavior to adapt the use of resources. Mechanisms controlled include caching, dynamic construction of push channels, use of multicast, and scheduling of satellite bandwidth. This paper discusses the key issues of using intelligence to control satellite links, and then presents as a case study the architecture of a specific system: the Internet Delivery System, which uses INTELSAT’s satellite fleet to create Internet connections that act as wormholes between points on the globe.

River Linking

Added on: September 28th, 2013 by No Comments

River Linking is project linking two or more rivers by creating a network of manually created canals, and providing land areas that otherwise does not have river water access and reducing the flow of water to sea using this means. It is based on the assumptions that surplus water in some rivers can be diverted to deficit rivers by creating a network of canals to interconnect the rivers.
It aims to Transfer water from surplus to water deficit areas in the country.
Inter-Linking River Program will help saving the people living in drought-prone zones from hunger and people living in flood-prone areas from the destruction caused by floods”.

Augmented Reality

Added on: September 28th, 2013 by No Comments

Augmented Reality (AR) is a growing area in virtual reality research. Computer graphics have become much more sophisticated, and game graphics are pushing the barriers of photo realism. Now, researchers and engineers are pulling graphics out of your television screen or computer display and integrating them into real-world environments. This new technology blurs the line between what’s real and what’s computer-generated by enhancing what we see, hear, feel and smell.
The basic idea of augmented reality is to superimpose graphics, audio and other sensory enhancements over a real-world environment in real time. An augmented reality system generates a composite view for the user. It is a combination of the real scene viewed by the user and a virtual scene generated by the computer that augments the scene with additional information.

5 Pen PC Technology

Added on: September 27th, 2013 by 2 Comments

5 Pen PC Technology called as P-ISM (“Pen-style Personal Networking Gadget Package”), which is nothing but the new discovery which is under developing stage by NEC Corporation. At the 2003 ITU Telecom World exhibition held in Geneva, the Tokyo-based NEC Corporation displayed a conceptual $30,000 prototype of P-ISM. It is simply a new invention in computer and it is associated with communication field. Surely this will have a great impact on the computer field. In this device you will find Bluetooth as the main interconnecting device between different peripherals. P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing.

Soft Ground Tunneling

Added on: September 27th, 2013 by 1 Comment

Soft ground tunneling involves the excavation of tunnels through cohesive and loose soils, including clay, silt, sand, gravel, and mud. The stability of the ground during the excavation, known as stand-up time, is a critical concern in this type of tunneling due to the inherent challenges posed by the soft soil. The limited stand-up time increases the risk of cave-ins, necessitating the use of specialized equipment known as shields to ensure the safety of the workers and the integrity of the tunnel.

The primary focus of this seminar topic report is to explore the techniques and methods employed in soft ground tunneling to mitigate the risks associated with unstable soil conditions. The use of shields is a fundamental aspect of this process, as they play a crucial role in stabilizing the tunnel face. These shields, typically constructed of iron or steel, are carefully pushed into the soft soil, creating a precisely round hole while providing support to the surrounding earth. As debris is removed and a permanent lining, often made of cast iron or precast concrete, is installed, the shield ensures the stability of the tunnel face.

The cyclic nature of soft ground tunneling is essential to comprehend. Upon completing a section, hydraulic jacks are employed to advance the shield, and the tunneling process is repeated. This step-by-step progression enables the workers to work safely and efficiently, avoiding potential cave-ins and ensuring that the tunnel maintains its integrity throughout the construction process.

Throughout the report, the significance of stand-up time and its correlation with the choice of equipment and construction methods are explored in depth. Understanding the ground conditions and the behavior of soft soils is crucial for selecting the appropriate shield design and ensuring the safety of the workers and the overall success of the tunneling project.

Moreover, the report investigates the various types of shields available and their suitability for different soft ground conditions. Engineers and construction professionals need to consider factors such as soil type, tunnel depth, groundwater levels, and the potential impact on existing structures to determine the most effective shield design for a specific project.

In conclusion, this seminar topic report provides valuable insights into the complex world of soft ground tunneling. By addressing the challenges of working with soft soils and the importance of stand-up time, the report emphasizes the significance of using shields as a protective measure during excavation. The utilization of shields not only ensures the safety of workers but also contributes to the successful construction of tunnels through cohesive and loose soils, advancing infrastructure development and enhancing transportation networks.

Cruise Missile Technology

Added on: September 26th, 2013 by 2 Comments

A cruise missile is basically a small, pilotless airplane. Cruise missiles have an 8.5-foot (2.61-meter) wingspan, are powered by turbofan engines and can fly 500 to 1,000 miles (805 to 1,610 km) depending on the configuration. A cruise missile’s job in life is to deliver a 1,000-pound (450-kg) high-explosive bomb to a precise location — the target.
Cruise missiles come in a number of variations and can be launched from submarines, destroyers or aircraft. Cruise missiles generally consist of a guidance system, payload, and propulsion system, housed in an airframe with small wings and empennage for flight control. Payloads usually consist of a conventional warhead or a nuclear warhead. Cruise missiles tend to be propelled by a jet engine, turbofan engines being preferred due to their greater efficiency at low altitude. Cruise missiles are designed to deliver a large warhead over long distances with high accuracy. Modern cruise missiles can travel at supersonic or high subsonic speeds, are self-navigating, and can fly on a non-ballistic, extremely low altitude trajectory. They are distinct from unmanned aerial vehicles (UAV) in that they are used only as weapons and not for reconnaissance. In a cruise missile, the warhead is integrated into the vehicle and the vehicle is always sacrificed in the mission.

Ad hoc Networks

Added on: September 25th, 2013 by No Comments

Recent advances in portable computing and wireless technologies are opening up exciting possibilities for the future of wireless mobile networking. A mobile ad hoc network (MANET) is an autonomous system of mobile hosts connected by wireless links. Mobile networks can be classified into infrastructure networks and mobile ad hoc networks according to their dependence on fixed infrastructures. In an infrastructure mobile network, mobile nodes have wired access points (or base stations) within their transmission range.
The access points compose the backbone for an infrastructure network. In contrast, mobile ad hoc networks are autonomously self-organized networks without infrastructure support. In a mobile ad hoc network, nodes move arbitrarily, therefore the network may experiences rapid and unpredictable topology changes. Additionally, because nodes in a mobile ad hoc network normally have limited transmission ranges, some nodes cannot communicate directly with each other. Hence, routing paths in mobile ad hoc networks potentially contain multiple hops, and every node in mobile ad hoc networks has the responsibility to act as a router.

Mobile ad hoc networks originated from the DARPA Packet Radio Network (PRNet) and SURAN project. Being independent on pre-established infrastructure, mobile ad hoc networks have advantages such as rapid and ease of deployment, improved flexibility and reduced costs.

Axial-Field Electrical Machines

Added on: September 25th, 2013 by 2 Comments

Axial-field electrical machines offer an alternative to the conventional machines. In the axial-field machine, the air gap flux is axial in direction and the active current carrying conductors are radially positioned. This paper presents the design characteristics, special features, manufacturing aspects and potential applications for axial-field electrical machines. The experimental from several prototypes, including d.c. machines, synchronous machines and single-phase machines are given. The special features of the axial-field machine, such as its planar and adjustable air gap, flat shape, ease of diversification, etc., enable axial-fled machines to have distinct advantages over conventional machines in certain applications, especially in special purpose applications. conventional radial field machines. The axial field electrical machines described in this paper are particularly suitable for d.c and synchronous machines where the double air gaps present no difficulties since these machines normally require fairly large air gaps to reduce the effect of armature reaction. One of the major objections to the use of AFMs lies in the difficulty on cutting the slots in their laminated cores.

Basalt Rock Fibre (BRF)

Added on: September 24th, 2013 by 3 Comments

Basalt is well known as a rock found in virtually every country round the world. Basalt Rock fibres have no toxic reaction with air or water, are non-combustible and explosion proof. When in contact with other chemicals they produce no chemical reactions that may damage health or the environment. Basalt base composites can replace steel and known reinforced plastics (1 kg of basalt reinforces equals 9.6 kg of steel). There seemed to be something quite poetic in using a fibre made from natural rock to reinforce a material, which might quite reasonably be described as artificial rock. Raw material for producing basalt fibre is a rock of the volcanic origin. Fibres are received by melting basalt stones down at the temperature of 1400?C. Melted basalt mass passes through the platinum bushing and is extended into fibres. Basalt Rock fibre special properties reduce the cost of products whilst improving their performance. Scope: Low cost, high performance basalt fibres offer the potential to solve the largest problem in the cement and concrete industry, cracking and structural failure of concrete.Basalt fibre reinforced concrete could become the leading reinforcement system in the world for minimizing cracking, reducing road wear, improving concrete product life, lowering maintenance and replacement costs, and minimizing concrete industry law suits. It was therefore with considerable interest that use of basalt fibres as a reinforcing material for concrete. We propose here to investigate the usage of Basalt fibers in low cost composites for civil infrastructure applications requiring excellent mechanical support properties and long lifetimes. Because of the higher performance (strength, temperature range, and durability) and lower potential cost predicted for basalt fibers, they have the potential to cost effectively replace fiberglass, steel fiber, polypropylene, polyethylene, polyester, aramid and carbon fiber products in many applications.

Intelligent Transportation System

Added on: March 27th, 2012 by 2 Comments

National highway ITS has been shifted to “Integrated Road Transportation Management, which incorporates road safety and management technologies, from traditional “Transportation Management”. This paper describes the road management system which is suitable to national highway ITS. When efficient integrated road transportation management system (transportation + road management) is developed by introducing road management system to national highway ITS, reduction in traffic congestion cost, travel time and traffic accident and improvement of road management are expected thanks to integration of road safety and management technology. And based on automobiles and mobile phones distributed in Korea, the creation of new market in telematics and ubiquitous area is highly expected.
Keywords: Advanced Road Management System, Intelligent Transportation System (ITS), IT technology

Light Peak

Added on: March 27th, 2012 by No Comments

Light Peak is the code name for a new high-speed optical cable technology designed to connect electronic devices to each other. Light Peak delivers high bandwidth starting at 10Gb/s with the potential ability to scale to 100Gb/s over the next decade. At 10Gb/s, we can transfer a full-length Blu-Ray movie in less than 30 seconds. Light peak allows for smaller connectors and longer, thinner, and more flexible cables than currently possible. Light Peak also has the ability to run multiple protocols simultaneously over a single cable, enabling the technology to connect devices such as peripherals, displays, disk drives, docking stations, and more.

Free Space Optics

Added on: March 26th, 2012 by No Comments

Free space optics (FSO) is a line-of-sight technology that currently enables optical transmission up to 2.5 Gbps of data, voice, and video communications through the air, allowing optical connectivity without deploying fiber optic cables or securing spectrum licenses. FSO system can carry full duplex data at giga bits per second rates over Metropolitan distances of a few city blocks of few kms. FSO, also known as optical wireless, overcomes this last-mile access bottleneck by sending high –bitrate signals through the air using laser transmission.

Security Issues in Grid Computing

Added on: March 25th, 2012 by No Comments

The last decade has seen a considerable increase in commodity computer and network performance mainly as a result of faster hardware and more sophisticated software. Nevertheless there are still problems, in the fields of science, engineering and business, which cannot be dealt effectively with the current generation of super computers. In fact, due to their size and complexity, these problems are often numerically and/or data intensive and require a variety of heterogeneous resources that are not available from a single machine. A number of teams have conducted experimental studies on the cooperative use of geographically distributed resources conceived as a single powerful computer. The new approach is known by several names such as, Metacomputing, seamless scalable computing, global computing and more recently Grid Computing.

The early efforts in Grid Computing started as a project to link super computing sites, but now it has grown far beyond its original intent. The rapid and impressive growth of internet has become an attractive means of sharing information across the globe. The idea of grid computing has emerged from the fact that, internet can also be used for several other purposes such as sharing the computing power, storage space, scientific devices and software programs. The term “Grid” is chosen as it is analogous to Electrical Power Grid where it provides consistent, pervasive and ubiquitous power irrespective of its source. The main aim of this paper is to present the state- of-the-art and issues in Grid computing.

This paper aims to present the state-of-the-art of Grid computing and attempts to survey the major international efforts in developing this emerging technology

Intelligent Cooling System

Added on: March 25th, 2012 by 3 Comments

In the present paper, efforts have been made to highlight the concept of an “INTELLIGENT COOLING SYSTEM”. The basic principle behind this is to control the flow rate of coolant by regulating the valve by implementing FUZZY LOGIC.

In conventional process the flow rate is constant over the entire engine jacket. This induces thermal stresses & reduction in efficiency.

The “INTELLIGENT COOLING SYSTEM” i.e implementation of fuzzy logic will overcome the above stated drawbacks in any crisp situation. The flow rate of coolant will be controlled by control unit & intelligent sensors.

This is a concept and an innovative idea not been implemented yet.

Optical Fibers

Added on: March 25th, 2012 by 1 Comment

An optical fiber (or fibre) is a glass or plastic fiber that carries light along its length. Fiber optics is the overlap of applied science and engineering concerned with the design and application of optical fibers. Optical fibers are widely used in fiber-optic communications, which permits transmission over longer distances and at higher bandwidths (data rates) than other forms of communications. Fibers are used instead of metal wires because signals travel along them with less loss, and they are also immune to electromagnetic interference. Fibers are also used for illumination, and are wrapped in bundles so they can be used to carry images, thus allowing viewing in tight spaces. Specially designed fibers are used for a variety of other applications, including sensors and fiber lasers.

Light is kept in the core of the optical fiber by total internal reflection. This causes the fiber to act as a waveguide. Fibers which support many propagation paths or transverse modes are called multi-mode fibers (MMF), while those which can only support a single mode are called single-mode fibers (SMF). Multi-mode fibers generally have a larger core diameter, and are used for short-distance communication links and for applications where high power must be transmitted. Single-mode fibers are used for most communication links longer than 550 meters (1,800 ft).

Optical Fiber Communication System

Added on: March 25th, 2012 by No Comments

Communication is an important part of our daily life. The communication process involves information generation, transmission, reception and interpretation. As needs for various types of communication such as voice, images, video and data communications increase demands for large transmission capacity also increase. This need for large capacity has driven the rapid development of light wave technology; a worldwide industry has developed. An optical or light wave communication system is a system that uses light waves as the carrier for transmission. An optical communication system mainly involves three parts. Transmitter, receiver and channel. In optical communication transmitters are light sources, receivers are light detectors and the channels are optical fibers. In optical communication the channel i.e, optical fibers play an important role because it carries the data from transmitter to the receiver. Hence, here we shall discuss mainly about optical fibers.

Pneumatic Control System

Added on: March 24th, 2012 by No Comments

Control system utilize pressure differential created by gas source to drive the transfer of material. Pneumatic control system are all about using compressed air to operate and power a system air taken from the atmosphere and squeezed are compressed .This compressed air is then used a pneumatic system to do work . Pneumatic system are used in Meany field such as lorry brake, bicycle tyres, car tyres, paint spraying aircraft and hydraulic system In this paper, we propose an intelligent control method for a pneumatic servo nonlinear system with static friction. The real machine experiment confirmed the improvement of the speed of response and the stop accuracy and the effectiveness of the proposed method.

Pulse Detonation Engine

Added on: March 24th, 2012 by No Comments

Rocket engines that work much like an automobile engine are being developed at NASA’s Marshall Space Flight Center in Huntsville, Ala. Pulse detonation rocket engines offer a lightweight, low-cost alternative for space transportation. Pulse detonation rocket engine technology is being developed for upper stages that boost satellites to higher orbits. The advanced propulsion technology could also be used for lunar and planetary Landers and excursion vehicles that require throttle control for gentle landings.

The engine operates on pulses, so controllers could dial in the frequency of the detonation in the “digital” engine to determine thrust. Pulse detonation rocket engines operate by injecting propellants into long cylinders that are open on one end and closed on the other. When gas fills a cylinder, an igniter—such as a spark plug—is activated. Fuel begins to burn and rapidly transitions to a detonation, or powered shock. The shock wave travels through the cylinder at 10 times the speed of sound, so combustion is completed before the gas has time to expand. The explosive pressure of the detonation pushes the exhaust out the open end of the cylinder, providing thrust to the vehicle.
A major advantage is that pulse detonation rocket engines boost the fuel and oxidizer to extremely high pressure without a turbo pump—an expensive part of conventional rocket engines. In a typical rocket engine, complex turbo pumps must push fuel and oxidizer into the engine chamber at an extremely high pressure of about 2,000 pounds per square inch or the fuel is blown back out.

The pulse mode of pulse detonation rocket engines allows the fuel to be injected at a low pressure of about 200 pounds per square inch. Marshall Engineers and industry partners United Technology Research Corp. of Tullahoma, Tenn. and Adroit Systems Inc. of Seattle have built small-scale pulse detonation rocket engines for ground testing. During about two years of laboratory testing, researchers have demonstrated that hydrogen and oxygen can be injected into a chamber and detonated more than 100 times per second.

NASA and its industry partners have also proven that a pulse detonation rocket engine can provide thrust in the vacuum of space. Technology development now focuses on determining how to ignite the engine in space, proving that sufficient amounts of fuel can flow through the cylinder to provide superior engine performance, and developing computer code and standards to reliably design and predict performance of the new breed of engines.
A developmental, flight-like engine could be ready for demonstration by 2005 and a full-scale, operational engine could be finished about four years later. Manufacturing pulse detonation rocket engines is simple and inexpensive. Engine valves, for instance, would likely be a sophisticated version of automobile fuel injectors. Pulse detonation rocket engine technology is one of many propulsion alternatives being developed by the Marshall Center’s Advanced Space Transportation Program to dramatically reduce the cost of space transportation.

Humanoid Robot

Added on: March 23rd, 2012 by 4 Comments

The field of humanoids robotics is widely recognized as the current challenge for robotics research .The humanoid research is an approach to understand and realize the complex real world interactions between a robot, an environment, and a human. The humanoid robotics motivates social interactions such as gesture communication or co-operative tasks in the same context as the physical dynamics. This is essential for three-term interaction, which aims at fusing physical and social interaction at fundamental levels.

People naturally express themselves through facial gestures and expressions. Our goal is to build a facial gesture human-computer interface fro use in robot applications. This system does not require special illumination or facial make-up. By using multiple Kalman filters we accurately predict and robustly track facial features. Since we reliably track the face in real-time we are also able to recognize motion gestures of the face. Our system can recognize a large set of gestures (13) ranging from “yes”, ”no” and “may be” to detecting winks, blinks and sleeping.

Solar Power Satellites

Added on: March 22nd, 2012 by 5 Comments

The new millennium has introduced increased pressure for finding new renewable energy sources. The exponential increase in population has led to the global crisis such as global warming, environmental pollution and change and rapid decrease of fossil reservoirs. Also the demand of electric power increases at a much higher pace than other energy demands as the world is industrialized and computerized. Under these circumstances, research has been carried out to look into the possibility of building a power station in space to transmit electricity to Earth by way of radio waves-the Solar Power Satellites. Solar Power Satellites(SPS) converts solar energy in to micro waves and sends that microwaves in to a beam to a receiving antenna on the Earth for conversion to ordinary electricity.SPS is a clean, large-scale, stable electric power source. Solar Power Satellites is known by a variety of other names such as Satellite Power System, Space Power Station, Space Power System, Solar Power Station, Space Solar Power Station etc. One of the key technologies needed to enable the future feasibility of SPS is that of Microwave Wireless Power Transmission.WPT is based on the energy transfer capacity of microwave beam i.e; energy can be transmitted by a well focused microwave beam. Advances in Phased array antennas and rectennas have provided the building blocks for a realizable WPT system.

Geothermal Energy

Added on: March 21st, 2012 by No Comments

The word geothermal comes from the Greek words geo (earth) and therme ( heat). So, geothermal energy is heat from within the earth. We can use the steam and hot water produced inside the earth to heat buildings or generate electricity. Geothermal energy is a renewable energy source because the water is replenished by rainfall and the heat is continuously produced in the earth.

Historically , the first application of geothermal energy were for space heating , cooking and medical purposes . The earliest record of space heating dates back to 1300 in Iceland .In the early 1800s , geothermal energy was used on what was then a large scale by the conte Franceso de Laderel to recover boric acid . The first mechanical conversion was in 1897 when the steam of the field at Larderallo, Italy , was used to heat a boiler producing steam which drove a small steam engine . The first attempt to produce electricity also took place at Larderello in 1904 with an electricity generator that powered four light bulbs . This was followed in 1912 by a condensating turbine ; and by 1914, 8.5 MW of electricity was being produced . By 1944 larderello was producing 127MW. The Plant was destroyed near end of World war 2, but was fortunately rebuilt and expanded evevtually reached 360 MW in 1981.

SSL and TLS

Added on: March 20th, 2012 by No Comments

Secure Socket Layer (SSL) denotes the predominant security protocol of the Internet for World Wide Web (WWW) services relating to electronic commerce or home banking.

The majority of web servers and browsers support SSL as the de-facto standard for secure client-server communication. The Secure Socket Layer protocol builds up point-to-point connections that allow private and unimpaired message exchange between strongly authenticated parties.

In the ISO/OSI reference model [ISO7498], SSL resides in the session layer between the transport layer (4) and the application layer (7); with respect to the Internet family of protocols this corresponds to the range between TCP/IP and application protocols such as HTTP, FTP, Telnet, etc. SSL provides no intrinsic synchronization mechanism; it relies on the data link layer below.

The SSL protocol allows mutual authentication between a client and server and the establishment of an authenticated and encrypted connection. SSL runs above TCP/IP and below HTTP, LDAP, IMAP, NNTP, and other high-level network protocols.

Access Premium Seminar Reports: Subscribe Now



Sign Up for comprehensive seminar reports & presentations: DOCX, PDF, PPTs.