Hello Guest. Sign Up to view and download full seminar reports               

SEMINAR TOPICS CATEGORY

Engineering Topics Category

Protection of Distribution System

Added on: July 25th, 2023 by Webmaster No Comments

Protection of Distribution System: Challenges, Innovations, and Future Prospects

The distribution system plays a crucial role in ensuring the reliable and efficient supply of electrical power to end-users. However, with the growing complexity and interconnection of power grids, the protection of distribution systems has become an ever more critical concern. This seminar report delves into the challenges faced by the distribution system’s protection, the innovative solutions proposed to address these challenges, and the potential future prospects in this rapidly evolving field.

The report commences by highlighting the vulnerabilities of the distribution system, which are primarily attributed to the increasing penetration of distributed energy resources (DERs) such as solar photovoltaics, wind turbines, and energy storage systems. These DERs introduce bidirectional power flows, power quality issues, and islanding risks, necessitating the adaptation of existing protection schemes or the development of new ones.

In the subsequent section, the report delves into the advancements in protection technologies, including the implementation of smart grid solutions and communication systems. These innovations enable more reliable and adaptive protection, fault detection, and isolation. Additionally, the incorporation of advanced relaying techniques, such as distance, differential, and adaptive protection, is explored for their enhanced accuracy and sensitivity.

Furthermore, the seminar report examines the role of artificial intelligence and machine learning techniques in revolutionizing distribution system protection. These data-driven approaches enable predictive maintenance, fault classification, and event prediction, thereby contributing to the overall resilience and stability of the grid.

Moreover, the report emphasizes the significance of cybersecurity in safeguarding the distribution system. As power grids become increasingly digitalized and interconnected, the risk of cyber-attacks on protection relays and control systems escalates. The seminar explores various cybersecurity measures and best practices to mitigate these threats.

Furthermore, the seminar report addresses the regulatory and policy aspects related to distribution system protection. It discusses the challenges in coordinating protection settings across multiple stakeholders, including utilities, independent power producers, and consumers. The analysis of international standards and guidelines related to distribution system protection aids in understanding the global efforts to ensure grid resilience and reliability.

Lastly, the report outlines potential future prospects in the field of distribution system protection. These encompass emerging technologies like blockchain, advanced sensor networks, and self-healing systems that hold promise for further enhancing the resilience and robustness of distribution grids.

In conclusion, the protection of distribution systems is a multifaceted challenge that demands constant innovation and adaptation. This seminar report provides insights into the vulnerabilities of the distribution system, the latest protection technologies, the integration of artificial intelligence, the importance of cybersecurity, and the future prospects. By addressing these issues, the power industry can move closer to achieving a highly reliable, flexible, and secure distribution grid that meets the needs of a sustainable and energy-diverse future.

Solar Based Refrigerator

Added on: July 20th, 2023 by Webmaster No Comments

The solar-based refrigerator is an innovative and sustainable solution that harnesses the power of solar energy to provide cooling without relying on traditional electricity sources. This seminar presentation report explores the principles, working mechanisms, advantages, and applications of solar-based refrigerators. The report delves into the environmental benefits of such systems, including reduced carbon emissions and energy conservation. Furthermore, it highlights the potential for solar-based refrigerators to improve access to refrigeration in remote areas and off-grid regions.

In recent years, the adverse effects of conventional refrigeration methods have become evident, prompting a shift towards more environmentally friendly alternatives. Conventional refrigeration systems, relying on electricity and synthetic refrigerants with high Global Warming Potential (GWP), contribute significantly to greenhouse gas emissions and ozone depletion. Moreover, their energy-intensive nature leads to substantial electricity consumption, resulting in rising energy costs and strained power grids.

In light of these challenges, solar-based refrigerators offer a beacon of hope for a sustainable future. The working principles of solar refrigeration are based on the well-established thermodynamic cycle, where solar energy is utilized to drive the cooling process. By using solar panels to capture sunlight, solar-based refrigerators can produce electricity or direct heat to power the refrigeration cycle, significantly reducing the reliance on conventional grid electricity and mitigating the environmental impact.

This report further delves into the technical aspects of solar-based refrigeration, exploring the components and system design considerations. Energy storage methods, such as batteries and phase change materials, are discussed to ensure continuous cooling during periods of limited sunlight. Additionally, the selection of appropriate cooling components, like evaporators and compressors, and effective insulation techniques are vital for optimizing the performance and efficiency of solar-based refrigeration systems.

The advantages of adopting solar-based refrigerators are multifaceted. Beyond the substantial environmental benefits, these systems offer reduced operating costs, making refrigeration more economically viable in the long run. Furthermore, the versatility and portability of solar-based refrigerators make them ideal for use in remote areas, off-grid communities, disaster relief scenarios, and mobile applications, where conventional cooling solutions may be impractical or inaccessible.

Real-world applications and case studies demonstrate the feasibility and success of solar-based refrigeration across various sectors. From providing essential healthcare services in remote clinics to supporting agriculture through cold storage for perishable produce, these case studies showcase the transformative potential of solar refrigeration in addressing pressing challenges faced by diverse communities.

The seminar presentation report also examines the environmental impact and sustainability of solar-based refrigerators. By substantially reducing greenhouse gas emissions and promoting energy conservation, these systems align closely with global efforts to combat climate change and achieve sustainable development goals. The report emphasizes the positive ecological footprint of solar refrigeration as a powerful tool for environmental preservation.

Despite the numerous advantages, challenges still exist in the widespread adoption of solar-based refrigeration. High initial costs, energy storage limitations, and climatic influences remain areas of concern. However, ongoing research and technological advancements are continuously improving the efficiency, affordability, and reliability of solar-based refrigeration, bolstering its potential to become a mainstream cooling solution in the future.

In conclusion, this seminar presentation report advocates for the adoption of solar-based refrigeration technology as a viable and sustainable alternative to conventional cooling systems. Embracing solar-based refrigerators can lead to a more equitable distribution of cooling solutions worldwide, improving the quality of life for underserved communities and contributing to a greener, more environmentally conscious global society.

TigerSharc Processor

Added on: February 25th, 2020 by Afsal Meerankutty No Comments

In the past three years several multiple data path and pipelined digital signal processors have been introduced into the marketplace. This new generation of DSP’s takes advantage of higher levels of integrations than were available for their predecessors. The Tiger SHARC processor is the newest and most power member of this family which incorporates many mechanisms like SIMD, VLIW and short vector memory access in a single processor. This is the first time that all these techniques have been combined in a real time processor.

              The TigerSHARC DSP is an ultra high-performance static superscalar architecture that is optimized for tele-communications infrastructure and other computationally demanding applications. This unique architecture combines elements of RISC, VLIW, and standard DSP processors to provide native support for 8, 16, and 32-bit fixed, as well as floating-point data types on a single chip.

               Large on-chip memory, extremely high internal and external bandwidths and dual compute blocks provide the necessary capabilities to handle a vast array of computationally demanding, large signal processing tasks.

Solar Sail

Added on: February 24th, 2020 by Webmaster No Comments

Hundreds of space missions have been launched since the last lunar mission, including several deep space probes that have been sent to the edges of our solar system. However, our journeys to space have been limited by the power of chemical rocket engines?and the amount of rocket fuel that a spacecraft can carry. Today, the weight of a space shuttle at launch is approximately 95 percent fuel. What could we accomplish if we could reduce our need for so much fuel and the tanks that hold it?
International space agencies and some private corporations have proposed many methods of transportation that would allow us to go farther, but a manned space mission has yet to go beyond the moon. The most realistic of these space transportation options calls for the elimination of both rocket fuel and rocket engines — replacing them with sails. Yes, that’s right, sails.
Solar-sail mission analysis and design is currently performed assuming constant optical and mechanical properties of the thin metalized polymer films that are projected for solar sails. More realistically, however, these properties are likely to be affected by the damaging effects of the space environment. The standard solar-sail force models can therefore not be used to investigate the consequences of these effects on mission performance. The aim of this paper is to propose a new parametric model for describing the sail film’s optical degradation with time. In particular, the sail film’s optical coefficients are assumed to depend on its environmental history, that is, the radiation dose. Using the proposed model, the optimal control laws for degrading solar sails are derived using an indirect method and the effects of different degradation behaviors are investigated for an example interplanetary mission.

Tele Immersion

Added on: February 24th, 2020 by Afsal Meerankutty No Comments

Tele-immersion may be the next major development in information technology. Using tele-immersion, you can visit an individual across the world without stepping a foot outside.
Tele-Immersion is a new medium that enables a user to share a virtual space with remote participants. The user is immersed in a 3D world that is transmitted from a remote site. This medium for human interaction, enabled by digital technology, approximates the illusion that a person is in the same physical space as others, even though they may be thousands of miles distant. It combines the display and interaction techniques of virtual reality with new computer-vision technologies. Thus with the aid of this new technology, users at geographically distributed sites can collaborate in real time in a shared, simulated, hybrid environment submerging in one another’s presence and feel as if they are sharing the same physical space.

Plagiarism Detection Of Images

Added on: February 23rd, 2020 by Afsal Meerankutty No Comments

“Plagiarism is defined as presenting someone else’s work as your own. Work means any intellectual output, and typically includes text, data, images, sound or performance.”Plagiarism is the unacknowledged and inappropriate use of the ideas or wording of another writer. Because plagiarism corrupts values in which the university community is fundamentally committed – the pursuit of knowledge, intellectual honesty – plagiarism is considered a grave violation of academic integrity and the sanctions against it are correspondingly severe. Plagiarism can be characterized as “academic theft.”
                          CBIR or Content Based Image Retrieval is the retrieval of images based on visual features such as colour, texture and shape. Reasons for the development of CBIR systems is that in many large image databases, traditional methods of image indexing have proven to be insufficient, laborious, and extremely time consuming. These old methods of image indexing, ranging from storing an image.
                          In the database and associating it with a keyword or number, to associate it with a categorized description, has become obsolete. In CBIR, each image that is stored in the database has its features extracted and compared to the features of the query.
                          Feature (content) extraction is the basis of content based Image Retrieval. In broad sense, features may include both text based features (keywords, annotations, etc) and visual features (colour, texture, shape, faces, etc). Within the visual feature scope, the features can be further classified as general features and domain specific features. The former include colour, texture and shape features while the latter is application dependent and may include, for example, human faces and finger prints.

Telepresence

Added on: February 23rd, 2020 by Afsal Meerankutty No Comments

Telepresence refers to a set of technologies which allow a person to feel as if they were present, to give the appearance that they were present, or to have an effect, at a location other than their true location.Telepresence requires that the senses of the user, or users, are provided with such stimuli as to give the feeling of being in that other location. Additionally, the user(s) may be given the ability to affect the remote location. In this case, the user’s position, movements, actions, voice, etc. may be sensed, transmitted and duplicated in the remote location to bring about this effect. Therefore information may be travelling in both directions between the user and the remote location.                           TelePresence is a new technology that creates unique, “in-person” experiences between people, places, and events in their work and personal lives. It combines innovative video, audio, and interactive elements (both hardware and software) to create this experience over the network. Telepresence means “feeling like you are somewhere else”. Some people have a very technical interpretation of this, where they insist that you must have head-mounted displays in order to have telepresence. Other people have a task-specific meaning, where “presence” requires feeling that you are emotionally and socially connected with the remote world. It’s all a little vague at this time.

Positron Emission Tomography

Added on: February 19th, 2020 by Afsal Meerankutty No Comments

Positron emission tomography, also called PET imaging or a PET scan, is a type of nuclear medicine imaging. Positron emission tomography (PET) is a nuclear medicine imaging technique which produces a three-dimensional image or map of functional processes in the body. The system detects pairs of gamma rays emitted indirectly by a positron-emitting radioisotope, which is introduced into the body on a metabolically active molecule. Images of metabolic activity in space are then reconstructed by computer analysis, often in modern scanners aided by results from a CT X-ray scan performed on the patient at the same time, in the same machine.
information and accurate diagnoses.
A PET scan measures important body functions, such as blood flow, oxygen use, and sugar (glucose) metabolism, to help doctors evaluate how well organs and tissues are functioning.
PET is actually a combination of nuclear medicine and biochemical analysis. Used mostly in patients with brain or heart conditions and cancer, PET helps to visualize the biochemical changes taking place in the body, such as the metabolism (the process by which cells change food into energy after food is digested and absorbed into the blood) of the heart muscle.
PET differs from other nuclear medicine examinations in that PET detects metabolism within body tissues, whereas other types of nuclear medicine examinations detect the amount of a radioactive substance collected in body tissue in a certain location to examine the tissue’s function.

Speech Processing

Added on: February 18th, 2020 by Afsal Meerankutty No Comments

Speech processing is the study of speech signals and the processing methods of these signals.The signals are usually processed in a digital representation whereby speech processing can be seen as the intersection of digital signal processing and natural language processing.

  • Speech recognition, which deals with analysis of the linguistic content of a speech signal.
  • Speaker recognition, where the aim is to recognize the identity of the speaker.
  • Enhancement of speech signals, e.g. audio noise reduction,
  • Speech coding, a specialized form of data compression, is important in the telecommunication area.
  • Voice analysis for medical purposes, such as analysis of vocal loading and dysfunction of the vocal cords.
  • Speech synthesis: the artificial synthesis of speech, which usually means computer generated speech.
  • Speech enhancement: enhancing the perceptual quality of speech signal by removing the destructive effects of noise, limited capacity recording equipment, impairments, etc.

SILC

Added on: February 17th, 2020 by Afsal Meerankutty 2 Comments

Secure Internet Live Chat Protocol (SILC)

The Secure Internet Live Conferencing (SILC) protocol is a new generation chat protocol which provides full featured conferencing services, just like any other contemporary chat protocol provides. In addition, it provides security by encrypting and authenticating the messages in the network. The security has been the primary goal of the SILC protocol and the protocol has been designed from the day one security in mind. All packets and messages travelling in the SILC Network are always encrypted and authenticated. The network topology is also different from for example IRC network. The SILC network topology attempts to be more powerful and scalable than the IRC network. The basic purpose of the SILC protocol is to provide secure conferencing services. The SILC Protocol have been developed as Open Source project. The protocol specifications are freely available and they have been submitted to the IETF. The very first implementations of the protocol are also already available.

SILC provides security services that any other conferencing protocol does not offer today. The most popular conferencing service, IRC, is entirely insecure. If you need secure place to talk to some person or to group of people over the Internet, IRC or any other conferencing service, for that matter, cannot be used. Anyone can see the messages and their contents in the IRC network. And the most worse case, some is able to change the contents of the messages. Also, all the authentication data, such as, passwords are sent plaintext in IRC.

SILC is much more than just about `encrypting the traffic’. That is easy enough to do with IRC and SSL hybrids, but even then the entire network cannot be secured, only part of it. SILC provides security services, such as sending private messages entirely secure; no one can see the message except you and the real receiver of the message. SILC also provides same functionality for channels; no one except those clients joined to the channel may see the messages destined to the channel. Communication between client and server is also secured with session keys and all commands, authentication data (such as passwords etc.) and other traffic is entirely secured. The entire network, and all parts of it, is secured. We are not aware of any other conferencing protocol providing same features at the present time.

On-road Charging of Electric Vehicles

Added on: February 15th, 2020 by Afsal Meerankutty 1 Comment

This seminar topic report delves into the concept of Contactless Power Transfer (CPT) systems and their potential application for charging electric vehicles (EVs) without requiring any physical interconnection. The focus of the investigation centers on the feasibility of implementing on-road charging systems to extend the driving range of EVs and reduce the size of their batteries. The paper examines critical aspects such as the necessary road coverage and power transfer capability of the CPT system.

One of the primary objectives of this study is to explore how on-road charging can positively impact EVs by offering continuous charging while they are in motion. By seamlessly charging the EVs while driving, it becomes possible to extend their range and mitigate range anxiety, a crucial concern for many potential EV owners. Moreover, with reduced battery size requirements, the overall weight and cost of EVs could potentially be lowered, making them more accessible and efficient.

To achieve these benefits, the paper addresses essential design considerations concerning the distribution and length of CPT segments across the road. Determining the optimal arrangement of these charging segments is critical to ensure efficient and reliable charging for vehicles on the move. Additionally, understanding the power transfer capability of the CPT system is essential to match the charging requirements of various EV models and ensure compatibility.

A significant aspect of this study is to assess the total power demand generated by all passing vehicles using the on-road charging system. Understanding the overall power consumption is essential to gauge the system’s scalability and the potential burden it may place on the electrical grid. This assessment can also shed light on the feasibility of powering EVs directly from renewable energy sources, which aligns with the broader sustainability goals of the transportation sector.

By exploring the possibility of integrating EVs with renewable energy sources, the paper seeks to contribute to the ongoing efforts towards a cleaner and greener future for transportation. If successful, on-road charging systems could significantly reduce the carbon footprint of EVs and promote their widespread adoption, thus making substantial progress towards a more sustainable mobility landscape.

In conclusion, this seminar topic report presents a comprehensive investigation into the on-road charging of electric vehicles using Contactless Power Transfer systems. By addressing key aspects such as road coverage, power transfer capability, design considerations, and potential reliance on renewable energy, the study aims to shed light on the promising opportunities and challenges in this domain. The outcomes of this research have the potential to reshape the EV charging infrastructure and accelerate the transition to an emission-free transportation era.

Computer Aided Process Planning (CAPP)

Added on: February 13th, 2020 by Afsal Meerankutty No Comments

Technological advances are reshaping the face of manufacturing, creating paperless manufacturing environments in which computer automated process planning (CAPP) will play a preeminent role. The two reasons for this effect are: Costs are declining, which encourages partnerships between CAD and CAPP developers and access to manufacturing data is becoming easier to accomplish in multivendor environments. This is primarily due to increasing use of LANs; IGES and the like are facilitating transfer of data from one point to another on the network; and relational databases (RDBs) and associated structured query language (SQL) allow distributed data processing and data access.
.
With the introduction of computers in design and manufacturing, the process planning part needed to be automated. The shop trained people who were familiar with the details of machining and other processes were gradually retiring and these people would be unavailable in the future to do process planning. An alternative way of accomplishing this function was needed and Computer Aided Process Planning (CAPP) was the alternative. Computer aided process planning was usually considered to be a part of computer aided manufacturing. However computer aided manufacturing was a stand alone system. Infact a synergy results when CAM is combined with CAD to create a CAD/CAM. In such a system CAPP becomes the direct connection between design and manufacturing.

Moreover, the reliable knowledge based computer-aided process planning application MetCAPP software looks for the least costly plan capable of producing the design and continuously generates and evaluates the plans until it is evident that non of the remaining plans will be any better than the best one seen so far. The goal is to find a useful reliable solution to a real manufacturing problem in a safer environment. If alternate plans exist, rating including safer conditions is used to select the best plans

Air Muscles

Added on: February 10th, 2020 by Afsal Meerankutty No Comments

Air muscle is essentially a robotic actuator which is replacing the conventional pneumatic cylinders at a rapid pace. Due to their low production costs and very high power to weight ratio, as high as 400:1, the preference for Air Muscles is increasing. Air Muscles find huge applications in biorobotics and development of fully functional prosthetic limbs, having superior controlling as well as functional capabilities compared with the current models. This paper discusses Air Muscles in general, their construction, and principle of operation, operational characteristics and applications.

Robotic actuators conventionally are pneumatic or hydraulic devices. They have many inherent disadvantages like low operational flexibility, high safety requirements, and high cost operational as well as constructional etc. The search for an actuator which would satisfy all these requirements ended in Air Muscles. They are easy to manufacture, low cost and can be integrated with human operations without any large scale safety requirements. Further more they offer extremely high power to weight ratio of about 400:1. As a comparison electric motors only offer a power ration of 16:1. Air Muscles are also called McKibben actuators named after the researcher who developed it.

Green Engine

Added on: February 6th, 2020 by Afsal Meerankutty No Comments

Everyday radios, newspapers, televisions and the internet warn us of energy exhaustion, atmospheric pollution and hostile climatic conditions. After few hundred years of industrial development, we are facing these global problems while at the same time we maintain a high standard of living. The most important problem we are faced with is whether we should continue “developing” or “die”.

Coal, petroleum, natural gas, water and nuclear energy are the five main energy sources that have played important roles and have been widely used by human beings.

The United Nations Energy Organization names all of them “elementary energies”, as well as “conventional energies”. Electricity is merely a “second energy” derived from these sources. At present, the energy consumed all over the world almost completely relies on the supply of the five main energy sources. The consumption of petroleum constitutes approximately 60 percent of energy used from all sources, so it is the major consumer of energy.

The green engine is one of the most interesting discoveries of the new millennium. It has got some unique features that were used for the first time in the making of engines. This engine is a piston less one with features like sequential variable compression ratio, direct air intake, direct fuel injection, multi-fuel usage etc. The efficiency of this engine is high when compared to the contemporary engines and also the exhaust emissions are near zero. The significance of the engine lies in the efficiency when the present world conditions of limited resources of energy are considered. Prototypes of the engine have been developed. Generators have been produced with the green engine.

Skylights

Added on: February 3rd, 2020 by Afsal Meerankutty No Comments

Adding a skylight is one of the quickest and easiest ways to make any room of your home lighter and brighter, adding an open and airy feeling. There are two basic types of skylights for residential use – flat glass and domed acrylic – and each have some advantages.
Domed acrylic skylights are less expensive than glass, and their convex shape tends to let the rain wash accumulated dust and dirt off a little easier. The acrylic dome is mounted in an aluminum frame, which is in turn mounted on a 2×6 box called a “curb.” Once the hole is cut in the roof to the manufacturer’s specifications, the curb is constructed on-site to raise the skylight above the level of the roof sheathing. Site- built or factory-supplied flashings are used to seal the roofing around the curb.

Domed skylights are available in clear, smoked, bronze or other tints. Most are double- or triple-glazed in order to achieve the level of energy efficiency required by the building codes. Several sizes are available, with the most common being 2×2, 2×4 and 4×4 feet.
Flat glass skylights come mounted in a wood or integrated rubber and metal framework, and require no additional curb construction. After the hole is cut, the skylight frame is simply attached to the roof sheathing with L- brackets, then the installation is completed using the factory- supplied flashing kit. Easy installation, superior insulating qualities, less tendency to scratch and a cleaner finished appearance all add to the popularity and somewhat higher cost of glass skylights.

Glass skylights also have a greater number of optional accessories. These include tempered, laminated or wire glass; shades and blinds for light control; glass tints for heat retention or to block sunlight; and the ability to open fully or partially for ventilation. At least one company, Velux – a leading manufacturer of quality glass skylights that are available at most local home centers and lumber yards – even offers an electric motor coupled to a rain sensor that automatically shuts the skylight if it detects rain.

Powerline Communication (PLC)

Added on: February 1st, 2020 by Afsal Meerankutty No Comments

Connecting to the Internet is a fact of life for business, government, and most households. The lure of e-commerce, video on demand, and e-mail has brought 60 million people to the Internet. Once they get to the Internet, they find out what it’s really like. That includes long waits for popular sites, substantial waits for secure sites, and horrible video quality over the web.

Telephone companies have offered high bandwidth lines for many years. For the most part, the cost of these lines and the equipment needed to access them has limited their usefulness to large businesses. The lone exception has been ISDN (Integrated Services Digital Network) which has won over some residential customers. ISDN offers fast Internet access (128k) at a relatively low cost.

Here the solution is Powerline communications (or PLC). Powerline communications is a rapidly evolving market that utilizes electricity power lines for the high-speed transmission of data and voice services.

None of the available Internet access services offer the right balance of cost, convenience, and speed. Digital Powerline technology could change all that. It gives customers high speed Internet access through electrical networks. Lower costs are achieved because the service is implemented on standard electrical lines. The service is also convenient because it’s already in your home. Internet access through Digital Powerline would be at (at least) 1Mbps, 20 times faster than a standard phone/modem connection.

Remote Frame Buffer (RFB) Protocol

Added on: January 30th, 2020 by Afsal Meerankutty No Comments

Remote Desktop Softwares are those softwares which provide Remote Access. Remote Access means the ability of a user to log onto a remote computer or network from a distant location. This usually comprises of computers, a Network, and some remote access software to connect to the network. Since this involves clients and servers connected across a network, a protocol is essential for efficient communication between them. RFB protocol is one of such protocols which is used by the client and servers for communicating with each other and thereby making Remote Access possible. The purpose of this Paper is to give a general idea as to how this Protocol actually works. This paper also gives a broad idea about the various messages of this protocol and how these messages are send and interpreted by the client and server modules. This Paper also includes a Simple implementation of the Protocol which shows the various messages and methods and how this protocol is practically used for gaining remote access.

RFB (remote framebuffer) is a simple and efficient protocol which provide remote access to graphical user interfaces.As its Name Suggests it works at the framebuffer level and thus it is applicable to all windowing systems and applications. Eg. X11, Windows and Macintosh. It should also be noted that there are other Protocols available and RFB is the protocol which is used in Virtual Network Computing (VNC) and its various forms. Due to increase in number of Software products and Services such protocols play a very important role nowadays.

Contactless Energy Transfer to a Moving Actuator

Added on: January 29th, 2020 by Afsal Meerankutty No Comments

Most high-precision machines are positioning stages with multiple degrees of freedom (DOF), which often consist of cascaded long- and short-stroke linear actuators that are supported by mechanical or air bearings. Usually, the long stroke actuator has a micrometer accuracy, while the submicron accuracy is achieved by the short-stroke actuator. To build a high-precision machine, as much disturbances as possible should be eliminated. Common sources of disturbances are vibrations, Coulomb and viscous friction in bearings, crosstalk of multiple cascaded actuators and cable slabs.A possibility to increase throughput, while maintaining accuracy is to use parallel processing, i.e. movement and positioning in parallel with inspection, calibration, assembling, scanning, etc. To meet the design requirements of high accuracy while improving performance, a new design approach is necessary, especially if vacuum operation is considered, which will be required for the next generation of lithography
machines. A lot of disturbance sources can be eliminated by integrating the cascaded long- and short-stroke actuator into one actuator system.

Since most long-stroke movements are in a plane, this can be done by a contactless planar actuator. A contactless planar actuator or planar motor is supported by magnetic bearings that levitate the actuator platform, while controlling all six DOF of the platform. Long-stroke linear
movement in 2D is also provided by the magnetic bearing while small translations in height and small rotations remain possible. Magnetic bearings can also operate in vacuum. Parallel processing requires power on the platform to drive the actuators on the platform. In order to remove as much disturbances as possible, the power transfer needs to be contactless, i.e. without wires from the ground to the platform. A coil topology and geometry for a contactless
energy transfer system is proposed for energy transfer to a planar moving platform. The platform is equipped with permanent magnets and is levitated and propelled by a matrix of coils, which are fixed to the ground. Such a planar actuator is currently under investigation at Eindhoven University of Technology. The aim of this research project is to transfer energy to the moving platform continuously and at every position in order to enhance the functionality of the platform, while maintaining the advantages of operating without contact and cables slabs.

Steam Turbine

Added on: January 28th, 2020 by Afsal Meerankutty No Comments

A steam turbine is a mechanical device that extracts thermal energy from pressurized steam, and converts it into rotary motion. Its modern manifestation was invented by Sir Charles Parsons in 1884.
Definitions of steam turbine:

  • Turbine in which steam strikes blades and makes them turn
  • A steam turbine is a mechanical device that extracts thermal energy from pressurized steam, and converts it into rotary motion. Its modern manifestation was invented by Sir Charles Parsons in 1884.
  • A system of angled and shaped blades arranged on a rotor through which steam is passed to generate rotational energy. Today, normally used in power stations
  • A device for converting energy of high-pressure steam (produced in a boiler) into mechanical power which can then be used to generate electricity.
  • Equipment unit flown through by steam, used to convert the energy of the steam into rotational energy.

A machine for generating mechanical power in rotary motion from the energy of steam at temperature and pressure above that of an available sink. By far the most widely used and most powerful turbines are those driven by steam. Until the 1960s essentially all steam used in turbine cycles was raised in boilers burning fossil fuels (coal, oil, and gas) or, in minor quantities, certain waste products. However, modern turbine technology includes nuclear steam plants as well as production of steam supplies from other sources.

The illustration shows a small, simple mechanical-drive turbine of a few horsepower. It illustrates the essential parts for all steam turbines regardless of rating or complexity: (1) a casing, or shell, usually divided at the horizontal center line, with the halves bolted together for ease of assembly and disassembly; it contains the stationary blade system; (2) a rotor carrying the moving buckets (blades or vanes) either on wheels or drums, with bearing journals on the ends of the rotor; (3) a set of bearings attached to the casing to support the shaft; (4) a governor and valve system for regulating the speed and power of the turbine by controlling the steam flow, and an oil system for lubrication of the bearings and, on all but the smallest machines, for operating the control valves by a relay system connected with the governor; (5) a coupling to connect with the driven machine; and (6) pipe connections to the steam supply at the inlet and to an exhaust system at the outlet of the casing or shell.Steam turbines are ideal prime movers for driving machines requiring rotational mechanical input power. They can deliver constant or variable speed and are capable of close speed control. Drive applications include centrifugal pumps, compressors, ship propellers, and, most important, electric generators.

Navbelt and Guidecane

Added on: January 27th, 2020 by Afsal Meerankutty No Comments

 Recent revolutionary achievements in robotics and bioengineering have given scientists and engineers great opportunities and challenges to serve humanity. This seminar is about “NAVBELT AND GUIDECANE”, which are two computerised devices based on advanced mobile robotic navigation for obstacle avoidance useful for visually impaired people. This is “Bioengineering for people with disabilities”. NavBelt is worn by the user like a belt and is equipped with an array of ultrasonic sensors. It provides acoustic signals via a set of stereo earphones that guide the user around obstacles or displace a virtual acoustic panoramic image of the traveller’s surroundings. One limitation of the NavBelt is that it is exceedingly difficult for the user to comprehend the guidance signals in time, to allow fast work. A newer device, called GuideCane, effectively overcomes this problem. The GuideCane uses the same mobile robotics technology as the NavBelt but is a wheeled device pushed ahead of the user via an attached cane. When the Guide Cane detects an obstacle, it steers around it. The user immediately feels this steering action and can follow the Guide Cane’s new path easily without any conscious effort. The mechanical, electrical and software components, user machine interface and the prototypes of the two devices are described below.

Brushless DC Electric Motor

Added on: January 25th, 2020 by Afsal Meerankutty No Comments

A microprocessor-controlled BLDC motor powering a micro remote-controlled airplane. This external rotor motor weighs 5 grams, consumes approximately 11 watts (15 millihorsepower) and produces thrust of more than twice the weight of the plane.
Brushless DC motors (BLDC motors, BL motors) also known as electronically commutated motors (ECMs, EC motors) are synchronous electric motors powered by direct-current (DC) electricity and having electronic commutation systems, rather than mechanical commutators and brushes. The current-to-torque and voltage-to-speed relationships of BLDC motors are linear.
BLDC motors may be described as stepper motors, with fixed permanent magnets and possibly more poles on the stator than the rotor, or reluctance motors. The latter may be without permanent magnets, just poles that are induced on the rotor then pulled into alignment by timed stator windings. However, the term stepper motor tends to be used for motors that are designed specifically to be operated in a mode where they are frequently stopped with the rotor in a defined angular position; this page describes more general BLDC motor principles, though there is overlap.

Superconducting Rotating Machine

Added on: January 25th, 2020 by Afsal Meerankutty No Comments

Advances in High Temperature Superconductors (HTS) are enabling a new class of synchronous rotating machines (SuperMotors and SuperGenerators) that can generically be categorized as SuperMachines. Compared to conventional machines of equivalent rating, these SuperMachines are expected to be less expensive, lighter, more
compact, efficient, and provide significantly superior stable operation in a power system. The field windings are made with HTS conductor material (BSCCO, or Bi-2223) which operates at 35-40 K and can be cooled with inexpensive, off-the-shelf cryocoolers available from a number of manufacturers throughout the world. As will be discussed, these advanced SuperMachines are attractive for use in industrial as well as naval and commercial maritime industry applications. This paper discusses recent SuperMachine work at AMSC and other companies. HTS rotating machine technology is maturing rapidly, and electricity producers as well as the end-users will undoubtedly benefit enormously from these advancements.

Evolution of SIM to eSIM

Added on: October 31st, 2018 by Afsal Meerankutty No Comments

Every GSM (Global System for Mobile Communications) phone, also called 2G mobile phone and every UMTS (Universal Mobile Telecommunications System) phone, aka 3G mobile phone requires a smart card to connect and function in the mobile network. This smart card is called SIM, which stands for Subscriber Identity Module. In fact, this module contains the International Mobile Subscriber Identity (IMSI) and credentials that are necessary for the identification and authentication of the subscriber. Without the SIM the user will not be allowed to connect to the
mobile network and hence not able to make or receive phone calls.
As a smart card the SIM is a tampered resistant microprocessor card with its own operating system, storage and built-in security features that prevent unauthorized individual to access, retrieve, copy or modify the subscriber IMSI and credentials. Abuses of subscriber’s account and fraudulent accesses to the mobile network can hence be avoided. Furthermore, as a removable and autonomous module the SIM introduces great flexibility since the user can easily move the SIM to other mobile phones or replace a SIM with another one. So far, the smart card and its content, the SIM are bound together and called SIM.
With the advances in wireless and storage technology, new demands have arisen. Because of cumbersome task of opening machines and installing the removable SIM, the M2M applications are designed with pre-installed SIM application. The M2M applications based on the cellular networks with the ability of installing the user subscription have advantages and disadvantages for a certain stakeholder.
This master’s thesis provides the multiple alternative solutions to this installation and also describe the SIM evolutions i.e. eUICC and soft SIM to give a comprehensive view of the SIM’s situation. The thesis also presents the security assessment of these evolutions which are different with the current removable SIM.

Security in Apple iOS

Added on: March 7th, 2018 by Afsal Meerankutty No Comments

Apple iOS has been a very advanced and sophisticated mobile operating system ever since it was first released in 2007. In this seminar paper, we will first focus on introducing iOS security by talking about the implementation details of its essential building blocks, such as system security, data security, hardware security and app security. Then we will talk about some potential and existing security issues of iOS.

iOS has been one of the most popular mobile operating system in the world ever since it was first released in 2007. As of June 2014, Apple App Store contained more than 1.2 million iOS applications, which have collectively been downloaded more than 60 billion times.  iOS was designed and created by Apple Inc, it is distributed exclusively for Apple hardware. iOS protects not only the data stored in the iOS device, but also the data transmitted on networks when using internet services. iOS provides advanced and sophisticated security for iOS devices and it’s also very easy to use. Users do not need to spend a lot of time on security configurations, as most of the security features have been automatically configured by iOS. iOS also supports biometric authentication (Touch ID), which has recently been incorporated into iOS devices, users can easily use their fingerprints to perform private and sensitive tasks such as unlocking the iPhone and making payments. This survey talks about the details about how the security elements are implemented in iOS and some issues with the iOS security.

Artificial Passenger

Added on: March 6th, 2017 by Afsal Meerankutty No Comments

The AP is an artificial intelligence–based companion that will be resident in software and chips embedded in the automobile dashboard. The heart of the system is a conversation planner that holds a profile of you, including details of your interests and profession.

A microphone picks up your answer and breaks it down into separate words with speech-recognition software. A camera built into the dashboard also tracks your lip movements to improve the accuracy of the speech recognition. A voice analyzer then looks for signs of tiredness by checking to see if the answer matches your profile. Slow responses and a lack of intonation are signs of fatigue.

This research suggests that we can make predictions about various aspects of driver performance based on what we glean from the movements of a driver’s eyes and that a system can eventually be developed to capture this data and use it to alert people when their driving has become significantly impaired by fatigue.

Light Emitting Polymer

Added on: February 13th, 2017 by Afsal Meerankutty No Comments

 The seminar is about polymers that can emit light when a voltage is applied to it. The structure comprises of a thin film of semiconducting polymer sandwiched between two electrodes (cathode and anode).When electrons and holes are injected from the electrodes, the recombination of these charge carriers takes place, which leads to emission of light .The band gap, ie. The energy difference between valence band and conduction band determines the wavelength (colour) of the emitted light.

They are usually made by ink jet printing process. In this method red green and blue polymer solutions are jetted into well defined areas on the substrate. This is because, PLEDs are soluble in common organic solvents like toluene and xylene .The film thickness uniformity is obtained by multi-passing (slow) is by heads with drive per nozzle technology .The pixels are controlled by using active or passive matrix.

It is a polymer that emits light when a voltage is applied to it. The structure comprises a thin-film of semiconducting polymer sandwiched between two electrodes (anode and cathode) as shown in fig.1. When electrons and holes are injected from the electrodes, the recombination of these charge carriers takes place, which leads to emission of light that escapes through glass substrate. The bandgap, i.e. energy difference between valence band and conduction band of the semiconducting polymer determines the wavelength (colour) of the emitted light.

The advantages include low cost, small size, no viewing angle restrictions, low power requirement, biodegradability etc. They are poised to replace LCDs used in laptops and CRTs used in desktop computers today.

Their future applications include flexible displays which can be folded, wearable displays with interactive features, camouflage etc.

MOCT (Magneto-Optical Current Transformer)

Added on: February 7th, 2017 by Afsal Meerankutty 1 Comment

An accurate current transducer is a key component of any power system instrumentation. To measure currents, power stations and substations conventionally employ inductive type current transformers. With short circuit capabilities of power system getting larger and the voltage level going higher the conventional current transducers becomes more bulky and costly.

It appears that newly emerged MOCT technology provides a solution for many of the problems by the conventional current transformers. MOCT measures the rotation angle of the plane polarized lights caused by the magnetic field and convert it into a signal of few volts proportional to the magnetic field.

Main advantage of an MOCT is that there is no need to break the conductor to enclose the optical path in the current carrying circuit and there is no electromagnetic interference.

Magneto Hydro Dynamic Power Generation

Added on: February 1st, 2017 by Afsal Meerankutty No Comments

When an electrical conductor is moved so as to cut lines of magnetic induction, charged particles in the conductor experience a force in a direction mutually perpendicular to the B field and to the velocity of the conductor. The negative charges tend to move in one direction, and the positive charges in the opposite direction. This induced electric field, or motional emf, provides the basis for converting mechanical energy into electrical energy. In conventional steam power plants, the heat released by the fuel is converted into rotational mechanical energy by means of a thermo cycle and the mechanical energy is then used to drive the electric generator. Thus two stages of energy conversion are involved in which the heat to mechanical energy conversion has inherently very low efficiency.

Also, the rotating machine has its associated losses and maintenance problems. In MHD generation, electrical energy is directly generated from hot combustion gases produced by the combustion of fuel without moving parts. The conventional electrical machines are basically electro mechanical converters while an MHD generator is heat engine operating on a turbine cycle and transforming the internal energy of gas directly into electrical energy.

Memristors

Added on: January 25th, 2017 by Afsal Meerankutty No Comments

Generally when most people think about electronics, they may initially think of products such as cell phones, radios, laptop computers, etc. others, having some engineering background, may think of resistors, capacitors, etc. which are the basic components necessary for electronics to function. Such basic components are fairly limited in number and each having their own characteristic function.

Memristor theory was formulated and named by Leon Chua in a 1971 paper. Chua strongly believed that a fourth device existed to provide conceptual symmetry with the resistor, inductor, and capacitor. This symmetry follows from the description of basic passive circuit elements as defined by a relation between two of the four fundamental circuit variables. A device linking charge and flux (themselves defined as time integrals of current and voltage), which would be the memristor, was still hypothetical at the time. However, it would not be until thirty-seven years later, on April 30, 2008, that a team at HP Labs led by the scientist R. Stanley Williams would announce the discovery of a switching memristor. Based on a thin film of titanium dioxide, it has been presented as an approximately ideal device.

The reason that the memristor is radically different from the other fundamental circuit elements is that, unlike them, it carries a memory of its past. When you turn off the voltage to the circuit, the memristor still remembers how much was applied before and for how long. That’s an effect that can’t be duplicated by any circuit combination of resistors, capacitors, and inductors, which is why the memristor qualifies as a fundamental circuit element.

The arrangement of these few fundamental circuit components form the basis of almost all of the electronic devices we use in our everyday life. Thus the discovery of a brand new fundamental circuit element is something not to be taken lightly and has the potential to open the door to a brand new type of electronics. HP already has plans to implement memristors in a new type of non-volatile memory which could eventually replace flash and other memory systems

Methanol Fueled Marine Diesel Engine

Added on: January 19th, 2017 by Afsal Meerankutty No Comments

Nowadays concerns about methanol have increased from the viewpoints of environmental protection and versatility of fuels at a global scale. Energetic research on methanol-fueled automobile engines has been forwarded from the viewpoints of low environmental pollution and the use of alternate fuel since the oil crisis, and they are now being tested on vehicles in various countries in the world. Desire for saving of maintenance cost and labour prevails as well as the environmental problems in the field of marine engines. From these motives scientists have carried out research and development of a methanol fueled marine diesel engine which is quite different from automobile engines in the size, main particulars, working condition and durability.

Although scientists have made a great use of invaluable knowledge from automotive technology, some special studies were necessary due to these differences. Ignition method is a typical one. Dual fuel injection system was tried for trouble-free ignition of methanol fuel. This system is thought to be the most favourable ignition method for marine diesel engines which have to withstand quick load change and accept no misfiring. 

Pill Camera

Added on: January 12th, 2017 by Afsal Meerankutty No Comments

The aim of technology is to make products in a large scale for cheaper prices and increased quality. The current technologies have attained a part of it, but the manufacturing technology is at macro level. The future lies in manufacturing product right from the molecular level. Research in this direction started way back in eighties. At that time manufacturing at molecular and atomic level was laughed about. But due to advent of nanotechnology we have realized it to a certain level. One such product manufactured is PILL CAMERA, which is used for the treatment of cancer, ulcer and anemia. It has made revolution in the field of medicine. This tiny capsule can pass through our body, without causing any harm.

It takes pictures of our intestine and transmits the same to the receiver of the Computer analysis of our digestive system. This process can help in tracking any kind of disease related to digestive system. Also we have discussed the drawbacks of PILL CAMERA and how these drawbacks can be overcome using Grain sized motor and bi-directional wireless telemetry capsule .Besides this we have reviewed the process of manufacturing products using nanotechnology.Some other important applications are also discussed along with their potential impacts on various fields.

We have made great progress in manufacturing products. Looking back from where we stand now, we started from flint knives and stone tools and reached the stage where we make such tools with more precision than ever. The leap in technology is great but it is not going to stop here. With our present technology we manufacture products by casting, milling, grinding, chipping and the likes. With these technologies we have made more things at a lower cost and greater precision than ever before. In the manufacture of these products we have been arranging atoms in great thundering statistical herds. All of us know manufactured products are made from atoms. The properties of those products depend on how those atoms are arranged. If we rearrange atoms in dirt, water and air we get grass. The next step in manufacturing technology is to manufacture products at molecular level. The technology used to achieve manufacturing at molecular level is “NANOTECHNOLOGY”. Nanotechnology is the creation of useful materials, devices and system through manipulation of such miniscule matter (nanometer).Nanotechnology deals with objects measured in nanometers. Nanometer can be visualized as billionth of a meter or millionth of a millimeter or it is 1/80000 width of human hair.

Overall Equipment Effectiveness

Added on: January 11th, 2017 by Afsal Meerankutty No Comments

In today’s economy, you’re expected to continuously improve your Return on Total Capital. And as capital to build new, more efficient plants becomes more difficult to obtain, you often have to meet growing production demands with current equipment and facilities — while continuing to cut costs.

There are several ways you can optimize your processes to improve profitability. But it can be difficult to understand the overall effectiveness of a complex operation so you can decide where to make improvements. That’s especially true when the process involves multiple pieces of equipment that affect each other’s effectiveness.

One metric that can help you meet this challenge is Overall Equipment Effectiveness, or OEE. OEE measures the health and reliability of a process relative to the desired operating level. It can show you how well you’re utilizing resources, including equipment and labor, to satisfy customers by matching product quality and supply requirements.

Overall Equipment Effectiveness (OEE) measures total performance by relating the availability of a process to its productivity and output quality.

OEE addresses all losses caused by the equipment, including
• Not being available when needed because of breakdowns or set-up and adjustment losses
• Not running at the optimum rate because of reduced speed or idling and minor stoppage losses
• Not producing first-pass A1 quality output because of defects and rework or start-up losses.

OEE was first used by Seiichi Nakajima, the founder of total productive maintenance (TPM), in describing a fundamental measure for tracking production performance. He challenged the complacent view of effectiveness by focusing not simply on keeping equipment running smoothly, but on creating a sense of joint responsibility between operators and maintenance workers to extend and optimize overall equipment performance.

First applied in discrete manufacturing, OEE is now used throughout process, batch, and discrete production plants.

Medical Imaging

Added on: January 10th, 2017 by Afsal Meerankutty No Comments

The increasing capabilities of medical imaging devices have strongly facilitated diagnosis and surgery planning. During the last decades, the technology has evolved enormously, resulting in a never-ending flow of high-dimensional and high-resolution data that need to be visualized, analyzed, and interpreted. The development of computer hardware and software has given invaluable tools for performing these tasks, but it is still very hard to exclude the human operator from the decision making. The process of stating a medical diagnosis or to conduct a surgical planning is simply too complex to fully automate. Therefore, interactive or semi-automatic methods for image analysis and visualization are needed where the user can explore the data efficiently and provide his or her expert knowledge as input to the methods.

All software currently being written for medical imaging systems have to conform to the DICOM (Digital Imaging in Communication in Medicine) standards to ensure that different systems from different vendors can successfully share information).so, you can, for example, acquire the image from a Siemens viewing station and do the processing on Philips multimodal stations (the same station being able to easily process say MRI as well as CAT scan images) are already in common use. Vendors are also able to send private information that only their software and viewing stations can read, so as to enhance their equipment. For example a Philips acquisition system can acquire and transmit more information than prescribed by the standard. Such extra information can be deciphered only by the standard. Even though the basic job is that of image processing, the algorithms used in medical software can be vastly different from say those used in other commercial image manipulation software like movie software or Photoshop. The reason behind this is that medical systems have to preserve a very high degree of accuracy and detail or there could be fatal results.

Smart Quill

Added on: January 3rd, 2017 by Afsal Meerankutty No Comments

Lyndsay Williams of Microsoft Research’s Cambridge UK lab is the inventor of the Smartquill, a pen that can remember the words that it is used to write, and then transform them into computer text . The idea that “it would be neat to put all of a handheld-PDA type computer in a pen,” came to the inventor in her sleep. “It’s the pen for the new millennium,” she says. Encouraged by Nigel Ballard, a leading consultant to the mobile computer industry, Williams took her prototype to the British Telecommunications Research Lab, where she was promptly hired and given money and institutional support for her project. The prototype, called SmartQuill, has been developed by world-leading research laboratories run by BT (formerly British Telecom) at Martlesham, eastern England. It is claimed to be the biggest revolution in handwriting since the invention of the pen.

With the introduction of handheld computers, the present trend has started preferring small computers to do computation. This has made computer manufacturers to go for almost gadget like computers. Reducing the size of handheld computers can only be taken so far before they become unusable. Keyboards become so tiny you require needle-like fingers to operate them and screen that need constant cursor controls to read simple text.

The introduction of SmartQuill has solved some of these problems. Lyndsay Williams of Microsoft, UK is the inventor of Smart Quill, a pen that can remember the words that is used to write, and then transform them into computer text. The pen is slightly larger than ordinary fountain pen, with a screen on the barrel. User can enter information into these applications by pushing a button .Information can be entered using his/her own handwriting. User can use any platform for writing like paper, screen, tablet or even air. There is also a small three-line screen to read the information stored in the pen. Users can scroll down the screen by tilting the pen. The pen is then plugged in to an electronic docking station, text data is transmitted to a desktop computer, printer, and modem or to mobile telephones to send files electronically.

Graphical Password Authentication

Added on: December 29th, 2016 by Afsal Meerankutty No Comments

The most common computer authentication method is to use alphanumerical usernames and passwords. This method has been shown to have significant drawbacks. For example, users tend to pick passwords that can be easily guessed. On the other hand, if a password is hard to guess, then it is often hard to remember.

To address this problem, some researchers have developed authentication methods that use pictures as passwords. In this paper, we conduct a comprehensive survey of the existing graphical password techniques. We classify these techniques into two categories: recognition-based and recall-based approaches. We discuss the strengths and limitations of each method and point out the future research directions in this area.

We also try to answer two important questions: “Are graphical passwords as secure as text-based passwords?”; “What are the major design and implementation issues for graphical passwords”. In this paper, we are conducting a comprehensive survey of existing graphical image password authentication techniques. Also we are here proposing a new technique for graphical authentication.

BrainGate

Added on: December 26th, 2016 by Afsal Meerankutty No Comments

BrainGate is a brain implant system developed by the bio-tech company Cyberkinetics in 2003 in conjunction with the Department of Neuroscience at Brown University. The device was designed to help those who have lost control of their limbs, or other bodily functions, such as patients with amyotrophic lateral sclerosis (ALS) or spinal cord injury. The computer chip, which is implanted into the brain, monitors brain activity in the patient and converts the intention of the user into computer commands. Cyberkinetics describes that “such applications may include novel communications interfaces for motor impaired patients, as well as the monitoring and treatment of certain diseases which manifest themselves in patterns of brain activity, such as epilepsy and depression.”

The Braingate Neural Interface device consists of a tiny chip containing 100 microscopic electrodes that is surgically implanted in the brain’s motor cortex. The whole apparatus is the size of a baby aspirin. The chip can read signals from the motor cortex, send that information to a computer via connected wires, and translate it to control the movement of a computer cursor or a robotic arm. According to Dr. John Donaghue of Cyberkinetics, there is practically no training required to use BrainGate because the signals read by a chip implanted, for example, in the area of the motor cortex for arm movement, are the same signals that would be sent to the real arm. A user with an implanted chip can immediately begin to move a cursor with thought alone.

The BrainGate technology platform was designed to take advantage of the fact that many patients with motor impairment have an intact brain that can produce movement commands. This may allow the BrainGate system to create an output signal directly from the brain, bypassing the route through the nerves to the muscles that cannot be used in paralysed people.

Ac Synchronous Generators

Added on: January 23rd, 2016 by Afsal Meerankutty No Comments

AC Generators come in two basic types – synchronous and non-synchronous. Synchrono us generators lock in with the fundamental line frequency and rotate at a syn-chronous speed related to the number of poles similar to that of AC Synchronous motors. Synchronous generator stator windings are similar to a three-phase synchronous motor stator winding. Synchronous generator rotor fields may be either salient or non-salient pole. Salient Pole (also called spider or projected pole) means the rotor has distinct pole windings mounted on the rotor shaft using dovetail joints. These pole windings are wound around field poles. Salient Pole rotors are most commonly used in slow speed applications and tend to be at least six poles. Salient pole rotors typically have damper windings to reduce rotor oscillations (caused by large flux changes between the individual poles) during operation.

Wearable Computers

Added on: November 3rd, 2013 by Afsal Meerankutty 4 Comments

As computers move from the desktop, to the palm top, and onto our bodies and into our everyday lives, infinite opportunities arise to realize applications that have never before been possible.To date, personal computers have not lived up to their name. Most machines sit on a desk and interact with their owners only a small fraction of the day. A person’s computer should be worn, much as eyeglasses or clothing are worn, and interact with the user based on the context of the situation. With the current accessibility of wireless local area networks, and the host of other context sensing and communication tools available, coupled with the current scale of miniaturization, it is becoming clear that the computer should act as an intelligent assistant, whether it be through a remembrance agent, augmented reality, or intellectual collectives. It is also important that a computer be small, such as something we could slip into our pocket, or even better wear like a piece of clothing. It is rapidly becoming apparent that the next technological leap is to integrate the computer and the user in a non-invasive manner, this leap will bring us into the fascinating world of Wearable Computers.

DakNet

Added on: November 3rd, 2013 by Afsal Meerankutty 10 Comments

DakNet provides extraordinarily low-cost digital communication, letting remote villages leapfrog past the expense of traditional connectivity solutions and begin development of a full coverage broadband wireless infrastructure. DakNet, an ad hoc network that uses wireless technology to provide asynchronous digital connectivity, is evidence that the marriage of wireless and asynchronous service may indeed be the beginning of a road to universal broadband connectivity.
This paper briefly explains about what are DakNet, how wireless technology implemented with DakNet, its fundamental operations and its applications, cost estimation, advantages and disadvantages and finally how to connect Indian villages with town city and global markets.

BiCMOS Technology

Added on: November 3rd, 2013 by Afsal Meerankutty No Comments

The need for high-performance, low-power, and low-cost systems for network transport and wireless communications is driving silicon technology toward higher speed, higher integration, and more functionality. Further more, this integration of RF and analog mixed-signal circuits into high-performance digital signal-processing (DSP) systems must be done with minimum cost overhead to be commercially viable. While some analog and RF designs have been attempted in mainstream digital-only complimentary metal-oxide semiconductor (CMOS) technologies, almost all designs that require stringent RF performance use bipolar or semiconductor technology. Silicon integrated circuit (IC) products that, at present, require modern bipolar or BiCMOS silicon technology in wired application space include the essential optical network (SONET) and synchronous digital hierarchy (SDH) operating at 10 Gb/s and higher.

The viability of a mixed digital/analog. RF chip depends on the cost of making the silicon with the required elements; in practice, it must approximate the cost of the CMOS wafer, Cycle times for processing the wafer should not significantly exceed cycle times for a digital CMOS wafer. Yields of the SOC chip must be similar to those of a multi-chip implementation. Much of this article will examine process techniques that achieve the objectives of low cost, rapid cycle time, and solid yield.

Access Premium Seminar Reports: Subscribe Now



Sign Up for comprehensive seminar reports & presentations: DOCX, PDF, PPTs.