Home » TUTORIALS

Category Archives: TUTORIALS

SPICE (Simulation Program with Integrated Circuit Emphasis)

Circuit simulation is a technique to predict the behavior of a real circuit using a computer program. It replaces real components with predefined electrical models. It is not possible to conceder all the physical processes (in circuit level simulation) in the parts and all PCB parasitic so it will only reflect the specific model that is put into it. This is the reason behind that simulators can’t substitute bread boarding and prototyping. But they allow measurements of internal currents, voltages and power that in many cases are virtually not possible to do any other way.

SPICE (Simulation Program with Integrated Circuit Emphasis) is a general-purpose open source analog electronic circuit simulator. It is a powerful program that is used in integrated circuit and board-level design to check the integrity of circuit designs and to predict circuit behavior.

Integrated circuits, unlike board-level designs composed of discrete parts, are impossible to breadboard before manufacture. Further, the high costs of photo lithographic masks and other manufacturing prerequisites make it essential to design the circuit to be as close to perfect as possible before the integrated circuit is first built. Simulating the circuit with SPICE is the industry-standard way to verify circuit operation at the transistor level before committing to manufacturing an integrated circuit.

Board-level circuit designs can often be bread boarded for testing. Even with a breadboard, some circuit properties may not be accurate compared to the final printed wiring board, such as parasitic resistances and capacitance. These parasitic components can often be estimated more accurately using SPICE simulation. Also, designers may want more information about the circuit than is available from a single mock-up. For instance, circuit performance is affected by component manufacturing tolerances. In these cases it is common to use SPICE to perform Monte Carlo simulations of the effect of component variations on performance, a task which is impractical using calculations by hand for a circuit of any appreciable complexity.

Circuit simulation programs, of which SPICE and derivatives are the most prominent, take a text netlist describing the circuit elements (transistors, resistors, capacitors, etc.) and their connections, and translate this description into equations to be solved. The general equations produced are nonlinear differential algebraic equations which are solved using implicit integration methods, Newton’s method and sparse matrix techniques

SPICE was developed at the Electronics Research Laboratory of the University of California, Berkeley by Laurence Nagel with direction from his research advisor, Prof. Donald Pederson. SPICE1 was largely a derivative of the CANCER program, which Nagel had worked on under Prof. Ronald Rohrer. CANCER was an acronym for “Computer Analysis of Nonlinear Circuits, Excluding Radiation,” a hint to Berkeley’s liberalism of 1960s: at these times many circuit simulators were developed under the United States Department of Defence contracts that required the capability to evaluate the radiation hardness of a circuit. When Nagel’s original advisor, Prof. Rohrer, left Berkeley, Prof. Pederson became his advisor. Pederson insisted that CANCER, a proprietary program, be rewritten enough that restrictions could be removed and the program could be put in the public domain.

SPICE1 was first presented at a conference in 1973. SPICE1 was coded in FORTRAN and used nodal analysis to construct the circuit equations. Nodal analysis has limitations in representing inductors, floating voltage sources and the various forms of controlled sources. SPICE1 had relatively few circuit elements available and used a fixed-time step transient analysis. The real popularity of SPICE started with SPICE2 in 1975. SPICE2, also coded in FORTRAN, was a much-improved program with more circuit elements, variable time step transient analysis using either the trapezoidal (second order Adams-Moulton method) or the Gear integration method (also known as BDF), equation formulation via modified nodal analysis (avoiding the limitations of nodal analysis), and an innovative FORTRAN-based memory allocation system developed by another graduate student, Ellis Cohen. The last FORTRAN version of SPICE was 2G.6 in 1983. SPICE3 was developed by Thomas Quarles (with A. Richard Newton as advisor) in 1989. It is written in C, uses the same netlist syntax, and added X Window System plotting.

As an early open source program, SPICE was widely distributed and used. Its ubiquity became such that “to SPICE a circuit” remains synonymous with circuit simulation. SPICE source code was from the beginning distributed by UC Berkeley for a nominal charge (to cover the cost of magnetic tape). The license originally included distribution restrictions for countries not considered friendly to the USA, but the source code is currently covered by the BSD license.

SPICE inspired and served as a basis for many other circuit simulation programs, in academia, in industry, and in commercial products. The first commercial version of SPICE was ISPICE, an interactive version on a timeshare service, National CSS. The most prominent commercial versions of SPICE include HSPICE (originally commercialized by Shawn and Kim Hailey of Meta Software, but now owned by Synopsys) and PSPICE (now owned by Cadence Design Systems). The academic spinoffs of SPICE include XSPICE, developed at Georgia Tech, which added mixed analog/digital “code models” for behavioural simulation, and Cider (previously CODECS, from UC Berkeley/Oregon State Univ.) which added semiconductor device simulation. The integrated circuit industry adopted SPICE quickly, and until commercial versions became well developed many IC design houses had proprietary versions of SPICE. Today a few IC manufacturers, typically the larger companies, have groups continuing to develop SPICE-based circuit simulation programs. Among these are ADICE at Analog Devices, LTspice at Linear Technology, Mica at Freescale Semiconductor, and TISPICE at Texas Instruments. (Other companies maintain internal circuit simulators which are not directly based upon SPICE, among them PowerSpice at IBM, Titan at Qimonda, Lynx at Intel Corporation, and Pstar at NXP Semiconductor.)

SPICE became popular because it contained the analyses and models needed to design integrated circuits of the time, and was robust enough and fast enough to be practical to use. Precursors to SPICE often had a single purpose: The BIAS program, for example, did simulation of bipolar transistor circuit operating points; the SLIC program did only small-signal analyses. SPICE combined operating point solutions, transient analysis, and various small-signal analyses with the circuit elements and device models needed to successfully simulate many circuits.

Some of the popular circuit simulators are as follows:

1.            ASTAP

2.            Advanced Design System

3.            CircuitLogix

4.            CPU Sim

5.            GNU Circuit Analysis Package

6.            Gpsim

7.            ICAP/4

8.            List of free electronics circuit simulators

9.            Logisim

10.          Micro-Cap

11.          NI Multisim

12.          National Instruments Electronics Workbench Group

13.          Ngspice

14.          PSpice

15.          PowerEsim

16.          Quite Universal Circuit Simulator

17.          SPICE

18.          SapWin

19.          SmartSpice

20.          SNAP (software)

21.          Spectre Circuit Simulator

22.          SpectreRF

These simulators differs each other and are generally application specific. Most popular version of spice simulators for analog circuit simulations are PSpice offered by MicroSim but now incorporated in OrCAD of Cadence and National Instruments Multisim.

“Do Bussiness with Style” cool new templates for Microsoft office 2007

Are u get bored using traditional old Microsoft office environment ,then here is something new. Microsoft Small Business is now giving out some cool free designer templates for Office 2007 for Windows, and Office 2008 for Mac. The best is that they comes in 9 colour schemes. These six templates contain a spreadsheet, presentation, invoice, letterhead, business card and newsletter blast templates.U can also get a 60 days free trial of Microsoft Office if you don’t have it. just click the link

Microsoft Office 2007 free templates

Don’t forget that , these templates will only work if you have the Office (2007 for Windows and 2008 for Mac) because the templates are in the new formats (.docx, .xlsx etc.). These templates are zip files and you can download them directly. No verification required.

“TOUCH-SCREEN” general introduction

You must have seen touch screens at ATMs, cellphones, information kiosks .Touch screen based system allows an easy navigation around a GUI based environment.



Tap on the screen twice for  double clicks, and drag a figure on the screen to move the cursor .We can do almost everything the mouse does, and do away with the additional device(mouse) leaving only the screen.

So “They’re all the rage, they are the buzzword, they are probably the hottest thing in technological cosmos right now!”Falling price and improved technology has fueled the use of these devices in smaller consumer devices like mobile, Tablet PCs and handheld gaming consoles.

The components

There are four popular touch screen   technologies but all of them have three main components.

·        Touch sensitive surface

·        The controller

·        The software driver

LCD Framework LCD Theory

>>   The   touch sensitive surface is an extremely durable and flexible glass or polymer touch response surface, and this panel is placed over the viewable area of the screen. In most sensors there is an electric signal going across the screen, and a touch on the surface causes change in the signal depending on the touch sensor technology used .This change allows the controller to identify the location of the touch.

>> The controller is a device that acts as the intermediate between the screen and the computer .It interprets the electrical signal of the touch event to digital signal that computer can understand.  The controller can be placed with the screen or housed externally.

>> The software driver is an interpreter that converts what signal comes from the controller to information that the operating system can understand.

Touch screen sensor technologies

Resistive touch screens










Resistive touch screens have two glasses or acrylic layers, one of them is coated wit is conducting and other is coated with resistive material. When these layers brought in contact the resistance changes and change is registered and location of touch is calculated.

These are durable and resistant to humidity and liquid spills. But they offer limited clarity, and the surface can be easily damaged by sharp objects.

·       Capacitive touch screens

In these devices a glass panel with a coat of charge storing material on its surface .When the panel is touched, a small amount of charge is drawn at the point of contact. Circuit located at each corner of the screen measure the difference in charge and send information to the controller for calculating the position of touch.These are used where the clarity precision is of concern as in laptops and medical imaging

·        Acoustic wave touch screens

Touch Screen Acoustic Wave Technology

This newer technology uses ultrasonic waves that pass over the   screen   . When the panel is touched   , there is a change in the frequency of ultrasonic wave   and the receiver at end of the panel register this change.

Since only glass is used with no coating, there is nothing that wears out.

Infrared touch screens

Infrared touch screens are primarily used for large displays, banking machines, and in military applications.

Touch Screen Infrared Technology

Infrared touch screens are based on light-beam interruption technology. Instead of an overlay on the surface, a frame surrounds the display. The frame has light sources, or light emitting diodes (LEDs) on one side and light detectors on the opposite side, creating an optical grid across the screen.

When an object touches the screen, the invisible light beam is interrupted, causing a drop in the signal received by the photo sensors.

Optical touch screen

Touch Screen Optical Technology

Optical touch screen technology is ideal for large LCD and plasma displays up to 100″ diagonal.

Optical touch screen technology uses two line scanning cameras located at the corners of the screen. The cameras track the movement of any object close to the surface by detecting the interruption of an infra-red light source. The light is emitted in a plane across the surface of the screen and can be either active (infra-red LED) or passive (special reflective surfaces).

Dispersive Signal Technology

Dispersive Signal Technology (DST) consists of a chemically-strengthened glass substrate with piezos mounted on each corner, mated to a sophisticated, dedicated controller. The DST Touch System determines the touch position by pinpointing the source of “bending waves” created by finger or stylus contact within the glass substrate. This process of interpreting bending waves within the glass substrate helps eliminate traditional performance issues related to on-screen contaminants and surface damage, and provides fast, accurate touch attributes.

Dispersive Signal Technology (DST)

·      Other technologies

The above are the main technologies, there are others as well like strain gauge and Microsoft’s surface technology etc .Through surface computing you can seamlessly synchronize electronic devices that touch its surface.

Advantage and disadvantage

With improvement of technology of touch screens, precise pointing is   also possible. Touch screens with glass surface can resist dirt, grease, moisture and also household cleaning agents.

Touch screens have some disadvantages  like people with fat fingers may mishit keys , needs to be handle  carefully .Touch screens are complex items and unlike keypad there are many   things which may go wrong .

Future prospect

Many large companies like Microsoft and Apple   have gotten on   the touch screen   bandwagon, multi touch screens in particular. Apple has patented an “Integrated Sensing Display” wherein display elements are   integrated with image sensing elements. If this will put to use, you could have a single device that looked like a monitor for video conferencing, wherein the monitor would be a camera!

Page copy protected against web site content infringement by Copyscape

” GRAPHICS CARD ” an introduction

A graphics card, also known as a video card or graphics accelerator card, is a piece of computer hardware that is responsible for creating the images you see on the monitor. It is usually used to refer to the type of card that is an expansion to the computer’s motherboard, and not the one integrated into the computer already. They have been used in many other types of electronic devices, particularly game consoles.

                                                                                                                              
                   A basic modern video card consists of several circuitry items. A Graphics Processing Unit (GPU), Video Memory and Bios, A Random Access Memory Digital to Analog Converter (RAMDAC), Outputs, and a cooling device. I’ll discuss these in order.

Graphics Processing Unit
The GPU is responsible for what you see on the screen. It is a separate microprocessor that frees up the CPU. Unlike the CPU, the GPU makes use of much more complicated mathematical and geometrical equations to complete the graphics rendering. Most models make use of Filtering and Anti aliasing to make images crisper, and smooth out the edges of objects. Current models are often optimized for 3D functions, due to the common usage for games. The GPU makes use of “pipelines,” which translate a 3D image to a 2D image for the computer to read.

Video Memory and Bios
Video RAM or VRAM is the number of Megabytes you will see on the box of a graphics card. Graphics cards have their own memory space to help the computer process images. This part of the card makes use of the z-buffer. This is particularly helpful in 3D graphics, as it coordinates depth in an image. It is this memory that makes a 3D environment possible. The Bios is just the basic programming for interaction between the card and the computer. It contains the memory for all of the cycles the card must go through. It is an extremely important area on the card, and if damaged, will cause the card to stop working.

Random Access Memory Digital to Analog Converter
The RAMDAC is slowly disappearing as the CRT monitor disappears and more people use digital equipment. It’s function, however, is to convert any digital data (eg. from a CD) and convert it to something the computer will display (eg. a movie). Because of the conversion, there is a certain amount of quality that is lost.

Outputs
These are the links from the video card to other instruments and peripherals. Common outputs are SVGA outputs to the CRT monitors, DVI outputs to digital outputs such as LCD monitors and projectors, and S-video outputs to TVs, Game consoles, and video recorders.

Cooling devices
As Graphics cards become more and more powerful, they generate more and more heat. Cooling devices remove heat in some way or another. Fans are common and simply circulate the warm air away. Because the fan consists of moving parts, it can break. Liquid heat sinks use a cool liquid to circulate the heat away. It’s much less common, but is favorable because of its efficiency. It is much more expensive.

These parts describe a general graphics card. The most important information on how the graphics card works is in the GPU subsection and the VRAM subsection.

Intel® Centrino® Processor Technology

Intel® Centrino® Processor Technology 

with New Intel® Core™2 Duo Processor

More performance. More freedom. More bandwidth

 

With new laptop PCs based on Intel® Centrino® 2 processor technology for the home, or Intel® Centrino® 2 with vPro™ technology for business, you’ll experience a new breakthrough in mobile performance, enabled longer battery life, the future of wireless now with 802.11n standard, and more, right at your fingertips.‡ Delivering performance gains of up to 50%¹ enabled by a minimum 3MB Smart Cache and 1066MHz Front Side Bus, these laptops are equipped to handle everything from robust business to masterful multimedia and everything in between. And with Intel Centrino 2 processor technology, you’ll make quick work of the toughest computing tasks like HD video encoding—up to 90% faster², so you can accomplish more without the wait.

• New 45nm Intel® Core™ Duo processor

• Mobile Intel® 965 Express Chipset Family

  Intel® Next-Gen Wireless-N (Intel® Wireless WiFi Link 4965AGN) 

or Intel® PRO/Wireless 3945ABG Network Card

 

  CPU performance for high-def movies and music:

Enjoy the performance of the Intel® Centrino® processor

technology – it provides the computing power for a great

high-def experience and you can enjoy the rich sound

quality of your music, movies and gaming with Intel® High

Definition Audio.

  Innovate with amazing multimedia performance: Now

you can get up to 50% more performance on intensive

multimedia like HD video encoding with Intel® HD Boost.1

  Designed for the longest possible battery life5: Unique

new power management and power saving features help

extend battery life so you can enjoy more entertainment,

gaming, and music while on the go. Stay unplugged longer.

 

 

Reinventing Performance, Redefining Efficiency.


  Use less, do more: New T8000∆ and T9000∆ series 

CPUs are lead-free and have unique power-saving 

features like Intel® Deep Power Down for conserving 

energy and protecting the environment.

  Faster, broader, better: Now up to 5X faster wireless

performance and up to 2X greater range than 802.11 

a/g solutions with an Intel Centrino processor technology-

based notebook with optional Intel® Next-Gen Wireless-N

and a wireless-N home network.2

  Need for speed?: Look for Vista*-based notebooks with

new Intel® Turbo Memory for up to 2X faster performance

when loading frequently used, memory-intensive applica-

tions3 and better boot time.4


Intel Centrino processor technology featuring the new Intel® Core™2 Duo processor enables outstanding 

mobile dual-core performance while enabling improved battery life and enhanced wireless connectivity — 

all in a variety of laptop designs so you can choose the right one for you, your family or your business.

 

 

 

here is a pdf document for more details

Stop the intruders using a firewall

A firewall is a ‘must-have’ if you want to be safe online. Arm your PC with this critical interface to stay safe.
You know you have to keep your PC safe from viruses. But an anti-virus program cannot help you deal with malicious people or computer robots from trying to enter your PC. Many websites have malicious code running which can put your computer in trouble. This is the reason you need a firewall.
 
Ever noticed at work, when you are trying to send a file to your friend on chat, and you get a message that says you cannot send the file to him because you are behind a firewall? That is the precaution that your company has taken to ensure that chat programs do not compromise the office network.
 
A firewall can be a software or hardware device or a combination of both used to monitor and filter network traffic according to a set of predefined rules. Generally, firewalls are set up to protect against unauthenticated logins from the cyber world. A firewall can also guard you against malicious websites and potential hackers. More elaborate firewalls block traffic for the outside to the inside, but allow users to communicate freely with the outer world.
 
If you have a home computer, it’s enough to go for a personal software firewall. Windows XP (SP2) comes with one installed. You can switch it on by going to Control Panel – Windows Firewall. In offices, a firewall is placed at the gateway of web connectivity with its predefined rules based on the company specifications. Once in place, a firewall monitors and restricts information travelling between your computer and the Internet.
 
So if an intruder attempts to enter your system, his request is rejected by the firewall. The firewall only allows permissible transactions to take place. So while you maybe able to surf the web or download files, requests from hacker programs or suspicious codes are rejected. A skillfully configured firewall, at times makes your system appear practically invisible! Remember that the efficiency of a firewall greatly depends on the way the “rules” are set.
 
There are many free software firewalls for download. You can use a reputed one and install it on your computer. But remember to diable Windows Firewall before you start using it. Although many firewalls are designed to fight viruses, it’s best to have a separate anti-virus program installed on your computer.
 
There are limitations with firewalls too. Not all firewalls can protect you against viruses. A firewall cannot tell you whether it has been correctly configured or not. Also, not many firewalls can indicate whether your system has been compromised.
 

LHC-Large Hadron Collider

Big bang

The Creation of Universe

First major CERN test complete, scientists cheer

The world’s largest particle collider successfully completed its first major test by firing a beam of protons all the way around a 17-mile (27-kilometre) tunnel on Wednesday in what scientists hope is the next great step to understanding the makeup of the universe.

After a series of trial runs, two white dots flashed on a computer screen at 10:36 a.m. (0836 GMT) indicating that the protons had travelled the full length of the 3.8 billion US dollar Large Hadron Collider. (Watch).

Cheers erupted from the assembled scientists, including project leader Lyn Evans, in the collider’s control room at the Swiss-French border when the beam completed its lap.

Champagne corks popped in labs as far away as Chicago, where contributing scientists watched the proceedings by satellite.

Physicists around the world now have much greater power than ever before to smash the components of atoms together in attempts to see how they are made.

The European Organisation for Nuclear Research, known as CERN began firing the protons – a type of subatomic particle – around the tunnel in stages less than an hour earlier. (See pics).

Now that the beam has been successfully tested in clockwise direction, CERN plans to send it counterclockwise.

Eventually two beams will be fired in opposite directions with the aim of recreating conditions a split second after the big bang, which scientists theorise was the massive explosion that created the universe.

The start of the collider – described as the biggest physics experiment in history – comes over the objections of some skeptics who fear the collision of protons could eventually imperil the earth.
The skeptics theorised that a byproduct of the collisions could be micro black holes, subatomic versions of collapsed stars whose gravity is so strong they can suck in planets and other stars.
James Gillies, chief spokesman for CERN dismissed this as nonsense before Wednesday’s start.

CERN is backed by leading scientists like Britain’s Stephen Hawking in dismissing the fears and declaring the experiments to be absolutely safe.

Gillies said that the most dangerous thing that could happen would be if a beam at full power were to go out of control, and that would only damage the accelerator itself and burrow into the rock around the tunnel.

Nothing of the sort occurred on Wednesday, though accelerator is still probably a year away from full power. The project organised by the 20 European member nations of CERN has attracted researchers from 80 nations.

Some 1,200 are from the United States, an observer country which contributed 531 million US dollar. Japan, another observer, also is a major contributor.

The collider is designed to push the proton beam close to the speed of light, whizzing 11,000 times a second around the tunnel.

Smaller colliders have been used for decades to study the makeup of the atom. Less than 100 years ago scientists thought protons and neutrons were the smallest components of an atom’s nucleus, but in stages since then experiments have shown they were made of still smaller quarks and gluons and that there were other forces and particles.

The CERN experiments could reveal more about “dark matter,” antimatter and possibly hidden dimensions of space and time.

It could also find evidence of the hypothetical particle – the Higgs boson – believed to give mass to all other particles, and thus to matter that makes up the universe

How a detector works

How a detector works

The job of a particle detector is to record and visualise the explosions of particles that result from the collisions at accelerators. The information obtained on a particle’s speed, mass, and electric charge help physicists to work out the identity of the particle.

The work particle physicists do to identify a particle that has passed through a detector is similar to the way someone would study the tracks of footprints left by animals in mud or snow. In animal prints, factors such as the size and shape of the marks, length of stride, overall pattern, direction and depth of prints, can reveal the type of animal that came past earlier. Particles leave tell-tale signs in detectors in a similar manner for physicists to decipher.

Modern particle physics apparatus consists of layers of sub-detectors, each specialising in a particular type of particle or property. There are 3 main types of sub-detector:

To help identify the particles produced in the collisions, the detector usually includes a magnetic field. A particle normally travels in a straight line, but in the presence of a magnetic field, its path is bent into a curve. From the curvature of the path, physicists can calculate the momentum of the particle which helps in identifying its type. Particles with very high momentum travel in almost straight lines, whereas those with low momentum move forward in tight spirals.

Tracking devices

Tracking devices reveal the paths of electrically charged particles through the trails they leave behind. There are similar every-day effects: high-flying airplanes seem invisible, but in certain conditions you can see the trails they make. In a similar way, when particles pass through suitable substances the interaction of the passing particle with the atoms of the substance itself can be revealed.

Most modern tracking devices do not make the tracks of particles directly visible. Instead, they produce tiny electrical signals that can be recorded as computer data. A computer program then reconstructs the patterns of tracks recorded by the detector, and displays them on a screen.

They can record the curvature of a particle’s track (made in the presence of a magnetic field), from which the momentum of a particle may be calculated. This is useful for identifying the particle.

Muon chambers are tracking devices used to detect muons. These particles interact very little with matter and can travel long distances through metres of dense material. Like a ghost walking through a wall, muons can pass through successive layers of a detector. The muon chambers usually make up the outermost layer.

Calorimeters

A calorimeter measures the energy lost by a particle that goes through it. It is usually designed to entirely stop or ‘absorb’ most of the particles coming from a collision, forcing them to deposit all of their energy within the detector.

Calorimeters typically consist of layers of ‘passive’ or ‘absorbing’ high–density material (lead for instance) interleaved with layers of ‘active’ medium such as solid lead-glass or liquid argon.

Electromagnetic calorimeters measure the energy of light particles – electrons and photons – as they interact with the electrically charged particles inside matter.

Hadronic calorimeters sample the energy of hadrons (particles containing quarks, such as protons and neutrons) as they interact with atomic nuclei.

Calorimeters can stop most known particles except muons and neutrinos.

Particle identification detectors

Two methods of particle identification work by detecting radiation emitted by charged particles:

  • Cherenkov radiation: this is light emitted when a charged particle travels faster than the speed of light through a given medium. The light is given off at a specific angle according to the velocity of the particle. Combined with a measurement of the momentum of the particle the velocity can be used to determine the mass and hence to identify the particle.
  • Transition radiation: this radiation is produced by a fast charged particle as it crosses the boundary between two electrical insulators with different resistances to electric currents. The phenomenon is related to the energy of a particle and distinguishes different particle types.