Home » Articles posted by techspaceofatul

Author Archives: techspaceofatul

Evolution from 1G to 4G LTE, Understanding the Mobile Technologies

Long-term evolution (LTE) aka 4G, is a quickly rising common universal technology that’s constantly evolving to offer us unmatched data rates, higher capacity, and new levels of user experience.

As reported by Qualcomm by 2019, 65% of the world’s population is forecasted to have LTE coverage.

To bring 5G to execution, the electronics industry is working to get the most out of every bit of spectrum across a widespread band of the available spectrum from low bands below 1 GHz, to mid bands from 1 GHz to 6 GHz, to high bands known as mmWave.

This will enable us of virtually 11 GHz of the spectrum, 3.85 GHz of licensed spectrum and 7 GHz of unlicensed spectrum making accessible more spectrum for wireless/mobile broadband than ever before. 5G technologies will take advantage of ultra wide bandwidths to meet the rising demands for great speed, quality performance and network capacity. New advanced 5G technologies, however, will change just like 3G to 4G and further to 5G, including advanced antenna technologies to mobilize mmWave.

gen

Let’s have a glimpse on the evolution of the mobile and communication technology generations.

1G established seamless mobile connectivity introducing mobile voice services.

  • Mobile 1G established the foundation of mobile cellular phones.
  • Commercial mobile communications systems first appeared in the early 1980’s (1G). These systems were based on analog transmission and a relatively small proportion of the population had access to these systems.
  • Licensed Spectrum: Cleared spectrum for exclusive use by mobile technologies. Operator-deployed base stations provide access for subscribers.
  • Frequency Reuse: Reusing frequencies without interference through geographical separation. Neighboring cells operate on different frequencies to avoid interference.
  • Mobile Network: Coordinated network for seamless access and seamless mobility. Integrated, transparent backhaul network provides seamless access.
  • AMPS, NMT, TACS techniques were standardized.

Mobile 1G was amazing, but limited:

  • Some flaws with these systems were that they did not provide a great deal of security and standardization was not controlled particularly well.
  • Limited Capacity: Analog voice consumed channel 1 call per channel. Support for only 1 user per channel. Frequency Division Multiple Access (FDMA) support for only 1 user (analog phone call) per channel.
  • Requires large gap of spectrum between users to avoid interference. Analog transmissions are inefficient at using limited spectrum.
  • Limited Scalability: Analog devices are large/heavy, power inefficient, and high cost having limited scope of scaling.

2G digital wireless technologies increased voice capacity. Delivering mobile voice services to the masses – more people, in more places.

  • The first 2G systems were standardized and deployed in the early 90’s. The GSM (Global System Mobile) standard has come to dominate but other co-existing technologies also operate such as CDMAone.
  • 2G systems were developed primarily for voice communications and incorporated circuit-switching technology. Later, some data capability has been added on with SMS (Short Message Service) and WAP (Wireless Application Protocol) but these were quite limited in terms of functionality and available capacity.
  • In a GSM network mobile phones are in constant contact with the nearest BTS (Base Terrestrial Station) which is connected to a BSC (Base Station Controller) along with several other BTSs). The network of BSCs reports to a MSC (Master Station Controller) which manages the communications. It performs a number of functions including mobility management, authentication, encryption, billing, connectivity to PSTN (Public Switched Telephone System), etc.
  • Initial 2G technologies D-AMPS and GSM were based on TDMA, while cdmaOne was based on Code Division Multiple Access.
  • Early Mobile 2G technologies enabled more users per channel. Digital transmissions enable compressed voice and multiplexing multiple users per channel. Time Division Multiple Access (TDMA) allows multiple users per radio channel with each user talking one at a time.
  • D-AMPS technique standardized as IS-54 by TIA in 1992 was used mainly in North America enabled three users per radio channel (No longer utilized). Digital voice compressed into smaller “packages”.
  • GSM technique standardized by ETSI in 1990 (phase 1) initiated in Europe enabled eight users per radio channel is still widely used today . It provides Simple data services with GPRS. GSM can offer a maximum of 9.6 Kbps per channel which is sufficient for voice but not suitable for any substantial data traffic.
  • GPRS allows for a GSM channel to be divided into 8 streams of 13 Kbps each and for concurrent transmission of packets along each to yield a maximum of around 100 Kbps. Another feature of GPRS is that it allows for simultaneous data transfer and voice calls.
  • TDMA still required large frequency gaps to reduce interference, also required potentially unreliable “hard” handoffs. Switch channels between adjacent cells made it prone to dropped calls. It also had rigid delivery schedule whether or not the user is actively talking.
  • Code Division Multiple Access (CDMA) developed by Qualcomm enabled users to share the same frequency and communicate at the same time. Multiple users can talk at same time using different languages (“codes”). Each user’s information is coded with unique code. No rigid delivery schedule so delivery truck can take advantage when user is not talking to support more callers.
  • CDMA utilizes all the available spectrum to support more users it has ability to support many more users (>10x 1G) with same spectrum.
  • In February 1990 First CDMA field trial completed by Qualcomm and NYNEX.
  • Qualcomm developed a new CDMA standard (IS-95) referred to as cdmaOne in May 1995 which was commercially deployed in December 1995.
  • GPRS has actual speeds considerably lower than the 100 Kbps mark. EDGE (Enhanced Data Rates for GPRS Evolution) was an attempt to improve data rates and deliver up to 384 Kbps. It represents another step towards 3G.
  • Scalable Technology: Digital components cost and weight is far less and they deliver more secure signal with ample scope of further scaling using semiconductor large scale integration.
  • Due to GSM and CDMA, more and more people had a mobile subscription.

3G optimized mobile for data enabling mobile broadband services, and is evolving for faster and better connectivity.

  • 3G extends mobile services to include multi-rate multimedia content.
  • CDMA is the foundation for Mobile 3G technologies. Mobile 3G evolved into two competing standards both based on CDMA. CDMA2000/EV-DO, WCDMA/HSPA+ and TD-SCDMA techniques evolved.
  • Using 3G technology consumers were introduced to mobile broadband internet access in the home and office.
  • Amazing innovations in device technology resulted in the era of the smartphone.
  • 3G techniques uses higher order adaptive modulation to get more bps per Hz for users with good signal quality with increased peak data rates.
  • Optimizes channel by scheduling users at the time instances when users have good radio signal conditions (with fairness) Increases overall capacity.
  • CDMA2000 uses 1.25 MHz carrier; which enabled easy migration from cdmaONE. Qualcomm introduces EV-DO in 1999 (Evolution-Data Optimized), by optimized data channel for CDMA2000 providing mobile broadband services launched commercially in 2002.
  • WCDMA (UMTS) Uses 5 MHz carrier; leverages GSM core network was introduced in 2004. HSPA (High Speed Packet Access) Optimized data channel for WCDMA providing mobile broadband services. First DC-HSPA+ (42 Mbps) was commercially launched in 2010.
  • While in 2G GSM data rate was <0.5 Mbps it reached up to 14.7 Mbps in CDMA2000/EV-DO having Data optimized channel with support for larger package sizes enabling richer content.
  • 3G techniques uses higher order adaptive modulation to get more bps per Hz for users with good signal quality with increased peak data rates. 64-QAM enabled 50% more bits per second per Hz (bps/Hz)
  • Optimizes channel by scheduling users at the time instances when users have good radio signal conditions (with fairness) Increases overall capacity. Aggregating spectrum enabling increased user and peak data rates.
  • 3G Techniques delivered achievable throughput >2 Mbps Reduced operator cost for data services Continuous evolution for enhanced services.

4G LTE delivers more capacity for faster and better mobile broadband experiences and is also expanding into new frontiers.

  • Mobile 4G LTE is the first global standard for mobile broadband.
  • It enables two modes with common standard and same ecosystem LTE FDD & LTE TDD.
  • Unpaired spectrum enables asymmetrical DL/UL for more DL capacity
  • Frequency Division Duplex (FDD) Paired spectrum enables better coverage and Time Division Duplex (TDD) unpaired spectrum enables asymmetrical DL/UL for more DL capacity.
  • Multimode 3G/LTE is the foundation for successful 4G LTE.
  • 4G LTE is providing more data capacity for richer content and more connections. 3G is enabling a consistent broadband experience outside 4G LTE coverage Delivering ubiquitous voice services and global roaming.
  • Flexible support for wider channels up to 20 MHz enabled with OFDMA supporting more users.
  • Create spatially separated paths with more antennas using advanced MIMO techniques to create spatially separated paths; 2×2 MIMO mainstream.
  • Aggregate channels for higher data rates up to 100 MHz for higher data rates.
  • Connect in Real-time with Simplified Core Network, all IP network with flattened architecture resulting in less equipment per transmission.
  • Low Latencies, optimized response times for both user and control plane improves the user experience.

to have a summarized view a  comparison table of various features evolved may be seen as follows.

Comparison table

1g to 4g
Source: Internet and Qualcomm

SPICE (Simulation Program with Integrated Circuit Emphasis)

Circuit simulation is a technique to predict the behavior of a real circuit using a computer program. It replaces real components with predefined electrical models. It is not possible to conceder all the physical processes (in circuit level simulation) in the parts and all PCB parasitic so it will only reflect the specific model that is put into it. This is the reason behind that simulators can’t substitute bread boarding and prototyping. But they allow measurements of internal currents, voltages and power that in many cases are virtually not possible to do any other way.

SPICE (Simulation Program with Integrated Circuit Emphasis) is a general-purpose open source analog electronic circuit simulator. It is a powerful program that is used in integrated circuit and board-level design to check the integrity of circuit designs and to predict circuit behavior.

Integrated circuits, unlike board-level designs composed of discrete parts, are impossible to breadboard before manufacture. Further, the high costs of photo lithographic masks and other manufacturing prerequisites make it essential to design the circuit to be as close to perfect as possible before the integrated circuit is first built. Simulating the circuit with SPICE is the industry-standard way to verify circuit operation at the transistor level before committing to manufacturing an integrated circuit.

Board-level circuit designs can often be bread boarded for testing. Even with a breadboard, some circuit properties may not be accurate compared to the final printed wiring board, such as parasitic resistances and capacitance. These parasitic components can often be estimated more accurately using SPICE simulation. Also, designers may want more information about the circuit than is available from a single mock-up. For instance, circuit performance is affected by component manufacturing tolerances. In these cases it is common to use SPICE to perform Monte Carlo simulations of the effect of component variations on performance, a task which is impractical using calculations by hand for a circuit of any appreciable complexity.

Circuit simulation programs, of which SPICE and derivatives are the most prominent, take a text netlist describing the circuit elements (transistors, resistors, capacitors, etc.) and their connections, and translate this description into equations to be solved. The general equations produced are nonlinear differential algebraic equations which are solved using implicit integration methods, Newton’s method and sparse matrix techniques

SPICE was developed at the Electronics Research Laboratory of the University of California, Berkeley by Laurence Nagel with direction from his research advisor, Prof. Donald Pederson. SPICE1 was largely a derivative of the CANCER program, which Nagel had worked on under Prof. Ronald Rohrer. CANCER was an acronym for “Computer Analysis of Nonlinear Circuits, Excluding Radiation,” a hint to Berkeley’s liberalism of 1960s: at these times many circuit simulators were developed under the United States Department of Defence contracts that required the capability to evaluate the radiation hardness of a circuit. When Nagel’s original advisor, Prof. Rohrer, left Berkeley, Prof. Pederson became his advisor. Pederson insisted that CANCER, a proprietary program, be rewritten enough that restrictions could be removed and the program could be put in the public domain.

SPICE1 was first presented at a conference in 1973. SPICE1 was coded in FORTRAN and used nodal analysis to construct the circuit equations. Nodal analysis has limitations in representing inductors, floating voltage sources and the various forms of controlled sources. SPICE1 had relatively few circuit elements available and used a fixed-time step transient analysis. The real popularity of SPICE started with SPICE2 in 1975. SPICE2, also coded in FORTRAN, was a much-improved program with more circuit elements, variable time step transient analysis using either the trapezoidal (second order Adams-Moulton method) or the Gear integration method (also known as BDF), equation formulation via modified nodal analysis (avoiding the limitations of nodal analysis), and an innovative FORTRAN-based memory allocation system developed by another graduate student, Ellis Cohen. The last FORTRAN version of SPICE was 2G.6 in 1983. SPICE3 was developed by Thomas Quarles (with A. Richard Newton as advisor) in 1989. It is written in C, uses the same netlist syntax, and added X Window System plotting.

As an early open source program, SPICE was widely distributed and used. Its ubiquity became such that “to SPICE a circuit” remains synonymous with circuit simulation. SPICE source code was from the beginning distributed by UC Berkeley for a nominal charge (to cover the cost of magnetic tape). The license originally included distribution restrictions for countries not considered friendly to the USA, but the source code is currently covered by the BSD license.

SPICE inspired and served as a basis for many other circuit simulation programs, in academia, in industry, and in commercial products. The first commercial version of SPICE was ISPICE, an interactive version on a timeshare service, National CSS. The most prominent commercial versions of SPICE include HSPICE (originally commercialized by Shawn and Kim Hailey of Meta Software, but now owned by Synopsys) and PSPICE (now owned by Cadence Design Systems). The academic spinoffs of SPICE include XSPICE, developed at Georgia Tech, which added mixed analog/digital “code models” for behavioural simulation, and Cider (previously CODECS, from UC Berkeley/Oregon State Univ.) which added semiconductor device simulation. The integrated circuit industry adopted SPICE quickly, and until commercial versions became well developed many IC design houses had proprietary versions of SPICE. Today a few IC manufacturers, typically the larger companies, have groups continuing to develop SPICE-based circuit simulation programs. Among these are ADICE at Analog Devices, LTspice at Linear Technology, Mica at Freescale Semiconductor, and TISPICE at Texas Instruments. (Other companies maintain internal circuit simulators which are not directly based upon SPICE, among them PowerSpice at IBM, Titan at Qimonda, Lynx at Intel Corporation, and Pstar at NXP Semiconductor.)

SPICE became popular because it contained the analyses and models needed to design integrated circuits of the time, and was robust enough and fast enough to be practical to use. Precursors to SPICE often had a single purpose: The BIAS program, for example, did simulation of bipolar transistor circuit operating points; the SLIC program did only small-signal analyses. SPICE combined operating point solutions, transient analysis, and various small-signal analyses with the circuit elements and device models needed to successfully simulate many circuits.

Some of the popular circuit simulators are as follows:

1.            ASTAP

2.            Advanced Design System

3.            CircuitLogix

4.            CPU Sim

5.            GNU Circuit Analysis Package

6.            Gpsim

7.            ICAP/4

8.            List of free electronics circuit simulators

9.            Logisim

10.          Micro-Cap

11.          NI Multisim

12.          National Instruments Electronics Workbench Group

13.          Ngspice

14.          PSpice

15.          PowerEsim

16.          Quite Universal Circuit Simulator

17.          SPICE

18.          SapWin

19.          SmartSpice

20.          SNAP (software)

21.          Spectre Circuit Simulator

22.          SpectreRF

These simulators differs each other and are generally application specific. Most popular version of spice simulators for analog circuit simulations are PSpice offered by MicroSim but now incorporated in OrCAD of Cadence and National Instruments Multisim.

Being Single

Well after going through an article “The 50 Best Things about Being Single”. I was thinking that is it really hurts being single? Lets talks about the other side of the mirror, being single may be fun .we have to just think that is it really so fun being in a relationship? The soul of a healthy relationship lies in the better understanding. You can see your best friend could be in the relationship from hell, your another friend could be in a monotonous and uneventful relationship, but I find people still feel sorry for being single? Is something wrong here? They might even imply that something’s “wrong” with them if they’re single. Single life can be the most exciting time of your life. It’s just the matter of Attitude, so what really matters is the attitude towards life. Change the attitude and u will never feel sorry for being single  and can speak out loudly with proud “I am single and loving it”. That’s not the only reason I can name you a lot of reasons some of them may be like as follows…

Being single we can always focus on friendship. Being single doesn’t have to mean being lonely. When you’re single, you have more time to do a variety of things, all of which are opportunities of new friendships. Even if you’re an introvert, this can be an excellent time to rip your extroverted side. Make it a priority in your life to create meaningful friendships and enrich your existing ones. Being single “You can spend as much time as you want with your friend and family and nobody’s lip will drag the ground.”

Generally people’s advice about relationship dictates that compromise, sacrifice is essential to a healthy relationship. Perhaps if you’ve been in a relationship before, you realize how much stuff you had to give up and what you pay in order to making that relationship work. Or maybe you forgot about that stuff, because you’re focused on the things you miss. Well, this is a good time to shift that focus. If you’re unsystematic guy, isn’t it great to be able to be properly systematic for some other person? If you’re a neat freak, isn’t it wonderful to be able to organize everything, and find it the way you left it? Isn’t it nice to be able to work that a partner might be averse to? Isn’t it cool to be able to go out spontaneously, without wondering whether your partner can or should be invited and join you? A relationship can add many good things to your life, but it also adds some rigidity and many constrains, so take the time to appreciate your current flexibility. Being single “You can have eight hours of undisturbed sleep with the covers all to yourself.”And “You can be happy with who you are, not who he or she wants you to be.”

Relationships tend to come along with lot of planning for example; you can’t just accept a job across the country without touching base with your partner. And generally, if you’re in it for the long term, you’ll likely talk about what you’ll be doing years from now. But when you’re single, the future is completely open. Today you’re at your desk with present and a year from now you might be somewhere else with some new things. So being single “You don’t have to deal with anyone’s grumpy, moody personalities.”

Being single you can always enjoy your freedom. Everybody has his own dreams and desires, and the chance of pairing up with someone who shares such a fantasy with equal fervour is not something to hold your breath for. So what are you waiting for? Find some people who have the same idea, or just go for it alone, and you’ll meet like-minded people along the way. There might be some adventurous endeavours you might want to indulge in. Being single “The TV Guide crossword puzzle is YOURS, ALL YOURS.”

And finally u can spoil yourself. Being single gives you a lot more “you” time. Go spoil yourself! Do whatever you love to do. Just like this

“You can go out and flirt as much as your heart desires, without a worry in the world.”

So enjoy your single dome and Just say “I love being single”

“INS ARIHANT” INDIA’s First Nuclear Submarine

India launched its first nuclear-powered submarine in a ceremony in southern port city of Vishakhapatnam 0n 26 July 2009, becoming one of just six nations in the world to have successfully built one. The 367-foot long INS Arihant, which means “Destroyer of the Enemies” in Hindi according to the official news release. The name Arihant has its origins in the Jain religion, and unofficial news reports stating “Destroyer of Enemies” omitting the definite article. India became the sixth country in the world to have built one. Besides the US, which has 74 nuclear submarines, Russia (45), UK (13), France (10) and China (10) also possess nuclear-powered submarines – the US has nearly as many nuclear submarines as all other countries combined.

India is a nation that struggled to enter the select group of countries that build nuclear powered submarines. Its program ATV, or Advanced Technology Vessel, was initiated in 1974. But after three decades it had not presented results that could modify the current picture of the navies with nuclear propulsion.

The INS Arihant, India’s first nuclear submarine that was till now known by the code name S 2, was launched at a simple ceremony in the port town of Visakhapatnam [Vizac] with the traditional breaking of a coconut on its hull by Prime Minister Manmohan Singh’s wife, Gursharan Kaur. It was expected to be ready for induction into the Navy by 2011 after a series of exhaustive trials.

The launch ceremony was attended by the prime Minister. Dr. Manmohan Singh, accompanied by Smt. Gursharan Kaur, Raksha Mantri Shri.AK Antony, Chief minister of Andhra Pradesh Dr. YS Rajasekhar Reddy, Raksha Rajya Mantri Shri MM Pallam Raju, Minister of State for Human Resource Development, Smt. D Purandareswari, Chief of the Naval Staff Admiral Sureesh Mehta and high ranking officials from the Navy, Department of Atomic Energy, and Defence Research and Development Organisation.

FREE SEMINARS AND TECHNICAL PPT PRESENTATIONS

On this occasion, the Prime Minister congratulated the Director General of the ATV (Advanced Technology vehicle) Program Vice Admiral DSP Verma (Retd) and all personnel associated with it for achieving this historic milestone in the country’s defence preparedness. He noted that they had overcome several hurdles and barriers to enable the country to acquire self reliance in the most advanced areas of defence technology. The Prime Minister made a special mention of the cooperation extended by Russia. The Prime Minister stated that the Government is fully committed to ensuring the Defence of our national interests and the protection of our territorial integrity. The Government would render all support to the constant modernization of our defence forces and to ensuring that they remain at the cutting edge of technology.

The project director, Vice Admiral (retd) D S P Verma, said that the Arihant is a 6,000-tonne submarine with a length of 110 meters and a breadth of 11 meters. The length is about 10 percent longer than previously published estimates, while the 11 meter beam is much less than the 15 meters of previous un-offcial estimates. Experts say the vessel will be able to carry 12 K 15 submarine launched ballistic missiles that have a range of over 700 km. The Indian nuclear powered attack submarine design was said in some reports to have a 4,000-ton displacement and a single-shaft nuclear power plant of Indian origin. By other accounts it would be 9,400 tons displacement when submerged and 124 meters long.

The MoD/PMO decided not to release any photographs of the submarine, and no filming or photography by the media was permitted inside the Matsya Dock. One report stated that the submarine was visibly based on the Russian Borei-class SSBN, and claimed that the official invitation had a silhouette of the submarine indicating that it’s almost definitely based on the Borei. But the 935 Borei has a length of 170 meters (580 feet), a beam of 13 meters (42 feet), and a displacement of 11,750-12,250 tons Surfaced and 17,000 tons Submerged.

India has been working actively since 1985 to develop an indigenously constructed nuclear-powered submarine, one that was possibly based on elements of the Soviet Charlie II-class design, detailed drawings of which are said to have been obtained from the Soviet Union in 1989. This project illustrates India’s industrial capabilities and weaknesses. The secretive Advanced Technology Vessel (ATV) project to provide nuclear propulsion for Indian submarines has been one of the more ill-managed projects of India.With the participation of involved Russian scientists and technician in the diverse phases of the program, came the possibility of that the first Indian submarine with nuclear propulsion can be operational in 2009, having been launched in 2006-2007.

Although India has the capability of building the hull and developing or acquiring the necessary sensors, its industry had been stymied by several system integration and fabrication problems in trying to downsize a 190 MW pressurized water reactor (PWR) to fit into the space available within the submarine’s hull. The Proto-type Testing Centre (PTC) at the Indira Gandhi Centre For Atomic Research. Kalpakkam, was used to test the submarine’s turbines and propellers. A similar facility is operational at Vishakapatnam to test the main turbines and gear box.

In 1998, L&T began fabricating the hull of ATV but the struggle with the reactor continued. After BARC designs failed, India bought reactor designs from Russia. By 2004 the reactor had been built, tested on land at the IGCAR and had gone critical. Its modest size, around 6,000 tons (the Ohio class SSBN in the movie Crimson Tide weighs over 14,000 tons), led experts to call it a “baby boomer”.

India had ample experience building Pressurised Heavy Water Reactors (PHWRs) using natural un-enriched uranium as fuel, and heavy water as moderator and coolant. But this was the first time that India has built a PWR that used enriched uranium as fuel, and light water as both coolant and moderator. The electrical power reactors that India would be importing (potentially from Russia, France, and the US) would also be PWRs with enriched uranium as fuel, and light water as both coolant and moderator. Naval nuclear reactors typically use uranium that is enriched to much higher levels than is the case with shore-based power reactors.

While the present project reportedly ends at three units, defence officials have not ruled out building larger submarines on the basis of national strategic imperatives. These have changed since the conception of the project. By the time the first unit was launched in July 2009, the construction of the hull for the next one was reportedly already underway at the Larson and Toubro (L&T) facility at Hazira where the first hull was built. The cost of the three submarines was reported at over Rs3,000 crore, over US$600,000,000 [the Indian numbering system is denominated in Crore 1,00,00,000 and Lakhs 1,00,000, so Rs3,000 crore is Rs30,000,000,000, or US$623,104,807.77 the day INS Arihant was launched]. Another report said that the first submarine alone had cost Rs. 14,000 Crore [$US2.9 billion]. In April 2006, the larger American Virginia-class subs were priced at $2.4 billion apiece, at which time the goal was to cut the program’s cost to about $2 billion per sub. The $2 billion figure is a baseline expressed in fiscal 2005 dollars. As of late 2008 the Procurement Cost for the first three units of the British Astute class SSN was forecast at £3,806 M (outturn prices) [US$6,275 B at 2009 conversion rates], for a unit cost of about US$2.1 billion.

The three submarines would be based at a facility being developed at Rambilli close to Vishakpatnam, where hundreds of acres of land had already been acquired. The Indian Navy hoped to commission the base by 2011 in time for INS Arihant’s commissioning, and two of these submarines would be at sea at any given time while the third would be in maintenance at the base. Other reports claim that India plans to build a fleet of five nuclear-powered submarines. On report in 2009 stated that the government had given clearance for the construction of much bigger SSBNs, nuclear-powered submarines capable of launching ballistic missiles, each of them costing about $2 billion (approximately Rs 10,000 crore each). This would take off once the three Arihant class submarines were ready.

By 2004 it was reported that the first ATV would be launched by 2007. At that time it was reported that it would be an SSGN and displacing some 6,500 tons, with a design derivative of Russia’s Project 885 Severodvinsk-class (Yasen) SSN. The ATV multirole platform would be employed for carrying out long-distance interdiction and surveillance of both submerged targets as well as principal surface combatants. It would also facilitate Special Forces operations by covertly landing such forces ashore. The ATV pressure hull will be fabricated with the HY-80 steel obtained from Russia.

This would have the possibility of multiple performance: it could use missiles of cruise of average reach (1,000 km), ballistic missiles of short reach (300 km), torpedoes and mines, besides participating of operations special.

The ATV is said to be a modified Akula-I class submarine. The Russian Akula-2 and Yasen are also modified Akula-1. By this line of reasoning the ATV would be in league of Yasen, so the ATV would be 6500 tons light, 8500 tons armed and surfaced and 10000 tons submerged. It would be the biggest and heaviest combat naval vessel built in India to date.

The 100-member crew, which will man the submarine, was trained at an indigenously-developed simulator in the School for Advanced Underwater Warfare (SAUW) at the naval base in Vizag. Hands-on training will be done on the INS Chakra, a 12,000-tonne Akula-II class nuclear-powered attack submarine being taken on a 10-year lease from Russia. SBC in Vizag is to become the assembly line for three ATVs, costing a little over Rs 3,000 crore each or the cost of a 37,000 ton indigenous aircraft carrier built at the Cochin Shipyard. Larsen and Toubro (L&T) has begun building the hull of the second ATV at its facility in Hazira, to be inducted into the navy by 2012.

As of 2007 the first of the five long-delayed ATVs was scheduled to be fully-ready by 2010 or so. In August 2008 it was reported that on January 26, 2009, the sluice gates of an enclosed dry-dock in Visakhapatnam were to be opened and the world was to take its first look at India’s first nuclear-powered submarine, the Advanced Technology Vessel (ATV), as it entered the waters.

In February 2009 defence minister A K Antony confirmed that India’s nuclear-powered submarine is in the final stages. “The Advanced Technology Vessel (ATV) project is in the final stage. We had some problems with the raw material in the initial phase. But now the project is in its final stage,” he said at the ongoing Aero-India show. This was a rare admission by the defence minister – not only on the existence of the secretive project to build an indigenous nuclear submarine, but also on its developmental status. The submarine, modelled on the Russian Charlie class submarine, is slated for a sea trial in 2009. Officials in the navy and atomic energy department are hopeful of meeting the deadline this time. In the long run, the government plans to buy three nuclear submarines to provide the navy with capability to stay underwater for a very long time. Though defence and nuclear sccientists have been working on this project since 1985, they had initial setbacks with the material and miniaturisation of the nuclear reactor whih will be fitted into the submarine’s hull.

“Do Bussiness with Style” cool new templates for Microsoft office 2007

Are u get bored using traditional old Microsoft office environment ,then here is something new. Microsoft Small Business is now giving out some cool free designer templates for Office 2007 for Windows, and Office 2008 for Mac. The best is that they comes in 9 colour schemes. These six templates contain a spreadsheet, presentation, invoice, letterhead, business card and newsletter blast templates.U can also get a 60 days free trial of Microsoft Office if you don’t have it. just click the link

Microsoft Office 2007 free templates

Don’t forget that , these templates will only work if you have the Office (2007 for Windows and 2008 for Mac) because the templates are in the new formats (.docx, .xlsx etc.). These templates are zip files and you can download them directly. No verification required.

List of Approved Post Graduate Education and Research institutions for M Tech & M E

Here is a link having information of seats of M Tech and M E in various colleges in India , although the document is little bit older but will be very helpful for basic idea of available courses and information about seats. click the link bellow to download the document.

List of Approved Post Graduate Education and Research institutions Upto 30th September, 2004 (M.E/M.Tech)

Save Energy Go green


Not with envy, but kindness to the environment, and alertness to your electricity bills
You leave your PC on all day. But you care for the environment, so you switch off the monitor. Good move, but did you know you are still wasting about 45 Watts with the CPU running?
That’s what Tufts University’s Climate Initiative says. And it also says that if you leave your PC on for the entire day, 850-1500 pounds of carbon dioxide is released into the atmosphere a year. And this means that you need 60-300 trees to absorb that much CO2 in a year.
That does get you thinking doesn’t it! Now, all that noise about climate change because of our insensitivity to the environment starts to make sense. We cut down trees, we waste electricity, we replace cellphones and gadgets with the latest ones, without bothering about what really happens to the old ones. While there could be debates on how much all this really impacts our environment, most of us know intuitively that what we are doing mindlessly is really not right, and is likely to have negative repercussions.
That is why the world over, the word Green is becoming red hot. Green computing is in. This means that you start to use computing resources efficiently. It’s not just about being good to Mother Nature, but also being able to save a lot of money being spent on electricity. For companies that have thousands and thousands of computers running, datacenters keeping their businesses up and running, all this can add to a huge fortune. In fact, going green is now gaining so much momentum that those companies which do not have green computing initiatives are seen as enemies of the environment.
So how does it matter to you? You might have one PC and a laptop at home, in addition to the multiple electronic devices you run. And you might even be considering another PC for the little one. Think if you really need all those PCs. Buy only if you have to. While buying, remember that laptops consume less power, so it could be wiser to go the portable way. Or if you need to look at a PC, opt for monitors that consume less power. LCD monitors need much less power than CRT ones. Look for Energy Star ratings and save energy.
Explore the power saving options of your PC and customize them to suit the way you work. If you often go away from your PC for a long time, you can set the monitor and hard disk to be switched off after a few minutes of no-activity.
To dispose old PCs and gadgets, get in touch with NGOs working in this area and figure out the best way to do so. Or if you own a branded PC or gadget, get in touch with the company and ask how this e-waste can be managed. Most computer vendors and cellphone makers have Green initiatives on, so you might get some help.
These are still early days for green initiatives for companies in India. But if you and I make a start, the rest will fall in place.

ISRO all set for Chandrayan II in 2012, Mars mission in 2013

24 December 2008
Indian Space Research Organisation (ISRO) is all set for its second moon mission `Chandrayaan II’ and space scientists plan to send a robot to moon in 2012, followed by a spacecraft to Mars in 2013.

ISRO will also sent a man to space astride a Russian spaceship the same year, its chairman G Madhavan Nair said.

Besides, ISRO has lined up a slew of missions, which also include landing a spacecraft on an asteroid and sending a probe to fly past a comet, the ISRO chief told reporters at a function organised by the Confederation of Indian Industry (CII) to felicitate the members of the Chandrayaan I team.

”Chandrayaan II, the design is complete, we hope by 2012, we will be ready for the launch,” Madhavan Nair said.

The launch of Chandrayaan-II, approved by the government will include a rover that will land on the moon. It will map a three-dimensional atlas of the moon, and analyse the chemical and mineral composition of the lunar surface.

India hopes to send an astronaut into space by 2013 and a manned mission to the moon by 2020.

India, which started its space programme in 1963, now account for around 16 satellites currently in earth orbit

India also has the world’s largest constellation of seven earth-observation satellites, which is being used for telecommunications, TV broadcasting, earth observation, weather forecasting, remote education and healthcare.

Nair, however, said the mission to Mars is still at a conceptual stage and ISRO expects to finalise plans by next year, with take-off in 2013.b He said the same Geosynchronous Satellite Launch Vehicle (GSLV) will be used to launch the probe to Mars.

"Semulation" an Introduction

Semulation is a computer science-related neologism that combines simulation and emulation. It is the process of controlling an emulation through a simulator.

Semulation in computer science

Digital hardware is described using hardware description languages (HDL) like VHDL, Verilog or System Verilog. These descriptions are simulated together with a problem-specific testbench. The initial functional verification of most IP designs is done via simulation at register transfer level (RTL) or gate level. In an event driven simulation method the code must be processed sequential by a CPU, because a normal computer is not able to process the implemented hardware parallel. This sequential approach leads to long simulation times especially in complex systems on chip (SoC) designs.

After simulation the RTL description must be synthesized to fit in the final hardware (eg.: FPGA, ASIC). This step brings a lot of uncertainties because the real hardware is normally not as ideal as the simulation model. The differences between real world and simulation are a major reason why emulation is used in hardware design.

Generally the simulation and emulation environment are two independent systems. Semulation is a symbiosis of both methods. In semulation one part of a hardware design is processed sequential in software (eg.: the testbench) while the other part is emulated.

An example design flow for semulation is depicted in the following block chart:

Semulation.png

The database holds the design and testbench files and the information about the block whether it will be simulated or emulated. The left part shows the normal simulation path where the design files must be compiled for an HDL simulator. The right part of the state chart handles the flow for the emulation system. Design files for the FPGA must be synthesized to the appropriate target technology. A major point in semulation is the connection between the emulation system and the HDL simulator. The interface is necessary for the simulator to handle the connected hardware.

Advantages of Semulation

  • Simulation acceleration: Simulating huge designs with an HDL simulator is a tedious task. When the designer transfers parts of the design to an emulation system and co-simulates them with the HDL simulation, the simulation run times can be decreased.
  • Using real hardware early in the design flow.

Virtual Instrumentation

Virtual Instrumentation is the use of customizable software and modular measurement hardware to create user-defined measurement systems, called virtual instruments.

Traditional hardware instrumentation systems are made up of pre-defined hardware components, such as digital multimeters and oscilloscopes that are completely specific to their stimulus, analysis, or measurement function. Because of their hard-coded function, these systems are more limited in their versatility than virtual instrumentation systems. The primary difference between hardware instrumentation and virtual instrumentation is that software is used to replace a large amount of hardware. The software enables complex and expensive hardware to be replaced by already purchased computer hardware; e. g. analog to digital converter can act as a hardware complement of a virtual oscilloscope, a potentiostat enables frequency response acquisition and analysis in electrochemical impedance spectroscopy with virtual instrumentation.

The concept of a synthetic instrument is a subset of the virtual instrument concept. A synthetic instrument is a kind of virtual instrument that is purely software defined. A synthetic instrument performs a specific synthesis, analysis, or measurement function on completely generic, measurement agnostic hardware. Virtual instruments can still have measurement specific hardware, and tend to emphasize modular hardware approaches that facilitate this specificity. Hardware supporting synthetic instruments is by definition not specific to the measurement, nor is it necessarily (or usually) modular.

Leveraging commercially available technologies, such as the PC and the analog to digital converter, virtual instrumentation has grown significantly since its inception in the late 1970s. Additionally, software packages like National Instruments’ LabVIEW and other graphical programming languages helped grow adoption by making it easier for non-programmers to develop systems.