Friday, December 23, 2011

The Evolution of the Microprocessor


The first programmable microprocessor — the Intel® 4004 — made its debut in 1971 in a business calculator. Since then, multiple generations of Intel microprocessors have gone on to be the brains in a variety of everyday products, from gas pumps and traffic light controllers to some of history’s most profound moments, like the Apollo space missions and medical research into the human genome.

It would take up to one million original Intel 4004 chips to provide the effortless computing power we have all come to expect from today’s Ultrabook. By our calculation, that would make today’s laptop measure roughly 23 feet by 10 feet, and it would consume 4,000 times more energy than a moden laptop and cost about $150,000 a year to power. Cheers to progress!

The dramatic evolution of computing over the past few decades has unleashed wave after wave of innovation. Yet, Intel believes we are still at the very early stages in the evolution of computing. Fueled by the relentless advancement of Moore’s Law, the pace of technological innovation is, in fact, accelerating. Intel believes the sheer number of advances in the next 40 years will equal all of the innovative activity that has taken place over the last 10,000 years of human history. Intel envisions billions of connected people, and trillions of connected electronic and electromechanical devices, creating the so-called “Internet of things”. Today, technology is no longer the limiting factor. We are limited only by our own imagination, so the really important question is, “What do you want from the future of computing?”

Sunday, November 6, 2011

What is Motherboard


Hi Friends,
How are you all,

I hope you enjoyed reading details about processors. Today I am gonna elaborate about one of the most vital components of CPU that is MOTHERBOARD.

In today's article I will try to explain you brief of what motherboard is and other details about motherboard and also helping you select the best suitable to you?

What is motherboard??

A motherboard is the central or primary circuit board making up a complex electronic system, such as a modern computer. It is also known as a mainboard, baseboard, system board, planar board or, on Apple computers, a logic board, and is sometimes abbreviated as mobo.

The motherboard of a typical desktop consists of a large printed circuit board. It holds electronic components and interconnects, as well as physical connectors. It often consists of two components or chips known as the Northbridge and Southbridge, though they may also be integrated into a single component. These chips determine, to an extent, the features and capabilities of the motherboard.

Most motherboards include, at a minimum:

sockets (or slots) in which one or more microprocessors are installed.
slots into which the system's main memory is installed (typically in the form of DIMM modules containing DRAM chips)
a Chipset which forms an interface between the CPU's front-side bus, main memory, and peripheral buses
Non volatile chips (usually Flash ROM in modern motherboards) containing the system's firmware and BIOS
a clock generator which produces the system clock signal to synchronize the various components
slots for expansion cards (these interface to the system via the buses supported by the chipset)
power connectors and circuits, which receive electrical power from the computer power supply and distribute it to the CPU, chipset, main memory, and expansion cards.
Additionally, nearly all motherboards include logic and connectors to support commonly-used input devices, such as PS 2 connectors for a mouse and keyboard.

There are a lot of motherboards on the market to choose from. The big question is, how do you go about choosing which one is right for you?

The first factor to think about concerning motherboards is the size, or form factor. The most popular motherboard form factor today is ATX, which evolved from it's predecessor, the Baby AT, a smaller version of the AT (Advanced Technology) form factor.

The important differential is which CPU the board supports. Two of the biggest makes of CPUs at the moment are Intel and AMD, yet you cannot buy motherboards that support the use of either: it will support one or the other, due to physical differences in the connectors. Furthermore, you must choose a specific type of processor; for example, an AMD Athlon 64 or or Intel Core 2 Duo.

Chipsets are a crucial part of a motherboard, a chipset supports the facilities offered by the processor. A chipset is part of the motherboard, and cannot be upgraded without upgrading the whole board. There are a few main producers of chipsets, which are AMD, Intel, NVidia and Via: The latter two make chipsets for both AMD and Intel processors; AMD and Intel only make chipsets compatible with their own processors.


The next thing is how much RAM you want. RAM, or Random Access Memory, is the main memory in a computer, and is used mainly to store information that is being actively used or that changes often. It is always wise to choose a motherboard that can support more RAM than you currently need. For example, if you want 512MB of RAM in your computer, it would be wise to buy a motherboard that supports at least 1GB of RAM (many now support 4GB). This is simply to help make your computer ‘future proof': if you need to upgrade your memory, you will not need to upgrade your motherboard too.

You are likely to want various expansion cards (such as graphics cards, sound cards and so on). These components tend to have physically different connectors. The PCI-E slot is the most common graphics card interface nowadays, but the AGP slot is still in use.

Aside from the main differences I have covered, there a few more details to consider. All motherboards have USB sockets for peripheral devices. You also need to ensure that your motherboard has the right socket for your drives (hard drive, CD ROM drive, etc), which are generally SATA and IDE.

Unless you have limitless resources, price is always a consideration when buying computer component. A motherboard usually takes up a fairly large part of any PC budget, so it requires careful consideration. A cheap motherboard may be more unreliable and more trouble than it is worth. A motherboard is one of those components where it pays to spend a little extra.

Finally, try to buy from a reputable retailer: It is always worth doing so just in case you have any problems.

That's it for today, I hope you got a brief idea about how you should go about buying a motherboard and a CPU.

Please feel free to comment or write views on the above article. It will help me do better.

Image of Motherboard will be uploaded soon

Processor War

Hello Friends,

I hope you had fun reading the history of computers.

Taking a leaf out of those history I bring details about the heart of computer, that is the PROCESSOR.

Two most prominent chipmaker in the world - INTEL and AMD are slugging it out to provide the best possible processors in today's world. With time the companies have moved ahead from Single- Core CPUs to Quad- Core CPUs with 8-Core CPUs in pipeline.

Just 3 years back we were arguing about performance difference between Single core cpus v/s dual core cpus. Now, with Intel's dual-core becomming more mainstream, within less than a year, we have already begun to look to the future and speculate the performance increase between dual vs. quad-cores.

What are basically Single Core, Dual Core, Quad Core CPUs?

Let me explain each of them in brief

Single Core : In a single-core or traditional processor the CPU is fed strings of instructions it must order, execute, then selectively store in its cache for quick retrieval. When data outside the cache is required, it is retrieved through the system bus from random access memory (RAM) or from storage devices. Accessing these slows down performance to the maximum speed the bus, RAM or storage device will allow, which is far slower than the speed of the CPU. The situation is compounded when multi-tasking. In this case the processor must switch back and forth between two or more sets of data streams and programs. CPU resources are depleted and performance suffers.

Dual Core : A dual-core processor is a single chip that contains two distinct processors or "execution cores" in the same integrated circuit. Each core handles incoming data strings simultaneously to improve efficiency.
A dual core processor is different from a multi-processor system.

Multi Core refers to - two or more CPUs working together on one single chip (like AMD Athlon X2 or Intel Core Duo) in contrast to DUAL CPU, which refers to two separate CPUs working together.

Quad Core : In this processor, there are four complete execution cores in a single processor with upto 12 MB of L2 cache and upto 1333 MHz Front Side Bus. Four dedicated, physical threads help OS and applications deliver additional performance that reflects in multi-tasking and multi-threaded performance. More instructions can be carried out per clock cycle, shorter and wider pipelines execute commands more quickly and improved bus lanes move data throughout the system faster.

To take full advantage of more than one core CPU, operating systems as well as applications should be developed such that they can utilize the multiple core capability of processor thereby improving the performance.

The obvious question that comes to every computer user will be:

What sort of processor do i need to buy?

The answer to this is quite straight forward.

In today's world usually we run two to three applications at one time. That is while surfing on web, we listen to music, burn dvd, run other applications as well. So at this time a single core cpu won't be able to perform these task efficiently. So the best option would be to go for dual core cpu or quad core cpu.

For value conscious users who want the best out of both the worlds should go for a higher end dual core cpu as they can be easily overclocked to meet all the basic requirements.

For those who have no money bar issue can go for the best Quad Core processors available in the market.

For users who are interested in overclocking their units to perform the best Quad Cores are the way to go while for those who don't like to play with their units will have a different opinion that dual cores are better as the lab test mention below suggest.

Games, which usually run about 2 years behind hardware innovations, b/c of their long and vested development time, are just now starting to cater to dual-core cpu's. However, with games like Unreal Tournament 2007, and Alan Wake looking to take advantage of multi cores, it's common sense that individuals looking to play the next wave of next gen games, look to the quad-core environment. Also, the power consumption to productivity ratio is greater in quad-core machines, verses dual-core.

I think it won't be until the end of '07 to '08 when we really see common programs and apps, as well as games, truly become multi-threaded.

We're not yet at a point of deminishing returns, where adding more cores will cease to improve performance, so the switch should be even more likely. As far as upgradability, dual-cores can't be upgraded to quads, and the new motherboards coming out will be able to support dual-quad- cores, potentially offering up to 8 cores within a single machine.

While programs now don't really take advantage of quad-core cpu's, unless they are extremely multi-threaded, they will eventually.

In my personal opinion, if you are a gaming freak it is better to go for Quad Core for reasons mentioned below. And for those who are more in software development and application users need not worry as buying a Quad Core should ensure that they don't have to update their Computers for 5 to 7 years.

The recent XBIT lab review had a battle between 3 GHz Dual Core processor and a 2.4 GHz Quadra Core processor which are price almost the same. Though the performance difference is not huge it has to be kept in mind that the applications and games as of today are still lacking in terms of utilizing all core. But one can expect to change all this in near future, as near as end of 2008 or starting of 2009.

Here is the link for the processor battle between two processors.

http://www.xbitlabs.com/articles/cpu/display/core2quad-q6600.html

Finally, it all boils down to what you need and how are you gonna utilize the processor so that your money is well spent and you get the most out of your CPUs.

There is another report generated where dual core and quad core processor of INTEL and AMD are compared.

Find it out at http://techreport.com/articles.x/12091/1

Not surprising is that INTEL wins comprehensively considering they are working on 65 nm processors as well as 45 nm processors. AMD has a long way to go and play catch up to its arch rival, till then it can consolidate with # 2 chip maker in the world.

I hope you enjoyed reading and referring about the processors. If you wish to make any comments or give your personal views you are most welcome to do so. As they are very valuable to me.

So have a good day, I look forward to hearing from you soon.

Computer History Part 3

Hey, Greetings to my friends.

How are you all. Hope you are having a good time.

Continuing to my last post of computer history I bring you the last part of history of computers.

This is gonna be a long post and I grab a popcorn while you relish the happenings in computer world since 1990.

So get ready for the ride and fasten your seat belts as it gonna be one hell of a ride.

1990 Tim Berners-Lee, working with Robert Cailliau at CERN propose a 'hypertext' system, which is the first start of the Internet as we know it today.

1990 Microsoft releases Windows 3.0 a completely new version of Microsoft Windows. The version will sell more than 3 million copies in one year.

1990 Microsoft exceeds $1 billion in sales and becomes the first company to do so.

1990 The World, the first commercial Internet dial-up access provider comes online.

1990 Microsoft and IBM stop working together to develop operating systems.

1990 The first search engine Archie, written by Alan Emtage, Bill Heelan, and Mike Parker at McGill University in Montreal Canada is released on September 10, 1990

1991 Linux is introduced by Linus Torvald in 1991.

1991 The World Wide Web is launched to the public August 6, 1991. Tim Berners-Lee, a scientist at the European Partial Physics Laboratory (CERN) in Geneva, Switzerland develops the Web as a research tool.

1991 Following its decision not to develop operating systems cooperatively with IBM, Microsoft changes the name of OS/2 to Windows NT.

1992 Open Database Connectivity (ODBC) is developed by SQL Access Group.

1993 Intel releases the Pentium Processor. The processor is a 60 MHz processor, incorporates 3.2 million transistors and sells for $878.00.

1994 Microsoft releases its beta for Windows 95, code named Chicago.

1994 YAHOO is created in April, 1994.

1994 MS-DOS 6.22 was released April, 1994

1995 The dot-com boom starts.

1995 The first VoIP software (Vocaltec) is released allowing end users to make voice calls over the Internet.

1995 Apple develops FireWire.

1995 Microsoft Releases Windows 95, within four days the software sells more than 1 million copies.

1995 One of the largest and well known e-commerce sites today opens its website for the first time. Amazon.com is officially opened July of 1995.

1996 Google is first developed by Sergey Brin and Larry Page.

1996 Microsoft introduces DirectX.

1996 Sony enters the PC market with the release of VAIO.

1997 IEEE releases 802.11 (WiFi) standard.

1997 Digital Video Discs / Digital Versatile Discs (DVDs) go on sale.

1997 Microsoft releases Microsoft Office 97.

1997 Microsoft announces Windows 98.

1997 The Li-Ion battery begins being used for commercial uses.

1998 Intel releases the Celeron processor.

1998 Bill Gates, is hit in the face with a cream pie.

1998 MySQL is introduced.

1999 The D programming language starts development.

1999 On December 1, 1999 the most expensive Internet domain was sold by Marc Ostrofsky for $7.5 Million.

2000 Microsoft releases Windows ME June 19, 2000.

2000 Microsoft introduces C# to the public in June 2000.

2001 Bill Gates unveils the Xbox on January 7th 2001.

2001 Apple introduces Mac OS X 10.0 code named Cheetah.

2001 USB 2.0 is introduced.

2001 Apple introduces the iPod.

2001 Microsoft Windows XP home and professional editions are released October 25, 2001.

2002 PCI Express is approved as standard.

2002 Microsoft release DirectX 9, December 19, 2002.

2003 Microsoft Windows XP 64-Bit Edition (Version 2003) for Itanium 2 systems is released on March 28, 2003.

2003 The first computer is infected with the spybot worm on April 16, 2003.

2004 Google announces Gmail on April 1, 2004.

2005 YouTube is founded.

2005 Microsoft announces it's next operating system, codenamed "Longhorn" will be named Windows Vista on July 23, 2005.

2006 The blu-ray is first announced and introduced at the 2006 CES on January 4, 2006.

2006 On January 5, 2006 Intel introduces the Core Duo.

2006 On July 27, 2006 Intel introduces the Core 2 Duo processors.

2006 On November 14, 2006 Microsoft released its portable Zune media player.

2006 Microsoft releases Microsoft Windows Vista to corporations on November 30, 2006.

2007 Apple announces in January 1, 2007 that it will drop computer from its name as it becomes a company who deals with more than computers.

2007 Apple introduces the iPhone to the public at the January Macworld Conference & Expo.

2007 Microsoft releases Microsoft Windows Vista and Office 2007 to the general public January 30, 2007.

2007 Apple releases the Apple iPhone to the public June 29, 2007.

2007 Intel releases Quadra Core to the public.

I hope you enjoyed the happenings till present.

Please leave your comments and views.

Computer History Part 2

Hello Friends,
I hope you enjoyed the evolution of computers and development of some of the most amazing hardware systems and software.
Now continuing on the same lines. Here is some more history in the world of computers of what happened from 1951 to 1989.

1951 The first commercial computer, the "First Ferranti MARK I" is now functional at Manchester University.

1952 Alexander Sandy Douglas created the first graphical computer game of Tic-Tac-Toe on a EDSAC known as "OXO".

1953 IBM introduces the first IBM computer, the 701.

1953 A magnetic memory smaller and faster than existing vacuum tube memories is built at MIT.

1953 The IBM 701 becomes available to the scientific community. A total of 19 are produced and sold.

1954 IBM produces and markets the IBM 650. More than 1,800 of these computers are sold in an eight-year span

1954 The first version of FORTRAN (formula translator) is published by IBM.

1955 William (Bill) H. Gates is born October 28, 1955.

1955 Bell Labs introduces its first transistor computer. Transistors are faster, smaller and create less heat than traditional vacuum tubs, making these computers more reliable and efficient.

1956 On September 13, 1956 the IBMs 305 RAMAC is the first computer to be shipped with a hard disk drive that contained 50 24-inch platters and was capable of storing 5MB of data.

1960 The Common Business-Oriented Language (COBOL) programming language is invented.

1962 Steve Russell creates "SpaceWar!" and releases it in February 1962. This game is considered the first game intended for computers.

1963 IEEE is founded.

1963 The American Standard Code for Information Interchange (ASCII) is developed to standardize data exchange among computers.

1964 Dartmouth Universitys John Kemeny and Thomas Kurtz develop Beginners All-purpose Symbolic Instruction Language (BASIC).

1965 Lawrence G. Roberts with MIT performs the first long distant dial-up connection between a TX-2 computer n Massachusetts and a Q-32 in California.

1967 GPS becomes available for commercial use.

1968 Intel Corporation is founded by Robert Noyce and Gordon Moore.

1969 AT&T Bell Laboratories develop Unix.

1969 AMD is founded.

1970 U.S. Department of Defense develops ada a computer programming language capable of designing missile guidance systems.

1970 Intel introduces the first microprocessor, the Intel 4004.

1970 Centronics introduces the first dot matrix printer.

1971 The first 8" floppy diskette drive was introduced

1971 FTP is first purposed.

1971 Niklaus Wirth invents the Pascal programming language.

1972 Dennis Ritchie at Bell Labs invents the C programming language.

1972 The compact disc is invented in the United States.

1973 The first VoIP call is made.

1974 IBM develops SEQUEL, which today is known as SQL today.

1975 MITS ships one of the first PCs, the Altair 8800 with one kilobyte (KB) of memory. The computer is ordered as a mail-order kit for $397.00.

1975 Paul Allen and Bill Gates write the first computer language program for personal computers, which is a form of BASIC designed for the Altair. Gates later drops out of Harvard and founds Microsoft with Allen.

1976 Steve Wozniak designs the first Apple, the Apple I computer in 1976, later Wozniak and Steve Jobs co-found Apple Computers.

1976 The first 5.25-inch floppy disk is invented.

1976 Microsoft introduces an improved version of BASIC.

1977 Apple Computers Apple II, the first personal computer with color graphics is demonstrated.

1978 TCP splits into TCP/IP driven by Danny Cohen, David Reed, and John Shoch to support real-time traffic. This allows the creation of UDP.

1979 Oracle introduces the first commercial version of SQL.

1979 The programming language DoD-1 is officially changed to Ada.

1979 The Motorola 68000 is released and is later chosen as the processor for the Apple Macintosh.

1979 Oracle is founded.

1980 IBM hires Microsoft to develop versions of BASIC, FORTRAN, COBOL, and Pascal for the PC being developed by IBM.

1980 Microsoft licenses Unix and starts to develop a PC version, XENIX.

1981 MS-DOS 1.0 was released August, 1981.

1981 VHDL is proposed and begins development.

1981 Adam Osborne introduces the Osborne I, the first successful portable computer, which weighs 25 pounds.

1981 Commodore ships the VIC-20, which later becomes the worlds most popular computer costing only $299.95.

1981 Logitech is founded in Apples, Switzerland.

1982 WordPerfect Corporation introduces WordPerfect 1.0 a word processing program that will become one of the computer markets most popular word processing program.

1982 Sun is incorporated in February 1982, with four employees.

1982 Lotus Development Corporation is founded and Lotus 1-2-3, a spreadsheet program is introduced.

1982 Compaq Computer Corp. is founded by Rod Canion and other Texas Instruments Incorporated engineers. Compaq is the first company to introduce a clone of the IBM PC (the Compaq Portable in 1983) and become IBMs biggest challenger in the corporate market.

1982 Apple Computer is the first personal computer manufacturer to hit the $1 billion mark for annual sales.

1983 THX is established.

1983 Microsoft Windows was announced November, 1983

1984 The 3.5-inch floppy diskette is introduced and later becomes an industry standard.

1984 Dell Computer is founded May 3, 1984 in Austin Texas.

1984 The now famous Apple commercial is shown during the Super Bowl, the commercial introduces the Apple Macintosh, a computer with graphical user interface instead of needing to type in commands. In six months sales of the computer reach 100,000.

1984 Microsoft introduces MS-DOS 3.0 for the IBM PC AT and MS-DOS 3.1 for networks.

1985 Microsoft Windows 1.0 is introduced in November, 1985 and is initially sold for $100.00.

1986 Pixar is co-founded by Steve Jobs.

1986 The AT or 101 key keyboard is introduced by IBM.

1987 Microsoft introduces Microsoft Works.

1987 The SPARC processor is introduced by Sun.

1987 IBM introduces VGA.

1989 The networking routing protocol OSPF is introduced.


I hope you had a good time reading. Please post your comments and views. It will help me do better in future.

Computer History Part 1

Hello Friends

I will be starting of with a little bit of Computer History
Here is how the Computers evolved from a counting board in 300 B.C to 2008 Quadra-Core Systems.

300 B.C. The counting board much like the later abacus is believed to be first
used by the Babylonians circa.

1617 John Napier introduced a system called "Napiers Bones," made from horn,
bone or ivory the device allowed the capability of multiplying by adding
numbers and dividing by subtracting.

1623 The first known workable mechanical calculating machine is invented by
Germany's Wilhelm Schickard.

1642 France's Blaise Pascal invents a machine, called the Pascaline, that can
add, subtract, and carry between digits.

1674 Germany's Gottfried Wilhelm Leibnitz creates a machine that can add,
subtract, multiply and divide automatically.

1804 Frances Joseph-Marie Jacquard completes his fully automated loom that is
programmed by punched cards.

1820 Thomas de Colmar creates the first reliable, useful and commercially
successful calculating machine.

1838 Samuel Morse invents a code (later called Morse code) that used different
numbers to represent the letters of the English alphabet and Ten digits.

1847 Siemens is founded.

1868 Christopher Sholes invents the typewriter in the United States utilizing
the QWERTY keyboard.

1875 Tanaka Seizo-sho is established in Japan and later merges with another
company called shibaura Seisaku-sho to form Tokyo Shibarura Denki. Later
this companys name is shortened to the company that we know today, TOSHIBA.

1876 Scottish-Canadian-American Alexander Graham Bell invents the telephone.

1877 The microphone is invented in the United States by Emile Berliner.

1885 American Telegraph and Telephone company (AT&T) is incorporated.

1896 Herman Hollerith starts the Tabulating Machine Company, the company later
becomes the well-known computer company IBM (International Business
machines).

1911 Company now known as IBM on is incorporated June 15, 1911 in the state of New York as the Computing - Tabulating - Recording Company (C-T-R), a consolidation of the Computing Scale Company, and The International Time Recording Company.

1924 The Tabulating Machine Company is renamed to IBM.

1927 The first publicly demonstrated TV is demonstrated at Bell Telephone Laboratories.

1928 September 25, 1928, The Galvin Manufacturing Corporation begins, the company will later be known as MOTOROLA.

1930 Galvin Manufacturing Corporation Auto radios begin to be sold as an accessory for the automobile. Paul Galvin coins the name Motorola for the company's new products, linking the ideas of motion and radio.

1933 Canon is established.

1934 The FCC is established.

1936 Germanys Konrad Zuse creates the Z1, one of the first binary digital computers and a machine that could be controlled through a punch tape.

1936 Dvorak keyboard is developed.

1937 Iowa State Colleges John Vincent Atanasoff and Clifford Berry begin work on creating the binary-based ABC (Atanasoft-Berry Computer). Considered by most to be the first electronic digital computer.

1938 The company now known as Hewlett Packard creates its first product the HP 200A.

1939 George Stibitz completes the Complex Number Calculator capable of adding, subtracting, multiplying and dividing complex numbers. This device provides a foundation for digital computers.

1939 Iowa State Colleges John Vincent Atanasoff and Clifford Berry create a prototype of the binary-based ABC (Atanasoft-Berry Computer).

1939 Hewlett Packard is found by William Hewlett and David Packard. The name is decided on the flip of a coin toss. Though it was David who won.

1940 The first handheld two-way radio called the "Handy Talkie" is created by Motorola for the U.S. Army Signal Control.

1941 German Konrad Zuse finishes the Z3, a fully operational calculating machine.

1943 ENIAC (Electronic Numerical Integrator and Computer), the first general-purpose electronic digital calculator begins to be constructed. This computer by most is considered to be the first electronic computer.

1944 The relay-based Harvard-IBM MARK I a large programmable-controlled calculating machine provides vital calculations for the U.S. Navy. Grace Hopper becomes its programmer.

1945 The term bug as computer bug was termed by Grace Hopper when programming the MARK II.

1946 F.C. Williams applies for a patent on his cathode-ray tube (CRT) storing device, an original form of random-access memory (RAM).

1946 ENIAC computer completed.

1946 Robert Metcalfe is born. He is the one who co-invented Ethernet, founded 3Com and formulated Metcalfe's Law

1947 John Bardeen, Walter Brattain and William Shockley invent the first transistor at the Bell Laboratories.

1948 IBM builds the SSEC (Selective Sequence Electronic Calculator). The computer contains 12,000 tubes.

1948 Andreew Donald Booth creates magnetic drum memory, which is two inches long and two inches wide and capable of holding 10 bits per inch.

1948 The 604 multiplying punch, based upon the vacuum tube technology, is produced by IBM.

1949 Claude Shannon builds the first machine that plays chess at the Massachusetts Institute of Technology.

1949 The Harvard-MARK III, the first of the MARK machines to use an internally stored program and indirect addressing, goes into operations again under the direction of Howard Aiken.

1949 The first computer company, Electronic Controls Company is founded by J. Presper Eckert and John Mauchly, the same individuals who helped create the ENIAC computer.

1949 The EDSAC performs its first calculation on May 6, 1949.

1949 The small-scale electronic machine (SSEM) is fully operational at Manchester University.

1949 The Australian computer CSIRAC is first ran.

1950 The first electronic computer is created in Japan by Hideo Yamachito.

1950 Steve Wozniak is born August 11, 1950. He is co-founder of Apple Computers

This is a brief history till 1950.
From 1951 to 2008 i will be posting it in next blog.

Please feel free to comment as it will be very valuable to me and help me do better.

Saturday, October 8, 2011

Alien Dalvik 2.0 brings Android apps to iOS devices

Since Android is an open source operating system, it isn't impossible to make other operating systems compatible with applications developed for it, and Myriad has done just that. In fact even on Android phones, applications run on a virtual machine called Dalvik that interprets Android applications that have been compiled to bytecode.

Myriad's Alien Dalvik technology allows Android applications to run on platforms they weren't created for by creating a compatibility layer that makes the devices resources available to the Android application as it expects them.

Now their 2.0 release promises to have Android applications running on "tablets, TVs automobiles and more", and most impressively on the iPad. This can effectively bridge the gap between different mobile platforms by making one alternative that is capable of working across devices. Another alternative is of course HTML5 /CSS3/JavaScript; open web technologies have been playing catch up with native technologies and for many applications they might be a better choice considering that such applications can bypass the limitations of the app stores.

For those wondering when they can get this technology on their non-Android phone and access the numerous Android applications out there, we're sorry to say it does not work that way. Alien Dalvik is a technology that device manufacturers can choose to integrate with their device in order for it to take advantage of the Android ecosystem; it does not work in running individual applications on individual phones.

Simon Wilkinson, the CEO of Myriad Group had this to say, "We have seen incredible momentum in Android adoption, but we are just scratching the surface. Digital screens such as Internet- enabled TVs and in-vehicle displays, along with other consumer devices like tablets and e-books are proliferating at an astounding rate. Consumers are driving multimedia evolution and are demanding more converged multi-screen services. With Alien Dalvik 2.0, we are creating a more flexible, consistent user experience by mobilizing content such as live sports, recorded TV shows and on-demand movies, so users can enjoy content seamlessly from one device to the next."

Friday, October 7, 2011

3g

3G refers to the third generation of mobile telephony (that is, cellular) technology. The third generation, as the name suggests, follows two earlier generations.

The first generation (1G) began in the early 80's with commercial deployment of Advanced Mobile Phone Service (AMPS) cellular networks. Early AMPS networks used Frequency Division Multiplexing Access (FDMA) to carry analog voice over channels in the 800 MHz frequency band.

The second generation (2G) emerged in the 90's when mobile operators deployed two competing digital voice standards. In North America, some operators adopted IS-95, which used Code Division Multiple Access (CDMA) to multiplex up to 64 calls per channel in the 800 MHz band. Across the world, many operators adopted the Global System for Mobile communication (GSM) standard, which used Time Division Multiple Access (TDMA) to multiplex up to 8 calls per channel in the 900 and 1800 MHz bands.

The International Telecommunications Union (ITU) defined the third generation (3G) of mobile telephony standards IMT-2000 to facilitate growth, increase bandwidth, and support more diverse applications. For example, GSM could deliver not only voice, but also circuit-switched data at speeds up to 14.4 Kbps. But to support

mobile multimedia applications, 3G had to deliver packet-switched data with better spectral efficiency, at far greater speeds.

However, to get from 2G to 3G, mobile operators had make "evolutionary" upgrades to existing networks while simultaneously planning their "revolutionary" new mobile broadband networks. This lead to the establishment of two distinct 3G families: 3GPP and 3GPP2.

The 3rd Generation Partnership Project (3GPP) was formed in 1998 to foster deployment of 3G networks that descended from GSM. 3GPP technologies evolved as follows.

• General Packet Radio Service (GPRS) offered speeds up to 114 Kbps.

• Enhanced Data Rates for Global Evolution (EDGE) reached up to 384 Kbps.

• UMTS Wideband CDMA (WCDMA) offered downlink speeds up to 1.92 Mbps.

• High Speed Downlink Packet Access (HSDPA) boosted the downlink to 14Mbps.

• LTE Evolved UMTS Terrestrial Radio Access (E-UTRA) is aiming for 100 Mbps.

GPRS deployments began in 2000, followed by EDGE in 2003. While these technologies are defined by IMT-2000, they are sometimes called "2.5G" because they did not offer multi-megabit data rates. EDGE has now been superceded by HSDPA (and its uplink partner HSUPA). According to the 3GPP, there were 166 HSDPA networks in 75 countries at the end of 2007. The next step for GSM operators: LTE E-UTRA, based on specifications completed in late 2008.

A second organization, the 3rd Generation Partnership Project 2 (3GPP2) -- was formed to help North American and Asian operators using CDMA2000 transition to 3G. 3GPP2 technologies evolved as follows.

• One Times Radio Transmission Technology (1xRTT) offered speeds up to 144 Kbps.

• Evolution Data Optimized (EV-DO) increased downlink speeds up to 2.4 Mbps.

• EV-DO Rev. A boosted downlink peak speed to 3.1 Mbps and reduced latency.

• EV-DO Rev. B can use 2 to 15 channels, with each downlink peaking at 4.9 Mbps.

• Ultra Mobile Broadband (UMB) was slated to reach 288 Mbps on the downlink.

1xRTT became available in 2002, followed by commercial EV-DO Rev. 0 in 2004. Here again, 1xRTT is referred to as "2.5G" because it served as a transitional step to EV-DO. EV-DO standards were extended twice – Revision A services emerged in 2006 and are now being succeeded by products that use Revision B to increase data rates by transmitting over multiple channels. The 3GPP2's next-generation technology, UMB, may not catch on, as many CDMA operators are now planning to evolve to LTE instead.

In fact, LTE and UMB are often called 4G (fourth generation) technologies because they increase downlink speeds an order of magnitude. This label is a bit premature because what constitutes "4G" has not yet been standardized. The ITU is currently considering candidate technologies for inclusion in the 4G IMT-Advanced standard, including LTE, UMB, and WiMAX II. Goals for 4G include data rates of least 100 Mbps, use of OFDMA transmission, and packet-switched delivery of IP-based voice, data, and streaming multimedia.

Steave Jobs


The Mac, the iPod and iPhone, born out of his vision of marrying high technology to an elegant and simple form, are already recognised by designers as among the most iconic products of the digital age.

Creations from the founder of Apple not only changed the way people communicate, watch films, listen to music and shop on the Internet but large Mac screens and graphics-friendly Mac software also make life easier for architects, publishers, artists and fashion designers.

"One of the truly great designers and mentors," said British architect Norman Foster, known for working on major projects such as the Millennium Bridge in London, the Millau Viaduct in southern France and Swiss Re's headquarters in London dubbed "The Gherkin."

"Steve Jobs encouraged us to develop new ways of looking at design to reflect his unique ability to weave backwards and forwards between grand strategy and the minutiae of the tiniest of internal fittings," Foster added.

The iPod, Apple's big game-changer launched a decade ago, has a special place on the wall of fame of global consumer icons, alongside the Volkswagen Beetle, the Coca-Cola bottle, the Swiss Army pocket knife or the Olivetti portable typewriter.

Every country or culture can have its own consumer design icons -- Italy's Vespa motorscooter or America's Cadillac -- but only relatively few go truly global and endure.

Rarer still are those that change the way people do things.

"Steve Jobs has shown that breakthrough products come from taking intuitive risks, not from listening to focus groups. He was a master of semiotic design", said British industrial designer James Dyson, best known for the dual-cyclone bagless vacuum cleaner.

From its inception in 2001, the iPod spread like electricity and reshaped the music industry in a way its predecessor, the Sony Walkman, failed to do in a lasting fashion.

Apple is a computer company yet it was the first to successfully commercialise digital music on the Internet well before industry giants EMI, Warner Music Group and Sony Music and helped save the industry from slow death by piracy.

Hundreds of millions of iPods have been sold, each featuring a simple retro dial that bears the hallmark of Jobs' design philosophy of clean minimalism.

All over the world, iPods are tucked into the back of torn jeans and in the pockets of suits, strapped to the arms of joggers or entertaining commuters on tedious journeys home.

"Many credit Apple as probably the best advertisement for professional design and the role of design that we have ever seen," said Brandon Gien, an executive member of the International Council of Societies of Industrial Design.

Then came the iPad, released in 2010, which changed the way people read newspapers and books, took notes, surfed the Internet, called each other on Skype and dealt with everyday practical problems thanks to hundreds of savvy applications.

At Paris Fashion Week, which ended on Wednesday, fashion buyers took photos of dresses with their iPad and once the show was over, they flicked through them as a catalogue they had just created and decided which ones they wanted to buy.

"We saw a lot of iPads on the front row," said Marigay McKee, Harrods Fashion,Beauty Director who was at Fashion Week.

"All the bloggers and a lot of the fashion editors diligently carried iPads," she added. Many luxury brands, including Louis Vuitton and Hermes (at a price of $1,400), make iPad holders as chic accessories.

The iPad is also getting the airline industry to rethink entertainment technology used on board. Airlines such as Australia's Quantas are looking into using iPads for in-flight entertainment to help trim costs and weight.

Professional designers regard Jobs not as one of them per se but as an innovator and businessman who recognised that form was as just important as function for a product's success.

They say there is no question Jobs directed the design fundamentals at Apple -- from the elimination of any unsightly screws in product casings to the type-face used to stamp them -- but he also relied on talented professional designers, from Hartmut Esslinger in the 1980s to Jonathan Ive who joined in the 1990s and still heads up product design at Apple.

Jobs was so obsessed with design that he hired Esslinger in 1982 on the then astronomical salary of a $1 million a year to create Apple's design strategy, which produced the "Snow White" design of all Apple products for the rest of that decade.

"Design was not a department that was buried in bureaucracy. He lifted that right up to where it rightfully belonged," said Gien, an Australian industrial designer based in Sydney.

FROM CALLIGRAPHY TO THE APPLE MAC

Jobs was inspired by design early on, having revealed in a famous 2005 speech to Stanford University students that one of his formative experiences was attending a calligraphy class at Reed College before finally dropping out of university himself.

"None of this had even a hope of any practical application in my life. But 10 years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography," Jobs said at the time.

Museums around the world have been collecting early Apple and Jobs products, starting from the original Apple 1 developed in a bedroom in the 1970s by Jobs and Apple co-founder Steve Wozniak to the first NeXt computer, a magnesium "cube" developed by Jobs during a break with Apple in the 1980s.

Sydney's Powerhouse Museum, which collects icons of contemporary culture, has no doubt the iPod and perhaps the iPhone will one day also take their place alongside the greats of earlier eras, such as the Olivetti Lexikon 80 typewriter designed by Italy's Nizzoli Marcello or the Braun shaver developed by legendary German designer Dieter Rams in 1951.

"It (the iPod) may not be working in 20 years time but it will remain in that echelon of great designs for sure," said Campbell Bickerstaff, curator for the museum's information and communication technology collections.

Thursday, October 6, 2011

Artificial Intelligence


When it comes to making complex judgement calls, computers can’t replace people. But with artificial intelligence, computers could be trained to think like humans do. Artificial intelligence allows computers to learn from experience, recognize patterns in large amounts of complex data and make complex decisions based on human knowledge and reasoning skills. Artificial intelligence has become an important field of study with a wide spread of applications in fields ranging from medicine to agriculture.

Expert Systems

Two of the most important and most used branches of AI are neural networks and expert systems.

An expert system can solve real-world problems using human knowledge and following human reasoning skills. Knowledge and thinking processes of experts are collected and encoded into a knowledge base. From that point on, the expert system could replace or assist the human experts in making complex decisions by integrating all the knowledge it has in its knowledge base.

Neural Networks

Illustration of Neural NetworkThis diagram represents an artificial neural network. A neural network is made of nodes arranged in different patterns representing the "intelligence" of the network. The line thickness indicates the strength of the connections.









The most important application of neural networks is in pattern recognition. Humans, through neurons in their brains, learn how to read human writing, recognize a bad apple from a good one or identify their children from a set of kids. Neural networks allow computers to use the same principles that neurons in the brains use to recognize and classify different patterns. So in a way, neural networks are a digital representation (although very simplified) of our brains. They are made of artificial neurons, connected by weights, which are indicative of the strengths of the connections. The neurons are arranged in layers, and depending on the complexity of the application, there could be a few of them or a very large number of them (hundreds or thousands). Iterative propagation of input from one layer of neurons to the next (training) is what enables the neural network to learn from experience.

Unlike humans, when a neural is fully trained, it can classify and identify patterns in massive amounts of complex data. It could do this at high speeds that can not be duplicated by humans.

Real-World Applications of Artificial Intelligence

Intelligent control is beneficial in many real world applications because it is good at solving complex problems. The flexibility inherent in AI techniques, makes the technology adaptable to fields as diverse as agriculture, business, and literature.

UGA scientists have used artificial intelligence in many different ways: monitor and adjust the climate in greenhouses and poultry production houses help forecast weather and predict crop development determine when vegetables are ripe identify molecules by their "chemical fingerprints" candle eggs to determine which have cracks and other defects.

Intro to tech gyan


Hey guys this is the first post of a cool blog about technology knowledge(gyan). Here you will get the cutting edge and general knowledge about technology especially information technology. Review of new gadgets ,advice in buying tech stuff ,DIY projects will be provided.So stay tuned and stay updated.

About me : I am an engineering student from India. Technology has been my passion from childhood.
Contact: niklabh811@gmail.com