Tuesday, June 21, 2011

Invention of the Transistors

Bell Telephone Laboratories develops the transistor in 1947.

The transistor revolutionized the electronics industry, allowing the Information Age to occur, and made possible the development of almost every modern electronical device, from telephones to computers to missiles. Bardeen's developments in superconductivity, which won him his second Nobel, are used in magnetic resonance imaging (MRI).

Bell Labs

In October 1945, John Bardeen began work at Bell Labs. Bardeen was a member of a Solid State Physics Group, led by William Shockley and chemist Stanley Morgan. Other personnel working in the group were Walter Brattain, physicist Gerald Pearson, chemist Robert Gibney, electronics expert Hilbert Moore and several technicians. He moved his family to Summit, New Jersey.[6] John Bardeen had met William Shockley when they were both in school in Massachusetts. He rekindled his friendship with Walter Brattain. Bardeen knew Walter Brattain from his graduate school days at Princeton. He had previously met Brattain through Brattain's brother, Bob Brattain. Bob Brattain was also a Princeton graduate student. Over the years the friendship of Bardeen and Brattain grew, both in the lab, where Brattain put together the experiments and Bardeen wove theories to explain the results and also on the golf course where they spent time on the weekends.

The assignment of the group was to seek a solid-state alternative to fragile glass vacuum tube amplifiers

On December 23, 1947, Bardeen and Brattain—working without Shockley—succeeded in creating a point-contact transistor that achieved amplification

[1]

[1] The invention of the transistor; http://en.wikipedia.org/wiki/John_Bardeen#The_invention_of_the_transistor



At the end of WWII - ENIAC, or Electronic Numerical Integrator Analyzor and Computer, is developed by the Ballistics Research Laboratory in Maryland to assist in the preparation of firing tables for artillery. It is built at the University of Pennsylvania's Moore School of Electrical Engineering and completed in November 1945.

ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory. When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists. The inventors promoted the spread of these new ideas by teaching a series of lectures on computer architecture.

The ENIAC's design and construction was financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer began in secret by the University of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was announced to the public the evening of February 14, 1946 and formally dedicated the next day at the University of Pennsylvania, having cost almost $500,000 (nearly $6 million in 2010, adjusted for inflation). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955.

See Figures below of the ENIAC computers:

ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University of Pennsylvania. The team of design engineers assisting the development included Robert F. Shaw (function tables), Chuan Chu (divider/square-rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer) and Jack Davis (accumulators).

The ENIAC was a modular computer, composed of individual panels to perform different functions. Twenty of these modules were accumulators, which could not only add and subtract but hold a ten-digit decimal number in memory. Numbers were passed between these units across a number of general-purpose buses, or trays, as they were called. In order to achieve its high speed, the panels had to send and receive numbers, compute, save the answer, and trigger the next operation—all without any moving parts. Key to its versatility was the ability to branch; it could trigger different operations that depended on the sign of a computed result.

Besides its speed, the most remarkable thing about ENIAC was its size and complexity. ENIAC contained 17,468 vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints. It weighed more than 30 short tons (27 t), was roughly 8 by 3 by 100 feet (2.4 m × 0.9 m × 30 m), took up 1800 square feet (167 m2), and consumed 150 kW of power. Input was possible from an IBM card reader, and an IBM card punch was used for output. These cards could be used to produce printed output offline using an IBM accounting machine, such as the IBM 405.[1]


[1] ENIAC; http://en.wikipedia.org/wiki/Eniac


CODE Breakers

In WWII - Nazi Germany used an electro-mechanical rotor machines used for the encryption and decryption of secret messages – code named – Enigma.[1]


Fortunately the British developed a code breaker machine called: Colossus.

Colossus was designed by engineer Tommy Flowers with input from Harry Fensom, Allen Coombs, Sidney Broadhurst and William Chandler[1] at the Post Office Research Station, Dollis Hill to solve a problem posed by mathematician Max Newman at Bletchley Park. The prototype, Colossus Mark 1, was shown to be working in December 1943 and was operational at Bletchley Park by February 1944. An improved Colossus Mark 2 first worked on 1 June 1944, just in time for the Normandy Landings. Ten Colossi were in use by the end of the war.[2]


The Colossus machines were electronic computing devices used by British codebreakers to help read encrypted German messages during World War II. They used vacuum tubes (thermionic valves) to perform the calculations

The Colossus computers were used to help decipher teleprinter messages which had been encrypted using the Lorenz SZ40/42 machine—British codebreakers referred to encrypted German teleprinter traffic as "Fish" and called the SZ40/42 machine and its traffic "TUNNY" [3]


[1] Enigma; http://en.wikipedia.org/wiki/Enigma_machine

[2] Colossus computer; http://en.wikipedia.org/wiki/Colossus_computer

[3]Lorenz cipher; http://en.wikipedia.org/wiki/Lorenz_SZ40/42

The rise of Big Blue

In 1890s – Thomas J. Watson, Sr. - a young tree farmer from the town of Erwin, NY - eventually left farming – took accounting and business courses at the Miller School of Commerce in Elmira, NY and became a traveling salesman.

In 1896 – Watson Sr. – became a sales apprentice to an NCR (National Cash Register) branch manager in Buffalo, NY – John J. Range.

John Range became a model for Watson Sr. in his sales and management style - that he became the most successful salesman in the East coast – earning $100 per week.

From 1908 until 1911 - during his time in NCR – Watson Sr. made NCR a monopoly in Rochester, NY. Watson’s main job is to knock out the competition in the cash register business. However in 1912 he was indicted in an anti-trust law suit

Watson Sr. then joined the Computing Tabulating Recording Corporation (CTR) in 1914. Watson Sr. took over as general manager of the company with $9 million in revenues.

In 1924 – he renamed CTR to IBM. IBM became a dominant company during that era – that a federal antitrust law suit was filed in 1952.

During WWII – IBM became more involve in the war for the United States. – IBM was producing large numbers of data processing equipment and experimenting with analog computers for the US military.

Watson Sr. eldest son – Thomas J. Watson Jr - joined the United States Army Air Corps where he became a bomber pilot. But he was soon hand-picked to become the assistant and personal pilot for General Follet Bradley, who was in charge of all Lend-Lease equipment supplied to the Soviet Union from the United States.

After World War II, Watson began work to further the extent of IBM's influence abroad and in 1949, the year he stepped down, created the IBM World Trade Corporation in order to control IBM's foreign business.[citation needed]

Watson was named chairman emeritus of IBM in 1956. A month before his death, Watson handed over the reins of the company to his oldest son, Thomas J. Watson, Jr. Thomas Watson Sr. was interred in Sleepy Hollow Cemetery in Sleepy Hollow, New York.[1]

Figure above is: Thomas J. Watson, Sr., circa 1920s

IBM Supercomputer - September 2009


[1] Thomas J. Watson; http://en.wikipedia.org/wiki/Thomas_J._Watson

Thursday, June 16, 2011

To be OR not to be counted

In 18th and 19th Century America - the US constitution mandated that a census should be taken of all US citizens every 10 years - this is neede in order to determine state representation in Congress.

The first census of 1790 only took 9 months to complete. However by 1880 the US population has grown - it took 7.5 years for that census to be completed.

The census bureau offered a prize to anyone that can automate the count for the 1890 census. An inventor named Herman Hollerith won that prize.
Hollerith adopted Jacquard's punched cards for the computation.

Known as the Hollerith Desk - it consisted of a card reader, a gear for counting and a dialed wall indicators to display the results
Hollerith had the insight to convert punched cards to what is today called a read/write technology. Unknown to Hollerith - Babbage already proposed this concept long time ago.

Hollerith's invention worked. The 1890 census only took 3 years to complete and saved $ 5 million dollars (the start of venture capitalism - here is were the real fun begins)
Because of his invention, Hollerith built a company called the "Tabulating Machine Company". After a few buyouts - it eventually became "International Business Machines" or as it is more commonly known today as "IBM"[1]

See figure below of Hollerith’s Desk:

IBM built mechanical calculators and sold them to businesses to help in accounting and inventory in the private sector. This calculators only did adding and subtracting - no divisions or multiplications.

However the US Government - more specifically, the US military needed a calculator that can perform scientific computation.

In World War II - US battleships needed a way to determine the trajectory of a shell projectile once it is fired out of a cannon - this is needed for proper aiming. In the beginning, physicists and mathematicians had to manually compute for the atmospheric drag, wind direction, gravity, muzzle velocity, etc. - (equivalent to a manual labor for the brain) and put all these information in a manual called "Firing Tables" - published for gunnery manuals. It was a laborious pain staking task and there were not enough people to do it.

So in 1944 the Harvard Mark I computer was built. The Mark I was a joint partnership between Harvard and IBM to provide a solution for firing tables. This was the first programmable digital computer made in the US. However, the Mark I was not a pure electronic computer. It was constructed out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and 51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor.[2]

The Mark I incorporated paper tape machines for programming the computer. This is far better than the old stack of punch cards that IBM was using for their mechanical calculators.

See Figure below of: The Harvard Mark I: an electro-mechanical computer


One of Mark I’s programmer - was Grace Hopper – a Rear Admiral in the Navy Reserves. She found the first computer “bug” -- literally. The bug was a dead moth – whose wings were blocking the reading of the holes in the paper tape machine. Grace Hopper was credited for coining the word “debugging” - describing the process to eliminate program faults.

Computer History
See figure below of the The first computer bug [photo © 2002 IEEE]

Invention of the first High Level programming language

In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This language eventually became COBOL which was the language most affected by the infamous Y2K problem. A high-level language is designed to be more understandable by humans than is the binary language understood by the computing machinery. A high-level language is worthless without a program -- known as a compiler -- to translate it into the binary language of the computer and hence Grace Hopper also constructed the world's first compiler. Grace remained active as a Rear Admiral in the Navy Reserves until she was 79 (another record). [3]



[1] An Illustrated History of Computers; http://www.computersciencelab.com/ComputerHistory/HistoryPt3.htm

The figure below is one of the four paper tape readers on the Harvard Mark I (you can observe the punched paper roll emerging from the bottom):



[1] Hollerith desk; http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htm



Tuesday, June 14, 2011

The First Computer Programmer:

1833 - Ada Byron, the countess of Lovelace, met Babbage. She describes the Analytical Engine as weaving "algebraic patterns just as the Jacquard loom weaves flowers and leaves." Her published analysis of the Analytical Engine is our best record of its programming potential. In it she outlines the fundamentals of computer programming, including data analysis, looping and memory addressing.[1]

Born as Augusta Ada Byron, was an English writer chiefly known for her work on Charles Babbage's early mechanical general-purpose computer, the analytical engine. Her notes on the engine include what is recognized as the first algorithm intended to be processed by a machine; as such she is sometimes portrayed in popular culture as the "World's First Computer Programmer".

She was the only legitimate child of the poet Lord Byron (with Anne Isabella Milbanke). See figure below


[2]


Greetings and Welcome to Nerds and Venture Capitalism Blog

This Blog provides a history of the computers, its inventors, the internet and to the Venture Capitalist that saw a business in these new medium

To understand how this machine became a billion dollar business and how it works, we have to go back to the beginning – on how this all started.

We have to go way back were - there were no transistors, Integrated Chips or even Vacuum Tubes. We have to go back in history when Blaise Pascal builds a mechanical calculator in 1642

This mechanical calculator had the capacity for eight digit calculation - but had a lot of problems - it had trouble carrying and its gears tend to jam[1].

Pascal conceived the mechanical calculator while trying to help his father who had been assigned the task of reorganizing the tax revenues of the French province of Haute-Normandie ; first called Arithmetic Machine, Pascal's Calculator and later Pascaline, it could add and subtract directly and multiply and divide by repetition.

Pascal went through 50 prototypes before presenting his first machine to the public in 1645. He dedicated it to Pierre Séguier, the chancellor of France at the time. He built around twenty more machines during the next decade, often improving on his original design. Nine machines have survived the centuries, most of them being on display in European museums. In 1649 a royal privilege, signed by Louis XIV of France, gave him the exclusivity of the design and manufacturing of calculating machines in France

See display below for the Pascal Calculator:

The introduction of the Pascal calculators - launched the development of mechanical calculators in Europe first and then all over the world. This development - culminated three centuries later, by the invention of the microprocessor developed for a Busicom calculator in 1971. Refer to: http://en.wikipedia.org/wiki/Pascal’s_calculator

The First Program
In 1801, Joseph Marie
Jacquard, a s
ilk-weaver, invented an improved textile loom. Known as the Jacquard loom - it was the first machine to use punched card. These punched cards controlled the weaving, enabling an ordinary workman to produce the most beautiful patterns in a style previously accomplished only with patience, skill, and hard work.[2]

See figure below.

[3]

The loom is controlled by punched cards with punched holes, each row of which corresponds to one row of the design. Multiple rows of holes are punched on each card and the many cards that compose the design of the textile are strung together in order. It is based on earlier inventions by the Frenchmen Basile Bouchon (1725), Jean Baptiste Falcon (1728) and Jacques Vaucanson (1740)[4]


[1]A HISTORY OF THE COMPUTER: PREHISTORY; http://www.pbs.org/nerds/timeline/pre.html

See image above on the power of Jacquard's Loom - he made a self-portrait of himself from his own invention

[4] Jacquard loom; http://en.wikipedia.org/wiki/Jacquard_loom

Harsh Busines competition:

Jacquard's technology was a real boon to mill owners, but put many loom operators out of work. Angry mobs smashed Jacquard looms and once attacked Jacquard himself. History is full of examples of labor unrest following technological innovation yet most studies show that, overall, technology has actually increased the number of jobs. Go to link: http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htm

Charles Babbage - good with numbers and anything mechanical:

In 1820 or 1821 and English mathematician and engineer – Charles Babbage invented the "Difference Engine" - a massive steam-powered mechanical calculator designed to print astronomical tables. Babbage had another invention called the “Analytical Engine” – however, he did not live to see his invention come to fruition.

The Analytical was undoubtedly the first design for what we now think of as a computer: a machine that takes an input, mathematically manipulates it according to a customizable program, and produces an output.

1

The figure above is the reconstruction of Babbage's difference engine at the London Science Museum

Babbage sought a way to remove human errors from the mathematical tables available in the early 19th century, devising his mechanical 'difference engine' to calculate polynomial functions (a type of algebra equation). Though it was never finished, the first difference engine would have contained more than 25,000 parts and weighed over 13 tons. A revised design was completed in 1991 by the Science Museum, and found to work perfectly.

More complex still, and also unfinished, Babbage's analytical engine added features that define modern computers. It could be programmed with cards, but could also store the results of calculations and perform new calculations on those. Babbage intended to support conditional branches and loops, fundamental to all modern programming languages. His death in 1871 meant that he never finalized his designs for the engine, but his son Henry completed its core computing unit – 'the mill' – in 1888.