Tag: IBM

Getting More Kids into Computing!

I’ve been working on a range of books with Bright Red Publishing on N5 books (which relates to the new syllabus within Scottish Schools), and just now I’m working on the new N5 Computing book. The syllabus looks to be a great improvement on the previous one, with over half of it on software development, and good coverage of things such as security and databases. It thus highlights a changing work, as we now move towards new subjects within computing.

So many job roles …

brp02Over the past few years I’ve been presenting at events on how we need to get more kids into Computing. So why, in the UK, are we still funding so many university places in subjects in which there are few jobs in? Shouldn’t we be funding more student places in Computing? Few subjects can offer the breadth of jobs that Computing … from software development to network support, and from user interface design to computer security. In fact there’s so many jobs titles that someone entering a computing programme in lots of interesting areas: networking, computer security, software development, media design, mobile devices, web development, and many others. Along with this, there’s new areas such as Cloud Computing, Big Data and lots of developments around mobile devices. So a first year student in a computing degree can often select from a wide range of subjects, and select the one that interests them most, and which, possibly, has the best career options for them.

It’s at the core of everything now …

brp03The Internet is probably one of the greatest creations ever, and one which provides us with the core of the modern world. Without it, many industries such as banking, energy, education and so on, could not exist in their current form. We can see from the increasing creation and consumption of digital information that there is an increasing reliance on the Internet, with over 12TB of tweets every day, and almost 90% of all the data produced in the Cloud has been produced in the last two years. Along with this we see, over 2.5 Quintillion bytes of data being produced – that’s over 1 billion hard disks of data, every day. And the Internet is not just about computer data, we are moving toward digitizing a whole range of media, including with voice, video and sensor data. Along with this, areas such as health and social could be radically changed with digital methods, where patients could Skype with their GP, rather than having to arrange appointments for a face-to-face meeting.

For example … computer security

brp04A good example of the new industries that are being created within Computing, and the rise in the academic requirements is in computer security. Within it there’s a wide range of things to focus on including network security, operating systems, people, encryption, identity, mobile devices, wireless, and so on. Also there’s new applications for the Internet, and new threats occur ever days, especially as we become more mobile and more reliant on the Cloud. Many application areas, including banking, shopping, government services, health care and so on, are all going on-line, increasing the threats we are all under, which thus requires a new range of professionals, which ten years ago would not have even existed … computer security consultants.

Where are the employers …

brp05Well they are everywhere .. with large and small companies expanding from key applications sectors such as in banking to core IT companies. From my side, I see an increasing demand for graduates in computing, and where there is a wide range of companies looking for many graduates. Many years ago I would see graduates move away from Scotland, but now there are companies recruiting them on their home base where companies such as Dell SecureWorks and Amazon recruiting students to work on Princess Street … how great is that? And there’s great SMEs which are leading on an international basis, whether is it miiCard for identity provision, or Rock Star leading the way in computer games, there has never been a better time to get into computing.

So … why don’t we funding more graduate places in computing, and less in areas which struggle to find enough jobs for their graduates?

Introducing Bob and Alice …

brp06Over the past few years, Edinburgh Napier University has been engaging with local schools on creating interest in computing with both IT4U, which bring schools into universities, and provides them with a range of interesting workshops, and with the Cyber Christmas lecture. For this we have presented on a range of things including within computer security, and showing kids some fun things in cracking codes and introducing Bob and Alice. Can you remember the time when you passed secret messages to your friends at school? Well the need for code cracking increases by the day, as we see new threats evolve … so the time is right to engage with these young minds, and get them interesting in some of the new problems that the world faces. It is within areas such as computer security, that we created the architectures of the future, and one in which physical buildings have been replaced with virtual ones.

My History of Computing (Punch cards to FORTRAN)

The Beginnings of the Industry

Herman Hollerith

One of the first appearances of computer technology occurred in the USA in the 1880s, and was due to the American Constitution demanding that a survey be undertaken every 10 years. As the population in the USA increased, it took an increasing amount of time to produce the statistics. It got so bad, that by the 1880s, it looked likely that the 1880 survey would not be complete until 1890. To overcome this, a government employee named Herman Hollerith devised a machine that accepted punch cards with information on them. These cards allowed an electrical current to pass through a hole when there was a hole present (a ‘true’), and did not conduct a current when it a hole was not present (a ‘false’). This was one of the first uses of binary information, which represents data in a collection of one of two states (such as true or false, or, 0 or 1). Thus Hollerith used a system which stored the census information with a sequence of binary information.

Hollerith Machine

1890. Hollerith’s electromechanical machine was extremely successful and was used in the 1890 and 1900 censuses. He then went on to found the company that would later become International Business Machines (IBM): CTR (Computer Tabulating Recording). Unfortunately, his business fell into financial difficulties and was saved by Tom Watson, a young salesman at CTR, who recognized the potential of selling punch card-based calculating machines to American businesses. He eventually took over the company, and, in the 1920s, he renamed it International Business Machines Corporation (IBM). IBM would eventually control much of the computer market, and it was only their own creation, the IBM PC, which would reduce this domination. For the next 50 years the electromechanical machines were speeded up and improved, but electronic computers, using valves, would eventually supercede these.

Harvard Mark I

1943. The first generation of electronic computers started in 1943. These electronic computers used the flow of electrons within an electronic value to represent the binary states, and not on magnetic fields stored in electromagnetics, which were used in previous computers. This had the advantage that they did not rely as much on the movement of mechanical components and or magnetic fields. These led to the first generation of computers that used electronic valves and punched cards for their main, non-volatile storage (non-volatile allows for long-term storage, even when the power is taken away). The first electronic computers developed were the ‘Harvard Mk I’, which was developed at Harvard University and was a general-purpose electromechanical programmable computer, and Colossus, which was developed in the UK and was used to crack the German coding system (Lorenz cipher).

ENIAC

1946. During World War II, John Eckert at the University of Pennsylvania built the world’s first large electronic computer contained. It contained over 19,000 and was called ENIAC (Electronic Numerical Integrator and Computer). It was so successful that it ran for over 11 years before it was switched off (not many modern day computers will run for more than a few years before they are considered unusable). By today’s standards, ENIAC was a lumbering dinosaur, and by the time it was dismantled, it weighed over 30 tons and spread itself over 1,500 square feet. Amazingly, it also consumed over 25kW of electrical power (equivalent to the power of over 400 60W light bulbs), but could perform over 100,000 calculations per second (which, even by today’s standards, is reasonable). Unfortunately, it was unreliable, and would work only for a few hours, on average, before an electronic valve needed to be replaced. Faultfinding, though, was much easier in those days, as a valve that was not working would not glow, and would be cold to touch.

John von Neumann

While ENIAC was important in the history of the modern computer, its successor would provide a much greater legacy: the standard architecture that has been used in virtually every computer since built: the ENVAC (Electronic Discrete Variable Automatic Computer). Its real genius was due to John von Neumann, a scientific researcher who had already built up a strong reputation in the field of quantum mechanics. For computing, he used his superior logical skills to overcome the shortcomings of ENIAC: too little storage, too many valves, and too lengthy a time to program it.  His new approach used the stored-program concept, which is used by virtually every computer made, ever since.  With this, the storage device of the computer (its memory) is used to hold both the program instructions and also the data used by the computer and the program. His computer, as illustrated in Figure 1, was designed around five major elements:

  • Central control.  This reads program instructions from the memory, which are interpreted by the central control unit.
  • Central arithmetic unit. This performs arithmetic operations, such as add/subtract, multiply/divide, binary manipulation, and so on.
  • Stored-program architecture
    Stored-program architecture

    Memory. This holds both the program instructions and program/system data.

  • Input device. This is used to read data into the memory. Example input devices include keyboards, disk storage, punch card reader (which were used extensively before the large-scale introduction of disk storage devices). The input device loads both program instructions and data into memory.
  • Output device. This is used to output data from memory to an output device, such as a printer or display device.

Typically, these days, the central control unit and the central arithmetical unit have been merged into a device known as a microprocessor. The environment in which to run programs, typically known as user programs, is defined by the operating system. The von Neumann architecture made it easier to load programs into the system as the operating system can load all its associated data in the same place as it loaded the program. Previous to this architecture, a user would have to load the program into one area of memory, and all the associated data to another. The computer would then read from the program area for its instructions, and then read and write to a data area.After ENIAC, progress was fast in the computer industry, and by 1948 small electronic computers were being produced in quantity. By the start of the 1950s, 2,000 were in use, by the start of the 1960s this was 10,000, and by 1970 it was 100,000.

Electronic valves
Electronic valves

At the time, electronic valves were used in many applications, such as TV sets and radios, but they were unreliable and consumed great amounts of electrical power, mainly for the heating element on the cathode. By the 1940s, several scientists at the Bell Laboratories were investigating materials called semiconductors, such as silicon and germanium, which conducted electricity only moderately well. To the researchers their most interesting property was that they could change their resistance when they were doped with impurities. From this work, they made a crystal called a diode, which worked like an electronic valve but had many advantages, including the fact that it did not require a vacuum and was much smaller. It also worked well at room temperatures, required little electrical current and had no warm-up time. This was the start of microelectronics industry, which has since become one of the most important technologies in the history of mankind, and without it we could hardly exist in our current form.

William Shockley

One of the great computing revolutions occurred in December 1948, when William Shockley, Walter Brattain and John Bardeen at the Bell Laboratories produced a transistor that could act as a triode (value which could amplify electrical signals). It was made from a germanium crystal with a thin p-type section sandwiched between two n-type materials. Rather than release its details to the world, Bell Laboratories kept its invention secret for over seven months so that they could fully understand its operation. They soon applied for a patent for it, and, on 30 June 1948, they finally revealed the transistor to the world. Unfortunately, as with many other great scientific inventions, it received little public attention and even less press coverage (the New York Times gave it 4½ inches on page 46). It must be said that few men have made such a profound change on the world, and Shockley, Brattain and Bardeen were deservedly awarded the Nobel Prize in 1956.

Magnetic Core Memory

A major problem with early computers was how to store data and program information when the power was taken away (typically after a fault). Punch cards were slow and extremely labor intensive, and it would typically taken many hours to save and properly restore the memory of the computer. This was overcome when, in 1951, Jay Forrester at MIT created a magnetic core memory. This used an array of ferrite toriods (or cores) to store binary information. As these cores were magnetic they stored the state of the binary digit, and were thus non-volatile (which means that they retain their state even when the power is taken away. Unfortunately, they were relatively slow to access, but within a few years they could be accessed within a fraction of a millionth of a second. Torriod memories were used in most of the systems in the 1950s and 1960s and it was only a small, but innovative, company Intel which broke their considerable share of the memory market when they developed a silicon memory in 1970.

A year later, G.W. Dummer, a radar expert from Britain’s Royal Radar Establishment, presented a paper proposing that a solid block of materials could be used to connect electronic components, without connecting wires. This would lay the foundation for the integrated circuit, but the world would have to wait for another decade before it was properly realized.

IBM

At the time, the Snow White of the industry, IBM, had a considerable share of the computer market. In fact it had gained so much of the computer market that a complaint was filed against it alleging monopolistic practices in its computer business. This, their competitors reckoned, was in violation of the Sherman Act. By January 1954, the US District Court made a final judgment on the complaint against IBM, which was a ‘consent decree’ for IBM, which placed limitations on how they conducted business with respect to ‘electronic data processing machines’. The word computer was never really used, as most people at the time reckoned that there was a worldwide computer market of just a few hundred (nowadays there can be this much inside an automobile).

From 1954 … say hello to the silicon transistor

First transistor

After the success at the Bell Labs, transistors had been made from germanium, but this is not a robust material and cannot withstand high temperatures. The solution to this problem came from a surprising place. It was Texas Instruments who were a geological company who had diversified into transistors. They were the first to propose the use of silicon transistors and then, in May 1954, they started the first commercial production of silicon transistors. Soon many companies were producing silicon transistors and, by 1955, the market for electronic valve market had peaked, while the market for transistors was rocketing. Unfortunately, as with many major changes in technology, the larger companies failed to change their business quickly enough to cope with the new techniques. Thus Western Electric, CBS, Raytheon and Westing­house all quickly lost their market share in electronic components to the new transistor manufacturing companies, such as Texas Instruments, Mo­torola, Hughes and RCA.

IBM 650

Values, though, were still popular at the time and IBM used them to build the IBM 650 which, at the time, was considered the workhorse of the industry. It was, though, the beginning of the end for values in computers when the Massachusetts Institute of Technology produced the first transistorized computer: the TX-O (Transistorized Experimental computer). IBM could see the potential of the transistor to computers, and they quickly switched from valves to transistors and, in 1959, they produced the first commercial transistorized computer. This was the IBM 7090/7094 series, and it dominated the computer market for years.

RAMAC 305 disk

1956. A year later a court decree ruled that IBM still had too much of a control of the industry, that it would be forced to rent their computers, rather than just sell them. In November 1956, IBM also showed that apart from being fast to adopt new technology they could also innovate it when the introduced the first hard disk: the RAMAC 305. It was towering by today’s standards, with 50 two-foot diameter platters, giving a total capacity of 5MB, but compared with magnetic cores it could store much more binary information.

The beginnings of the software industry can be traced by to the early-fifties when work was also been undertaken on assemblers which would simply use simple text representations of the binary operations that the computer understood (such as ADD A, B to add two numbers) and then using an assembler would convert them into a binary form. This aided the programmer as they did not have to continually look-up the binary equivalent of the command that they required. It also made programs easier to read. The great advance occurred around 1956 when one of the

Grace Hopper

all-time greats, Grace Hopper (1906-1992), started to develop compilers for the UNIVAC computer. These graceful programs would converted a language which was readable by humans into a form that a computer could understand. This work would lead to the development of the COBOL programming language (which has since survived to the present day, although it is still blamed for many of the Year 2000 problems).

1957. To commercialize on his success, Shockley, in 1955, founded Shockley Semiconductor. Then in 1957, eight engineers decided that they could not work within Shockley Semiconductor and formed Fairchild Semiconductors, which would become one of the most innovative companies in Silicon Valley. Unfortunately, Fairchild Semiconductors seldom exploited their developments fully, and was more of an incubator for many of the innovators in the electronics industry.

Ken Olsen
PDP-1

Around the same time, Kenneth Olsen founded the Digital Equipment Corporation (DEC) in an old woolen mill in Maynard Massachusetts. DEC would go on to become one of the key companies in the computer industry, along with IBM, but would eventually become one of the main causalities of the development of the IBM PC. Initially they developed plug-in computer boards with transistorized logic circuits, but by 1960 they developed the first of their computers: the PDP-1, which cost just over one-tenth of the normal cost of the systems which were available at the time. After an extremely successful period of selling the PDP range, in 1977 they developed a complete range of computer systems: the VAX range, which used the excellent VMS operating system.

Punch cards

Programs on the mainframe computers which were around in the 1950s were typically written either in machine code (using the actual binary language that the computer understood) or using one of the new compiled languages, such as COBOL (COmmon Business Oriented Language) and FORTRAN (FORmula TRANslation). FORTRAN was well suited to engineering and science as it was based around mathematical formulas, whereas COBOL was more suited to business applications, and written in a form that business managers could understand. FORTRAN was developed in 1957 (typically known as FORTRAN 57) and was a considerably enhancement in the development of computer programs, as programs could be writing in a near-English form, rather than using a binary language. With FORTRAN, the compiler converts the FORTRAN statements into a form that the computer can understand. At the time, FORTRAN programs were stored on punch cards, and loaded into a punch card reader to be read into the computer. Each punch card had holes punched into it to represent ASCII characters, and any changes to a program would require a new set of punch cards.

See more in the next blog … the transistors on a chip