-
Notifications
You must be signed in to change notification settings - Fork 15
/
Computer.txt
1999 lines (1775 loc) · 111 KB
/
Computer.txt
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
730
731
732
733
734
735
736
737
738
739
740
741
742
743
744
745
746
747
748
749
750
751
752
753
754
755
756
757
758
759
760
761
762
763
764
765
766
767
768
769
770
771
772
773
774
775
776
777
778
779
780
781
782
783
784
785
786
787
788
789
790
791
792
793
794
795
796
797
798
799
800
801
802
803
804
805
806
807
808
809
810
811
812
813
814
815
816
817
818
819
820
821
822
823
824
825
826
827
828
829
830
831
832
833
834
835
836
837
838
839
840
841
842
843
844
845
846
847
848
849
850
851
852
853
854
855
856
857
858
859
860
861
862
863
864
865
866
867
868
869
870
871
872
873
874
875
876
877
878
879
880
881
882
883
884
885
886
887
888
889
890
891
892
893
894
895
896
897
898
899
900
901
902
903
904
905
906
907
908
909
910
911
912
913
914
915
916
917
918
919
920
921
922
923
924
925
926
927
928
929
930
931
932
933
934
935
936
937
938
939
940
941
942
943
944
945
946
947
948
949
950
951
952
953
954
955
956
957
958
959
960
961
962
963
964
965
966
967
968
969
970
971
972
973
974
975
976
977
978
979
980
981
982
983
984
985
986
987
988
989
990
991
992
993
994
995
996
997
998
999
1000
Computer
A computer is a machine
that can be programmed to
carry out sequences of
arithmetic
or
logical
operations
(computation)
automatically.
Modern
digital electronic computers
can perform generic sets of
operations
known
as
programs. These programs
enable computers to perform
a wide range of tasks. A
computer system is a
nominally
complete
computer that includes the
hardware, operating system
(main
software),
and
peripheral
equipment
needed and used for full
operation. This term may
also refer to a group of
computers that are linked
and function together, such
as a computer network or
computer cluster.
Computers and computing devices from different eras – clockwise from top left:
Early vacuum tube computer (ENIAC)
Mainframe computer (IBM System 360)
Desktop computer (IBM ThinkCentre S50 with monitor)
Supercomputer (IBM Summit)
Video game console (Nintendo GameCube)
Smartphone (LYF Water 2)
A broad range of industrial
and consumer products use
computers
as
control
systems. Simple specialpurpose devices like microwave ovens and remote controls are included, as are factory devices like
industrial robots and computer-aided design, as well as general-purpose devices like personal computers
and mobile devices like smartphones. Computers power the Internet, which links billions of other
computers and users.
Early computers were meant to be used only for calculations. Simple manual instruments like the abacus
have aided people in doing calculations since ancient times. Early in the Industrial Revolution, some
mechanical devices were built to automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early 20th century. The first
digital electronic calculating machines were developed during World War II. The first semiconductor
transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and monolithic
integrated circuit chip technologies in the late 1950s, leading to the microprocessor and the microcomputer
revolution in the 1970s. The speed, power and versatility of computers have been increasing dramatically
ever since then, with transistor counts increasing at a rapid pace (as predicted by Moore's law), leading to
the Digital Revolution during the late 20th to early 21st centuries.
Conventionally, a modern computer consists of at least one processing element, typically a central
processing unit (CPU) in the form of a microprocessor, along with some type of computer memory,
typically semiconductor memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of operations in response to stored
information. Peripheral devices include input devices (keyboards, mice, joystick, etc.), output devices
(monitor screens, printers, etc.), and input/output devices that perform both functions (e.g., the 2000s-era
touchscreen). Peripheral devices allow information to be retrieved from an external source and they enable
the result of operations to be saved and retrieved.
Etymology
According to the Oxford English Dictionary, the first known
use of computer was in a 1613 book called The Yong Mans
Gleanings by the English writer Richard Brathwait: "I haue
[sic] read the truest computer of Times, and the best
Arithmetician that euer [sic] breathed, and he reduceth thy
dayes into a short number." This usage of the term referred to a
human computer, a person who carried out calculations or
computations. The word continued with the same meaning
until the middle of the 20th century. During the latter part of
this period women were often hired as computers because they
could be paid less than their male counterparts.[1] By 1943,
most human computers were women.[2]
A human computer, with microscope and
calculator, 1952
The Online Etymology Dictionary gives the first attested use of
computer in the 1640s, meaning 'one who calculates'; this is an
"agent noun from compute (v.)". The Online Etymology Dictionary states that the use of the term to mean
" 'calculating machine' (of any type) is from 1897." The Online Etymology Dictionary indicates that the
"modern use" of the term, to mean 'programmable digital electronic computer' dates from "1945 under this
name; [in a] theoretical [sense] from 1937, as Turing machine".[3]
History
Pre-20th century
Devices have been used to aid computation for thousands of years, mostly using one-to-one
correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later record
keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which represented
counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a][4] The use of
counting rods is one example.
The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices used in
Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables have been
invented. In a medieval European counting house, a checkered cloth would be placed on a table, and
markers moved around on it according to certain rules, as an aid to calculating sums of money.[5]
The Antikythera mechanism is believed to be the earliest known mechanical analog computer, according to
Derek J. de Solla Price.[6] It was designed to calculate astronomical positions. It was discovered in 1901 in
the Antikythera wreck off the Greek island of Antikythera, between Kythera and Crete, and has been dated
to approximately c. 100 BCE.
Devices of comparable complexity
to the Antikythera mechanism would
not reappear until the fourteenth
century.[7]
Many mechanical aids to calculation
and measurement were constructed
for astronomical and navigation use.
The Chinese suanpan (
). The number
The planisphere was a star chart
represented on this abacus is
invented by Abū Rayhān al-Bīrūnī
6,302,715,408.
in the early 11th century.[8] The
astrolabe was invented in the
Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed to
Hipparchus. A combination of the planisphere and dioptra, the astrolabe was
effectively an analog computer capable of working out several different kinds of
problems in spherical astronomy. An astrolabe incorporating a mechanical
calendar computer[9][10] and gear-wheels was invented by Abi Bakr of
Isfahan, Persia in 1235.[11] Abū Rayhān al-Bīrūnī invented the first
mechanical geared lunisolar calendar astrolabe,[12] an early fixed-wired
knowledge processing machine[13] with a gear train and gear-wheels,[14]
c. 1000 AD.
算盘
The sector, a calculating instrument used for solving problems in
proportion, trigonometry, multiplication and division, and for various
functions, such as squares and cube roots, was developed in the late 16th
century and found application in gunnery, surveying and navigation.
The planimeter was a manual instrument to calculate the area of a closed
figure by tracing over it with a mechanical linkage.
The Ishango bone, a
bone tool dating back
to prehistoric Africa
The Antikythera mechanism,
dating back to ancient Greece
circa 150–100 BCE, is an
early analog computing
device.
The slide rule was invented
around 1620–1630 by the
English clergyman William Oughtred, shortly after the
publication of the concept of the logarithm. It is a handoperated analog computer for doing multiplication and
A slide rule
division. As slide rule development progressed, added scales
provided reciprocals, squares and square roots, cubes and cube
roots, as well as transcendental functions such as logarithms and exponentials, circular and hyperbolic
trigonometry and other functions. Slide rules with special scales are still used for quick performance of
routine calculations, such as the E6B circular slide rule used for time and distance calculations on light
aircraft.
In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could write
holding a quill pen. By switching the number and order of its internal wheels different letters, and hence
different messages, could be produced. In effect, it could be mechanically "programmed" to read
instructions. Along with two other complex machines, the doll is at the Musée d'Art et d'Histoire of
Neuchâtel, Switzerland, and still operates.[15]
In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine, which,
through a system of pulleys and cylinders and over, could predict the perpetual calendar for every year from
0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-predicting
machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in
shallow waters. It used a system of pulleys and wires to automatically calculate predicted tide levels for a
set period at a particular location.
The differential analyser, a mechanical analog computer designed to solve differential equations by
integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William Thomson had
already discussed the possible construction of such calculators, but he had been stymied by the limited
output torque of the ball-and-disk integrators.[16] In a differential analyzer, the output of one integrator
drove the input of the next integrator, or a graphing output. The torque amplifier was the advance that
allowed these machines to work. Starting in the 1920s, Vannevar Bush and others developed mechanical
differential analyzers.
First computer
Charles
Babbage,
an
English
mechanical engineer and polymath,
originated
the
concept
of
a
programmable computer. Considered
the "father of the computer",[17] he
conceptualized and invented the first
mechanical computer in the early 19th
century.
After working on his difference engine
he announced his invention in 1822, in
a paper to the Royal Astronomical
Society, titled "Note on the application
of machinery to the computation of
astronomical
and
mathematical
Charles Babbage c. 1850
[18]
tables",
he also designed to aid in
navigational calculations, in 1833 he
realized that a much more general design, an analytical engine, was
possible. The input of programs and data was to be provided to the machine
via punched cards, a method being used at the time to direct mechanical
looms such as the Jacquard loom. For output, the machine would have a
printer, a curve plotter and a bell. The machine would also be able to punch
numbers onto cards to be read in later. The Engine incorporated an
arithmetic logic unit, control flow in the form of conditional branching and
loops, and integrated memory, making it the first design for a generalpurpose computer that could be described in modern terms as Turingcomplete.[19][20]
A diagram of a portion of
Babbage's Difference
engine
The Difference Engine
Number 2 at the Intellectual
Ventures laboratory in
Seattle
The machine was about a century ahead of its time. All the parts for his
machine had to be made by hand – this was a major problem for a device
with thousands of parts. Eventually, the project was dissolved with the decision of the British Government
to cease funding. Babbage's failure to complete the analytical engine can be chiefly attributed to political
and financial difficulties as well as his desire to develop an increasingly sophisticated computer and to
move ahead faster than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a
simplified version of the analytical engine's computing unit (the mill) in 1888. He gave a successful
demonstration of its use in computing tables in 1906.
In his work Essays on Automatics published in 1914, the Spanish engineer Leonardo Torres Quevedo
wrote a brief history of Babbage's efforts at constructing a mechanical Difference Engine and Analytical
Engine. He described the Analytical Engine as exemplifying his theories about the potential power of
machines, and takes the problem of designing such an engine as a challenge to his skills as an inventor of
electromechanical devices. The paper contains a design of a machine capable of calculating completely
automatically the value of the formula
, for a sequence of sets of values of the variables
involved. The whole machine was to be controlled by a read-only program, which was complete with
provisions for conditional branching. He also introduced the idea of floating-point arithmetic.[21][22] In
1920, to celebrate the 100th anniversary of the invention of the arithmometer, Torres presented in Paris the
Electromechanical Arithmometer, a prototype that used relays to implement the functions of an arithmetic
unit connected to a (possibly remote) typewriter, on which commands could be typed and the results printed
automatically.[23][24][25]
Analog computers
Sir William Thomson's third tidepredicting machine design, 1879–
81
During the first half of the 20th century, many scientific computing
needs were met by increasingly sophisticated analog computers, which
used a direct mechanical or electrical model of the problem as a basis
for computation. However, these were not programmable and
generally lacked the versatility and accuracy of modern digital
computers.[26] The first modern analog computer was a tide-predicting
machine, invented by Sir William Thomson (later to become Lord
Kelvin) in 1872. The differential analyser, a mechanical analog
computer designed to solve differential equations by integration using
wheel-and-disc mechanisms, was conceptualized in 1876 by James
Thomson, the elder brother of the more famous Sir William
Thomson.[16]
The art of mechanical analog computing reached its zenith with the
differential analyzer, built by H. L. Hazen and Vannevar Bush at MIT
starting in 1927. This built on the mechanical integrators of James Thomson and the torque amplifiers
invented by H. W. Nieman. A dozen of these devices were built before their obsolescence became obvious.
By the 1950s, the success of digital electronic computers had spelled the end for most analog computing
machines, but analog computers remained in use during the 1950s in some specialized applications such as
education (slide rule) and aircraft (control systems).
Digital computers
Electromechanical
By 1938, the United States Navy had developed an electromechanical analog computer small enough to
use aboard a submarine. This was the Torpedo Data Computer, which used trigonometry to solve the
problem of firing a torpedo at a moving target. During World War II similar devices were developed in
other countries as well.
Early digital computers were electromechanical; electric switches drove mechanical relays to perform the
calculation. These devices had a low operating speed and were eventually superseded by much faster allelectric computers, originally using vacuum tubes. The Z2, created by German engineer Konrad Zuse in
1939 in Berlin, was one of the earliest examples of an electromechanical relay computer.[27]
In 1941, Zuse followed his earlier machine up
with the Z3, the world's first working
electromechanical
programmable,
fully
[30][31]
automatic digital computer.
The Z3
was built with 2000 relays, implementing a
22 bit word length that operated at a clock
frequency of about 5–10 Hz.[32] Program
code was supplied on punched film while
data could be stored in 64 words of memory
Replica of Konrad Zuse's Z3, the
or supplied from the keyboard. It was quite
first fully automatic, digital
Konrad Zuse,
similar to modern machines in some respects,
(electromechanical) computer
inventor of the
pioneering numerous advances such as
modern
floating-point numbers. Rather than the
computer[28][29]
harder-to-implement decimal system (used in Charles Babbage's earlier design),
using a binary system meant that Zuse's machines were easier to build and
potentially more reliable, given the technologies available at that time.[33] The Z3 was not itself a universal
computer but could be extended to be Turing complete.[34][35]
Zuse's next computer, the Z4, became the world's first commercial computer; after initial delay due to the
Second World War, it was completed in 1950 and delivered to the ETH Zurich.[36] The computer was
manufactured by Zuse's own company, Zuse KG, which was founded in 1941 as the first company with
the sole purpose of developing computers in Berlin.[36]
Vacuum tubes and digital electronic circuits
Purely electronic circuit elements soon replaced their mechanical and electromechanical equivalents, at the
same time that digital calculation replaced analog. The engineer Tommy Flowers, working at the Post
Office Research Station in London in the 1930s, began to explore the possible use of electronics for the
telephone exchange. Experimental equipment that he built in 1934 went into operation five years later,
converting a portion of the telephone exchange network into an electronic data processing system, using
thousands of vacuum tubes.[26] In the US, John Vincent Atanasoff and Clifford E. Berry of Iowa State
University developed and tested the Atanasoff–Berry Computer (ABC) in 1942,[37] the first "automatic
electronic digital computer".[38] This design was also all-electronic and used about 300 vacuum tubes, with
capacitors fixed in a mechanically rotating drum for memory.[39]
During World War II, the British code-breakers at Bletchley
Park achieved a number of successes at breaking encrypted
German military communications. The German encryption
machine, Enigma, was first attacked with the help of the
electro-mechanical bombes which were often run by
women.[40][41] To crack the more sophisticated German
Lorenz SZ 40/42 machine, used for high-level Army
communications, Max Newman and his colleagues
commissioned Flowers to build the Colossus.[39] He spent
eleven months from early February 1943 designing and
building the first Colossus.[42] After a functional test in
December 1943, Colossus was shipped to Bletchley Park,
where it was delivered on 18 January 1944[43] and attacked its
first message on 5 February.[39]
Colossus, the first electronic digital
programmable computing device, was
used to break German ciphers during
World War II. It is seen here in use at
Bletchley Park in 1943.
Colossus was the world's first electronic digital programmable computer.[26] It used a large number of
valves (vacuum tubes). It had paper-tape input and was capable of being configured to perform a variety of
boolean logical operations on its data, but it was not Turing-complete. Nine Mk II Colossi were built (The
Mk I was converted to a Mk II making ten machines in total). Colossus Mark I contained 1,500 thermionic
valves (tubes), but Mark II with 2,400 valves, was both five times faster and simpler to operate than Mark I,
greatly speeding the decoding process.[44][45]
ENIAC was the first electronic, Turingcomplete device, and performed ballistics
trajectory calculations for the United
States Army.
The ENIAC[46] (Electronic Numerical Integrator and
Computer) was the first electronic programmable computer
built in the U.S. Although the ENIAC was similar to the
Colossus, it was much faster, more flexible, and it was Turingcomplete. Like the Colossus, a "program" on the ENIAC was
defined by the states of its patch cables and switches, a far cry
from the stored program electronic machines that came later.
Once a program was written, it had to be mechanically set into
the machine with manual resetting of plugs and switches. The
programmers of the ENIAC were six women, often known
collectively as the "ENIAC girls".[47][48]
It combined the high speed of electronics with the ability to be
programmed for many complex problems. It could add or
subtract 5000 times a second, a thousand times faster than any
other machine. It also had modules to multiply, divide, and
square root. High speed memory was limited to 20 words (about 80 bytes). Built under the direction of
John Mauchly and J. Presper Eckert at the University of Pennsylvania, ENIAC's development and
construction lasted from 1943 to full operation at the end of 1945. The machine was huge, weighing 30
tons, using 200 kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays, and
hundreds of thousands of resistors, capacitors, and inductors.[49]
Modern computers
Concept of modern computer
The principle of the modern computer was proposed by Alan Turing in his seminal 1936 paper,[50] On
Computable Numbers. Turing proposed a simple device that he called "Universal Computing machine"
and that is now known as a universal Turing machine. He proved that such a machine is capable of
computing anything that is computable by executing instructions (program) stored on tape, allowing the
machine to be programmable. The fundamental concept of Turing's design is the stored program, where all
the instructions for computing are stored in memory. Von Neumann acknowledged that the central concept
of the modern computer was due to this paper.[51] Turing machines are to this day a central object of study
in theory of computation. Except for the limitations imposed by their finite memory stores, modern
computers are said to be Turing-complete, which is to say, they have algorithm execution capability
equivalent to a universal Turing machine.
Stored programs
Early computing machines had fixed programs. Changing its function required the re-wiring and restructuring of the machine.[39] With the proposal of the stored-program computer this changed. A storedprogram computer includes by design an instruction set and can store in memory a set of instructions (a
program) that details the computation. The theoretical basis for the stored-program computer was laid out
by Alan Turing in his 1936 paper. In 1945, Turing joined the
National Physical Laboratory and began work on developing
an electronic stored-program digital computer. His 1945 report
"Proposed Electronic Calculator" was the first specification for
such a device. John von Neumann at the University of
Pennsylvania also circulated his First Draft of a Report on the
EDVAC in 1945.[26]
The Manchester Baby was the world's first stored-program
computer. It was built at the University of Manchester in
A section of the reconstructed
England by Frederic C. Williams, Tom Kilburn and Geoff
Manchester Baby, the first electronic
Tootill, and ran its first program on 21 June 1948.[52] It was
stored-program computer
designed as a testbed for the Williams tube, the first randomaccess digital storage device.[53] Although the computer was
described as "small and primitive" by a 1998 retrospective, it was the first working machine to contain all
of the elements essential to a modern electronic computer.[54] As soon as the Baby had demonstrated the
feasibility of its design, a project began at the university to develop it into a practically useful computer, the
Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the world's first commercially
available general-purpose computer.[55] Built by Ferranti, it was delivered to the University of Manchester
in February 1951. At least seven of these later machines were delivered between 1953 and 1957, one of
them to Shell labs in Amsterdam.[56] In October 1947 the directors of British catering company J. Lyons &
Company decided to take an active role in promoting the commercial development of computers. Lyons's
LEO I computer, modelled closely on the Cambridge EDSAC of 1949, became operational in April
1951[57] and ran the world's first routine office computer job.
Grace Hopper was the first to develop a compiler for a programming language.[2]
Transistors
The concept of a field-effect transistor was proposed by Julius
Edgar Lilienfeld in 1925. John Bardeen and Walter Brattain, while
working under William Shockley at Bell Labs, built the first
working transistor, the point-contact transistor, in 1947, which was
followed by Shockley's bipolar junction transistor in 1948.[58][59]
From 1955 onwards, transistors replaced vacuum tubes in computer
designs, giving rise to the "second generation" of computers.
Compared to vacuum tubes, transistors have many advantages: they
are smaller, and require less power than vacuum tubes, so give off
Bipolar junction transistor (BJT)
less heat. Junction transistors were much more reliable than vacuum
tubes and had longer, indefinite, service life. Transistorized
computers could contain tens of thousands of binary logic circuits in a relatively compact space. However,
early junction transistors were relatively bulky devices that were difficult to manufacture on a massproduction basis, which limited them to a number of specialised applications.[60]
At the University of Manchester, a team under the leadership of Tom Kilburn designed and built a machine
using the newly developed transistors instead of valves.[61] Their first transistorised computer and the first
in the world, was operational by 1953, and a second version was completed there in April 1955. However,
the machine did make use of valves to generate its 125 kHz clock waveforms and in the circuitry to read
and write on its magnetic drum memory, so it was not the first completely transistorized computer. That
distinction goes to the Harwell CADET of 1955,[62] built by the electronics division of the Atomic Energy
Research Establishment at Harwell.[62][63]
The metal–oxide–silicon field-effect transistor (MOSFET), also
known as the MOS transistor, was invented by Mohamed M. Atalla
and Dawon Kahng at Bell Labs in 1959.[64] It was the first truly
compact transistor that could be miniaturised and mass-produced for
a wide range of uses.[60] With its high scalability,[65] and much
lower power consumption and higher density than bipolar junction
transistors,[66] the MOSFET made it possible to build high-density
integrated circuits.[67][68] In addition to data processing, it also
enabled the practical use of MOS transistors as memory cell storage
elements, leading to the development of MOS semiconductor
memory, which replaced earlier magnetic-core memory in
computers. The MOSFET led to the microcomputer revolution,[69]
and became the driving force behind the computer
revolution.[70][71] The MOSFET is the most widely used transistor
in computers,[72][73] and is the fundamental building block of
digital electronics.[74]
MOSFET (MOS transistor), showing
gate (G), body (B), source (S) and
drain (D) terminals. The gate is
separated from the body by an
insulating layer (pink).
Integrated circuits
Die photograph of a MOS 6502, Integrated circuits are typically packaged in
an early 1970s microprocessor plastic, metal, or ceramic cases to protect the IC
integrating 3500 transistors on a from damage and for ease of assembly.
single chip
The next great advance in computing power came with the advent of the integrated circuit (IC). The idea of
the integrated circuit was first conceived by a radar scientist working for the Royal Radar Establishment of
the Ministry of Defence, Geoffrey W.A. Dummer. Dummer presented the first public description of an
integrated circuit at the Symposium on Progress in Quality Electronic Components in Washington, D.C., on
7 May 1952.[75]
The first working ICs were invented by Jack Kilby at Texas Instruments and Robert Noyce at Fairchild
Semiconductor.[76] Kilby recorded his initial ideas concerning the integrated circuit in July 1958,
successfully demonstrating the first working integrated example on 12 September 1958.[77] In his patent
application of 6 February 1959, Kilby described his new device as "a body of semiconductor material ...
wherein all the components of the electronic circuit are completely integrated".[78][79] However, Kilby's
invention was a hybrid integrated circuit (hybrid IC), rather than a monolithic integrated circuit (IC)
chip.[80] Kilby's IC had external wire connections, which made it difficult to mass-produce.[81]
Noyce also came up with his own idea of an integrated circuit half a year later than Kilby.[82] Noyce's
invention was the first true monolithic IC chip.[83][81] His chip solved many practical problems that Kilby's
had not. Produced at Fairchild Semiconductor, it was made of silicon, whereas Kilby's chip was made of
germanium. Noyce's monolithic IC was fabricated using the planar process, developed by his colleague
Jean Hoerni in early 1959. In turn, the planar process was based on Mohamed M. Atalla's work on
semiconductor surface passivation by silicon dioxide in the late 1950s.[84][85][86]
Modern monolithic ICs are predominantly MOS (metal–oxide–semiconductor) integrated circuits, built
from MOSFETs (MOS transistors).[87] The earliest experimental MOS IC to be fabricated was a 16transistor chip built by Fred Heiman and Steven Hofstein at RCA in 1962.[88] General Microelectronics
later introduced the first commercial MOS IC in 1964,[89] developed by Robert Norman.[88] Following the
development of the self-aligned gate (silicon-gate) MOS transistor by Robert Kerwin, Donald Klein and
John Sarace at Bell Labs in 1967, the first silicon-gate MOS IC with self-aligned gates was developed by
Federico Faggin at Fairchild Semiconductor in 1968.[90] The MOSFET has since become the most critical
device component in modern ICs.[87]
The development of the MOS integrated circuit led to the invention of the microprocessor,[91][92] and
heralded an explosion in the commercial and personal use of computers. While the subject of exactly which
device was the first microprocessor is contentious, partly due to lack of agreement on the exact definition of
the term "microprocessor", it is largely undisputed that the first single-chip microprocessor was the Intel
4004,[93] designed and realized by Federico Faggin with his silicon-gate MOS IC technology,[91] along
with Ted Hoff, Masatoshi Shima and Stanley Mazor at Intel.[b][95] In the early 1970s, MOS IC technology
enabled the integration of more than 10,000 transistors on a single chip.[68]
System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of a coin.[96] They may
or may not have integrated RAM and flash memory. If not integrated, the RAM is usually placed directly
above (known as Package on package) or below (on the opposite side of the circuit board) the SoC, and the
flash memory is usually placed right next to the SoC, this all done to improve data transfer speeds, as the
data signals do not have to travel long distances. Since ENIAC in 1945, computers have advanced
enormously, with modern SoCs (Such as the Snapdragon 865) being the size of a coin while also being
hundreds of thousands of times more powerful than ENIAC, integrating billions of transistors, and
consuming only a few watts of power.
Mobile computers
The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg) IBM 5100 was an
early example. Later portables such as the Osborne 1 and Compaq Portable were considerably lighter but
still needed to be plugged in. The first laptops, such as the Grid Compass, removed this requirement by
incorporating batteries – and with the continued miniaturization of computing resources and advancements
in portable battery life, portable computers grew in popularity in the 2000s.[97] The same developments
allowed manufacturers to integrate computing resources into cellular mobile phones by the early 2000s.
These smartphones and tablets run on a variety of operating systems and recently became the dominant
computing device on the market.[98] These are powered by System on a Chip (SoCs), which are complete
computers on a microchip the size of a coin.[96]
Types
Computers can be classified in a number of different ways, including:
By architecture
Analog computer
Digital computer
Hybrid computer
Harvard architecture
Von Neumann architecture
Complex instruction set computer
Reduced instruction set computer
By size, form-factor and purpose
Supercomputer
Mainframe computer
Minicomputer (term no longer used),[99] Midrange computer
Server
Rackmount server
Blade server
Tower server
Personal computer
Workstation
Microcomputer (term no longer used)[100]
Home computer (term fallen into disuse)[101]
Desktop computer
Tower desktop
Slimline desktop
Multimedia computer (non-linear editing system computers, video editing PCs
and the like, this term is no longer used)[102]
Gaming computer
All-in-one PC
Nettop (Small form factor PCs, Mini PCs)
Home theater PC
Keyboard computer
Portable computer
Thin client
Internet appliance
Laptop
Desktop replacement computer
Gaming laptop
Rugged laptop
2-in-1 PC
Ultrabook
Chromebook
Subnotebook
Netbook
Mobile computers:
Tablet computer
Smartphone
Ultra-mobile PC
Pocket PC
Palmtop PC
Handheld PC
Wearable computer
Smartwatch
Smartglasses
Single-board computer
Plug computer
Stick PC
Programmable logic controller
Computer-on-module
System on module
System in a package
System-on-chip (Also known as an Application Processor or AP if it lacks circuitry such as
radio circuitry)
Microcontroller
Hardware
Video demonstrating the standard
components of a "slimline" computer
The term hardware covers all of those parts of a computer that are tangible physical objects. Circuits,
computer chips, graphic cards, sound cards, memory (RAM), motherboard, displays, power supplies,
cables, keyboards, printers and "mice" input devices are all hardware.
History of computing hardware
First generation
(mechanical/electromechanical)
Second generation
(vacuum tubes)
Third generation
(discrete transistors and SSI, MSI,
LSI integrated circuits)
Fourth generation
(VLSI integrated circuits)
Calculators
Pascal's calculator, Arithmometer, Difference
engine, Quevedo's analytical machines
Programmable
devices
Jacquard loom, Analytical engine, IBM
ASCC/Harvard Mark I, Harvard Mark II, IBM
SSEC, Z1, Z2, Z3
Calculators
Atanasoff–Berry Computer, IBM 604, UNIVAC 60,
UNIVAC 120
Programmable
devices
Colossus, ENIAC, Manchester Baby, EDSAC,
Manchester Mark 1, Ferranti Pegasus, Ferranti
Mercury, CSIRAC, EDVAC, UNIVAC I, IBM 701,
IBM 702, IBM 650, Z22
Mainframes
IBM 7090, IBM 7080, IBM System/360, BUNCH
Minicomputer
HP 2116A, IBM System/32, IBM System/36,
LINC, PDP-8, PDP-11
Desktop Computer
HP 9100
Minicomputer
VAX, IBM AS/400
4-bit
microcomputer
Intel 4004, Intel 4040
8-bit
microcomputer
Intel 8008, Intel 8080, Motorola 6800, Motorola
6809, MOS Technology 6502, Zilog Z80
16-bit
microcomputer
Intel 8088, Zilog Z8000, WDC 65816/65802
32-bit
microcomputer
Intel 80386, Pentium, Motorola 68000, ARM
64-bit
microcomputer[c]
Alpha, MIPS, PA-RISC, PowerPC, SPARC, x8664, ARMv8-A
Embedded
computer
Intel 8048, Intel 8051
Personal computer
Desktop computer, Home computer, Laptop
computer, Personal digital assistant (PDA),
Portable computer, Tablet PC, Wearable computer
Quantum computer
IBM Q System One
Chemical computer
DNA computing
Theoretical/experimental
Optical computer
Spintronics-based
computer
Wetware/Organic
computer
Other hardware topics
Peripheral device
(input/output)
Computer buses
Input
Mouse, keyboard, joystick, image scanner, webcam,
graphics tablet, microphone
Output
Monitor, printer, loudspeaker
Both
Floppy disk drive, hard disk drive, optical disc drive,
teleprinter
Short range
RS-232, SCSI, PCI, USB
Long range (computer
networking)
Ethernet, ATM, FDDI
A general-purpose computer has four main components: the arithmetic logic unit (ALU), the control unit,
the memory, and the input and output devices (collectively termed I/O). These parts are interconnected by
buses, often made of groups of wires. Inside each of these parts are thousands to trillions of small electrical
circuits which can be turned off or on by means of an electronic switch. Each circuit represents a bit (binary
digit) of information so that when the circuit is on it represents a "1", and when off it represents a "0" (in
positive logic representation). The circuits are arranged in logic gates so that one or more of the circuits may
control the state of one or more of the other circuits.
Input devices
When unprocessed data is sent to the computer with the help of input devices, the data is processed and
sent to output devices. The input devices may be hand-operated or automated. The act of processing is
mainly regulated by the CPU. Some examples of input devices are:
Computer keyboard
Digital camera
Digital video
Graphics tablet
Image scanner
Joystick
Microphone
Mouse
Overlay keyboard
Real-time clock
Trackball
Touchscreen
Light pen
Output devices
The means through which computer gives output are known as output devices. Some examples of output
devices are:
Computer monitor
Printer
PC speaker
Projector
Sound card
Video card
Control unit
The control unit (often called a control system or
central controller) manages the computer's various
components; it reads and interprets (decodes) the
program instructions, transforming them into
control signals that activate other parts of the
computer.[d] Control systems in advanced
computers may change the order of execution of
some instructions to improve performance.
Diagram showing how a particular MIPS architecture
instruction would be decoded by the control system
A key component common to all CPUs is the
program counter, a special memory cell (a register) that keeps track of which location in memory the next
instruction is to be read from.[e]
The control system's function is as follows— this is a simplified description, and some of these steps may
be performed concurrently or in a different order depending on the type of CPU:
1. Read the code for the next instruction from the cell indicated by the program counter.
2. Decode the numerical code for the instruction into a set of commands or signals for each of
the other systems.
3. Increment the program counter so it points to the next instruction.
4. Read whatever data the instruction requires from cells in memory (or perhaps from an input
device). The location of this required data is typically stored within the instruction code.
5. Provide the necessary data to an ALU or register.
6. If the instruction requires an ALU or specialized hardware to complete, instruct the hardware
to perform the requested operation.
7. Write the result from the ALU back to a memory location or to a register or perhaps an output
device.
8. Jump back to step (1).
Since the program counter is (conceptually) just another set of memory cells, it can be changed by
calculations done in the ALU. Adding 100 to the program counter would cause the next instruction to be
read from a place 100 locations further down the program. Instructions that modify the program counter are
often known as "jumps" and allow for loops (instructions that are repeated by the computer) and often
conditional instruction execution (both examples of control flow).
The sequence of operations that the control unit goes through to process an instruction is in itself like a
short computer program, and indeed, in some more complex CPU designs, there is another yet smaller
computer called a microsequencer, which runs a microcode program that causes all of these events to
happen.
Central processing unit (CPU)
The control unit, ALU, and registers are collectively known as a central processing unit (CPU). Early
CPUs were composed of many separate components. Since the 1970s, CPUs have typically been
constructed on a single MOS integrated circuit chip called a microprocessor.
Arithmetic logic unit (ALU)
The ALU is capable of performing two classes of operations: arithmetic and logic.[103] The set of
arithmetic operations that a particular ALU supports may be limited to addition and subtraction, or might
include multiplication, division, trigonometry functions such as sine, cosine, etc., and square roots. Some
can operate only on whole numbers (integers) while others use floating point to represent real numbers,
albeit with limited precision. However, any computer that is capable of performing just the simplest
operations can be programmed to break down the more complex operations into simple steps that it can
perform. Therefore, any computer can be programmed to perform any arithmetic operation—although it
will take more time to do so if its ALU does not directly support the operation. An ALU may also compare
numbers and return Boolean truth values (true or false) depending on whether one is equal to, greater than
or less than the other ("is 64 greater than 65?"). Logic operations involve Boolean logic: AND, OR, XOR,
and NOT. These can be useful for creating complicated conditional statements and processing Boolean
logic.
Superscalar computers may contain multiple ALUs, allowing them to process several instructions
simultaneously.[104] Graphics processors and computers with SIMD and MIMD features often contain
ALUs that can perform arithmetic on vectors and matrices.
Memory
A computer's memory can be viewed as a list of cells into which
numbers can be placed or read. Each cell has a numbered "address"
and can store a single number. The computer can be instructed to
"put the number 123 into the cell numbered 1357" or to "add the
number that is in cell 1357 to the number that is in cell 2468 and put
the answer into cell 1595." The information stored in memory may
represent practically anything. Letters, numbers, even computer
instructions can be placed into memory with equal ease. Since the
CPU does not differentiate between different types of information,
it is the software's responsibility to give significance to what the
memory sees as nothing but a series of numbers.
Magnetic-core memory (using
magnetic cores) was the computer
memory of choice in the 1960s, until
it was replaced by semiconductor
memory (using MOS memory cells).
In almost all modern computers, each memory cell is set up to store
binary numbers in groups of eight bits (called a byte). Each byte is
able to represent 256 different numbers (28 = 256); either from 0 to
255 or −128 to +127. To store larger numbers, several consecutive bytes may be used (typically, two, four
or eight). When negative numbers are required, they are usually stored in two's complement notation. Other
arrangements are possible, but are usually not seen outside of specialized applications or historical contexts.
A computer can store any kind of information in memory if it can be represented numerically. Modern
computers have billions or even trillions of bytes of memory.
The CPU contains a special set of memory cells called registers that can be read and written to much more
rapidly than the main memory area. There are typically between two and one hundred registers depending
on the type of CPU. Registers are used for the most frequently needed data items to avoid having to access
main memory every time data is needed. As data is constantly being worked on, reducing the need to
access main memory (which is often slow compared to the ALU and control units) greatly increases the
computer's speed.
Computer main memory comes in two principal varieties:
random-access memory or RAM
read-only memory or ROM
RAM can be read and written to anytime the CPU commands it, but ROM is preloaded with data and
software that never changes, therefore the CPU can only read from it. ROM is typically used to store the
computer's initial start-up instructions. In general, the contents of RAM are erased when the power to the
computer is turned off, but ROM retains its data indefinitely. In a PC, the ROM contains a specialized
program called the BIOS that orchestrates loading the computer's operating system from the hard disk drive
into RAM whenever the computer is turned on or reset. In embedded computers, which frequently do not
have disk drives, all of the required software may be stored in ROM. Software stored in ROM is often
called firmware, because it is notionally more like hardware than software. Flash memory blurs the
distinction between ROM and RAM, as it retains its data when turned off but is also rewritable. It is
typically much slower than conventional ROM and RAM however, so its use is restricted to applications
where high speed is unnecessary.[f]
In more sophisticated computers there may be one or more RAM cache memories, which are slower than
registers but faster than main memory. Generally computers with this sort of cache are designed to move
frequently needed data into the cache automatically, often without the need for any intervention on the
programmer's part.
Input/output (I/O)
I/O is the means by which a computer exchanges information with
the outside world.[106] Devices that provide input or output to the
computer are called peripherals.[107] On a typical personal
computer, peripherals include input devices like the keyboard and
mouse, and output devices such as the display and printer. Hard
disk drives, floppy disk drives and optical disc drives serve as both
input and output devices. Computer networking is another form of
I/O. I/O devices are often complex computers in their own right,
Hard disk drives are common
with their own CPU and memory. A graphics processing unit might
storage devices used with
contain fifty or more tiny computers that perform the calculations
computers.
necessary to display 3D graphics. Modern desktop computers
contain many smaller computers that assist the main CPU in
performing I/O. A 2016-era flat screen display contains its own computer circuitry.
Multitasking
While a computer may be viewed as running one gigantic program stored in its main memory, in some
systems it is necessary to give the appearance of running several programs simultaneously. This is achieved
by multitasking i.e. having the computer switch rapidly between running each program in turn.[108] One
means by which this is done is with a special signal called an interrupt, which can periodically cause the
computer to stop executing instructions where it was and do something else instead. By remembering
where it was executing prior to the interrupt, the computer can return to that task later. If several programs
are running "at the same time". then the interrupt generator might be causing several hundred interrupts per
second, causing a program switch each time. Since modern computers typically execute instructions several
orders of magnitude faster than human perception, it may appear that many programs are running at the
same time even though only one is ever executing in any given instant. This method of multitasking is
sometimes termed "time-sharing" since each program is allocated a "slice" of time in turn.[109]
Before the era of inexpensive computers, the principal use for multitasking was to allow many people to
share the same computer. Seemingly, multitasking would cause a computer that is switching between
several programs to run more slowly, in direct proportion to the number of programs it is running, but most
programs spend much of their time waiting for slow input/output devices to complete their tasks. If a
program is waiting for the user to click on the mouse or press a key on the keyboard, then it will not take a
"time slice" until the event it is waiting for has occurred. This frees up time for other programs to execute so
that many programs may be run simultaneously without unacceptable speed loss.
Multiprocessing
Some computers are designed to distribute their work across
several CPUs in a multiprocessing configuration, a technique
once employed in only large and powerful machines such as
supercomputers, mainframe computers and servers.
Multiprocessor and multi-core (multiple CPUs on a single
integrated circuit) personal and laptop computers are now
widely available, and are being increasingly used in lower-end
markets as a result.
Supercomputers in particular often have highly unique
Cray designed many supercomputers that
architectures that differ significantly from the basic stored[g]
used multiprocessing heavily.
program architecture and from general-purpose computers.
They often feature thousands of CPUs, customized high-speed
interconnects, and specialized computing hardware. Such
designs tend to be useful for only specialized tasks due to the large scale of program organization required
to use most of the available resources at once. Supercomputers usually see usage in large-scale simulation,
graphics rendering, and cryptography applications, as well as with other so-called "embarrassingly parallel"
tasks.
Software
Software refers to parts of the computer which do not have a material form, such as programs, data,
protocols, etc. Software is that part of a computer system that consists of encoded information or computer
instructions, in contrast to the physical hardware from which the system is built. Computer software
includes computer programs, libraries and related non-executable data, such as online documentation or
digital media. It is often divided into system software and application software Computer hardware and
software require each other and neither can be realistically used on its own. When software is stored in
hardware that cannot easily be modified, such as with BIOS ROM in an IBM PC compatible computer, it is
sometimes called "firmware".
Operating system
/System Software
Library
Data
User interface
Application
Software
Unix and BSD
UNIX System V, IBM AIX, HP-UX, Solaris (SunOS), IRIX, List of
BSD operating systems
Linux
List of Linux distributions, Comparison of Linux distributions
Microsoft Windows