
Sord SMP80/20, one of the first microcomputers, released in 1974. It was a successor to the first microcomputer, the Sord SMP80/08 (1972).

MOS transistor (MOSFET), the basic building block of microcomputers.

Intel 4004 (1971), the first single-chip MOS microprocessor. MOS microprocessors form the basis of microcomputers.
Although there is no rigid definition, a microcomputer (sometimes shortened to micro) is most often taken to mean a computer with a microprocessor (µP) as its CPU along with MOS memory. Another general characteristic of these computers is that they occupy physically small amounts of space.
The microcomputer came after the minicomputer, most notably replacing the many distinct components that made up the minicomputer's CPU with a single integrated microprocessor chip. Such early models were primitive, the earliest microprocessors being little more than general-purpose calculator chips. However, as microprocessor design advanced rapidly from the early 1970s onwards, microcomputers in turn grew faster and cheaper, resulting in an explosion in their popularity.
Whilst the microcomputer may have taken over from older-style designs in many cases, its most significant effects are to have widened access to computers, and to have expanded their usage into completely new areas.
History[]
Background[]

Mohamed M. Atalla invented the MOS transistor (1959) and MOS integrated circuit (1960), the basis for microcomputers.
The minicomputer ancestors of the modern personal computer used early integrated circuit (microchip) technology, which reduced size and cost, but they contained no microprocessor. This meant that they were still large and difficult to manufacture just like their mainframe predecessors. After the "computer-on-a-chip" was commercialized, the cost to manufacture a computer system dropped dramatically. The arithmetic, logic, and control functions that previously occupied several costly circuit boards were now available in one integrated circuit, making it possible to produce them in high volume. Concurrently, advances in the development of solid state memory eliminated the bulky, costly, and power-hungry magnetic core memory used in prior generations of computers.

Masatoshi Shima and Stanley Mazor derveloped the Intel 4004 (1971), the first single-chip microprocessor.
The basic building block of every microprocessor and memory chip is the metal-oxide-semiconductor field-effect transistor (MOSFET), also known as MOS transistor,[1] which was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[2][3] Atalla then proposed the MOS integrated circuit (MOS IC) chip in 1960.[4] The MOS transistor made it possible to build high-density microchips,[5][6] which led to the development of the first microprocessors[7] and memory chips.[1]
Microprocessors made their 1971 introduction with the Intel 4004, developed at Busicom and Intel by Masatoshi Shima, Frederico Faggin and Stanley Mazor. The single-chip microprocessor was made possible by an improvement in MOS technology, the silicon-gate MOS integrated circuit chip, developed in 1968 by Federico Faggin, who later used silicon-gate MOS technology to develop the first single-chip microprocessor, the Intel 4004, in 1971.[7]
Early microcomputers[]

Altair 8800 (1975), one of the earliest microcomputers affordable and marketed for private use.
The first microcomputer was the Japanese Sord Computer Corporation's SMP80/08 (1972), which was followed by the SMP80/x (1974).[8] In April 1972, Sord (now Toshiba Personal Computer System Corporation) developed the SMP80/08, the first microcomputer. It used the Intel 8008 microprocessor, which it was developed in tandem with. The SMP80/08, however, did not have a commercial release. After the first general-purpose microprocessor, the Intel 8080, was announced in April 1974, Sord announced the SMP80/x, the first microcomputer to use the 8080, in May 1974. The SMP80/x marked a major leap toward the popularization of microcomputers.[8]
The French developers of the Micral N (1973) filed their patents with the term "Micro-ordinateur", a literal equivalent of "Microcomputer", to designate a solid state machine designed with a microprocessor.

Apple II (1977), one of the first home computers, developed under Steve Jobs.
Early microcomputers such as the Altair 8800 (1975) from MITS had front-mounted switches and diagnostic lights (nicknamed "blinkenlights") to control and indicate internal system status, and were often sold in kit form to hobbyists. These kits would contain an empty printed circuit board which the buyer would fill with the integrated circuits, other individual electronic components, wires and connectors, and then hand-solder all the connections.[9]
Microcomputer revolution[]

Steve Jobs developed the Apple II, one of the first home computers, in 1977.
Home computers made their introduction in 1977:

Sord M200 Smart Home Computer, one of the first home computers, released in 1977.
- In North America, the Apple II (developed under Steve Jobs), Commodore PET and TRS-80 debuted in 1977. In retrospect they were dubbed "the 1977 trinity" and considered the watershed in bringing computers to the mainstream market.
- In Japan, Sord Computer Corporation introduced the Sord M200 Smart Home Computer in 1977.[10] Similar to the American trinity, Japan has the term "the three 8-Bit houses" (8ビット御三家, hachi-bitto gosanke) for their most relevant machines of that era. It does not include the aforementioned Sord, but instead the Hitachi Basic Master, Sharp MZ-80K, and NEC PC-8001.
The microcomputer revolution (also known as the personal computer revolution, home computer revolution, or digital revolution) is a phrase used to describe the rapid advances of MOS microprocessor-based computers, known as microcomputers, from esoteric hobby projects in the 1970s to a commonplace fixture of homes in industrial societies during the 1980s. Within a decade, personal computers became common consumer goods.
See also[]
References[]
- ↑ 1.0 1.1 Colinge, Jean-Pierre; Greer, James C. (2016). Nanowire Transistors: Physics of Devices and Materials in One Dimension. Cambridge University Press. p. 2. ISBN 9781107052406. https://books.google.com/books?id=FvjUCwAAQBAJ&pg=PA2.
- ↑ "1960 - Metal Oxide Semiconductor (MOS) Transistor Demonstrated". The Silicon Engine (Computer History Museum). https://www.computerhistory.org/siliconengine/metal-oxide-semiconductor-mos-transistor-demonstrated/.
- ↑ Lojek, Bo (2007). History of Semiconductor Engineering. Springer Science & Business Media. pp. 321–3. ISBN 9783540342588.
- ↑ Moskowitz, Sanford L. (2016). Advanced Materials Innovation: Managing Global Technology in the 21st century. John Wiley & Sons. pp. 165–167. ISBN 9780470508923. https://books.google.com/books?id=2STRDAAAQBAJ&pg=PA165.
- ↑ "Who Invented the Transistor?". Computer History Museum. 4 December 2013. Retrieved 20 July 2019.
{{cite web}}
: - ↑ Hittinger, William C. (1973). "METAL-OXIDE-SEMICONDUCTOR TECHNOLOGY". Scientific American 229 (2): 48–59. Bibcode 1973SciAm.229b..48H. doi:10.1038/scientificamerican0873-48. ISSN 0036-8733. JSTOR 24923169.
- ↑ 7.0 7.1 "1971: Microprocessor Integrates CPU Function onto a Single Chip". Computer History Museum. Retrieved 22 July 2019.
{{cite web}}
: - ↑ 8.0 8.1 http://museum.ipsj.or.jp/en/computer/personal/0086.html
- ↑ "Ed Roberts Interview". Retrieved 22 May 2016.
{{cite web}}
: - ↑ http://museum.ipsj.or.jp/en/computer/personal/0087.html