what is the history behind microcomputer?
The world’s first commercial microprocessor was the Intel 4004, released on November 15, 1971. The 4004 processed 4 binary digits (bits) of data in parallel; in other words, it was a 4-bit processor. At the turn of the century 30 years later, microcomputers in embedded systems (built into home appliances, vehicles, and all sorts of equipment) most often are 8-bit, 16-bit, 32-bit, or 64-bit. Desktop/consumer microcomputers, like Apple Macintosh and PCs, are predominantly 32-bit but increasingly 64-bit, while most science and engineering workstations and supercomputers as well as database and financial transaction servers are 64-bit (with one or more CPUs). The first generation of microcomputers, for engineering development and hobbyist personal use, was launched in the mid-1970s; the MITS Altair being the most well-known example. 1977 saw the introduction of the second generation, known as home computers. These were considerably easier to use than their predecessors, whose operation oft