Once accountants started to run this ship, they sailed onto rocky shores. Profits should be used for research, instead they wasted ~~100 billion on stock buy-backs to keep the funds happy. Those billions, if spent on research, might have kept them off the rocks.
Depends on your scope. From an intel perspective it would be wise to keep their cash, and maybe they would be in a better position today. From a market perspective we unlocked 100B and pumped them to other companies for example Apple, tsmc, nvidia etc or IPO’ed new ones. These seemed to achieve better multipliers on the capital hence as a whole we are better off (theoretically).
Now of course all of these are on the macro level. If you look closely the collapse of Intel would cause severe disruption both on the business and on the geopolitics fronts.
This is the story of the birth of Intel, and with it so many of the firsts that laid the foundation for our current technology landscape: The first DRAM chip, the creation of the first microprocessor (the 4004), on through the release of the Intel 8080.
Debatable to claim the 4004 as "the first microprocessor". It's safer to specify it as the first "commercially-available general purpose" microprocessor. See https://en.wikipedia.org/wiki/Microprocessor#First_projects for a few pre-4004 chips that also are debatabley the first microprocessor:
- Four-Phase Systems AL1 chip (1969), which was later demonstrated in a courtroom hack to act as a microprocessor (though there is much debate on whether that hack was too hacky)
- The F-14 CADC's ALU chip (1970), which was classified at the time
- Texas Instruments TMS 1802NC (announced September 17, 1971, two months before the 4004), which is more specifically termed a microcontroller nowadays, but nevertheless the core was entirely inside a single chip.
It was designed for implementing a desktop calculator, not a general-purpose computer. With some effort it could be repurposed to implement a simple controller, but it was completely unsuitable for implementing the processor of a general-purpose programmable computer.
For implementing a general-purpose processor, it is likely that using MSI TTL integrated circuits would have been simpler than using Intel 4004.
Intel 8008 (which implemented the architecture of Datapoint 2200), was the first commercially-available monolithic processor that could be used to make a general-purpose computer, and which has actually been used for this.
Around the same time with the first monolithic processors, Intel has invented the ultraviolet-erasable programmable read-only memory.
The EPROM invented by Intel has been almost as important as the microprocessors for enabling the appearance of cheap personal computers, by avoiding the need for other kinds of non-volatile memories for storing programs (e.g. punched-tape readers or magnetic core memories), which would have been more expensive than the entire computer.
I get you point. I was speaking in relative terms about it being "general purpose" and probably should have instead said "that can run an program from an external ROM"...and it that aspect it is more "general purpose" relative to the TMS 1802NC which could only run a fixed program burned into its internal rom. Nevertheless, while it is unsuited for running a general-purpose processor, it has indeed been proven to be capable of running Linux (albeit by emulating MIPS).
For what it is worth, Intel does refer to the 4004 as the "first general-purpose":
"That’s when the Intel® 4004 became the first general-purpose programmable processor on the market—a "building block" that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices." https://www.intel.com/content/www/us/en/history/museum-story...
"The 4004 would replace that system with a general-purpose chip that could be mass produced and then programmed through its software to perform specific functions, such as those of a desktop calculator. That idea could make computing cheaper, more powerful and smaller in one fell swoop. It could, in other words, facilitate the modern information age. In 1969, the Nippon Calculating Machine Corporation approached Intel to design 12 custom chips for its new Busicom 141-PF printing calculator. Intel's engineers proposed a new design of just four chips, including one that could be programmed for use. That programmable chip, later known as the Intel 4004, became the first general-purpose microprocessor." https://www.intel.com/content/www/us/en/history/virtual-vaul...
What they say now is irrelevant, because it is retconned with the purpose of increasing the apparent importance of what they have done in 1971 vs. what they have done in 1972.
Moreover, you can be pretty certain that whoever has written that text has never read the datasheets of Intel 4004, to be able to evaluate whether it was "general purpose" or not.
Even at its launch in 1971, when Intel has begun to offer for sale the 4004 because Busicom was unable to pay the price desired by Intel for 4004, the Intel marketing has attempted to present the 4004 as much more general purpose than it really was, in order to find customers for it, in order to ensure the profitability that could not be provided by Busicom.
During its design, 4004 has never been intended to be general-purpose, because the plan was to sell it to a single customer. Only after the delivery to the intended customer the Intel marketing has made great efforts to find other possible applications for it.
Inexpensive personal computers weren’t shipped with EPROMs, they were shipped with mask programmable ROMs. EPROMs were used in development but they were nowhere near as important as the microprocessor.
Mask programmable ROM could be used only by the companies whose production of computers had grown enough to make them worthwhile.
Moreover, in the beginning Intel was also the main producer of mask-programmable ROMs, which were launched at the same time with the corresponding EPROM, with the 23xx mask-programmable ROMs corresponding to the 27xx EPROMs.
All these part numbers belonged to a system used by Intel, where the first digit was "1" for PMOS, "2" for NMOS and "3" for bipolar, with the second digit being the kind of IC, e.g. 21xx for RAM, 23xx for mask-programmable ROM, 27xx for UV-EPROMs and 28xx for electrically-erasable PROMs.
Other manufacturers have started the production of memories after a delay of a few years and most of them have made memories compatible with those introduced by Intel and they have kept the 23xx and 27xx Intel part numbers.
Bipolar PROM was too small to contain the equivalent of the BIOS of a computer in the early seventies.
Bipolar PROMs were initially used mainly to store microprograms for CPUs with microprogrammed control and for implementing various kinds of programmable logic, in which role they were later replaced by PLAs, then by PALs.
I do not think that there has ever been any kind of computer that has stored in bipolar PROMs programs that were usable during normal operation, except perhaps some kind of embedded controllers designed before microprocessors become widespread, together with their associated EPROMs.
By the time when the Intel EPROMs like 1702 and 2708 have appeared (with a capacity of 256 bytes, then of 1 kbyte), typical bipolar PROMs had capacities of either 32 bytes or 128 bytes.
In that space you could put at most some kind of initial loader that would load a real bootstrapping program from something like a punched-tape reader. This kind of solution was used in some minicomputers, replacing the introducing of such an initial loader from console keys, by the operator. Such minicomputers were still at least an order of magnitude more expensive than the first computers with microprocessors, mainly due to the expensive peripherals required for a working system.
The Apple I had a single 256 byte bipolar PROM for its WozMon debugger. Though the Apple II used ROMs (EPROMS were popular in clones) the expansion cards still were supposed to use tiny PROMs for their drivers (in the case of the disk controller there was even a second PROM for the state machine - Woz did love these devices).
The Altair 680 had sockets for four 256 byte PROMs. One was for the debugger, but a popular option for the other three was the VTL-2 (very tiny language) interpreter. Pretty amazing that they fit a Basic-like language in just 768 bytes, though implementing some language in the 510 byte boot sector has become a popular hobby.
By the time when the Japanese DRAM manufacturers began to make DRAM chips that were both better and cheaper than those made by Intel, Intel has decided in 1985, probably rightly, that instead of investing a lot of money in trying to catch up with the Japanese they should better cut their losses by exiting the DRAM market and they should concentrate on what they were doing better, i.e. CPUs for IBM compatible PCs.
This decision has been taken not much after the launch of the IBM PC/AT and also when Intel was preparing the launch of 80386, so they were pretty certain that they can make a lot of money from CPUs, even if they abandon their traditional market.
It is likely that Intel has reached that point where such a decision had to be taken because for many years they must have underestimated the competence of the Japanese, by believing that they are not innovating, but only copying what the Americans do, exactly like now many Americans claim about the Chinese. When they have realized that actually the quality of the Japanese DRAMs is higher and their semiconductor plants have much better fabrication yields, it was too late.
Deliberate decision to focus on higher-margin products that aren't commodities (like memory). I believe similar logic was used to justify the sale of their flash business.
I don’t really get the Intel/Micron relationship. Much later, Intel collaborated with Micron on their NVME tech (3D Xpoint/optane), but in the end they gave up the product line to Micron, right?
Companies don’t have friends. But they seem quite cozy?
DRAM has very tight profit margins. It's a very cost-focused product line to be in, a company like Intel would never be able to get costs low enough. It was the right call.
Once accountants started to run this ship, they sailed onto rocky shores. Profits should be used for research, instead they wasted ~~100 billion on stock buy-backs to keep the funds happy. Those billions, if spent on research, might have kept them off the rocks.
Depends on your scope. From an intel perspective it would be wise to keep their cash, and maybe they would be in a better position today. From a market perspective we unlocked 100B and pumped them to other companies for example Apple, tsmc, nvidia etc or IPO’ed new ones. These seemed to achieve better multipliers on the capital hence as a whole we are better off (theoretically).
Now of course all of these are on the macro level. If you look closely the collapse of Intel would cause severe disruption both on the business and on the geopolitics fronts.
This is the story of the birth of Intel, and with it so many of the firsts that laid the foundation for our current technology landscape: The first DRAM chip, the creation of the first microprocessor (the 4004), on through the release of the Intel 8080.
Debatable to claim the 4004 as "the first microprocessor". It's safer to specify it as the first "commercially-available general purpose" microprocessor. See https://en.wikipedia.org/wiki/Microprocessor#First_projects for a few pre-4004 chips that also are debatabley the first microprocessor: - Four-Phase Systems AL1 chip (1969), which was later demonstrated in a courtroom hack to act as a microprocessor (though there is much debate on whether that hack was too hacky) - The F-14 CADC's ALU chip (1970), which was classified at the time - Texas Instruments TMS 1802NC (announced September 17, 1971, two months before the 4004), which is more specifically termed a microcontroller nowadays, but nevertheless the core was entirely inside a single chip.
I do not consider 4004 as "general purpose".
It was designed for implementing a desktop calculator, not a general-purpose computer. With some effort it could be repurposed to implement a simple controller, but it was completely unsuitable for implementing the processor of a general-purpose programmable computer.
For implementing a general-purpose processor, it is likely that using MSI TTL integrated circuits would have been simpler than using Intel 4004.
Intel 8008 (which implemented the architecture of Datapoint 2200), was the first commercially-available monolithic processor that could be used to make a general-purpose computer, and which has actually been used for this.
Around the same time with the first monolithic processors, Intel has invented the ultraviolet-erasable programmable read-only memory.
The EPROM invented by Intel has been almost as important as the microprocessors for enabling the appearance of cheap personal computers, by avoiding the need for other kinds of non-volatile memories for storing programs (e.g. punched-tape readers or magnetic core memories), which would have been more expensive than the entire computer.
I get you point. I was speaking in relative terms about it being "general purpose" and probably should have instead said "that can run an program from an external ROM"...and it that aspect it is more "general purpose" relative to the TMS 1802NC which could only run a fixed program burned into its internal rom. Nevertheless, while it is unsuited for running a general-purpose processor, it has indeed been proven to be capable of running Linux (albeit by emulating MIPS).
For what it is worth, Intel does refer to the 4004 as the "first general-purpose":
"That’s when the Intel® 4004 became the first general-purpose programmable processor on the market—a "building block" that engineers could purchase and then customize with software to perform different functions in a wide variety of electronic devices." https://www.intel.com/content/www/us/en/history/museum-story...
"The 4004 would replace that system with a general-purpose chip that could be mass produced and then programmed through its software to perform specific functions, such as those of a desktop calculator. That idea could make computing cheaper, more powerful and smaller in one fell swoop. It could, in other words, facilitate the modern information age. In 1969, the Nippon Calculating Machine Corporation approached Intel to design 12 custom chips for its new Busicom 141-PF printing calculator. Intel's engineers proposed a new design of just four chips, including one that could be programmed for use. That programmable chip, later known as the Intel 4004, became the first general-purpose microprocessor." https://www.intel.com/content/www/us/en/history/virtual-vaul...
What they say now is irrelevant, because it is retconned with the purpose of increasing the apparent importance of what they have done in 1971 vs. what they have done in 1972.
Moreover, you can be pretty certain that whoever has written that text has never read the datasheets of Intel 4004, to be able to evaluate whether it was "general purpose" or not.
Even at its launch in 1971, when Intel has begun to offer for sale the 4004 because Busicom was unable to pay the price desired by Intel for 4004, the Intel marketing has attempted to present the 4004 as much more general purpose than it really was, in order to find customers for it, in order to ensure the profitability that could not be provided by Busicom.
During its design, 4004 has never been intended to be general-purpose, because the plan was to sell it to a single customer. Only after the delivery to the intended customer the Intel marketing has made great efforts to find other possible applications for it.
Inexpensive personal computers weren’t shipped with EPROMs, they were shipped with mask programmable ROMs. EPROMs were used in development but they were nowhere near as important as the microprocessor.
Mask programmable ROM could be used only by the companies whose production of computers had grown enough to make them worthwhile.
Moreover, in the beginning Intel was also the main producer of mask-programmable ROMs, which were launched at the same time with the corresponding EPROM, with the 23xx mask-programmable ROMs corresponding to the 27xx EPROMs.
All these part numbers belonged to a system used by Intel, where the first digit was "1" for PMOS, "2" for NMOS and "3" for bipolar, with the second digit being the kind of IC, e.g. 21xx for RAM, 23xx for mask-programmable ROM, 27xx for UV-EPROMs and 28xx for electrically-erasable PROMs.
Other manufacturers have started the production of memories after a delay of a few years and most of them have made memories compatible with those introduced by Intel and they have kept the 23xx and 27xx Intel part numbers.
Didn’t PROM come before EPROM? While I agree EPROM enabled easier testing, PROMs would fit the bill once their contents got stable.
Bipolar PROM was too small to contain the equivalent of the BIOS of a computer in the early seventies.
Bipolar PROMs were initially used mainly to store microprograms for CPUs with microprogrammed control and for implementing various kinds of programmable logic, in which role they were later replaced by PLAs, then by PALs.
I do not think that there has ever been any kind of computer that has stored in bipolar PROMs programs that were usable during normal operation, except perhaps some kind of embedded controllers designed before microprocessors become widespread, together with their associated EPROMs.
By the time when the Intel EPROMs like 1702 and 2708 have appeared (with a capacity of 256 bytes, then of 1 kbyte), typical bipolar PROMs had capacities of either 32 bytes or 128 bytes.
In that space you could put at most some kind of initial loader that would load a real bootstrapping program from something like a punched-tape reader. This kind of solution was used in some minicomputers, replacing the introducing of such an initial loader from console keys, by the operator. Such minicomputers were still at least an order of magnitude more expensive than the first computers with microprocessors, mainly due to the expensive peripherals required for a working system.
The Apple I had a single 256 byte bipolar PROM for its WozMon debugger. Though the Apple II used ROMs (EPROMS were popular in clones) the expansion cards still were supposed to use tiny PROMs for their drivers (in the case of the disk controller there was even a second PROM for the state machine - Woz did love these devices).
The Altair 680 had sockets for four 256 byte PROMs. One was for the debugger, but a popular option for the other three was the VTL-2 (very tiny language) interpreter. Pretty amazing that they fit a Basic-like language in just 768 bytes, though implementing some language in the 510 byte boot sector has become a popular hobby.
How did Intel lose dram to micron?!
By the time when the Japanese DRAM manufacturers began to make DRAM chips that were both better and cheaper than those made by Intel, Intel has decided in 1985, probably rightly, that instead of investing a lot of money in trying to catch up with the Japanese they should better cut their losses by exiting the DRAM market and they should concentrate on what they were doing better, i.e. CPUs for IBM compatible PCs.
This decision has been taken not much after the launch of the IBM PC/AT and also when Intel was preparing the launch of 80386, so they were pretty certain that they can make a lot of money from CPUs, even if they abandon their traditional market.
It is likely that Intel has reached that point where such a decision had to be taken because for many years they must have underestimated the competence of the Japanese, by believing that they are not innovating, but only copying what the Americans do, exactly like now many Americans claim about the Chinese. When they have realized that actually the quality of the Japanese DRAMs is higher and their semiconductor plants have much better fabrication yields, it was too late.
Deliberate decision to focus on higher-margin products that aren't commodities (like memory). I believe similar logic was used to justify the sale of their flash business.
Micron itself was often touch and go until several competitors went bankrupt around 2010
I don’t really get the Intel/Micron relationship. Much later, Intel collaborated with Micron on their NVME tech (3D Xpoint/optane), but in the end they gave up the product line to Micron, right?
Companies don’t have friends. But they seem quite cozy?
DRAM has very tight profit margins. It's a very cost-focused product line to be in, a company like Intel would never be able to get costs low enough. It was the right call.
During the early eighties, DRAM has become a product with very tight profit margins, thanks to the Japanese competitors.
Before that, Intel and Mostek could charge an arm and a leg for their DRAMs, obtaining handsome profits.
Another very good history of early Intel is the Asianometry video and associated write-up: https://www.asianometry.com/p/intel-and-amd-the-first-30-yea...
I recall when Mike Magee of the UK inquirer coined the term 'Chimpzilla'(AMD) as Intel's(Chipzilla) perpetual rival
Magee was originally at The Register. They've lost all their colorful characters, except for the BOfH feature.
Yes he was. I was the person who suggested he hook up with the BOFH, years ago. Sadly, Mike passed in 2024
> The 3101 held 64 bits of data (eight letters of sixteen digits)
The 3101 held 64 bits of data (eight bytes, each representing values from 0 to 255).