How did the Soviets react to American advances in CPU design?

by M-A-C_doctrine

So, reading about the Intel 4004 CPU and this caught my attention:

Federico Faggin accomplished what no one had achieved before: to fit a general-purpose CPU into a small, commercial silicon chip

Which seems like a huge leap! This in turn brought Intel to the forefront of CPU technology and put them on the path to become what they are today.

But given that this was at the height of the cold war...what were the Soviets doing while all this happened?

Did Soviet leadership recognize how important this was or they dismissed it?

What was the reaction inside the Soviet scientific community? As far as I know, around these years cybernetics was a huge topic in the Soviet Union.

Why did the Soviets (and for that matter, the rest of Europe too...) fell behind so fast compared to the Americans when it came to CPU research?

ExistentialTenant

I found a highly relevant thread about this subject with plenty of input from highly knowledgeable people.

The thread really went into detail about why replicating American advances in CPU technology was so difficult for the USSR and how they tried to go around the issue instead.

EDIT: I didn't know I was supposed to ping the original author of the answers. There were many knowledgeable people in the thread, but /u/restricteddata started the parent comment and provided a lot of the details.

Origin_of_Mind

It is worth noting that the inevitability of putting a processor on a single chip was taken for granted long before it happened. But it could only have happened after the semiconductor technology had matured sufficiently to allow fabrication of such a relatively complex chip.

Intel was created as a memory company, using specific technological know-how (silicon gate MOS) which the founders have developed. The specific details of the technological steps used by Intel remained their trade secret for some years and enabled them to economically produce higher density chips than those of their competitors. This is ultimately what led Intel to be among the first who have put an entire processor on a single chip, and the first to offer such a product commercially.

Even though today we celebrate the microprocessor as a great milestone, the microprocessor project at Intel was a very low priority work, one of several side projects taken up on contract. (The first engineer was hired and assigned to the project just when the customer have sent a representative to check on the progress of what was supposed to be a half finished design:

So what happened next is that I had now this task where I am essentially six months late the day after I start. I have four chips to develop, an angry customer, and basically Intel had no experience whatsoever in designing random logic circuits, particularly with silicon gate which is a new technology, that requires a new methodology, a new way of doing it. Intel was a memory company. They had done memories before, and those designs are very different from random logic. I was on my own. Vadasz, my boss, was completely absorbed by the memory activity. And the application group, it was not even their job to help me, so basically I was on my own. Of course, I tried to do my best to make the customer happy I felt that even if I had nothing to do with it, Intel was late, and the customer was right in complaining.

[Faggin, at 4004 oral history panel, page 11, Computer History Museum])

After the microprocessor was created, it took Intel management a very long time to realize its potential:

... the [main] preoccupation of Intel was memories. The microprocessor was a sort of a, “well, let’s see if it works first, and let’s try it out.” There was no commitment, really, behind it, no real commitment behind it. They were saying, “Oh, let’s figure it out, see if it works.” And the bottom line that was there for a lot of time -- many years, even after I left -- was, “Oh, we make microprocessors so that we can sell more memories.” That was the punch line of their strategy. There was not really an understanding that microprocessors were actually changing the industry. And it wasn’t until the microprocessor was firmly entrenched -- so, in the early ‘80’s -- before it was clear that something fundamentally different had happened in the marketplace.

[ Frederico Faggin oral history, page 65, Computer History Museum]