Reading this article it jumped out at me that Asimov thought that the schools of the future would have to teach binary arithimic so that people will have an easier time working with machines. However, I have never heard of a machine that had trouble taking inputs or providing outputs in decimal. From a technical standpoint, converting something from binary to decimal is extremely fast compared to nearly anything that you might ask a computer to do. Was computers of the day really that slow, or is it just Asimov wanted to sound futuristic?
From the article you mentioned:
All the high-school students will be taught the fundamentals of computer technology will become proficient in binary arithmetic and will be trained to perfection in the use of the computer languages that will have developed out of those like the contemporary "Fortran" (from "formula translation").
I am reading this as "people will need to understand how computers work." and not as "people will have an easier time working with machines by learning binary".
Knowing about the binary system is useful once you start looking at how binary digital computers work, once you are working at the level of electronic components. Transistors and digital circuits were the future.
I don't know how familiar you are with digital electronics, but to get a sense of why getting into this binary state of mind is useful see how "binary" is used in things like logic gates, adders and multiplers.
It's not about converting from decimal to binary to handle numbers. It's about being able to put EVERYTHING in terms that are representable with two states (binary) to be able to get a binary digital computer to do what you need it to do.
One binary digit represents two possibilities, one is represented by 1 and another by 0. That's all you have. It can be "made into a real thing" if you say "1 means there's electricity going through this wire and 0 means there's no electricity going through it." If you have 4 wires, 1000 can mean that only the leftmost wire has a current going through it. It has nothing to do with the number 8 (8 in dec = 1000 in bin).
How does your phone know the way you are holding it? How is it able to recognize people in pictures? How come you speak and it types for you? It is a digital machine, and stuff is converted to binary so it can do useful work.
Today you can get computers to work without knowing much about electronics, and you don't have to deal with binary stuff that often. To get to that point, a lot of very smart people had to do a hell of a lot of work.
Fortran (a computer language) is one of the tools that made using binary (and understanding electronics) less necessary, it was a common language in the 1960s (it's still used in science and engineering).
A programming language is a way for us to tell a computer what we want it to do (well, actually more like "how to do stuff" based on the limited operations it is capable of doing) without having to deal with the specifics of its electronics (where the binary is relevant).
Most people can use computers these days without having to learn one of those languages, but that was not the case even in the 1980s... In the 1960s, you really needed to know about computers to get them to do anything. A lot of work has been done to make interaction with computers more simple and natural.
Were computers of the day really that slow
Compared to ours, yes, they were extremely slow (but still managed to get the job done!). An average smartphone is beyond anything from that time.
Was there ever a time where computers struggled to work with decimal numbers?
That is an interesting question. The answer is yes.
Computers do struggle with decimal numbers (among other things). See how floating point data is handled (numbers are represented approximately, as in 9.99999999999 is 10 because it makes things easier/faster/whatever). There are ways to handle exact quantities instead of "good enough" approximations, and those give you some extra complications.