I really am curious if there was a time when things were the opposite of how they are now with regard to public feelings about the economy, and advice doled out based on those feelings. Certainly not within my lifetime! It's hard for me to even conceive of how society would operate that way, which I suppose is what makes it such a fascinating historical question to me.
The US economy was strong through a good portion of the 1980s, but I'd say the post-WWII era up until the early 70s would have been the last time that public opinion of the US economy was overwhelming positive.