We always hear/talk about the natives during the conquering of the new world as being subject to the diseases from Europe which (many believe) was the primary reason for their fall. There had to have been diseases in america in which the Europeans weren't custom to, right? Why don't we ever hear about them being affected? Did they have better medical means to prevent/cure them?
Sorry if it's the wrong place (I don't think it is) or a poor question, I only explore this sub every now and again. I learn a lot here though, so thanks!
hi! not a poor question, actually quite a popular one! there's always room for more expert input on this, but meanwhile, get started on this section of the FAQ*: take scan through the post titles to find posts asking similar questions
Native Americans and (European) Diseases
*see the link on the sidebar or the wiki tab