I get why salt is always added to foods, our body needs it, but I’m really curious where the practice of adding pepper to almost everything comes from? Are people familiar with when/how that became common practice? (In the US at least as that’s where I’m from)