Feet get a lot of negativity because a lot of people choose not to properly maintain them, but they are no different than any other body part in this regard. The only difference is their social perception. When not properly maintained, any part of the human body can be "gross" the way a lot of people look at feet. A lot of the most sexualized parts are arguably even worse than feet if they aren’t given proper care.
Speaking only about the American culture I grew up in, because anything under our clothes is a taboo and untouchable subject that can never be discussed, I feel like feet are far removed enough from our bodies for us to displ
Don't really think this folder needs any description.