For many east coast folks, the west coast is a mythical land full of sunshine, healthy people, and nothing but sandy beaches. And yes, you can find those things in California, Arizona, and other western states. But there are just some things about the west coast those back east don’t understand. These stereotypes should be put to rest!
https://ift.tt/3jGJt4b
from WordPress https://ift.tt/3yPMZ0A
via IFTTT
No comments:
Post a Comment