Log and Greger, I think, brought up the subject of American car culture in another thread, and it seemed large enough to overwhelm that original thread allowed, so I’m hijacking it here.

My cousin still pines for his first love - a white ‘68 T-bird. If he found one tomorrow he might hock the farm to get it.

Considering cars haven’t been around all that long, they’ve taken us (primarily USamericans) over like Godzilla in Tokyo. My question is where, and why. Did it start with the post-war behemoths? Was it as early as the Model T?

When did it become second nature to drive places that were walkable?

Or was the takeover by car just a natural outgrowth of (US) Americans to hit the road, look for a better place?



Julia
“It’s the shipwreck that leads you to the magical island.”
(Trevor Noah)