Cobaltblue's topic on Nudism at the Maternity Shop got me wondering about your opinion on going top-less at the beach.

I live in a Spanish city where this is extremely common, and more and more every year. I've never done it because I don't feel comfortable with my breasts. I don't like them, I think they get enough attention covered by my bikini top and I don't want to expose them to the world. However, I'm an exception these days.

Women of all ages and sizes sunbathe without their bikini tops (women don't usually wear one-piece swimsuits until they hit their 60-70s). They might put them on when they go inside the water, maybe they feel uncomfortable with the boucing (I would).

The truth is, breasts have become a normal thing to see at the beach. And they are not considered something sexual anymore (at the beach, that is). Of course there will be the occasional glance, but you would probably get that with your top on too. I can say I've been to the beach with four friends, three of them going top-less and no one came to bother us. No one told us anything. Nothing relevant happened because we were surrounded by other women doing exactly the same thing.

I remember a discussion I had some years ago with an American friend from SF. He was so shocked about this, when he came to Barcelona he couldn't believe what he was seeing. But I think he actually sexualized the breasts because it was something strange for him (and therefore alluring). He looked. He stared. It was a taboo for him. It's really not a tabbo for most guys here.

I think of it as when women weren't supposed to their legs and men went crazy when they saw an ankle. So, there's really nothing wrong with a specific bodypart. It is what we are taught about it that changes our perception.

So, what is your take here? I know what my American friend thinks, but he's a guy so he doesn't have boobs. I want to know what other women think and how they feel about this!