I was watching the Rachael Ray talk show today and one of the segments was about learning to feel more confident naked. Or, "Look Better Naked" is what I think it was actually called (it's a book someone wrote I guess). The book seemed to be pretty artificial and vain. It recommended spray tanning and correct posture--things to disguise your body to appear more visually appealing than it might actually be. But it made me start to think more about self-acceptance and body image--things I'm really working to get a grip on. I'm learning to love myself one step at a time. And I think I am learning to feel pretty good about myself for the most part. But naked? Boy, this complicates things.
If I struggle to love the way I look with clothes on, I don't know if or how I'll ever feel good about myself with clothes off. Is it even important? I guess I'd never thought about it before. I've never liked the thought of being naked. I spend very little time without clothes and that's the way I prefer to keep it. Part of it is that I just don't think it's proper. But beyond that, I've always been so afraid. Wearing a swimsuit is bad enough. But naked? No one sees me naked. No one. How terrifying.
Is this only terrifying to me? Or is this something everyone experiences?
It's not like I want to become a nudist or an exhibitionist. I'm not talking about sex at all. I'm just saying that I'd never considered on my journey toward self-acceptance that I'd have to accept myself as is, naked and flawed. Flawed I signed up for. But naked? How is that going to be possible? I have to love and accept all of myself. Isn't that the deal?
Isn't that the point?