Being women has no connection with not feeling confident and well in your body. Indeed, skin lightening has been proven unequivocally to cause illness in the body. Most people lose weight as it is medically unhealthy for them to be overweight. Being darker of skin is not an unhealthy state. 'Fake tans', as opposed to naturally developed suntans, look ridiculous.
Manicures are undertaken for neatness and hygiene reasons, skin whitening is not. What would we look like if we could all scale buildings using just our toe and fingernails!
Non-essential cosmetic surgery is largely undertaken for vanity and insecurity reasons. There is also a great deal of medical and governmental activity geared towards counseling those that would have it done for those reasons as they really need help.
If men love you as you put it, surely they would first try to encourage you that you don't need skin lightening for them to love you and for you to be 'feel good & confident' and that you are harming yourself for superficial reasons.
Being emotional has absolutely nothing to do with this discussion. Clearly, many women on this forum don't see that they need skin whitening. Many others think its a shame that some do have that need.
Similarly, there is absolutely no qualified data that supports the notion that women are more body conscious than men. Extant literature posits that men are just as conscious about their bodies/appearance but they deal with it in largely differing ways to their female counterparts.
That you want to do skin whitening is fine; it's totally your choice. No one can take that away from you. I would also defend your right to do what you want to your body. However, your attempts at it's justification on social science constructs are, in my view, fundamentally flawed.