I've got this post that goes around on Tumblr about the gender wage gap. It stays alive primarily because every now and then some dork will revive it to try and argue with me about it. They start by arguing that it doesn't exist, which is hilarious because what they actually argue, every time, is that it does and should exist. That it's women's fault it exists. Because we "choose" jobs that pay less.
But who chooses the worth of these jobs that are female-dominated? Who decided that these jobs were worth less than others? Even if we buy into the false notion of absolute, unimpeded choice, what accounts for the worth of each individual profession?
You should see the occupations that these assholes list. Nurse. Teacher. Therapist. Social worker. Hair stylist. Secretary.
You can see how these dudes are stuck in the 1950's. The word "secretary" hardly exists in the job market anymore. It's been changed to "office assistant" or "receptionist" because the word "secretary" was too giiiiirly ewwwwww and so men wouldn't apply.
And hair stylists? You can easily spend more at a hair salon than you would to get your oil changed and your tires rotated. But of course men wouldn't know that, being MANLY MEN WHO DON'T NEED HAIR STYLING, THAT'S SOOOOO GIIIIIIIRLY EWWWWWW.
As for nurses and teachers, let's see. Both work ridiculous amounts of hours, with nurses often having super long shifts that go all night, both often go the extra mile for those under their care, with teachers working 60-80 hours a week even though they only get paid for 40 so that they can grade papers and make lesson plans and figure out how to still educate kids around teaching to ridiculous standardized tests (a US problem). Nurses often do more work than doctors and have to do more of the unpleasant work (you think being a garbage man is grosser than being a nurse? I have news for you). They spend a whole lot more time actually keeping patients alive and caring for them.
And therapists and social workers? Really, we don't value those jobs? Social workers, who make sure kids aren't being abused and who help adults get back on their feet? Therapists, who work to cure people's mental illnesses and make them able to function in day-to-day life? I get dudes complaining that men have a higher rate of (completed) suicide. But what is likely the cause of this is that men won't go to therapy because, again, talking about feelings is what GIRLS do (ew) and MEN deal with their problems ALONE until they FUCKING KILL THEMSELVES.
Maybe it's time for men to value therapists and social workers.
The counter argument I get a lot at this point is that professions like doctors and STEM fields require more education. This is a poor argument, because the amount of education required varies. Many teaching positions require Masters degrees, and there are plenty of STEM fields that only require a BA. In Washington State, you generally have to have a Master's to be a licensed therapist, but in Florida, you only need a BA.
And yes, doctors require more education than nurses. In the US, college education costs money. A lot of money. And astronomical amount of money. A rising amount of money, and for more and more citizens, an unobtainable amount of money. Since more women live in poverty than men, or have to take care of their kids and so either have to stay at home rather than working a job while going to school or have to pay extra for childcare, they obviously have less access to medical degrees. And that's not to mention the whole issue of boys clubs and active hiring discrimination against women. It's not so appealing to spend years in college classes full of men who make shitty jokes and comments and generally act as though you're less capable, only to then be asked in job interviews if you're planning to "start a family."
Yes, it still happens all the time, even if it is illegal.
But the dudes who insist that the wage gap is women's fault won't believe in the stories upon stories of women being treated like shit in med school or being asked sexist questions in interviews. They're not interested in listening to women or learning the truth. Or thinking in general. Their only goal is to convince everyone that the wage gap is a myth (by which they mean women's fault) and anyone who says otherwise is just a man-hating feminist hag.
But they've been pretty successful. The "wage gap myth" myth persists, despite mounting evidence that women get paid less for doing the same work with the same education and experience, the fact that the wage gap exists for little adolescent babysitters, the fact that in Russia, doctors are paid poorly because being a doctor is considered a "nurturing" profession and therefore "women's work," and despite the fact that scientists find again and again that even if you factor in all those reasons why the wage gap is women's fault, like "choosing" lower paying work and "choosing" to spend more time taking care of their kids, there is still an unexplained wage gap.
The first thing I think we need to question is why jobs are worth what they are. Dudes try to throw out weak arguments like "the market decides" and... well basically that's it. But anyone with a slight grasp of economics and psychology knows that it's so much more complicated than that. That's like saying the sky is blue because "outer space." Um... well... a little, but no.
I've worked an interesting variety of jobs. I've worked in retail, food service, customer service, and marketing. I got paid the most with the best benefits by far in marketing. Marketing was the easiest job I ever had. I sat in a cushy chair all day, worked maybe half the time while spending the other half on Tumblr and shit, and the work I did do was not hard. There simply wasn't much to do in a 40 hour work week, and no one seemed to have high expectations for me. I earned an actual salary at my last marketing job until I could no longer stand working in bullshit and constantly feeling guilty because I hardly did any fucking work.
I know that this isn't everyone's experience in an industry like marketing, but the fact is that it's still a cushy job. No one stayed late. You got to sit in a chair all day. Every Friday we'd all go have a long lunch at a pub across the street. And for some reason we deserved more money than someone working their ass off in retail.
For a very short period of time, I worked as a dog bather. It was the hardest work I'd ever done. They should call it dog wrestling, for fuck's sake. It was so hard that I couldn't do it. It was not only physically demanding, but the grooming salon was frequently busy enough that there was no time to take a breather. In fact, most of the bottom rung jobs I've worked basically don't let you have your legally mandated breaks. You can take them (because legally they have to let you) but you'll get scowls from managers and then you can be expected to be treated like shit by them, and most of the time there really just isn't time for breaks or, in the case of tip-fueled delivery driving, it doesn't make sense to take breaks.
So I could right now be making way more money per hour with benefits for doing a fraction of the work, and why? Because I have a BA in creative writing? Because I was lucky enough to have parental support through most of college?
That shit doesn't add up. Please explain to me why marketing is worth more than serving you food? Please explain why working my ass off to make sure your dog is clean, dry, and healthy is worth barely over minimum wage.
And for the love of god, why don't teachers get paid more? No one can fucking explain that to me. They raise and educate our fucking children for the love of all that is holy oh my god just