Anyone who’s ever held a minimum wage job (which is a lot of people) knows that it’s impossible to support a family solely on $7.25 per hour. With an average monthly rent of somewhere around $700, you would have to devote nearly one hundred hours of work each month solely to keeping a roof over your head. Given that a number of these jobs are often available only part-time, the financial plausibility of it becomes ever more remote. In an economy that makes it so hard for individuals at the lower end of the economic scale to meet their basic needs, any argument like “take personal responsibility for your financial situation” or “pull yourself up by your bootstraps” rings woefully hollow.
Of course, it wasn’t always this way. The United States has not always had a minimum wage, so it’s important not to take it for granted. Furthermore, the minimum wage has not always been as high as $7.25 an hour (the amount that Congress only slated it to become in 2007). I trust that this fact is obvious enough to be self-evident, but what might be surprising is that the first minimum wage was a mere $0.25 per hour. Compared to today’s minimum wage, this seems laughably—perhaps even impossibly—low.
Except for one thing: it’s not. We don’t have to go back very far to see a similar rate. In today’s money, 25 cents in 1938 would have been equal to almost $4 an hour—not so drastically far away from what the minimum wage was before being raised in 2007.