Last week we discussed how the problem of “adverse selection” works to undermine insurance markets, including the market for paid sick leave and unemployment insurance. We noted that the cure is to make insurance mandatory for all – as President Obama did recently, by making paid sick leave mandatory for employees of federal contractors. In developed countries, many forms of insurance are mandatory: disability, unemployment, old age, auto, and health are among the most common.
But even universal insurance faces the “moral hazard” problem. That’s the fact that people, for good and bad, behave differently when they have insurance. Old age insurance (like social security) may diminish your incentive to save for retirement. Auto insurance may facilitate riskier driving. Unemployment insurance may make you less deferential to your boss. And indeed, all else being equal, people with paid sick leave should be expected to miss work more often than people without it.
But the key insight about moral hazard, is that we are still better off with insurance than without it. In other words, while it imposes a cost, that cost is almost invariably exceeded by the benefits. For example, it’s been suggested that unemployment insurance and social security have the combined effect of allowing people to take riskier decisions on where to work – giving a hi-tech start-up a chance, for example, instead of playing it safe with an established firm. In the aggregate, such risk-taking may be a significant boost to a modern economy dependent upon constant innovation.
Auto insurance makes transportation risks more manageable, letting people commute to their job of choice; while also facilitating distribution networks, giving consumers more options. Health insurance correlates with better health, and can reduce costs when people make use of preventive care before a problem gets out of hand and lands them in the ER.
There is no free lunch. Paid sick leave, ultimately, is paid for by employees, reducing their wage compensation, leaving total compensation (including benefits) unchanged. The same is true for paid maternity leave, unemployment insurance and even social security. While an employer nominally pays out for those items, direct employee compensation is reduced by the same amount.
Taken together, benefits like paid sick leave confer a further benefit: they seem to make employment more desirable, such that more people offer themselves on the labor markets. This is seen in higher labor-force participation rates in the working-age population of countries that have liberal labor standards; and in relatively low labor-force participation rates in the US.
Most people want benefits with employment – including caps on hours, unemployment insurance, paid holiday and vacation time, and paid sick and maternity leave. But the market has no route from this equilibrium (their absence) to another equilibrium (their ubiquity), without help from legislation.
The point is not that a central planner knows better than individuals at the point of contract. Rather, we must recognize that there are obstacles that prevent market participants from coming to terms. Well-tailored labor regulations can ameliorate these obstacles, to let the market work its magic.
Share the Field Guide: https://liberalfieldguide.org/
Share this post: https://liberalfieldguide.org/2015/09/22/sick-pay-pays-ii/
On Labor Day, President Obama issued an executive order mandating that federal contractors offer paid sick leave to their employees. Such mandates serve as a back-door means of improving labor standards, albeit for a relatively small number of workers.
Most countries make paid sick leave mandatory for full-time employees. The US is alone among developed countries is not doing so. Outside of a few American cities and states that legally require paid sick leave, most Americans are at the mercy of their employer.
One can fairly ask why we shouldnt simply leave it to the markets. If paid sick leave is really so desirable, one can argue, laborers will ask for it, and employers will offer it. But the problem is that paid sick leave – like paid maternity leave – is a lot like insurance, and beset by the same problems. Chief among them is that insureds typically have better information on their own circumstances than might a would-be insurer. And so when someone asks for insurance, an insurer can reasonably infer that that person – for reasons that may be undetectable – is more likely to be a bad risk than someone taken at random.
That simple fact naturally leads to a feedback loop, whereby insurance gets pricier, making the people who are willing to pay that price even worse risks; which in turn makes insurance pricier still, and so on and so on until the market fails – with many people who want insurance, and firms who would provide it, unable to come to terms.
This dynamic was famously observed in the market for used cars, in a piece entitled “The Market for Lemons,” which won for its author, George Akerlof, the Nobel Prize for economics. People are suspicious about used cars because some defects are readily known to the seller, but are exceedingly difficult for a buyer to ascertain. Because of the buyer’s perception of risk, his offer price drops. As a consequence, sellers of good used cars cant get fair compensation, making them less likely to bring them to market. This dynamic feeds on itself until the market contains only the worst used cars.
You can readily envision the same problem occurring in the market for unemployment insurance, if it werent mandatory. A worker asking for such insurance at the time of hire would flag himself as a bad risk – one who is likely to be let go. He might be passed over for a position simply for asking! Consequentially, we should expect unemployment insurance to get more and more expensive; and as it does, only the most at-risk employees would be willing to pay for it – and so on and so on, as the market fails.
This same problem befalls virtually every form of insurance – including disability, old age, and health. The cure is to make insurance mandatory, so that people cant “self-select” into or out of insurance. Insurers are then better able to estimate the risks, because they can look at the population as a whole.
Once you solve the “adverse-selection” problem (also known as the “asymmetrical information” problem), you run into the next big issue in insurance markets: moral hazard, which we’ll take up when the Field Guide returns next week.
Share the Field Guide: https://liberalfieldguide.org/
Share this post: https://liberalfieldguide.org/2015/09/18/sick-pay-pays/
The US frequently goes through regional recessions. One part of the country slumps because of the failure of some industry, or the fluctuation of commodity prices, or natural disaster. But since social insurance is largely federal – medicare, SNAP, TANF, social security – they continue despite the weak local economy, as do big-ticket federal spending items: highways, agriculture, student loans, etc. All told, these programs combine to guarantee a minimum of income to recession-hit areas, smoothing out some of the bumps along the road that every economy suffers from time to time.
Local economies in Las Vegas, Miami and Phoenix were ground zero for the worst financial crisis in 75 years – but they had a safety net to ensure that their recession had a bottom. Six years later, they are growing again, fast as ever. The US economy as a whole added more jobs in 2014 than in any year since 1999. The cost was significant – the US ran trillion-dollar deficits for several consecutive years, with President Obama and Treasury Secretary Geithner resisting pressure to reduce deficit spending too much too soon in the face of the worst economy since World War II.
The European Union operates very differently. While the currency is federalized – controlled by a single central bank, the ECB – social programs and government spending are almost entirely dependent on the finances of individual countries.
Ground zero for the financial crisis in the European Union was Greece, Spain, Portugal and Ireland. On the eve of the crisis, they were all up-and-coming economies, seeing fantastic year-over-year growth. That kind of growth lures investors – and banks made progressively riskier loans in the hope of cashing in on the boom. You could tell the same story about Vegas, South Florida and Arizona through 2007.
But when the recession hit, the European Union had a very different plan for its hardest-hit countries. Instead of increasing government spending, and allowing them to run deficits – as every liberal economist urged – the ECB called for the exact opposite. With many countries unable to raise the cash to meet their obligations to creditors, the ECB would only underwrite further lending if those countries practiced “austerity.” To avoid default, they were forced to cut government spending dramatically – at a time when people and businesses had no money to spend either. Government programs of every stripe were cut back or eliminated.
At the time, conservative economists theorized that countries who drastically cut government spending would find the bottom of their recession more rapidly, and would bounce back that much faster. For the US they predicted “debasement of the dollar,” “hyper-inflation” and a prolonged slump.
Liberals at the time warned that austerity would send weak European economies into full-fledged depressions, and be a drag on growth across the EU. They projected that deficit-spending would save the US from a much deeper recession, and while the US would come out on the other side with more debt, it would also have many more jobs. Liberal economists dismissed the threat of inflation entirely, warning instead of the threat of deflation in Europe.
Years later, liberals theories have been borne out, resoundingly. The US rebounded faster and stronger than Europe, and inflation remained near historic lows all along. In Europe, conservative policies have proved to be an abject failure – as liberals also foresaw. Time will tell whether the Euro itself will survive years of conservative mismanagement.
Share the Field Guide: https://liberalfieldguide.org/
Share this post:
Fast food workers are striking for better pay – and while many media outlets are dutifully burying this worldwide event, a trickle of information is coming in, with the usual anecdotes about impoverished single mothers flipping burgers for $8/hr, year in and year out, on public assistance, struggling to buy groceries and pay rent.
Conservatives often come back with the smarmy insight that fast-food jobs werent “intended” to be lifelong careers. To which we ask: intended by whom?!? Y’see, one slap-in-the-head aspect of the invisible hand (which conservatives love, honor and obey, if not understand) is that markets dont have intentions, any more than they have fingernails. Employers advertise jobs, and by market magic, workers appear, offering their labor. And it so happens that adults with dependent children, and not just high school kids, are a large fraction of the fast-food workforce.
The good news is that when conservatives deem a market outcome or market-participant behavior to be “wrong,” they unwittingly take a giant conceptual leap forward. Because in order to reject classical economics, one must first accept the premise that markets sometimes screw the pooch, making it necessary for Keynesians to step in and apply a fix. Seems many conservatives, in their limited way, already appreciate that we have a national problem with many single heads-of-households trying to support families on minimum wage jobs.
Conservatives just have to take the next step. Complaining that an increase in the minimum wage will force fast-food firms to raise prices misses the point: their prices are held artificially low through a kind of corporate welfare. Income that fast food workers receive in the form of public assistance (through medicaid, EITC, food stamps, housing, etc.) must be regarded as a government subsidy to fast food firms, who, in its absence, would have to pay more in compensation. Thus, raising the minimum wage will serve primarily to shift that burden off of taxpayers, and onto those firms and its customers, where it belongs.
Earlier this year, CBO came out with a report that both puffed up and pooped on Obama’s plan to raise the minimum wage. While they predict that an increase in the minimum wage to $10.10 will succeed in increasing the wages of most low-wage workers (raising many families out of poverty), they also predict that some will lose their jobs, and that the overall rate of employment for low-wage workers will likely drop a bit (0.3%).
CBO is a respectable non-partisan source, so we will for the moment politely ignore the fact that a long line of empirical research has failed to observe any such drop in low-wage employment rates following a modest increase in the minimum wage. We might focus instead on CBO’s bottom line: the prediction that an increase in the minimum wage to $10.10 will send an additional $17 billion of income to workers making less than triple the poverty line. In a time of high poverty and sky-high inequality, that might not be a whopper of a increase, but surely it would be a modest mcpositive.
Conservatives like to pooh-pooh the US economy’s recovery from the worst downturn since the 1930s. But because the Great Recession is so singular, it’s difficult to judge recent economic performance, and the effectiveness of the government and Fed response. No downturn since World War II compares. Both for the conditions that triggered each, and their severity, the closest and nearest-in-time comparisons we have are the Great Depression and the Panic of 1893. And the US economy did much better this time around.
The Great Depression remains the worst of them all. GDP fell 30% and unemployment got to 25%. The Panic of 1893 saw GDP drop 5 to 10%, and unemployment peak at 12 to 18%. (Measures for that period remain crude.) The Great Recession was less severe: US GDP dropped 4.7% and unemployment topped out at 10%.
The key to our escape from what might have been a replay of the Great Depression was massive, directed spending on the part of the federal government, and perhaps more importantly, a commitment on the part of the Federal Reserve to pump cash into the economy, to prevent deflation. By one measure, US GDP in 2010 was 13% higher than it would have been in the absence of Fed and fiscal action.
It is not generally appreciated that the initial drop in economic activity during the 1st 3 quarters of the Great Recession during 2007-08 was in fact STEEPER than 1929-30. In other words, at the outset, the US was on track for a 1930s-style depression. The difference, according to the best research on the subject (cited below), was aggressive fiscal and monetary intervention.
Financial bubbles happen when banks continue to pour money into an economy, even as asset prices inflate. When banks collectively get cold feet and stop lending, asset sellers quickly outnumber buyers, and prices collapse – as they do, a lot of money vanishes. It doesnt merely change hands – it ceases to exist – no longer available for borrowing, investing or buying. In 1893, the asset bubble was concentrated in railroads. In the Great Recession, it was housing. In the Great Depression, the bubble wasnt specific to a particular industry. In all three, the crash was preceded by a massive run-up in private-debt, followed by a prolonged economic malaise, in which banks were insolvent, and personal savings was wiped out – there was no money left in private hands to buy anything.
Getting out of such a funk takes time. With the Panic of 1893, real per capita GDP needed 6 yrs to get back to 1892 levels. And even after it did, unemployment (which lags behind other indicators) was 12% – the economy wouldnt get back to full employment until 1900, 7 yrs after the bubble burst. The Great Depression was worse: real per capita GDP didnt get back to its 1929 level until 1937, and full employment wasnt achieved until World War II.
By comparison, real per capita GDP after the GR needed 5 years to get back to its 2007 level. Full employment (which is not well defined) may be achieved next year, which would make for a 7 to 8 year recovery. Not quite 5 1/2 years since the GR began in Dec 2007, unemployment today is a manageable 6.7% (though labor force participation remains quite low). 5 1/2 years after the Panic of 1893 and Great Depression, unemployment was still in double-digits.
The relative shallowness of the Great Recession – both in unemployment and GDP contraction – can be directly attributed to a policy of deficit spending by the federal government, and aggressive action by the Fed to shore up banks and maintain money supply. The aim of these policies at the time was to take the edge off – and they succeeded. In 1893 and 1929, prices collapsed soon after asset values. During and after the GR, the US teetered on the edge of deflation but never succumbed – this alone may have halved the depth of the contraction.
The short of it is that financial crises dont make for ordinary recessions – the recovery that follows has always been slow, and is beset by persistent unemployment. But the US economy has come a long, long way since the dark days of 2008, thanks in large part to aggressive government and central bank action.
great source for historical macro data: http://www.measuringworth.com/
help wanted: i’d be very grateful for a historical graph on private debt for the US that looks like this one for Australia: http://www.creditwritedowns.com/wp-content/uploads/2011/11/Australia-private-debt-to-GDP.png
Objections over birth control coverage in employer-provided health insurance are no more than an attempt by employers to intrude upon, control the lives of, and impose their religious beliefs on their employees, outside the course and scope of their job. No one can stop a private employer from posting the 10 commandments in your cubicle, installing Vishnu as your screensaver, or (Christ have mercy) leaving “A Clay Aiken Xmas” on an endless loop on the factory floor. But insinuating their beliefs into an employee’s family planning decisions – medical matters reserved for consultation with one’s doctor – is offensive.
Imagine an employer is a Jehovah’s Witness – and he objects to providing health insurance coverage to his employees for blood transfusions. (Faith prohibits Jehovah’s Witnesses from donating, storing or receiving blood – though I’ve never heard of a Jehovah’s Witness making such an objection as an employer, so this is strictly hypothetical.) Next imagine that Jehovah’s Witnesses sued the US Government so they could exclude transfusions from health insurance coverage mandated of large employers by the ACA.
This is not intended to be a slippery-slope argument – that if we permit employers to deny certain kinds of health insurance coverage to their employees, it would open the door to all manner of 11th century healthcare policies. Rather the illustration is meant to highlight the absurdity of allowing one person’s religious beliefs to impinge on another person’s access to modern medicine. Few would quarrel with Jehovah’s Witness’s choice to die for their religious beliefs – but most would have a problem with their expectation that other people should die for them.
Employers, under the Civil Rights Act, cannot discriminate in hiring on the basis of a job applicant’s religion – nor can they fire an employee for practicing their religion. (Churches are exempted, and can hire and fire based on an employee’s religion alone.) This means, among other things, that an employee is free to donate a fraction of his salary to the Church of Satan, or use it on Friday to enjoy a philly cheesesteak, or purchase a condom from the corner pharmacy – and his employer cant do anything about it.
Health insurance is just another form of compensation. Whether an employee acquires birth control with salary, or with employment-based health-insurance, in either case the employer is providing the compensation, and the employee is making the final decision on how he will use that compensation – to obtain birth control, or not. Distinctions between the two cases are spurious. What an employee does with the compensation he earns is up to him – not his boss.
The company at the center of the controversy – Hobby Lobby, an Oklahoma retailer – claims to be very much concerned about employee compensation being used to obtain birth control. But it has no compunctions about sending money to its Chinese suppliers, from whom it gets the vast majority of its merchandise. China’s abortion rate is TRIPLE that of the US, with more than 13 million abortions per year – and that doesnt include another 10 million morning-after pills sold annually. Abortion in China is effectively REQUIRED by law under the one-child policy. When a woman who’s already had a child becomes pregnant, she may face fines and other sanctions if she does not obtain an abortion.
If abortion were a serious concern, Hobby Lobby could not send money to China, knowing that it’s far more likely to finance abortions there, compared to the same money being sent practically anywhere else on earth. One can only infer that their preoccupation with abortion does not rise to the level where it might cut into their profits. Hobby Lobby is happy to force its employees to make sacrifices for the firm’s religious beliefs – but the firm is unwilling to make sacrifices itself – and happy to turn a blind eye to make a buck.
Religious freedom is a good thing, if only because the alternative is so noxious. But that liberty in a polyglot society is about an individual’s freedom within his or her defined individual sphere – such freedom does NOT include an employer’s right to reach into his employees’ private lives, to impose his religious beliefs on them.
Inequality begins with poverty, and is perpetuated by underinvestment in education, health and social insurance. One in four American children are born into a poverty that’s deeper and harder to escape than poverty in other western countries. They arrive to public school at age five or six as damaged goods – one can hardly expect any public school system to reverse the harm done, no matter the budget. The US spends a lot on education – but like healthcare, education spending is tilted toward the heroic, not the fundamental: America is the land of elite $50k/year universities – and of failing elementaries and high schools.
Top universities like Harvard operate like modeling agencies: they only want you if you’re pretty. By comparison, the Marines Corps believes that it can take anyone and turn him/her into a Marine. Americans so thoroughly accept the distinct roles of public and elite schools, that they hardly give it a thought. The best American universities – public and private both – run like modeling agencies, admitting only the best of the best, and rejecting the rest. But at the same time Americans expect their public elementaries and high schools to function like the Marine Corps, and turn out disciplined, literate and numerate young people, no matter their circumstances when they enter.
We already know that poor children are different from other children in real, observable ways. Being in poverty as a child has long-lasting negative health and income effects, and the differences even show up in brain scans. Poor kids arrive to kindergarten with all-but insurmountable deficits. If public schools are to be effective, they have to take kids at a younger age. By beginning public school at age three or four – adding pre-K, and even pre-pre-K – and guaranteeing at least 2 quality meals per day, 5 days per week over what should be a 200 day school year – the public will have the opportunity to invest in all of our children at a critical, formative age, so that when they get to kindergarten, they arrive ready to learn.
Head Start, America’s most famous pre-K program, has had fantastic results. When Head Start kids become young adults, they are more likely to finish high school, begin college and go to work – and less likely to become teen parents. They’re also healthier. This should be the model for a nationwide public pre-K system – this is how America can escape its cycle of poverty and inequality. By giving every child the means to reach their full potential, America can live up to its meritocratic ideals. Its self-image notwithstanding, America today is the least meritocratic country in the West. An American child’s destiny lies not in his talents, but in the circumstances of his birth. This isnt surprising, given the vast disparity in health and education resources available to different American children, depending on who they were born to, not on their innate talents.
While investing in public pre-K now, the US should follow up with free public community colleges at the other end. It is an embarrassment that America’s only federal universities are military schools. The federal government might lead by example and create a federal college system that’s free to anyone who passes an entrance exam. The exam can itself be a tool for maintaining high school standards. Alternatively/additionally the federal government could provide aid and offsets to reduce the cost of locally-based tertiary education to zero.
For decades, each successive American generation had far more education than generations past. But that trend ended abruptly around 1970, after which American education-levels flat-lined, and inequality exploded. Jump-starting growth in American education – both at the front and back end – is the key to future prosperity, to break America out of its funk.