Thursday, December 19, 2013

Pentagon Spending Undermining US Economy



In fiscal year 2014, the federal government will spend around $3.8 trillion. Of that total, military spending will occupy $831 billion. This includes spending on military defense ($626.8 billion), veterans aid ($148.2 billion), foreign military aid ($14.3 billion), the war in Afghanistan ($92.3 billion) and the Department of Energy's nuclear weapons programs ($7.9 billion).

This massive sum takes into account the sequester, which forced the Pentagon to slice $52 billion from its budget for the 2014 fiscal year that began October 1.

Military spending is second only to Social Security. However, Social Security is funded by the payroll tax, which is paid by every working American.

Military spending, on the other hand, comes from the federal government's general fund. In other words, military spending comes at the expense of other domestic programs and needs.

U.S. government spending is divided into three groups: mandatory spending, discretionary spending and interest on debt.

Discretionary spending refers to the portion of the budget which goes through the annual appropriations process each year. In other words, Congress directly sets the level of spending on programs which are discretionary. Congress can choose to increase or decrease spending on any of those programs in a given year.

Military expenditures account for 57 percent of discretionary spending.

It is quite justifiable for the U.S. to have a goal of maintaining the world's most powerful military, and one that spends the most money to provide for that. However, it is not justifiable for the U.S. to grotesquely outspend not only any conceivable enemy, but essentially the rest of the world combined.

In 2012, U.S. defense spending was six times more than China, 11 times more than Russia, 27 times more than Iran and 33 times more than Israel.

In fact, the U.S. consumed 41 percent of total global military spending that year.

It's not just that the U.S. has a bigger budget and can therefore spend more. The U.S. was also in the top 10 highest spending countries as a percentage of Gross Domestic Product (GDP).

Unnecessary Pentagon spending creates fewer jobs than every other form of federal spending, including tax cuts to promote personal consumption. In fact, it destroys American jobs if the money for it comes out of our domestic economy, according to a 2011 study by the University of Massachusetts.

This study focuses on the employment effects of military spending versus alternative domestic spending priorities, in particular investments in clean energy, health care and education.

The study compared spending $1 billion on the military versus the same amount of money spent on clean energy, health care, and education, as well as for tax cuts which produce increased levels of personal consumption.

The authors concluded that $1 billion spent on each of the domestic spending priorities will create substantially more jobs within the U.S. economy than would the same $1 billion spent on the military.

The study also concludes that investments in clean energy, health care and education create a much larger number of jobs across all pay ranges, including mid-range jobs (paying between $32,000 and $64,000) and high-paying jobs (paying over $64,000).

Ultimately, all of this unnecessary military spending is bloating the federal budget, driving the deficit and piling onto our ever-expanding debt.

Admiral Mike Mullen, former chairman of the Joint Chiefs of Staff, gave Congress a very powerful warning in 2010.

"I think the biggest threat we have to our national security is our debt," the Admiral intoned.

It would be wise to heed his admonition.

Wednesday, December 04, 2013

By Almost Any Measure, the U.S. Healthcare System Is Failing



The U.S. spends more on healthcare each year than any other country in the world. Yet, according to a new report, the spending problem isn't the result of our rapidly growing segment of seniors.

Instead, most of the money is being spent on people under age 65, and it is being directed toward chronic and preventable conditions, such as diabetes and heart disease.

The U.S. spends a whopping $2.7 trillion per year on health care, or nearly 18 percent of gross domestic product (GDP). But the nation gets relatively little for the enormous amount it is spending. In fact, "The U.S. ‘system’ has performed relatively poorly,” reads the report.

The report, co-written by Dr. Hamilton Moses of the Alerion Institute in Virginia and Johns Hopkins University, had a rather surprising conclusion.

“In 2011, chronic illnesses account for 84 percent of costs overall among the entire population, not only of the elderly. Chronic illness among individuals younger than 65 years accounts for 67 percent of spending."

Despite the conventional wisdom, quite remarkably, the problem isn't old people.

The "price of professional services, drugs and devices, and administrative costs, not demand for services or aging of the population, produced 91 percent of cost increases since 2000,” reads the report.

Dr. Moses says that unlike a normal market, the healthcare market has no price discovery. Consumers operate in the darkness, entirely unaware of how much they are paying or what they are paying for. There are no market forces reigning in costs because healthcare in the U.S. doesn't exist in a true market.

“This is not a market," Moses says. "It’s far from a market. Few prices are known. They are not publicized.”

Moses also says his team’s study shows that one of the biggest problems in the U.S. healthcare system is that it is based on a fee-for-service model in which doctors and other caregivers are motivated to give lots of tests and individual treatments, as well as to prescribe drugs, instead of keeping patients well.

In other words, it's all about the treatment of illness and disease, rather than prevention. If the system was based on performance and outcomes, we'd be spending a whole lot less money.

Yet, individuals can play a bigger role in their own health and wellness than any doctor simply by making lifestyle choices that will lessen sickness, improve quality of life, and perhaps even longevity.

Keeping people from being afflicted by preventable diseases in the first place is the best way to reduce medical costs.

Some people contend that the U.S. has the world’s best health care system. But that claim simply doesn't square with the facts.

A study released in November (which was many years in the making) shows that Americans pay more per capita for health care than people in any other industrialized country. In return, we are sicker and die younger.

The Commonwealth Fund, which does research on health care and health reform, has continually shown that Americans spend far more on health care than any other nation — currently $2.7 trillion annually. That amounts to $8,508 per person, compared to $5,669 per person in Norway and $5,643 in Switzerland, the next-highest-spending countries.

In other words, no other nation's spending is even close to ours — even on a per capita basis.

Yet, all that money isn't buying us much. There's very little return on the investment.

The U.S. has the eighth-lowest life expectancy in the Organization for Economic Co-operation and Development (OECD), a group of 34 developed nations.

Commonwealth Fund researchers found that 37 percent of Americans went without recommended care, did not see a doctor when sick, or failed to fill prescriptions because of costs, compared to as few as 4 percent to 6 percent in Britain and Sweden.

Additionally, 23 percent of American adults either had serious problems paying medical bills or were unable to pay them, compared to fewer than 13 percent of adults in France and six percent or fewer in Britain, Sweden, and Norway.

But what about access? Defenders of the U.S. system say Americans have a much easier time seeing their physician than patients in other countries. Not so.

Americans wait longer to see primary care doctors. In Germany, 76 percent said they could get a same or next-day appointment, and 63 percent in the Netherlands said the same. Meanwhile, just 48 percent in the U.S. said they had that level of access. In fact, only Canada scored worse, with 41 percent saying they could see their doctor that soon.

Sadly, the U.S. health system is plagued by problems.

An Institute of Medicine report released in 2012 found that the U.S. health care system wasted $750 billion in 2009 (about 30 percent of all health spending) on unnecessary services, excessive administrative costs, fraud, and other problems.

The Institute also found that as many as 75,000 people who died in 2005 would have lived if they got the kind of care provided in the states with the best medical systems.

Quite plainly, all of the arguments that the U.S. healthcare system is the greatest in the world are plainly false or, at the least, misleading. By almost any measure, the U.S. lags the developed world, and even many developing nations.

Not only do Americans pay significantly more per capita for their healthcare than the citizens of any other nation on earth, they are also fatter, sicker, have less access, and die younger than those in other industrialized countries.

It's still unclear whether the Affordable Care Act (aka, Obamacare) will positively affect any of this, but it had better. The current state of affairs isn't just unacceptable; it's untenable.

Monday, November 18, 2013

American Poverty and Economic Decay Being Driven by Low Wage Jobs



You may notice the signs of economic decay all around in your community. Perhaps you have personally experienced (or are still experiencing) joblessness, the need for government assistance, or are somehow living on the edge economically.

One way or another, there are numerous signs that our economic security has deteriorated and that the American dream has faded away.

Nearly 50 million Americans (49.7 Million) are living below the poverty line. But the level of economic insecurity goes well beyond those officially recognized by the government as living in poverty.

According to The Associated Press, four out of five U.S. adults struggle with joblessness, live near poverty, or rely on welfare for at least parts of their lives. That amounts to roughly 80 percent of American adults, a figure that is simply mind-blowing.

However, poverty is not problem that plagues only racial and ethnic minorities. More than 19 million whites fall below the poverty line of $23,021 for a family of four, accounting for more than 41 percent of the nation’s destitute — nearly double the number of poor blacks.

Economic insecurity afflicts more than 76 percent of white adults by the time they turn 60, according to a new economic gauge to be published next year by the Oxford University Press. Measured across all races, the risk of economic insecurity rises to 79 percent.

“Economic insecurity” is defined as experiencing unemployment at some point in one's working life, or a year or more of reliance on government aid (such as food stamps), or income below 150 percent of the poverty line.

Millions of Americans cycle in and out of poverty at various points in their lives; four in 10 adults fall into poverty for at least a year.

The risk of falling into poverty has been rising in recent decades, particularly for those in their prime earning years (ages 35-55). For example, people ages 35-45 had a 17 percent risk of encountering poverty during the 1969-1989 time period. However, that risk increased to 23 percent during the 1989-2009 period.

The future projections are quite sobering. Based on the current trend of widening income inequality, close to 85 percent of all working-age adults in the U.S. will experience bouts of economic insecurity by 2030.

Yet, government safety net programs are the only thing keeping millions of additional Americans from falling into poverty.

The nation's poverty rate was 16 percent in 2012, according to new Census Bureau data that looks at how benefits and expenses affect family resources. Social Security, for example, kept 26.6 million Americans out of poverty last year. Food stamps provided by the Supplemental Assistance Nutrition Program, or SNAP, kept another 5 million people above the poverty level.

The main reason people fell into poverty last year was out-of-pocket health care expenses.

There are a near-record 47.6 million Americans, representing 23.1 million households, on the SNAP program. In other words, the program helps one in seven Americans put breakfast, lunch and dinner on the table.

Even as the stock market soars to new heights and income disparity widens to Great Depression-era levels, SNAP participation has doubled over the past 10 years and increased nearly 25 percent over the past four.

The cost of the program will reach $63.4 billion in 2013.

Poverty is becoming so widespread that it is creating a culture of government dependence. But safety net programs are becoming increasingly difficult to subsidize, given the portion of Americans who draw upon these various programs rather than fund them.

Tens of millions of Americans earn so little that they pay no federal income taxes. These folks do, however, pay payroll taxes, federal excise taxes (on things like gas, tobacco, alcohol and airfare), state taxes and local taxes.

A report from the Tax Policy Center (TPC) finds that 43 percent of Americans paid no federal income tax last year. About half of them earned too little to qualify, and many more were retired people who live on Social Security. In fact, two-thirds of this group are elderly. The remaining households likely qualified for tax breaks such as the Earned Income Tax Credit or the Child Tax Credit.

However, more than 70,000 households with income over $200,000 paid no federal income tax in 2013, according to the TPC.

The biggest culprit in all of this is low wages and incomes, which chokes off demand and consumption — the basic components of economic growth. Consumer spending comprises 70 percent of our GDP. Low wages and incomes are also starving the Treasury of much needed tax revenue.

Since seven out of the 10 fastest-growing U.S. occupations pay less than the national median wage, more and more Americans are forced to rely on the social safety net.

To illustrate this point, a whopping 52% of fast-food employees’ families are forced to rely on public assistance for food and medical due to low wages, which means American taxpayers are picking up the tab for corporations that pay poverty wages.

The average fast-food worker is now over 28 years old, meaning many support families with a combination of low fast-food wages and public assistance. That assistance is provided by American taxpayers.

A recent report from the University of California, Berkeley Labor Center estimated the cost of this at nearly $7 billion per year.

This puts a tremendous burden on American taxpayers, who have to fund these low-wage workers because their employes won't adequately do so.

Obviously, there are limits to the carrying capacity of the 57 percent of Americans who pay the federal income taxes that fund most safety net programs.

According to a recent in-depth study from the Heritage Foundation, "128,818,142 people are enrolled in at least one government program," based on U.S. Census Bureau information.

To be fair, the bulk of them are receiving Social Security (35,770,301) and Medicare (43,834,566) benefits, which all workers pay throughout the course of their working lives.

However, Heritage researchers note that 48,580,105 people are on Medicaid, the health insurance program for the poor, and 6,984,783 people are living in subsidized rental housing.

There has always been poverty and there will always be poverty. It's as old as society itself. Some people will always be more skilled, more educated and more industrious. So, they will typically earn more as a result.

But there are millions of Americans working two jobs to get by, putting in as many as 80 hours per week. These people are not poor due to a lack of will, effort or hard work. And, as I recently reported, most American households now have at least two adult workers.

Fifty-eight percent of the jobs created during the "recovery" have been low-wage positions, according to a 2012 report by the National Employment Law Project. These low-wage jobs paid a median hourly wage of $13.83 or less.

Even worse, 30 million Americans are scraping by on the federal minimum wage. That’s one in five people with a job.

Someone working full-time at the federal minimum wage of $7.25 makes $15,080 in a year, before taxes. The federal minimum wage has been stagnant for 45 years because it hasn't kept pace with inflation.

Back in 1968, the minimum wage in the United States was $1.60 an hour. After you account for inflation, that is equivalent to $10.74 today.

If you were to work a full-time job at $10.74 an hour for a full year, you would make about $22,339 for the year.

That's not a lot of money. Yet, according to the Social Security Administration, 40.28% of all American workers make less than $20,000 a year.

This means more than 40% of all U.S. workers actually make less than what a full-time minimum wage worker made back in 1968.

That's how far we have fallen.

Low-wage jobs have undermined our economy and our society. They have swelled the ranks of the working poor, created a surging dependence on taxpayer-funded welfare programs, robbed the federal tax base and driven down demand and consumption, making a genuine economic recovery an impossibility.

Economic security is merely a fantasy for millions of Americans who have watched their American dreams fade to black like the final frames of a sad movie.

Tuesday, October 15, 2013

The Federal Budget Process 101



The federal budget process, and the debt ceiling, explained in relatively simple terms.

The Independent Report strives to be independent and apolitical in reporting on the U.S. economy. The focus is typically on the Federal Reserve's monetary policy, inflation, interest rates, unemployment, the housing market and the energy markets.

But another area of focus is fiscal policy: federal spending, revenues and deficits. It is impossible, inconceivable or, at the least, irresponsible to ignore the nearly $17 trillion national debt.

The level of dysfunction in Washington has angered most Americans and it has dismayed other governments around the world, particularly our trading partners and the holders of our debt. With the current melodrama playing out in DC, a review the federal budget process is in order. Think of it as "Federal Budget Process 101."

There is a lot of finger-pointing going on right now, but many Americans do not understand how the budget process works. Thanks to the Center on Budget and Policy Priorities for laying out the broad strokes.

The Congressional Budget Act of 1974 is the template for Congressional tax and spending legislation. Under the Budget Act, each year Congress is required to develop a "budget resolution" that sets aggregate limits on spending and targets for federal revenue.

The President's annual budget request kicks off the budget process. On or before the first Monday in February, the President submits to Congress a detailed budget request for the coming federal fiscal year, which begins on October 1. This budget request is developed by the President's Office of Management and Budget (OMB).

The President's proposed budget provides Congress with a recommendation for overall federal fiscal policy, including: total federal spending, how much revenue the government should collect, and how much of a deficit (or surplus) the federal government should run — which is simply the difference between spending and revenue. Cumulative, yearly deficits add to the overall national debt.

The President's budget is very specific, laying out recommended funding levels for all individual federal programs. The proposed budget typically outlines fiscal policy and budget priorities not only for the coming year, but for the next five years, or more. It is also accompanied by historical tables that illustrate past budget figures.

Nearly all of the federal tax code is set in permanent law and does not expire. Similarly, more than one-half of federal spending — including the three largest entitlement programs (Medicare, Medicaid and Social Security) — is also permanently enacted. Additionally, interest paid on the national debt is also paid automatically, with no need for specific legislation.

Funding for "discretionary" or "appropriated" programs falls under the jurisdiction of the House and Senate Appropriations Committees. Discretionary programs make up about one-third of all federal spending. Almost all defense spending is discretionary, as are the budgets for education, health research, housing, science, technology and transportation, to name just a few examples.

The next step in the federal budget process is the Congressional Budget Resolution. After Congress receives the President's budget request, the House and Senate Budget Committees generally hold hearings to question Administration officials about their requests and then develop their own budget resolutions. These resolutions then go to the House and Senate floors, where they can be amended by a majority vote. A House-Senate conference then resolves any differences in the resolutions and a conference report is passed by both houses.

The budget resolution requires only a majority vote to pass, and its consideration is one of the few actions that cannot be filibustered in the Senate. The budget resolution is supposed to be passed by April 15, but it often takes longer. If Congress does not pass a budget resolution, the previous year's resolution, which is a multi-year plan, stays in effect.

With all of the above in mind, hearing certain members of Congress complain — after the fact — about a budget that they approved and voted for rings hollow and sounds empty. All federal budgets, which typically include deficit spending, are approved by Congress before being signed into law by the President.

Which brings us to the current fiscal year.

The 2014 United States federal budget was issued by President Obama on April 10, 2013. As in any year, the actual appropriations for fiscal year 2014 must be enacted by both houses of Congress before they can take effect. The President's budget was submitted two months past the February 4 legal deadline due to negotiations over the fiscal cliff and implementation of the sequester cuts mandated by the Budget Control Act of 2011 (which was the result of the last debt crisis).

This means Congress still had nearly six months to review and counter the President's proposal before the new fiscal year commenced. The onset of the new fiscal year nearly coincides with the arrival of the "debt ceiling" on October 17, a date that has been looming for months. Waiting this long was clearly a tactic — a means of exacting negotiating leverage for a budget that should have been resolved months earlier.

The House Budget Bill was introduced on March 15, 2013 and passed the House with a simple majority, 221-207, on March 21, 2013. All 221 votes in favor of passage were from Republicans. Of those voting, every Democrat voted against passage, along with 10 Republicans.

The Senate rejected the House budget on March 21, 2013 with a vote of 59-40 and continued working on its own budget bill, which was introduced on March 15, 2013. On March 23, 2013 the Senate passed the resolution, 50-49, with 48 Democrats, 0 Republicans, and 2 Independents voting in favor of passage. Four Democrats and 45 Republicans voted against, with one Democrat not voting.

The political divide in both chambers is clearly evident.

By law, the two chambers of Congress were supposed to reconcile the two bills. Under regular procedures, the Senate and House were to appoint representatives to a joint budget conference committee to negotiate a compromise. However, the House balked.

Democratic Party members of the House Appropriations Committee wrote a letter on April 17, 2013 urging Speaker Boehner to appoint House members to the budget conference committee. Yet, the House majority refused to engage in a conference to reconcile total 2014 discretionary spending levels. Despite its refusal, the House was adamant that it would not raise revenues in any way, or by any amount.

Ultimately, there was no unified congressional budget. The House and Senate each moved forward with appropriations bills, but none passed. With fiscal 2014 approaching, Congress debated a Continuing Appropriations Resolution to temporarily fund the government. However, it failed to pass before the beginning of the new fiscal year (Oct. 1), leading to the current government shutdown.

While the House Republicans initially stated that their intention was to negotiate a budget that defunded the Affordable Care Act (aka, "Obamacare"), their strategy soon shifted to refusing to increase the debt ceiling, which is supposed to signify their opposition to government spending levels.

However, every federal budget, every year, has been approved by both houses of Congress before being signed into law by the President. It is ludicrous for Congress to now complain about spending that it previously approved. The time for dissent and negotiation has long since passed. A new fiscal year has already begun. Previously incurred debts are now due.

Spending was twice reduced in recent years; under the Budget Control Act of 2011 and through the sequester cuts. More cuts are needed. But those should have been negotiated in April or May, not October.

Republicans in Congress are now taking the position that the only way to control future spending is by refusing to raise the debt ceiling. It's a tacit admission that they cannot control themselves and lack the ability to stop spending money the nation doesn't have.

However, the debate over the debt limit is a false argument; the debt ceiling and current spending levels are not correlated. While many Americans may not understand how the debt limit works, members of Congress surely do. Yet, they are playing on the public's lack of understanding to score political points.

The "debt ceiling," or debt limit, is a legislative restriction on the amount of national debt that can be issued by the Treasury. However, since expenditures are authorized by separate Congressional legislation, the debt ceiling does not actually restrict deficits. In effect, it can only restrain the Treasury from paying for expenditures that have already been incurred by Congress.

In other words, the debt ceiling only limits how much the Treasury can borrow to pay for past expenditures approved by Congress. The debt ceiling is raised as necessary through separate legislation.

A 2011 Government Accountability Office study found "the debt limit does not control or limit the ability of the federal government to run deficits or incur obligations. Rather, it is a limit on the ability to pay obligations already incurred."

A January, 2013 poll of a panel of highly regarded economists found that 84% agreed or strongly agreed that, since Congress already approves spending and taxation, "a separate debt ceiling that has to be increased periodically creates unneeded uncertainty and can potentially lead to worse fiscal outcomes."

The United States and Denmark are the only democratic countries to have legislative restrictions on issuing debt.

The U.S. has had some sort of legislative restriction on debt since 1917. However, since 1960, Congress has acted 78 separate times to permanently raise, temporarily extend or revise the definition of the debt limit — 49 times under Republican presidents and 29 times under Democratic presidents.

The United States has never reached the point of default where the Treasury was unable to pay its obligations. However, in 2011 the U.S. reached a point of near default. The delay in raising the debt ceiling resulted in the first downgrade in the United States' credit rating, a sharp drop in the stock market and an increase in borrowing costs.

Here's the important question: if the debt limit is so useless in restraining Congress' deficit spending and the issuance of new debt, why have a limit? Why the charade? Congress has continually voted to raise the ceiling anyway. Congress approves all expenditures, so it just needs to show some restraint in spending (or increase its revenues, or both). However, that restraint is due long before it's time to pay our bills for already incurred debt.

If Congress wants to place limits on the amount it can borrow to pay its debts, it must also place limits on the amount it first spends. The time for such decisions is in the spring or early summer, not October.

The U.S. is in quandary. Not raising the debt ceiling would be economic suicide. The U.S. government would default for the first time in our nation's history — and it would be by choice. That would be insane. World markets would be rocked and the cost of borrowing would skyrocket.

As it stands, three-month Treasury bills today sold at a high rate of 0.13%, well above the 0.035% paid to sell the notes a week ago. That's a nearly 400% increase, and the nation hasn't defaulted... yet.

However, the U.S. is facing a genuine fiscal dilemma: It must borrow even more money, and go further into debt, in order to service its already massive debts. In essence, we're continually borrowing new money just to pay off old debts.

Tuesday, September 24, 2013

Americans Facing Economic Insecurity



For the past seven years, the Independent Report has been chronicling the decline in American incomes and living standards, the reliance on debt, and the increases in poverty, joblessness and low-wage work. But the trouble began long before.

Though the Great Recession wreaked havoc on the U.S. economy and on millions of families, the American worker — in fact, the entire U.S. middle class — had already been under assault for decades.

The inflation-adjusted wages of full-time, male workers are now lower than they were in 1973, according to Census figures, which has forced more women into the workforce. Yet, despite the emergence of female workers and two-income families, household income continues to fall.

The median income of American households was $51,017 in 2012, following a median of $51,100 in 2011, the Census Bureau reported on Sept. 17th. While the Bureau said that the decline was not statistically different, it was a decline nonetheless and it followed two previous annual declines.

Yet, while incomes have been steadily falling, prices have been steadily rising.

Inflation rose 1.6 percent, 3.2 percent and 2.1 percent in each of the last three years, respectively. The government strips out the costs of food and gas from the Consumer Price Index, so the inflation rate is actually higher than reported. Oil still trades at over $100 per barrel, which raises the cost of gas, food and all other consumer goods.

This fall in incomes was not merely the result of the Great Recession. The decline had already been underway for many years.

In 1999, median household income was $56,080, adjusted for inflation, according to the Census Bureau. So, our median household income has fallen more than $5,000 since that time. That's not progress; it's a stark decline.

We're just slowly, steadily, sliding backwards as a nation.

Meanwhile, the U.S. poverty rate was essentially unchanged at 15 percent in 2012, as roughly 46.5 million people were stuck living at or below the poverty line, the Census Bureau reports.

As the Associated Press noted, "It was the sixth straight year that the poverty rate had failed to improve, hurt by persistently high levels of unemployment after the housing bust."

In 2012, 13.7 percent of people ages 18 to 64 (26.5 million) were in poverty compared with 9.1 percent of people 65 and older (3.9 million) and 21.8 percent of children under 18 (16.1 million).

Numbers like these may seem remote and impersonal, but the stark takeaway is this: there are 16 million American children, or one-in-five, living in poverty today.

Yet, the federal poverty rate surely underestimates the true number of poor Americans. For example, the poverty level for 2012 was set at $23,050 (total yearly income) for a family of four. However, if a family of four has $30,000 in annual income, it's safe to say that they are still living in poverty.

For a family trying to pay for housing, medical insurance, food and utilities — especially in a large metropolitan area — even $30,000 doesn't go far.

The official poverty measure ignores critical information like geographical location and the cost of housing. It is determined using the price of certain food staples nationwide and uses a flawed formula to adjust for inflation that doesn't consider the cost of gas.

If the government were being honest about the true scope of poverty and struggle in America, an ugly and shameful picture would emerge.

Forty-five percent of Americans lack basic economic security, or the ability to pay for necessities like housing, utilities, food, health care, child care and transportation, according to a recent report by the nonprofit Wider Opportunities for Women.

It's more evidence that the middle class has been eviscerated.

The wealthiest Americans, however, are doing just fine. The top 10 percent of earners made half of all income in 2012, the most on record, according to IRS data.

Consequently, the gulf between the richest 1% of Americans and the rest of the country reached its widest level in history last year.

The top 1% of earners in the U.S. pulled in 19.3% of total household income in 2012, which is their biggest slice of total income in more than 100 years, according to a an analysis by economists at the University of California, Berkeley and the Paris School of Economics at Oxford University.

One of the economists behind the research, Emmanuel Saez of UC Berkeley, is a top researcher on the topic of wealth and income inequality. He won the John Bates Clark medal last year.

In a separate analysis, Saez found the top 1% of earnings posted 86% real income growth between 1993 and 2000. Meanwhile, the real income growth of the bottom 99% of earnings rose just 6.6%.

Yet, the disparity has only worsened since that time.

According to the latest figures from Saez, the top 1 percent received 95 percent of all real income gains between 2009 and 2012. In 2011, when real real income fell for the bottom 99 percent, the top percentile accounted for 121 percent of the year's income gains.

The richest percentile now accounts for 22.5 percent of total U.S. income.

How can this state of affairs exist in a country that prides itself on greatness?

The U.S. is a plutocracy. The richest one percent have become a controlling class that runs Big Banking, Big Energy, Big Pharma, Big Insurance, Big Healthcare, Big Agra, Big Media and the Military-Industrial Complex. These industries are the core of the U.S. economy.

To repeat; the top 10 percent of earners made half of all income in 2012, the most on record, according to IRS data.

This isn't good in a consumption-based economy. When so many Americans have so little disposable income, it chokes off demand.

It's not a matter of fairness or economic equality; it's a matter of national economic survival. Siphoning off so much income to the top 1 percent, or even top 10 percent, is crushing economic growth and it is driving the emergence of a low-wage economy.

Fifty-eight percent of the jobs created during the recovery have been low-wage positions, according to a 2012 report by the National Employment Law Project. These low-wage jobs had a median hourly wage of $13.83 or less.

Wealth isn't easily defined. To someone making minimum wage, a person who makes $100,000 annually may seem rich. Yet, to someone who earns $1 million annually, $100,000 may seem like chump change. And then there are the stunningly wealthy Americans who make tens of millions of dollars each year, like hedge fund managers and elite Wall Street bankers.

So, how much money do Americans make across the economic strata?

The median income for those employed full-time between the ages of 25 and 64 is $39,000, according to the Census Bureau. The median household income is roughly $51,000 (this means half of American households earned more than that amount, while half earned less).

A household, as the Census defines it, consists of all the people who occupy one house or apartment. That means anyone living under the same roof and includes families, roommates sharing an apartment, and people living on their own.

In 2010, 39% of all households had two or more income earners. As a result 19.9% of households had six figure incomes, even though just 6.61% of Americans had incomes exceeding $100,000.

To be clear, less than 7 percent of Americans make more than $100,000 annually. That's not a large group. And fewer than 1 percent of the U.S. population has an annual income of more than $1 million.

On the other hand, one household out of every four (24.9 percent) makes less than $25,000 a year. This isn't just a national shame; it's a national crisis.

Workers in seven of the 10 largest occupations typically earn less than $30,000 a year, according to data published by the Bureau of Labor Statistics.

As long as capital is treated superiorly to labor, this state of affairs will continue until the economy totally breaks down, with American workers unable to purchase whatever goods they still produce (assuming those goods aren't already produced overseas).

America is devolving into a nation of serfs, ruled by a small class of oligarchs.

As it stands, the U.S. already has the highest income inequality in the developed world, and the fourth highest among all nations. All signs point to this blight continually worsening.

Economic opportunity and a respect for labor drove the emergence of robust American middle class in post-war America. But falling wages, the off-shoring of jobs, inflation, income inequality and diminished opportunity have destroyed the middle class — a group that previously separated America from the rest of the world.

Tuesday, September 10, 2013

The Rise of (and need for) Female Economic Power



When I was growing up, my father noted that something had noticeably changed in the American economy since he was a young man. Whereas his parents could get by quite well and even live a middle-class lifestyle on just one income, that had changed by the early 1970's.

By that time, maintaining a grip on a middle-class lifestyle typically required two incomes, forcing both parents into the workforce.

Since that time, the situation has become even more pronounced.

According to the Families and Work Institute in New York, 80 percent of today's married/partnered couples have both people in the work force, up from 66 percent in 1977.

The proportion of wives working year-round in married-couple households with children increased from 17% in 1967 to 39% in 1996.

By 2012, the share of married-couple families with children where both parents worked was 59 percent. And the labor force participation rate (the percent of the population working or looking for work) of married mothers with a spouse present was 68.3 percent, according to the BLS.

Of course, the number of women in the workforce is considerably higher when you include single and divorced women. And a dual-income household includes unmarried people, such as cohabiting couples of both sexes, as well as roommates who share expenses.

The issue is the number of families with children that require two incomes just to make ends meet. For most American families, there is no choice in whether a mother or father gets to stay home and parent their children. Having two working parents is now an economic necessity for almost all families.

The annual median wage fell in 2010 for the second year in a row to $26,364, a 1.2 percent drop from 2009, and the lowest level since 1999, according to David Cay Johnston at Reuters. And according to the Census Bureau, per capita income was $27,915 in 2011.

This is why most households now require more than one income to get by.

According to the Social Security Administration, 40.28% of all American workers currently make less than $20,000 a year. One in five people with a job earns only the minimum wage.

This just reaffirms why the majority of American households require two income earners, not just one.

Median household income has been sliding backward for the last six years, according to a new Census Bureau report. In 2007, at the beginning of the Great Recession, it was $55,480. By June, 2009, when the recession had officially ended, it had fallen to $54,478. And by June of this year, it had dropped to $52,098.

Income of that level does not go far in our economy, given the cost of food, energy, housing, education, and healthcare.

During the Great Recession, and even in its aftermath, companies cut jobs and salaries to get leaner and lower costs. But the financial struggles of average Americans are not just a matter of lower incomes; the problem is coupled with the continually diminished purchasing power of our money.

This is attributable to the pernicious effects of inflation, which is engineered by the Federal Reserve (meaning that it is intended). Inflation is not some mysterious phenomena. As the Fed has continually increased the money supply through the decades, it has steadily eroded and devalued the buying power of our money.

Inflation is a topic that I have explored repeatedly on this page through the years.

The effects of inflation, plus stagnant wages, have driven most American women into the workforce over the past four decades. This has resulted in such a historic shift that it can be aptly described as a sea change.

In 2010, for the first time in American history, the balance of the workforce shifted toward women, who now hold a majority of the nation’s jobs.

Women also dominate today’s colleges and professional schools: for every two men who will receive a B.A. this year, three women will do the same.

According to the Bureau of Labor Statistics, women now hold 51.4 percent of managerial and professional jobs — up from 26.1 percent in 1980. They make up 54 percent of all accountants and hold about half of all banking and insurance jobs. About a third of America’s physicians are now women, as are 45 percent of associates in law firms — and both those percentages are rising fast.

To be clear; not all working women are in the workforce simply out of economic necessity. Many women desire to work for a variety of personal reasons. Many of them have a skill or degree that they wish to utilize. Work can provide all people with a sense of community and of belonging. It can provide structure and a sense of purpose. Work can also be socially and mentally engaging. Additionally, it can provide a sense of identity and pride.

But there is no denying the economic impetus that has driven so many mothers into, or back into, the workforce.

It's bad enough that so many mothers are compelled to work as a result of economic necessity, even if they'd rather be at home with their young children. But they are also paid considerably less than their male counterparts for the very same jobs.

Women in the United States today are paid on average 77 cents for every dollar paid to men. The gap is even worse for African-American and Latina women.

Women ages 25 to 34 with only a high-school diploma currently have a median income of $25,474, while men in the same position earn $32,469.

Even among educated women, this wage-gap still exists.

The life-time earnings for a male with a professional degree are roughly 40 percent (39.59%) higher than those of a female with a professional degree, according to the Census Bureau. The lifetime earnings gap between males and females is the smallest for those individuals holding an Associate degrees, with male life-time earnings being 27.77% higher than those of females.

According to a new study done by the National Partnership For Women And Families (NPWF), the median yearly pay for women who are employed full time is $11,084 less than men’s.

This has has major implications for the ability of families and single women to afford essentials like food, housing and gas. According to NPWF, in more than 15.1 million families the woman is the breadwinner. And 31 percent of these families fall below the poverty line.

So, while women have advanced professionally in so many ways relative to men in recent decades, their pay still lags their male counterparts. This affects the men in dual-income households as much as it does the women.

If women were paid commensurately to men, all American families would benefit.

In 1970, women contributed 2 to 6 percent of the family income. Now the typical working wife brings home 42.2 percent. And four in 10 mothers — many of them single mothers — are the primary breadwinners in their families.

However, as women climb the ranks of the professional world, advancement eventually stalls out. Only 3 percent of Fortune 500 CEOs are women. But given societal trends, that will likely change sooner than later.

Women now earn 60 percent of master’s degrees, about half of all law and medical degrees, and 42 percent of all M.B.A.s. Most important, women earn almost 60 percent of all bachelor’s degrees — the minimum requirement, in most cases, for an affluent life.

This is the first time that the segment of Americans ages 30 to 44 has more college-educated women than college-educated men. So this is not a new trend; it has been underway for some time.

As women have stepped up and taken on increasingly larger roles in society and the workplace, their earning power has allowed many more families to maintain their foothold in the middle-class.

In fact, in many cases wives now out-earn their husbands — another historic shift. Of all married couples, 24 percent include a wife who earns more, versus 6 percent in 1960.

But as women have increasingly taken on a larger role in the workplace — whether voluntarily for personal reasons, or less voluntarily due to economic reasons — it has created a greater demand for child care and resulted in more latch-key kids. This has increased the pressure (and expense) for both parents.

It's laudable when women are able to enter the labor force at will to utilize their skills or degrees for their own self-fulfillment. But it's not so great when women must work just make ends meet for their families, yet get paid less than their male counterparts for doing the very same jobs.

The great middle-class expansion in the U.S. began following World War II. Women had entered the workforce during the war as a matter of patriotic duty and necessity. Millions of men were fighting overseas, so women stepped up and fulfilled many of the jobs suddenly left vacant by men.

In 1940, only 28 percent of women were working; by 1945, this figure exceeded 34 percent. In fact, the 1940s saw the largest proportional rise in female labor during the entire twentieth century.

However, more than half of the women drawn into the workforce by the war had left at the end of the decade. The Baby Boom had begun and for most of them work had become a choice — not a necessity.

In contrast, by 2011, 58.1 percent of women were in the labor force (which includes the noninstitutionalized civilian population, 16 years of age and over, that is willing and able to work and is either employed or actively seeking employment). But in recent years that percentage has been shrinking due to high unemployment. The women’s labor force participation rate peaked at 60.0 percent back in 1999.

Having more women working can be viewed as a sign of progress and of gender equality. But the wage difference between men and women remains very antiquated and even sexist. This bias is impacting nearly all American households.

The primary issue is that the post-war rise of the American middle-class was built largely on the backs of a single income-earner (typically men). But in order for the vast majority of American families to maintain their tenuous grip on that middle-class status, it almost certainly requires two incomes these days.

That's surely not a sign of progress.

Saturday, July 27, 2013

Savings Crisis Leaving Americans Unprepared for Emergencies/Retirement



For many years, Americans have shown a consistent inability to put money into savings, be it for emergencies, retirement, or the larger, necessary purchases that may arise.

In fact, the savings rate has been declining for decades, notwithstanding a brief uptick during the Great Recession. Here's a look at where the savings rate stood at the start of each of the last six decades, according to the Commerce Department's Bureau of Economic Analysis:

1960: 7.2%
1970: 9.4%
1980: 9.8%
1990: 6.5%
2000: 2.9%
2010: 5.1%

In 2005, the savings rate actually turned negative for the first time since the Great Depression, and it stayed that way for about two years.

While the savings rate reached a high of 5.4% in 2008, as Americans were trying to pay down their debts during the initial phase of the financial crisis, it started declining again in 2011.

This will have horrible consequences for the millions of Americans who will be facing a retirement funded entirely by Social Security. Pensions are largely a thing of the past and 50 percent of Americans don't participate in a retirement savings plan at work.

As it stands, current retirees are already relying far too heavily on Social Security.

According to the Social Security Administration, 23 percent of married couples and 46 percent of single people receive 90 percent or more of their income from Social Security. Furthermore, 53 percent of married couples and 74 percent of unmarried people receive half of their income or more from the program.

The average monthly Social Security benefit for retirees is just $1,262. That amounts to just $15,144 annually. Obviously, that doesn't go far.

According to a report by AARP, three out of five families headed by a retiree over 65 had no retirement savings. And half of those 65 and older had annual individual incomes of less than $18,500.

The problem for millions of Americans is that they simply can't afford to save. Adjusted for inflation, wages have been stagnant since the 1970s.

As bad as the Great Recession was to household incomes, those incomes have continued falling during the alleged recovery. Between June 2009, when the recession officially ended, and June 2011, inflation-adjusted median household income fell 6.7 percent, to $49,909, according to a study by two former Census Bureau officials.

The problem continues; Inflation-adjusted wages fell 0.4% in 2012, following a 0.5% decline in 2011.

So this gives a petty clear indication as to why Americans aren't saving for retirement or an emergency; they simply can't afford to.

Here's a look at the U.S. savings rate in recent years, according to the Organization for Economic Cooperation and Development (OECD).

2006: 2.6%
2007: 2.4%
2008: 5.4%
2009: 5.1%
2010: 5.3%
2011: 4.7%
2012: 4.3%
2013: 4.0%

However, the personal savings rate was just 3.20% in May. It had been as high as 6.4% in December.

The low savings rate creates the potential for crisis for millions of individuals and families.

Nearly three-quarters of Americans don’t have enough money saved to pay their bills for six months, according to survey results released in June by Bankrate.com.

Half of the survey respondents said they had less than three months’ worth of expenses saved up, and more than one-quarter have no reserves to draw on in case of emergency.

In addition to stagnant wages, persistent unemployment has made it difficult for Americans to put any money away.

Low- and middle-income Americans were hit harder by the recession and slow recovery than the wealthy. While the annual wages of the bottom 90 percent of workers declined between 2009 and 2011, the wages of the top one percent rose 8.2 percent during the same period, according to a January analysis by the Economic Policy Institute.

According to Bankrate, if Americans want to ensure they're protected in the event of a financial emergency, like a job loss or a medical issue, they should have enough money to cover about six months' worth of bills saved.

The U.S. retirement savings deficit is between $6.8 and $14.0 trillion, according to the National Institute on Retirement Security.

That is a staggering sum of money, or rather, a staggering deficit.

The average working household has virtually no retirement savings. When all households are included — not just households with retirement accounts — the median retirement account balance is $3,000 for all working-age households and $12,000 for near-retirement households.

Previous generations were much better at saving. During World War II, Americans were encouraged to buy government bonds as a matter of patriotic duty to aid the war effort. Following the Great Depression, regulation of the banking/financial industry greatly diminished bank failures, encouraging more Americans to save.

Credit was also not nearly as available in earlier eras, which encouraged people to save for future needs, whether it was a car, or a house, or an education. But, beginning in the '80s, there was an explosion of cheap and easy credit. That allowed people to get by without saving.

But the larger issue is the fact that household incomes have been flat for decades, while inflation has driven the prices of everything higher.

I've been coving the savings crisis and its implications on the retirement security of Americans since 2005, and the story hasn't gotten any better.

I asked the question "Will You Have Enough to Retire?" in February, 2006, and later that year I asked, "Are You Retirement Ready?"

And in September, 2010, I noted that "Americans' Retirement Savings Look Bleak."

Sadly, nothing has changed in recent years. Millions of people have simply moved closer to, or into, retirement quite unprepared. This has huge implications for our country.

Seniors are among the most vulnerable people in our society. Many will have to rely on their adult children or other family members to help them get through their final years. That will place tremendous burdens on already struggling families.

Assisted living facilities are very expensive. Nursing homes are even more expensive. Home health care and aids are also beyond the reach of great numbers of our senior population.

None of this appropriate or acceptable for such a wealthy nation — one that sees itself as a first rate, world leader.

For millions of Americans, what were supposed to be their "golden years" will not be so golden after all. At the least, they won't be nearly as golden as those of their parents and grandparents.

Saturday, July 20, 2013

U.S. Trade Deficit Dragging Down Economic Growth, Sending Billions Overseas



Warnings about the size of the federal budget deficit have made headlines for many years. We've repeatedly been cautioned about the threat the federal deficit poses to the U.S. However, we hear relatively little about the nation's trade deficit.

The U.S. has consistently run a gaping trade deficit for decades because we import more than we export. In fact, the U.S. has led the world in imports for decades and is also the world's biggest debtor nation.

Countries with big, persistent trade deficits have to continually borrow to fund themselves. The problem for the U.S. is that we don't export nearly enough to continue paying for all those cheap foreign goods that we've grown so accustomed to.

A trade surplus is preferable to a trade deficit since it generally implies that a nation's goods are competitive on the world stage, its citizens are not consuming too much, and that it is amassing capital for future investment and economic pursuits. The U.S. hasn't known such a position since 1975.

The trade deficit acts as a drag on economic growth because it means the U.S. is earning less on overseas sales of American-produced goods while spending more on foreign products. Buying goods from abroad means they are not being made here at home, and that displaces American jobs. Simply put, the trade gap subtracts from economic growth (GDP).

Each and every month, tens of billions of dollars are being drained out of the U.S. economy.

Such an imbalance has been able to exist for 36 years only because the U.S. has run a surplus in the trade of services (tourism, financial services, telecommunications, etc.). However, the overall trade deficit is unsustainable in the longer term. You can't buy more than you sell indefinitely.

This problem will not be easy to rectify. It can't be fixed in a quarter, a year, or even during a president's term. This is a structural problem, not merely a cyclical one. It has been decades in the making and at this point it will be very difficult to rectify.

There was some modest progress last year, however.

According to the U.S. Census Bureau, the U.S. trade deficit in goods and services declined from $559.9 billion in 2011 to $540.4 billion in 2012, an improvement of $19.5 billion (3.5 percent). It marked the first time in three years that the trade deficit fell.

Record exports, a drop in the cost of imported oil, and a slowdown in the country’s demand for imported consumer goods led to the decline.

However, there is no getting around the fact that the U.S. ran a trade deficit in excess of half a trillion dollars in successive years. And the problem remains persistent.

This year, the U.S. posted a trade deficit of $44.5 billion in January, $43.6 billion in February, $37.1 in March, $40.1 billion in April and $45.0 billion in May. That amounts to a cumulative deficit of $210.3 billion through the first five months of this year, meaning that the trade deficit is once again on track to exceed half a trillion dollars this year.

Oil has long been a major factor in the trade deficit. Yet, a structural shift is underway. While the U.S. trade deficit in petroleum goods declined $34.8 billion (10.7 percent), the U.S. trade deficit in non-petroleum goods increased $35.3 billion (8.8 percent).

As the Economic Policy Institute (EPI) put it:

"Growing goods trade deficits have eliminated millions of U.S. manufacturing jobs over the past decade, and non-petroleum goods were responsible for the vast majority of the jobs displaced. Rapidly growing trade deficits in non-petroleum goods, especially manufactured products, represent a substantial threat to the recovery of U.S. manufacturing employment."

Obviously, the trade deficit is the result of whole lot more than crude oil. We'll explore the role of oil in the trade deficit in a moment.

The good news was that the U.S. shipped a record $2.19 trillion in exports in 2012, despite significant economic headwinds around the world that undercut global trade. Europe, for example, is in a recession.

Exports as a share of GDP held steady at a record 13.9 percent, according to the Commerce Department. But that remains one of the lowest export levels in the world among large, industrialized nations. Clearly, the U.S. will not export its way to an economic recovery.

The U.S. relies heavily on consumption to drive its economy, yet American consumers are buying a disproportionate share of their goods from foreign nations. That doesn't help the U.S. economy much. Though cheap foreign goods help hold down consumer prices, they come at the expense of American jobs.

Americans are literally buying tons of foreign goods each month instead of making them here at home. That's a shortsighted policy.

Consumer spending now comprises 71% of the U.S. economy. Yet, far too much of that spending is directed toward foreign goods. For comparison, consumer spending was about 62% of GDP in 1960, when the economy was more balanced. At that time, the U.S. manufactured and exported far more than today.

Durable (6 percent) and non-durable (6 percent) manufacturing amount to just 12 percent of the U.S. economy. While the U.S. is still the world's leading manufacturer (by some estimates, China may have recently surpassed the U.S.), our share of global manufacturing has been declining for decades.

The U.S. share of global manufacturing now stands at 18%, down from 29% in 1970. And just 9 percent of U.S. workers are employed directly in manufacturing.

In January 2004, the number of manufacturing jobs stood at 14.3 million, down by 3.0 million jobs, or 17.5 percent, since July 2000 and about 5.2 million since the historical peak in 1979. Employment in manufacturing was its lowest since July 1950.

So, the decline in manufacturing is contributing to the trade deficit, which has been decades in the making. The U.S. has been running consistent trade deficits since 1976 due to high imports of oil and consumer products.

More than half of the U.S. trade deficit is with China, making it by far the largest deficit with any individual country. So far this year, the U.S. deficit with China is running 3% higher than last year.

While this trade deficit is benefitting the Chinese job market, it is hurting American workers.

According to a 2011 Economic Policy Institute report, the growth in the U.S. trade deficit with China displaced 2.8 million U.S. jobs between 2001 and 2010 alone.

For many years, China has undervalued its currency (the yuan) in relation to the dollar to keep its products artificially inexpensive in the U.S. while discouraging U.S. exports to China. This policy is contributing to high unemployment in the U.S.

As noted earlier, one of the biggest drivers of the U.S. trade deficit is imported crude oil. Oil is priced in dollars and the dollar is buying less these days.

In 2001, the U.S. Dollar Index traded at around $120. Today, the U.S. Dollar Index is trading at $81, about 32 percent below the 2001 high. That's a serious decline in value in little more than a decade.

The weakened dollar is punishing Americans every time they fill up their gas tanks.

Though a declining dollar makes U.S. exports cheaper overseas, our No. 1 import is oil, which is also priced in dollars. A weak dollar makes oil, and ultimately gasoline, more expensive, forcing the trade deficit further into the negative.

Right now, virtually all developed nations are seeking to devalue their currencies to increase exports. But every country can't have a trade surplus. Someone has to buy. For decades, the primary buyer has been the U.S.

However, the reason our economy melted down in the first place was because it was built on a bubble of debt. American consumers simply cannot continue taking on ever more debt while serving as the world's primary consumer.

Many economists believe that a currency war is currently underway. Call it a race to the bottom.

Japan’s money-printing and bond-buying program is the latest salvo in the global currency war. The U.S., U.K. and Switzerland are already enjoined in the battle. In fact, nations that constitute around 70% of world economic output are “at war,” pursuing policies that cause devaluation and currency debasement to differing degrees. A weaker currency boosts exports, driven by cheaper prices.

However, just as every nation cannot be a net exporter, neither can every nation have the cheapest currency by implementing similar devaluation policies. And if everyone devalues, then what's the point?

As long as the trade deficit continues, the U.S. will have to continue borrowing from abroad to pay the difference. That's why the trade deficit is so pernicious.

Since imports shrink the nation's gross domestic product, U.S. GDP will continue to face downward pressure. Every $1 billion of a larger deficit subtracts about 0.1 of a percentage point from the annualized growth rate. That's bad news for an economy that is currently struggling to eek out a mere 2 percent growth rate.

The U.S., the world's No. 1 importer, has been able to run continual trade deficits for many years because it has been receiving an inflow of capital from surplus nations, such as China, Japan and Saudi Arabia. If these surplus nations ever hope to get repaid (i.e. to reverse those capital flows) then those trade imbalances must be reversed.

That will require less consumption, more saving and more production here at home, plus more consumption and less saving in places like China.

America needs to produce more, export more, and save more. For more than a quarter-century, we did exactly the opposite. And that's exactly what we need China to do now; import more and spend more.

No nation can continually buy more from abroad than it sells abroad. It's simple arithmetic. Where will the money for all the purchases come from?

For decades, the U.S. has consumed more than it has produced, imported more than it has exported, and borrowed more than it has saved. The trade deficit is the unfortunate result of all those imbalances.

Sunday, July 07, 2013

Home Prices vs. Incomes: The Unravelling of the American Dream



The run-up in home prices in the last decade was unprecedented. Though home owners in some markets were thrilled to see their homes appreciating by 10, 15 or even 20 percent annually, these large increases priced-out numerous first time buyers and led many others to take on far too much debt in order to get aboard the runaway property train.

The rapid and substantial increase in home prices far outstripped income gains, otherwise known as Americans' ability to pay for these new homes. Despite the decline in home prices since the bubble burst, the disparity between prices and incomes still exists.

Home prices nationally had finally returned to their autumn 2003 levels by November of 2012, yet were still down about 30 percent from their respective peaks in June/July 2006.

That gives some perspective to just how over-inflated our national housing bubble truly was in the previous decade. The annual price appreciation from 2000 through 2006 was quite remarkable.

The following shows U.S. home-price appreciation by year, beginning in 2000, according to the Federal Housing Finance Agency (FHFA). Keep in mind that this was on a national level, and that in some markets the increases were considerably higher.

2000: 6.76 percent
2001: 6.61 percent
2002: 7.47 percent
2003: 7.57 percent
2004: 9.78 percent
2005: 9.84 percent
2006: 3.12 percent
2007: -2.35 percent
2008: -9.95 percent
2009: -2.06 percent
2010: -4.28 percent
2011: -3.12 percent

Even after steadily falling for five consecutive years, home prices are still 50% higher than they were in January 2000, according to the S&P/Case-Shiller Home Price Index (which refers to a typical home located within the 20 surveyed metropolitan areas).

The price of new homes increased by 5.4% annually from 1963 to 2008, on average, according the Census Bureau. That period includes the enormous price bubble of the last decade.

However, taking a longer view, the average annual home price increase in the U.S. from 1900 - 2012 was only 3.1% annually. So, the bubble years were truly an anomaly.

Your parents and grandparents viewed their homes as places to live, eat, sleep, raise a family and create memories. A home was a shelter that would gradually increase in value over time. Prior generations did not view their homes as investments. There was no expectation that houses could be a means to get rich.

Homes require upkeep, maintenance, insurance, taxes and interest payments. Yet, home values steadily increased for many years, decade after decade. It was enough to overlook the associated costs of ownership.

According to the Census Bureau, median home values (adjusted for inflation) nearly quadrupled over the 60-year period from the first housing census in 1940 to 2000. The median value of single-family homes in the United States rose from $30,600 in 1940 to $119,600 in 2000, after adjusting for inflation.

Median home value increased in each decade of this 60-year period, rising fastest in the 1970s (43 percent) and slowest in the 1980s (8.2 percent).

This all occurred before the Federal Reserve initiated the housing bubble by slashing interest rates in response to the bursting of the tech bubble, which hammered Wall St. and its investors.

Let's examine home price increases over the past four decades. None of the following Census Bureau figures are inflation-adjusted.

In 1970 the median home price was $23,400.
In 1980, the median price was $64,600 (176 percent increase).
In 1990, the median price was $$122,900 (90 percent increase).
in 2000, the median price was $169,000 (38 percent increase).
In 2010, the median price was $221,800 (31 percent increase).

The median home price peaked at $247,900 in 2007, the height of the housing bubble.

The median home price had advanced every year since 1963, with the exception of 1970 (when it dropped to $23,400 from $25,600 in 1969) and 1991 (when it dropped to $120,000 from $122,900 in 1990). In both of those years ('70 and '91), economic recessions were occurring.

So, the bursting of the 2000s housing bubble was largely an anomaly, and the price decline was extraordinary. The median home price fell from $247,900 in 2007 to $232,100 in 2008. Then it fell yet again, to $216,700 in 2009. A decline in back-to-back years was unprecedented in the modern U.S. economy, and it had not been experienced since the dark days of the Great Depression.

The big question is, how have median home prices stacked up against median household incomes? In other words, have incomes kept up with the almost perpetual rise in home prices?

The Census Bureau publishes median household income data from 1967 until present day.

According to the Census Bureau, "household median income" is defined as "the amount which divides the income distribution into two equal groups; half having income above that amount, and half having income below that amount."

The Census Bureau publishes both nominal (not inflation-adjusted) and inflation-adjusted figures for household income. Since we looked at nominal figures for home prices, we'll do the same with household incomes.

In 1970, the median household income was $7,143.
In 1980, the median household income was $16,200 (126 percent increase).
In 1990, the median household income was $27,922 (72 percent increase).
In 2000, the median household income was $40,418 (44 percent increase).
In 2010, the median household income was $47,425 (17 percent increase).

We can see that median household income more than doubled from 1970 to 1980, climbing 126 percent. But then income growth slowed in the ensuing decades, rising 72 percent between 1980 and 1990, 44 percent between 1990 and 2000, and then just 17 percent between 2000 and 2010.

Clearly, median household incomes were not keeping up with the rise in home prices, decade after decade.

Let's look at the comparison more closely.

While the median home price went up 176 percent from 1970 to 1980, median household income went up just 126 percent.

While the median home price went up 90 percent from 1980 to 1990, median household income went up just 72 percent.

While the median home price went up 38 percent from 1990 to 2000, median household income went up 44 percent.

While the median home price went up 31 percent from 2000 to 2010, median household income went up just 17 percent.

The only decade in which incomes gains eclipsed the rise in home prices was the 1990s. In every other decade, the increase in home prices significantly outpaced the rise in household incomes.

As if that wasn't bad enough, the income numbers are quite different, and tell an even worse story, after adjusting for inflation.

In 1970, the inflation-adjusted median household income was $45,146.
In 1980, the inflation-adjusted median household income was $46,024.
In 1990, the inflation-adjusted median household income was $49,950.
In 2000, the inflation-adjusted median household income was $54,841.
In 2010, the inflation-adjusted median household income was $50,831.

This means that household incomes barely budged for a couple of decades, and then they went backward in the last decade. Americans are making roughly the same amount today, in inflation-adjusted terms, as they were making back in 1990.

Yet, the problem of falling incomes stubbornly persists. Median household income after inflation fell again in 2011, to $50,054. It was part of an ongoing pattern. Inflation-adjusted wages fell 0.4% in 2012, following a 0.5% decline in 2011. Simply put, wages aren't keeping up with inflation.

Sixty-five percent of the jobs added to the U.S. economy since the recession officially ended have been lower wage jobs ($35,000 annually, or less). Those aren't the kind of jobs that allow people to service an existing mortgage, buy a new home, or firmly entrench anyone in the middle class.

Given the steady increase in home prices and the rate of inflation in general, that's been back-breaking for the typical American family.

Historically, from 1914 until 2012, the U.S. inflation rate averaged 3.36 percent. This means that your money has been losing roughly a third of its buying power each decade.

Given that inflation-adjusted median household incomes are now at the same level as in 1990, and that they are only marginally higher than in 1970, inflation has really punished most Americans and that is clearly seen in the housing market.

Since home prices are still well below where they were during the housing boom, when so many people bought into the market, millions of homeowners remain underwater.

A total of 13 million borrowers, or 25.4 percent of all homeowners with a mortgage, still owe more on their mortgages than their homes are worth, according to a new report from Zillow.

This prevents these people from moving and it is keeping millions of homes off the market. Another 9 million borrowers, while not entirely underwater, likely do not have enough equity in their homes to afford to move, according to Zillow.

That means a whopping 22 million homeowners are essentially stuck in their homes. This is limiting the supply of houses on the market and is driving up prices once again.

So, while home ownership was traditionally seen as the embodiment of the "American dream," it has become a nightmare for millions of Americans. And for million of others, it is nothing more than a pipe dream, an aspiration that has become entirely out of reach.

Every time there is an increase in home prices, home owners cheer. Meanwhile, those on sidelines, who hope to one day become homeowners, watch as their dream seems to move further off into the ether.

Saturday, June 01, 2013

The End of Economic Growth?



Perpetual or infinite growth is the economic paradigm in most of the world. Nowhere is this more evident than on Wall St., where companies are expected to consistently grow their revenues and profits year after year, even quarter after quarter.

In fact, a company can show growth, but if that growth is deemed inadequate — meaning it misses estimates — the company will be punished by Wall St.

Perpetual growth isn't just expected, it is mandated. But while infinite growth has long been taken for granted, it seems that we are now living in a time where it will no longer be the norm.

As a nation's population grows, its economy also has to grow in order to support all of the new workers entering the workforce. The US economy, for example, needs to grow at least 2.5 percent annually just to keep up with its population growth. But that kind of growth is becoming harder to rely on. As I reported previously, the US economy has been slowing for many years.

This trend has gotten the attention of some respected economists who have some sobering warnings for us.

The IMF projects that global growth “will slip below 2% in 2013.” Yet, through much of history, that sort of growth would be cause for celebration.

In August of 2012, economist Richard Gordon published a disturbing research study, titled, “Is U.S. Economic Growth Over?” In it, he notes that for the five centuries leading to the 18th century, the per capita growth rate was only 0.2 percent annually.

Then, during the Industrial Revolution, the U.S. growth rate shot up to 2.5 percent through 1930. A string of innovations, such as the steam engine, railroads and electricity drove that growth. But “it’s been downhill since 1950,” says Grodon, with growth averaging 2.1 percent.

On this trajectory, Gordon warns, the American economy will be back where it started by 2100, at annual growth of just 0.2 percent.

Given that the current economic, political and social systems are predicated on infinite economic expansion, the mere suggestion of this is unacceptable to our national leaders. It doesn't fit within the framework of how we view ourselves as a great nation, and publicly accepting this certainly won't help any politician win an election. That's because many Americans will similarly refuse to accept this, even when it proves to be true.

Some truths are just too hard and too bitter.

Gordon found that prior to 1750 there was little or no economic growth (as measured by increases in gross domestic product per capita).

It took approximately five centuries (from 1300 to 1800) for the standard of living to double in terms of income per capita. Between 1800 and 1900, it doubled again. The 20th Century saw rapid improvements in living standards, which increased by between five or six times. Living standards doubled between 1929 and 1957 (28 years) and again between 1957 and 1988 (31 years).

Gordon argues that the rapid advancement of living standards achieved since 1750 was driven by three distinct phases of the Industrial Revolution: 1.) steam engines; 2.) electricity, internal combustion engines, modern communication, entertainment, petroleum and chemical and 3.) computing.

However, these means to growth have either reached, or will soon reach, their limits. And Gordon isn't the only one who recognizes this. The slowdown is not only underway, it may be picking up a head of steam that will soon make annual growth of even 1 percent hard to come by.

Historically, from 1948 through 2013, the United States annual GDP growth rate averaged 3.21 percent.

Yet, over the last two decades, as with many other developed nations, its growth rates have been decreasing. In the 1950’s and 60’s the average growth rate was above 4 percent. In the 70’s and 80’s it dropped to around 3 percent. And in the last ten years, the average rate has been below 2 percent.

Famed investment strategist Jeremy Grantham, founder and chief investment strategist for the $100 billion asset management firm GMO — one of the largest such firms in the world — provided a recent warning to his clients: America’s long-term 3.4 percent annual GDP growth is a thing of the past.

In a recent Quarterly Letter “On Road to Zero Growth,” Grantham issued the following stark forecast:

“Going forward, GDP growth (conventionally measured) for the U.S. is likely to be about only 1.4% a year, and adjusted growth about 0.9%,” writes Grantham.

"Capitalism’s greatest weakness is its absolute inability to process the finiteness of resources and the mathematical impossibility of maintaining rapid growth in physical output,” Grantham states.

"Investors should be wary of a Fed whose policy is premised on the idea that 3% growth for the U.S. is normal," says Grantham.

The “bottom line for U.S. real growth,” he says, “is 0.9% a year through 2030, decreasing to 0.4% from 2030 to 2050.”

The problem comes down to the rapid depletion of our natural resources. Once we dig up and utilize our resources, there's no getting them back. Most of our key resources are finite and non-renewable. Yet, they have been, and will continue to be, vital to our economic growth.

Many economists and political leaders have ignored this problem because their focus is solely on the short term. Most projections are done on an annual, or even quarterly, basis. That shortsightedness isn't helpful in identifying longer term problems or crises.

Unfortunately, the rising cost of resources is often read as a boost to GDP. For example, drilling a deeper, more complicated well that requires more steel, more energy and more manpower is reflected as a boost to GDP. However, utilizing more resources and manpower to extract resources isn't really adding to GDP. It's misleading.

The reality is that the rising cost of resources is hindering real economic growth all over the world.

"I’ve been obsessing about the shift in resource prices that started 10 years ago, which is reducing the growth rate of every [country]," says Grantham. "We calculated the percentage of global GDP that was going to resources, and it declined beautifully, forever, until 2002, when it hit some very low number like 9 percent. The price of pretty well everything has doubled and tripled since then. This has taken a bite of three points out of global GDP."

This is a major shift since typical commodity prices dropped by about 70 percent during the 20th Century, according to Grantham. It was easy to get rich and grow an economy in that environment. However, that is no longer the case. The entire 100-year decline in commodity prices was reversed from 2002 to 2008. That's right; in just six short years, a century of resource declines were completely reversed.

The era of cheap resources is over, and that is a game-changer. The world is bigger today and there are more people than ever before competing for the same limited resources. The developing economies of China and India, alone, are home to more than a third of the earth's population.

For example, oil was $25 per barrel in 2000. It is now trading at roughly $100 per barrel. This has raised the price of everything else in the US and global economies. The extraction of all other resources is wholly reliant on oil, which makes them more expensive as oil becomes more expensive. In fact, resource prices have been rising faster than the global growth rate.

Under the perpetual-growth paradigm, oil — the most critical of commodities — is a requisite. But given that oil is a finite resource, that is an inherent limitation to growth.

In 2012, the Worldwatch Institute published a report titled, “Planet’s Tug-of-War Between Carrying Capacity and Rising Demand: Can We Keep This Up?”

The short answer is no. The planet’s “shrinking resources” cannot satisfy the exploding population’s “growing demand for food and energy,” stated the report's authors.

Back during the Great Depression the world had 3 billion people. Twelve years ago it had doubled to 6 billion. Now it’s 7 billion, with the United Nations predicting 10 billion by 2050.

Yes, by 2050, another 3 billion people will be added to the planet. This means that the world’s farmers, ranchers, and fishers must find a way to produce more food in the next 37 years than they have in all of human history. That will prove daunting since farmland is decreasing instead of increasing.

In 1960 there were 1.1 acres of arable farmland per capita globally, according to data from the United Nations. By 2000 that had fallen to 0.6 acre. Yet, during that time, the global population doubled from 3 billion to more than 6 billion. In other words, productive farm land and the human population are moving in the wrong directions.

"Even if we could produce enough food globally to feed everyone satisfactorily, the continued steady rise in the cost of inputs will mean increasing numbers will not be able to afford the food we produce,” says Grantham.

The world is facing an impending shortage of phosphorous, which is primarily used to make fertilizer. Plants remove phosphorous from the soil, so using fertilizer replenishes what is lost.

The trouble is, some scientists now believe that "peak phosphorous" will occur in 30 years, leading to a global shortage. What's particularly troubling is that there is no synthetic alternative to phosphorous.

"At current rates, reserves will be depleted in the next 50 to 100 years," warned a 2008 article in the Sunday Times.

"It`s an element. You can`t make it," Grantham cautions. "You can`t substitute for it and no living thing -- humans, animals, vegetables, everything needs phosphorous to grow. You can`t grow anything without it and we are mining it in what we call big AG, big agriculture. It`s a finite resource. Now that should make you pretty scared."

In the absence of phosphorous, there is no means to feed the current global population of 7 billion, much less than 9-10 billion humans set to inhabit this planet by mid-century.

"You can`t substitute for very few things in this world. You can`t substitute for water, not really for soil, not potassium and not phosphorus," says Grantham.

Moreover, mining phosphorous requires enormous amounts of energy, which will only add to future costs.

All of this will act as a drag on future economic growth.

Many people take economic growth for granted and expect it to continue indefinitely. But, as Gordon and Grantham have noted, in historical terms, economic growth is a relatively recent phenomenon.

Governments cannot simply conjure economic growth at will. If that were the case, there would be no business cycle. The economy would never slowdown. It would just continually grow instead of being plagued by regular recessions.

Massive government deficits are not leading to rapid growth and neither is the relentless money-printing of central banks. The days of financially engineered growth are over.

The reality is that growth is driven by an ever-increasing amount of debt. By 2008, $4 to $5 of debt was required to create $1 of growth. Fiat currencies are being continually devalued and we have finally reached the limits of the debt super cycle.

As finance expert and author Satyajit Das puts it, "If government deficit spending, low interest rates and policies to supply unlimited amounts of cash to the financial system were universal economic cures, then Japan’s economic problems would have been solved many years ago."

As Europe is painfully learning, reducing debt simultaneously reduces demand and locks an economy into a negative spiral of ever lower growth.

"We have been living in an unsustainable world of Ponzi-like prosperity where the wealth was based on either borrowing from or pushing problems into the future," says Das.

Going forward, economic growth will be much lower than that which we, our parents and grandparents became accustomed to.

Grantham warns that from the late 1900s until the early 1980s “the trend for U.S. GDP growth was up 3.4% a year for a full hundred years,” powering the American Dream. But after 1980, under Reaganomics and the new conservative capitalism, “the trend began to slip,” warns Grantham.

In other words, trickle down economics never worked out as promised.

Quite remarkably, after a century of high-growth prosperity, our GDP growth dropped “by over 1.5% from its peak in the 1960s and nearly 1% from the average of the last 30 years.”

Looking ahead at long-term macro-trends, “The U.S. GDP growth rate that we have become accustomed to for over a hundred years” is “not going back to the glory days of the U.S. GDP growth.”

Despite all the optimistic projections from in-house economists at Wall Street banks that are breathlessly reported in the mainstream media, “It is gone forever.”

Living in denial will only make the ultimate reality more bitter and more challenging. We're wasting precious time acting as if the last 100 years were an indicator of the next 100 years, when it clearly was not.

Our alleged leadership is ignoring our accelerating GDP decline. Grantham puts it this way: “Most business people (and the Fed) assume that economic growth will recover to its old rates.”

"Clearly, Bernanke seems to believe [growth] will go back to 3 percent — the good old days," Grantham laments.

But looking ahead to 2050, Grantham warns, “GDP growth (conventionally measured) for the U.S. is likely to be about only 1.4% a year, and adjusted growth about 0.9%.”

The evidence is clear: the American economy is in a long-term decline. That's a bitter pill.

Unless we shift to a model of sustainability driven by less consumption, more conservation. efficiency and renewables, we will continue running headlong into an eventual collapse.

History is littered with collapsed civilizations that lived beyond their means and exhausted their natural resources.

We may be no different.