Wednesday, March 26, 2014

Too Few High Tech Jobs, Too Many Fast Food & Retail Jobs



Facebook recently bought WhatsApp — a company with just 55 employees — for $19 billion. It's rather stunning that a company with just 55 employees can be valued at $19 billion. What's more, the company had estimated revenues of just $20 million in 2013, which makes the purchase price seem all the more irrational.

The acquisition is indicative of the tremendous money and growth in the tech industry. There are so many high-paying jobs in the sector, and so many millionaires in Silicon Valley, that most people would surely love to work for one of the famous (or not so famous) tech companies in the area.

But, despite their sizable revenues and market capitalization, these companies employ a relatively small number of people. For example:

Apple employs 80,300 full-time employees, plus 4,100 full time temporary employees and contractors. However, 42,800 of its employees work in the retail part of the business.
Revenue: $170.9 billion
Market cap: $446 billion

Google employs 47,756 people.
Revenue: $59.82 billion
Market cap: $268.44 billion

LinkedIn employs 43,282 people.
Revenue: $1.52 billion
Market cap: $24.92 billion

Facebook had 6,337 employees as of December 31st.
Revenue: $7.87 billion
Market cap: $134 billion

Twitter had “over 2,300 employees” when it filed for its IPO late last year.
Revenue: $664 million
Market cap: $29.16 billion

In total, these five major technology companies — the likes of which almost everyone has heard of, and where virtually all young people would feel privileged to work — employ a total of roughly 137,000 people.

For comparison, here are America's 10 largest employers, each of which has a workforce of more than 300,000 people. Combined, they employ more than 5.6 million workers.

General Electric: 305,000 total employees
Hewlett-Packard: 331,800 total employees
Home Depot: 340,000 total employees
Kroger: 343,000 total employees
Target: 361,000 total employees
United Parcel Service: 399,000 total employees
IBM: 434,246 total employees
McDonald's: 440,000 total employees
Yum! Brands: (owner of KFC, Taco Bell and Pizza Hut) 523,000 total employees
Walmart: The largest American employer has 1.3 million workers employed in the United States.

The reality is that the workforce of many of these companies is part-time, temporary and seasonal, and many of these jobs are low-paying and poor quality. The bulk of the employees in at least six of these companies (Walmart, Home Depot, Kroger, Target, McDonald's and Yum! Brands) are low wage workers.

Unfortunately, each of these companies employs more than twice as many workers as the five combined technology companies listed above. And Walmart employes ten times as many.

That's a sobering perspective.

We would all benefit from an economy built on higher-wage workers in industries such as science, technology and engineering. But those are not the type of industries that employ the masses. Moreover, those industries all require advanced degrees.

A recent analysis by the National Employment Law Project shows that low-wage positions
account for nearly three out of five jobs generated in the first three years of economic recovery.

So, while America aspires to be a 21st Century global leader — with a workforce largely comprised of well-payed, highly-skilled workers in modern, first-world industries — our economy is bogged down by far too many fast food and retail jobs.

Of course, those are the kind of jobs that keep workers and their families in poverty and on public assistance.

We need more companies like Apple and Google all around the country, not just Silicon Valley, employing millions more people.

But what we have instead are way too many Walmarts, McDonald's and KFCs.

Wednesday, March 19, 2014

Great Recession Still Casting Its Shadow Over Employment/Wages



Perhaps the greatest and most lasting scar of the Great Recession has been its affect on employment and the job market. More jobs were wiped out in the Great Recession than in any other post-World War II downturn.

According to the Department of Labor, roughly 8.7 million jobs were shed from February 2008 through February 2010. Meanwhile, about 8.4 million jobs were created since February 2010. But the remaining deficit doesn't even account for all the high school and college graduates who have entered the workforce in that span.

It's a rather stunning state of affairs. Four years after our supposed recovery took hold, we still have a significant jobs deficit. Yet, the economy needs to add 143,000 jobs monthly just to keep pace with population growth.

More than three million students are expected to graduate from high school this year, according to the National Center for Education Statistics. Additionally, 943,000 students are expected to receive associate’s degrees this year, while 1.8 million more will earn bachelor's degrees.

As a result, millions of jobs will have to be created this year just to employ this group of graduates, which doesn't include all of the currently unemployed adults who are still searching for work. This same cycle has been playing out each and every year since the recession "officially" ended.

From the late 1940s until the early 1990s, the U.S. economy never took more than a year to regain all the jobs lost during downturns. Yet, after the 1990-91 recession, it took 21 months to recover all the lost jobs. And following the dot-com bubble, when 2.7 million jobs evaporated, it took 18 months after payrolls bottomed out for them all to come back.

The pattern is clear; it is taking longer and longer to recover from each subsequent recession. But the Great recession was a different animal altogether.

While the unemployment rate has been decreasing in recent years (despite the fact that it ticked back up to 6.7 percent in February), it masks some rather troubling underlying statistics.

The number of involuntary part-time workers was 7.2 million in February, according to the Bureau of Labor Statistics (BLS). These people were working part-time because their hours had been cut back or because they were unable to find full-time work.

In February, 2.3 million people were "marginally attached" to the labor force. These individuals were not in the labor force, wanted and were available for work, and had looked for a job sometime in the prior 12 months. They were not counted as unemployed because they had not searched for work in the four weeks preceding the survey.

The Labor Force Participation rate was 63 percent in February, meaning that just 63 percent of people in the civilian non-institutional population either had a job or were actively seeking one in the previous four weeks. This group includes all people in the United States, 16 or older, who are not on active duty in the military or in an institution, such as a prison, nursing home or mental hospital.

A rate this low has become a disturbing trend, and one that may not reverse.

The average annual labor force participation rate in 2013 was 63.2 percent, a 35-year-low, according to data from the BLS. The rate peaked at 67.1 percent in 1997 and has dropped annually ever since.

A Philadelphia Federal Reserve study on the topic notes that, “retirement had not played much of a role until around 2010.” By then, the rate had already dropped 2.4 percent.

This means that retirement has not played much of a role in the participation rates up until 2010, and it seems to be only marginally affecting it now.

Yes, older people are retiring, but younger workers are continually entering the workforce, which should be helping to offset those retirements. Additionally, older Americans are staying in the workforce longer than in the past, usually for financial reasons.

What's particularly alarming is that the participation rate for workers between ages 25 and 54 fell sharply during the recession and still hasn't recovered. So, the low participation rate is not simply a matter of older workers retiring.

Younger workers have been hit especially hard by unemployment, as well as low wages. Yet, older workers — 55 and older, even those 65 and older — have been adding to the labor force participation rate. They have fared much better than younger workers.

This suggests that the economy is in much worse shape than the official unemployment rate indicates. The official jobless rate (known as U-3) is currently 6.7 percent, but that only counts people who are actively seeking work — not labor-force dropouts.

The U-6 unemployment rate was 13.1 percent in February. This calculation includes: "discouraged workers", or those who have stopped looking for work because current economic conditions make them believe that no work is available for them; "marginally attached workers", or those who "would like" and are able to work, but have not looked for work recently; and part-time workers who want to work full-time, but cannot due to economic reasons (also known as "underemployment").

Here's a troubling fact: In February, there were more than 92 million Americans not in the labor force, a record number. However, there are just 144.1 million employed workers. This ratio is quite unsettling. It means that there are just 59 percent more adults working than not working.

In other words, there aren't even two adults working for every adult that isn't. That's rather stunning.

Of the nation's 144.1 million employed workers, about 27.8 million of them are part-time. This means roughly 19 percent of all workers are part-timers.

While part-time work typically increases during recessions, the percentage of part-time jobs has remained stubbornly high since the recession officially ended. As previously noted, there were 7.2 million involuntary part-time workers in February.

Though the BLS reports the economy has added private sector jobs for 48 straight months, and job growth has averaged 189,000 per month over the past 12 months, far too many of those jobs have been of the low-wage variety.

Roughly half of the jobs created in the United States in the past three years have been low-paying jobs, according to economists at the Royal Bank of Scotland, who sent a research note sent to their clients in May 2013 titled, "A Closer Look At The Labor Market Recovery."

RBS defines "low-paying" jobs as those paying 80 percent or less of the average private-sector wage of $20.04 per hour. The sectors providing many of these low-paying jobs include retail sales, leisure & hospitality, and education.

The RBS study echoes several others in recent years, including a National Employment Law Project study from August that found three-fifths of jobs created since the recession have been low-paying, roughly matching the number of middle-income jobs that were lost.

About a third of working families in the U.S., representing about 47 million people, are in low-wage jobs today, according to the Working Poor Families Project. That's startling.

So, even as the unemployment rate has been generally trending downward with the creation of new jobs, they are not the kind of jobs that can meaningfully drive demand and consumption in our economy. To the contrary, they are the kind of jobs that keep people in relative poverty.

In fact, the U.S. now has the highest proportion of low-wage workers in the developed world, according to the Organization for Economic Cooperation and Development. One in four make less than two-thirds of the median wage, which is the same proportion that relies on public aid.

The big picture reveals a national crisis. There simply aren't enough jobs to employ the long-term unemployed, as well as all of the recent high school and college graduates entering the workforce.

When you include the number of low-paying and part-time jobs to the mix, the situation is downright bleak.

This is no way to sustain an economy, much less grow one.

Friday, February 07, 2014

Economic Squeeze: Americans Don't Have Enough Income to Drive Demand



Our economic model is based on demand and consumption. In order for the economy to continually grow, people must continually buy more stuff.

Of course, there are the everyday staples that everyone will always buy because they must, such as food, toilet paper, tooth paste, soap, etc. Spending on these basic life-essentials is considered non-discretionary because there is no choice involved. Paying for the roof over one's head is also considered non-discretionary.

Yet, what businesses and the broader economy really need is for people to also make lots of discretionary purchases. This includes things such as household furniture, appliances, cars, luxury items, vacations and entertainment, plus all other non-essential goods and services.

The problem is that household income has been shrinking for many years, leaving people with less to spend.

For many years, Americans made up the difference by using credit cards and going into debt. In essence, people were spending money they didn't have to fiance their lifestyles. But people are now taking on less debt than during the bubble years.

For example, credit card debt outstanding is 7% lower than its level in 2010 and 16% below its peak in 2008.

The debt-to-income ratio for American households is now down to 109% – well below the peak of 135% reached in late 2007. But it's still 35 percentage points above the average of the final three decades of the twentieth century, according to Yale economist Stephen Roach.

In other words, Americans still have a lot of work to do to pay down their rather substantial debts.

The personal savings rate also remains below past levels. The savings rate fell to 4.48 percent in 2013, according to the Bureau of Economic Analysis.

After turning negative in 2005 for the first time since the Great Depression, and staying that way for about two years, the savings rate then began to climb, reaching a high of 6.1% in 2009.

After witnessing the collapse of the debt bubble, Americans were trying to pay down their debts as the financial crisis was still unfolding.

But the savings rate quickly returned to a downward trajectory the very next year, falling to 5.6 percent in 2010. It stayed relatively stable for the next two years: 5.7 percent in 2011, and 5.6 percent in 2012.

So, the decline to 4.48 percent in 2013 was rather substantial, amounting to a 20 percent drop.

As the Baby Boomers continue to retire each year and draw down on their retirement savings, the savings rate will be driven down even further.

There are a couple of different perspectives on the savings rate.

On the one hand, if people are saving, they are not spending, which tends to hold back economic growth. On the other hand, if people aren't saving, there is less money for national investment, meaning money is instead borrowed from overseas. As it stands, that's already a big problem for the U.S.

People haven't been able to save because they don't have the means. Incomes have fallen considerably. In past decades, people made up for that fact by going further into debt. This helped prop up economic growth. But there was an eventual and inevitable reckoning, as people came face to face with an enormous pile of debt that had become overwhelming.

So, Americans are now spending less now than in the bubble years of the last decade. Home equity has been crushed, leaving nothing for them to extract and spend.

This is all bad news for retailers and for the economy in general, since consumer spending accounts for more than 70 percent of GDP.

However, our economic decline was quite predictable. Real median household income is now at 1990 levels. Yes, that's how far backward the typical American household has fallen.

In 1970, the inflation-adjusted median household income was $45,146.
In 1980, the inflation-adjusted median household income was $46,024.
In 1990, the inflation-adjusted median household income was $49,950.
In 2000, the inflation-adjusted median household income was $54,841.
In 2010, the inflation-adjusted median household income was $50,831.

This means that household incomes barely budged for a couple of decades, and then they went backward in the last decade. Americans are making roughly the same amount today, in inflation-adjusted terms, as they were making back in 1990.

Yet, the problem of falling incomes stubbornly persists. Median household income after inflation fell again in 2011, to $50,054. After falling 0.5% in 2011, inflation-adjusted wages declined 0.4% in 2012.

Simply put, wages aren't keeping up with inflation.

Yet, the cost of everything continues to rise. For example, the price of gasoline over the last three years has been at its highest point ever. Gas prices in 2013 were the second highest in history, trailing only 2012, which saw the highest average price. And the third highest average price was in 2011.

Oil prices have surged from $25 per barrel in 2003 to $100 per barrel today. That has raised the cost of virtually everything in our economy. All goods that are transported are more costly as a result.

Outstanding student loan debt has reached a whopping $1.2 Trillion, which means that more than 40 million Americans are not buying houses or cars, starting businesses or families, or otherwise creating demand in the economy.

As previously noted, there are fewer people spending because they have less income to spend. But, additionally, there are fewer people working. The labor participation rate is at its lowest level in 35 years, a time when the economy was in recession.

There are 247 million working age Americans between the ages of 16 and 64 (those not in the military, jail/prison, mental facilities or homes for the aged). Only 155 million of them are employed. This means that 92 million working-age people are not working.

While some on these people are retiring, once someone turns 65 they are no longer counted in the work force. Yet, the number of 'working age' Americans is continually growing since thousands of teenagers turn 16 every day and are considered part of the labor force.

It's a sad fact that the labor force participation rate was just 62.8 in December. The figure hasn't been that low since at least 1978. Again, a whopping 92 million working age people simply aren't working.

The problem is there are too many people looking for too few jobs. At present, there are three applicants for every available job nationwide.

Then there's the additional problem of low-paying jobs. Nearly two out of three new jobs created from January through August last year were part time.

During the supposed recovery, there have been far more low-wage jobs created than high-wage jobs.

Between 2009 and 2013, low-wage jobs outnumbered high-wage jobs by some 800,000, with 1.7 million versus 1.1 million jobs. Though low-wage jobs made up less than one in five (19 percent) of all employment in 2009, they accounted for nearly 40 percent (39 percent) of all new jobs created out to 2013.

Put all these pieces together and you realize why demand and consumption are so low, and why economic growth has been so weak.

Growth rates in the U.S. have been decreasing for decades. In the 1950’s and 60’s, the average growth rate was above 4 percent. But in the 1970’s and 80’s, it dropped to around 3 percent. And in the last ten years, the average rate has been below 2 percent.

In fact, the U.S. economy has not surpassed 3 percent annual growth since 2005. Last year likely continued that trend. The advanced estimate for economic growth in 2013 is 2.7 percent, according to the White House. That number will be revised a couple of times in the coming months.

Think of it as the new normal.

There is, however, plenty of money still in the U.S. economy. It's just being siphoned off to the richest 1 percent.

The average CEO-to-worker pay ratio in 2012 was 354 to 1. That is far more than the ratio in other developed countries.

In the 1970s and early '80s, the U.S. ratio was roughly 20 to 1.

While most Americans continue to struggle financially, and even regress, the rich just keep getting richer.

The United States has led a worldwide growth in wealth concentration, according to a recent Oxfam report, titled "Working for the Few."

The percentage of income held by the richest 1% in the U.S. has grown by nearly 150% since 1980. That small elite has received 95% of wealth created since 2009, after the financial crisis, while the bottom 90% of Americans have become poorer, Oxfam said.

The new normal is plainly abnormal, even despicable.

Wednesday, January 08, 2014

Disequilibrium: Debt Growth vs. Economic Growth



Despite its difficulty generating consistently strong economic growth, the U.S. is having no trouble growing its already sizable debt.

For years now, I've discussed the slower U.S. economic growth rates on this page, and all of the reasons for this. Growth has been slowing for many years and the downward trend will likely continue well into the future, perhaps to the point of zero growth.

Since 2001, GDP has only reached at least 3 percent in two years: 2004 (3.8 percent) and 2005 (3.4 percent). In every other year, through 2012, GDP failed to crack even 3 percent, a number that was once considered customary.

We won't know for another couple of months how much the U.S. economy grew in 2013, but we already have some estimates.

The Federal Reserve projects that the economy grew between 2 percent and 2.3 percent in 2013.

However, Kiplinger projects that, on the heels of the sequester and government shutdown, the economy grew just 1.8 percent in 2013.

Meanwhile, the Conference Board projects that the economy grew 1.7 percent last year.

Growth has been, and will likely remain, a challenge. The economy will have to expand much faster than those estimates just to keep up with the nation's continually mounting debt. The U.S. needs a combination of growth and inflation to pay off years of already accumulated debt.

The national debt has surpassed $17.3 trillion and continues to grow each and every month. Since all money is loaned into existence, and is therefore created as debt, the process of perpetual debt growth will continue until the system can no longer sustain itself and ultimately collapses.

While funding our massive debt expansion, the Federal Reserve is concurrently increasing, or inflating, the nation's currency. This process devalues our money.

By devaluing the dollar through the process of money printing (or quantitative easing), the Federal Reserve is simultaneously trying to create an export boom while also reducing the impact of the national debt.

A devalued dollar makes American goods cheaper overseas, and it also makes paying down existing debts easier because those dollars are worth less. In essence, it allows a nation to repay its creditors with money that is worth less than the money that was lent.

If the economy were growing strongly enough, it would help to offset the continually expanding debt to some degree. But that is not the case.

Historically, from 1948 through 2012, the United States' annual GDP growth rate averaged 3.21 percent.

Yet, as with many other developed nations, U.S. growth rates have been decreasing over the last two decades. In the 1950’s and 60’s the average growth rate was above 4 percent. In the 70’s and 80’s it dropped to around 3 percent. And in the last ten years, the average rate has been below 2 percent.

Some contend that the U.S. can simply grow its way out of debt. This is delusional. The historical data shows a consistently downward trend in growth, coupled with a steady growth in debt.

Since about 1980, debt has been growing much faster than GDP. But it is not possible to continually grow your debts faster than your income.

Consider America's economic growth rate over more than two centuries — essentially the entire existence of this nation. Over the past 220 years, U.S. GDP has averaged an annual growth rate of 3.8 percent.

Yet, between 1980 and 2013, total credit market debt (corporate, state, federal and household borrowing) grew by a whopping 8 percent per year. Anything growing by 8% per year will double every nine years. That is plainly unsustainable.

It's quite optimistic to believe that the U.S. economy, which hasn't surpassed 3 percent annual growth since 2005, can sustain a 3.8 percent rate going forward. Even if it did, an annual growth rate of 3.8 percent cannot support annual credit growth of 8 percent.

The total U.S. debt-to-GDP ratio currently stands at around 350 percent. That extraordinary debt level is already hindering economic growth. The stark reality is that we cannot, and will not, grow our way out of debt. Debt growth is simply outpacing economic growth.

After four consecutive years of trillion-dollar deficits, the fiscal 2013 deficit fell to $680 billion. This is being celebrated in Washington as a real sign of progress, even a victory. While the fact that the deficit has fallen for the last three years is a good thing, let's be clear: by any definition, the U.S. still has an absolutely massive deficit.

Let's take the most optimistic view: At $680 billion, the fiscal 2013 deficit was 51 percent less than in 2009, when it hit a nominal record high of $1.4 trillion. As a percent of the economy, the deficit is also considerably smaller than it's been in the past five years, coming in at 4.1 percent of GDP. By contrast, the budget deficit in 2009 topped 10 percent of GDP. And last year it was 6.8 percent.

But no matter how you spin it, this nation is still faced with a continual budget deficit that is adding to the national debt each and every year.

Here's a key data point: Federal revenues for fiscal 2013 were $2.77 trillion, yet the government spent $3.45 trillion. Given its deficit of $680 billion, this means that the government spent 25% more than it received in taxes. It also means that deficit spending represented nearly 20% of the entire federal budget.

And that was considered progress.

With all of this in mind, you have to possess an extraordinarily optimistic point of view to perceive the $680 billion deficit as a reason for celebration or back-slapping.

Furthermore, the U.S. made $415.6 billion in interest payments in 2013, nearly 16% higher than in 2012. Our debt has massive costs and it robs from critical domestic needs.

If the interest rate on the national debt just went up to the 20-year-average rate of 5.7 percent, almost all tax revenue would go toward paying the interest — leaving nothing for healthcare, food stamps, bridges and roads, social security, or defense. Just interest on debt.

The Congressional Budget Office (CBO) says it, "expects interest rates to rebound in coming years from their current unusually low levels, sharply raising the government’s cost of borrowing."

What's critical to consider is that even if the government somehow balances its budget and eliminates deficits going forward, it still wouldn't address the trillions in underlying debt. It would simply stop adding to it. In essence, it would just stop digging a deeper hole.

By 2038, the CBO projects that "the federal government’s net interest payments would grow to 5 percent of GDP, compared with an average of 2 percent over the past 40 years, mainly because federal debt would be much larger."

The most vexing problem for the U.S. going forward will be its inability to grow the economy adequately enough to manage all of the new debt that will continually be added each and every year in the decades ahead.

The nation will have to make some really difficult decisions about how much government it wants to pay for. One thing is certain; current tax and spending levels will have to be altered in the years ahead. The economy simply won't grow enough to pay for all future expenditures based on current levels, much less all the interest on our mountainous debt.

That debt will continue to squeeze the economy and the federal budget. As I've said previously, tough choices lie ahead.

As the CBO notes, "At some point, investors would begin to doubt the government’s willingness or ability to pay U.S. debt obligations, making it more difficult or more expensive for the government to borrow money."

That's a predicament the U.S. has never before faced in its history. While such a scenario was once unimaginable, it is now likely.

Thursday, December 19, 2013

Pentagon Spending Undermining US Economy



In fiscal year 2014, the federal government will spend around $3.8 trillion. Of that total, military spending will occupy $831 billion. This includes spending on military defense ($626.8 billion), veterans aid ($148.2 billion), foreign military aid ($14.3 billion), the war in Afghanistan ($92.3 billion) and the Department of Energy's nuclear weapons programs ($7.9 billion).

This massive sum takes into account the sequester, which forced the Pentagon to slice $52 billion from its budget for the 2014 fiscal year that began October 1.

Military spending is second only to Social Security. However, Social Security is funded by the payroll tax, which is paid by every working American.

Military spending, on the other hand, comes from the federal government's general fund. In other words, military spending comes at the expense of other domestic programs and needs.

U.S. government spending is divided into three groups: mandatory spending, discretionary spending and interest on debt.

Discretionary spending refers to the portion of the budget which goes through the annual appropriations process each year. In other words, Congress directly sets the level of spending on programs which are discretionary. Congress can choose to increase or decrease spending on any of those programs in a given year.

Military expenditures account for 57 percent of discretionary spending.

It is quite justifiable for the U.S. to have a goal of maintaining the world's most powerful military, and one that spends the most money to provide for that. However, it is not justifiable for the U.S. to grotesquely outspend not only any conceivable enemy, but essentially the rest of the world combined.

In 2012, U.S. defense spending was six times more than China, 11 times more than Russia, 27 times more than Iran and 33 times more than Israel.

In fact, the U.S. consumed 41 percent of total global military spending that year.

It's not just that the U.S. has a bigger budget and can therefore spend more. The U.S. was also in the top 10 highest spending countries as a percentage of Gross Domestic Product (GDP).

Unnecessary Pentagon spending creates fewer jobs than every other form of federal spending, including tax cuts to promote personal consumption. In fact, it destroys American jobs if the money for it comes out of our domestic economy, according to a 2011 study by the University of Massachusetts.

This study focuses on the employment effects of military spending versus alternative domestic spending priorities, in particular investments in clean energy, health care and education.

The study compared spending $1 billion on the military versus the same amount of money spent on clean energy, health care, and education, as well as for tax cuts which produce increased levels of personal consumption.

The authors concluded that $1 billion spent on each of the domestic spending priorities will create substantially more jobs within the U.S. economy than would the same $1 billion spent on the military.

The study also concludes that investments in clean energy, health care and education create a much larger number of jobs across all pay ranges, including mid-range jobs (paying between $32,000 and $64,000) and high-paying jobs (paying over $64,000).

Ultimately, all of this unnecessary military spending is bloating the federal budget, driving the deficit and piling onto our ever-expanding debt.

Admiral Mike Mullen, former chairman of the Joint Chiefs of Staff, gave Congress a very powerful warning in 2010.

"I think the biggest threat we have to our national security is our debt," the Admiral intoned.

It would be wise to heed his admonition.

Wednesday, December 04, 2013

By Almost Any Measure, the U.S. Healthcare System Is Failing



The U.S. spends more on healthcare each year than any other country in the world. Yet, according to a new report, the spending problem isn't the result of our rapidly growing segment of seniors.

Instead, most of the money is being spent on people under age 65, and it is being directed toward chronic and preventable conditions, such as diabetes and heart disease.

The U.S. spends a whopping $2.7 trillion per year on health care, or nearly 18 percent of gross domestic product (GDP). But the nation gets relatively little for the enormous amount it is spending. In fact, "The U.S. ‘system’ has performed relatively poorly,” reads the report.

The report, co-written by Dr. Hamilton Moses of the Alerion Institute in Virginia and Johns Hopkins University, had a rather surprising conclusion.

“In 2011, chronic illnesses account for 84 percent of costs overall among the entire population, not only of the elderly. Chronic illness among individuals younger than 65 years accounts for 67 percent of spending."

Despite the conventional wisdom, quite remarkably, the problem isn't old people.

The "price of professional services, drugs and devices, and administrative costs, not demand for services or aging of the population, produced 91 percent of cost increases since 2000,” reads the report.

Dr. Moses says that unlike a normal market, the healthcare market has no price discovery. Consumers operate in the darkness, entirely unaware of how much they are paying or what they are paying for. There are no market forces reigning in costs because healthcare in the U.S. doesn't exist in a true market.

“This is not a market," Moses says. "It’s far from a market. Few prices are known. They are not publicized.”

Moses also says his team’s study shows that one of the biggest problems in the U.S. healthcare system is that it is based on a fee-for-service model in which doctors and other caregivers are motivated to give lots of tests and individual treatments, as well as to prescribe drugs, instead of keeping patients well.

In other words, it's all about the treatment of illness and disease, rather than prevention. If the system was based on performance and outcomes, we'd be spending a whole lot less money.

Yet, individuals can play a bigger role in their own health and wellness than any doctor simply by making lifestyle choices that will lessen sickness, improve quality of life, and perhaps even longevity.

Keeping people from being afflicted by preventable diseases in the first place is the best way to reduce medical costs.

Some people contend that the U.S. has the world’s best health care system. But that claim simply doesn't square with the facts.

A study released in November (which was many years in the making) shows that Americans pay more per capita for health care than people in any other industrialized country. In return, we are sicker and die younger.

The Commonwealth Fund, which does research on health care and health reform, has continually shown that Americans spend far more on health care than any other nation — currently $2.7 trillion annually. That amounts to $8,508 per person, compared to $5,669 per person in Norway and $5,643 in Switzerland, the next-highest-spending countries.

In other words, no other nation's spending is even close to ours — even on a per capita basis.

Yet, all that money isn't buying us much. There's very little return on the investment.

The U.S. has the eighth-lowest life expectancy in the Organization for Economic Co-operation and Development (OECD), a group of 34 developed nations.

Commonwealth Fund researchers found that 37 percent of Americans went without recommended care, did not see a doctor when sick, or failed to fill prescriptions because of costs, compared to as few as 4 percent to 6 percent in Britain and Sweden.

Additionally, 23 percent of American adults either had serious problems paying medical bills or were unable to pay them, compared to fewer than 13 percent of adults in France and six percent or fewer in Britain, Sweden, and Norway.

But what about access? Defenders of the U.S. system say Americans have a much easier time seeing their physician than patients in other countries. Not so.

Americans wait longer to see primary care doctors. In Germany, 76 percent said they could get a same or next-day appointment, and 63 percent in the Netherlands said the same. Meanwhile, just 48 percent in the U.S. said they had that level of access. In fact, only Canada scored worse, with 41 percent saying they could see their doctor that soon.

Sadly, the U.S. health system is plagued by problems.

An Institute of Medicine report released in 2012 found that the U.S. health care system wasted $750 billion in 2009 (about 30 percent of all health spending) on unnecessary services, excessive administrative costs, fraud, and other problems.

The Institute also found that as many as 75,000 people who died in 2005 would have lived if they got the kind of care provided in the states with the best medical systems.

Quite plainly, all of the arguments that the U.S. healthcare system is the greatest in the world are plainly false or, at the least, misleading. By almost any measure, the U.S. lags the developed world, and even many developing nations.

Not only do Americans pay significantly more per capita for their healthcare than the citizens of any other nation on earth, they are also fatter, sicker, have less access, and die younger than those in other industrialized countries.

It's still unclear whether the Affordable Care Act (aka, Obamacare) will positively affect any of this, but it had better. The current state of affairs isn't just unacceptable; it's untenable.

Monday, November 18, 2013

American Poverty and Economic Decay Being Driven by Low Wage Jobs



You may notice the signs of economic decay all around in your community. Perhaps you have personally experienced (or are still experiencing) joblessness, the need for government assistance, or are somehow living on the edge economically.

One way or another, there are numerous signs that our economic security has deteriorated and that the American dream has faded away.

Nearly 50 million Americans (49.7 Million) are living below the poverty line. But the level of economic insecurity goes well beyond those officially recognized by the government as living in poverty.

According to The Associated Press, four out of five U.S. adults struggle with joblessness, live near poverty, or rely on welfare for at least parts of their lives. That amounts to roughly 80 percent of American adults, a figure that is simply mind-blowing.

However, poverty is not problem that plagues only racial and ethnic minorities. More than 19 million whites fall below the poverty line of $23,021 for a family of four, accounting for more than 41 percent of the nation’s destitute — nearly double the number of poor blacks.

Economic insecurity afflicts more than 76 percent of white adults by the time they turn 60, according to a new economic gauge to be published next year by the Oxford University Press. Measured across all races, the risk of economic insecurity rises to 79 percent.

“Economic insecurity” is defined as experiencing unemployment at some point in one's working life, or a year or more of reliance on government aid (such as food stamps), or income below 150 percent of the poverty line.

Millions of Americans cycle in and out of poverty at various points in their lives; four in 10 adults fall into poverty for at least a year.

The risk of falling into poverty has been rising in recent decades, particularly for those in their prime earning years (ages 35-55). For example, people ages 35-45 had a 17 percent risk of encountering poverty during the 1969-1989 time period. However, that risk increased to 23 percent during the 1989-2009 period.

The future projections are quite sobering. Based on the current trend of widening income inequality, close to 85 percent of all working-age adults in the U.S. will experience bouts of economic insecurity by 2030.

Yet, government safety net programs are the only thing keeping millions of additional Americans from falling into poverty.

The nation's poverty rate was 16 percent in 2012, according to new Census Bureau data that looks at how benefits and expenses affect family resources. Social Security, for example, kept 26.6 million Americans out of poverty last year. Food stamps provided by the Supplemental Assistance Nutrition Program, or SNAP, kept another 5 million people above the poverty level.

The main reason people fell into poverty last year was out-of-pocket health care expenses.

There are a near-record 47.6 million Americans, representing 23.1 million households, on the SNAP program. In other words, the program helps one in seven Americans put breakfast, lunch and dinner on the table.

Even as the stock market soars to new heights and income disparity widens to Great Depression-era levels, SNAP participation has doubled over the past 10 years and increased nearly 25 percent over the past four.

The cost of the program will reach $63.4 billion in 2013.

Poverty is becoming so widespread that it is creating a culture of government dependence. But safety net programs are becoming increasingly difficult to subsidize, given the portion of Americans who draw upon these various programs rather than fund them.

Tens of millions of Americans earn so little that they pay no federal income taxes. These folks do, however, pay payroll taxes, federal excise taxes (on things like gas, tobacco, alcohol and airfare), state taxes and local taxes.

A report from the Tax Policy Center (TPC) finds that 43 percent of Americans paid no federal income tax last year. About half of them earned too little to qualify, and many more were retired people who live on Social Security. In fact, two-thirds of this group are elderly. The remaining households likely qualified for tax breaks such as the Earned Income Tax Credit or the Child Tax Credit.

However, more than 70,000 households with income over $200,000 paid no federal income tax in 2013, according to the TPC.

The biggest culprit in all of this is low wages and incomes, which chokes off demand and consumption — the basic components of economic growth. Consumer spending comprises 70 percent of our GDP. Low wages and incomes are also starving the Treasury of much needed tax revenue.

Since seven out of the 10 fastest-growing U.S. occupations pay less than the national median wage, more and more Americans are forced to rely on the social safety net.

To illustrate this point, a whopping 52% of fast-food employees’ families are forced to rely on public assistance for food and medical due to low wages, which means American taxpayers are picking up the tab for corporations that pay poverty wages.

The average fast-food worker is now over 28 years old, meaning many support families with a combination of low fast-food wages and public assistance. That assistance is provided by American taxpayers.

A recent report from the University of California, Berkeley Labor Center estimated the cost of this at nearly $7 billion per year.

This puts a tremendous burden on American taxpayers, who have to fund these low-wage workers because their employes won't adequately do so.

Obviously, there are limits to the carrying capacity of the 57 percent of Americans who pay the federal income taxes that fund most safety net programs.

According to a recent in-depth study from the Heritage Foundation, "128,818,142 people are enrolled in at least one government program," based on U.S. Census Bureau information.

To be fair, the bulk of them are receiving Social Security (35,770,301) and Medicare (43,834,566) benefits, which all workers pay throughout the course of their working lives.

However, Heritage researchers note that 48,580,105 people are on Medicaid, the health insurance program for the poor, and 6,984,783 people are living in subsidized rental housing.

There has always been poverty and there will always be poverty. It's as old as society itself. Some people will always be more skilled, more educated and more industrious. So, they will typically earn more as a result.

But there are millions of Americans working two jobs to get by, putting in as many as 80 hours per week. These people are not poor due to a lack of will, effort or hard work. And, as I recently reported, most American households now have at least two adult workers.

Fifty-eight percent of the jobs created during the "recovery" have been low-wage positions, according to a 2012 report by the National Employment Law Project. These low-wage jobs paid a median hourly wage of $13.83 or less.

Even worse, 30 million Americans are scraping by on the federal minimum wage. That’s one in five people with a job.

Someone working full-time at the federal minimum wage of $7.25 makes $15,080 in a year, before taxes. The federal minimum wage has been stagnant for 45 years because it hasn't kept pace with inflation.

Back in 1968, the minimum wage in the United States was $1.60 an hour. After you account for inflation, that is equivalent to $10.74 today.

If you were to work a full-time job at $10.74 an hour for a full year, you would make about $22,339 for the year.

That's not a lot of money. Yet, according to the Social Security Administration, 40.28% of all American workers make less than $20,000 a year.

This means more than 40% of all U.S. workers actually make less than what a full-time minimum wage worker made back in 1968.

That's how far we have fallen.

Low-wage jobs have undermined our economy and our society. They have swelled the ranks of the working poor, created a surging dependence on taxpayer-funded welfare programs, robbed the federal tax base and driven down demand and consumption, making a genuine economic recovery an impossibility.

Economic security is merely a fantasy for millions of Americans who have watched their American dreams fade to black like the final frames of a sad movie.

Tuesday, October 15, 2013

The Federal Budget Process 101



The federal budget process, and the debt ceiling, explained in relatively simple terms.

The Independent Report strives to be independent and apolitical in reporting on the U.S. economy. The focus is typically on the Federal Reserve's monetary policy, inflation, interest rates, unemployment, the housing market and the energy markets.

But another area of focus is fiscal policy: federal spending, revenues and deficits. It is impossible, inconceivable or, at the least, irresponsible to ignore the nearly $17 trillion national debt.

The level of dysfunction in Washington has angered most Americans and it has dismayed other governments around the world, particularly our trading partners and the holders of our debt. With the current melodrama playing out in DC, a review the federal budget process is in order. Think of it as "Federal Budget Process 101."

There is a lot of finger-pointing going on right now, but many Americans do not understand how the budget process works. Thanks to the Center on Budget and Policy Priorities for laying out the broad strokes.

The Congressional Budget Act of 1974 is the template for Congressional tax and spending legislation. Under the Budget Act, each year Congress is required to develop a "budget resolution" that sets aggregate limits on spending and targets for federal revenue.

The President's annual budget request kicks off the budget process. On or before the first Monday in February, the President submits to Congress a detailed budget request for the coming federal fiscal year, which begins on October 1. This budget request is developed by the President's Office of Management and Budget (OMB).

The President's proposed budget provides Congress with a recommendation for overall federal fiscal policy, including: total federal spending, how much revenue the government should collect, and how much of a deficit (or surplus) the federal government should run — which is simply the difference between spending and revenue. Cumulative, yearly deficits add to the overall national debt.

The President's budget is very specific, laying out recommended funding levels for all individual federal programs. The proposed budget typically outlines fiscal policy and budget priorities not only for the coming year, but for the next five years, or more. It is also accompanied by historical tables that illustrate past budget figures.

Nearly all of the federal tax code is set in permanent law and does not expire. Similarly, more than one-half of federal spending — including the three largest entitlement programs (Medicare, Medicaid and Social Security) — is also permanently enacted. Additionally, interest paid on the national debt is also paid automatically, with no need for specific legislation.

Funding for "discretionary" or "appropriated" programs falls under the jurisdiction of the House and Senate Appropriations Committees. Discretionary programs make up about one-third of all federal spending. Almost all defense spending is discretionary, as are the budgets for education, health research, housing, science, technology and transportation, to name just a few examples.

The next step in the federal budget process is the Congressional Budget Resolution. After Congress receives the President's budget request, the House and Senate Budget Committees generally hold hearings to question Administration officials about their requests and then develop their own budget resolutions. These resolutions then go to the House and Senate floors, where they can be amended by a majority vote. A House-Senate conference then resolves any differences in the resolutions and a conference report is passed by both houses.

The budget resolution requires only a majority vote to pass, and its consideration is one of the few actions that cannot be filibustered in the Senate. The budget resolution is supposed to be passed by April 15, but it often takes longer. If Congress does not pass a budget resolution, the previous year's resolution, which is a multi-year plan, stays in effect.

With all of the above in mind, hearing certain members of Congress complain — after the fact — about a budget that they approved and voted for rings hollow and sounds empty. All federal budgets, which typically include deficit spending, are approved by Congress before being signed into law by the President.

Which brings us to the current fiscal year.

The 2014 United States federal budget was issued by President Obama on April 10, 2013. As in any year, the actual appropriations for fiscal year 2014 must be enacted by both houses of Congress before they can take effect. The President's budget was submitted two months past the February 4 legal deadline due to negotiations over the fiscal cliff and implementation of the sequester cuts mandated by the Budget Control Act of 2011 (which was the result of the last debt crisis).

This means Congress still had nearly six months to review and counter the President's proposal before the new fiscal year commenced. The onset of the new fiscal year nearly coincides with the arrival of the "debt ceiling" on October 17, a date that has been looming for months. Waiting this long was clearly a tactic — a means of exacting negotiating leverage for a budget that should have been resolved months earlier.

The House Budget Bill was introduced on March 15, 2013 and passed the House with a simple majority, 221-207, on March 21, 2013. All 221 votes in favor of passage were from Republicans. Of those voting, every Democrat voted against passage, along with 10 Republicans.

The Senate rejected the House budget on March 21, 2013 with a vote of 59-40 and continued working on its own budget bill, which was introduced on March 15, 2013. On March 23, 2013 the Senate passed the resolution, 50-49, with 48 Democrats, 0 Republicans, and 2 Independents voting in favor of passage. Four Democrats and 45 Republicans voted against, with one Democrat not voting.

The political divide in both chambers is clearly evident.

By law, the two chambers of Congress were supposed to reconcile the two bills. Under regular procedures, the Senate and House were to appoint representatives to a joint budget conference committee to negotiate a compromise. However, the House balked.

Democratic Party members of the House Appropriations Committee wrote a letter on April 17, 2013 urging Speaker Boehner to appoint House members to the budget conference committee. Yet, the House majority refused to engage in a conference to reconcile total 2014 discretionary spending levels. Despite its refusal, the House was adamant that it would not raise revenues in any way, or by any amount.

Ultimately, there was no unified congressional budget. The House and Senate each moved forward with appropriations bills, but none passed. With fiscal 2014 approaching, Congress debated a Continuing Appropriations Resolution to temporarily fund the government. However, it failed to pass before the beginning of the new fiscal year (Oct. 1), leading to the current government shutdown.

While the House Republicans initially stated that their intention was to negotiate a budget that defunded the Affordable Care Act (aka, "Obamacare"), their strategy soon shifted to refusing to increase the debt ceiling, which is supposed to signify their opposition to government spending levels.

However, every federal budget, every year, has been approved by both houses of Congress before being signed into law by the President. It is ludicrous for Congress to now complain about spending that it previously approved. The time for dissent and negotiation has long since passed. A new fiscal year has already begun. Previously incurred debts are now due.

Spending was twice reduced in recent years; under the Budget Control Act of 2011 and through the sequester cuts. More cuts are needed. But those should have been negotiated in April or May, not October.

Republicans in Congress are now taking the position that the only way to control future spending is by refusing to raise the debt ceiling. It's a tacit admission that they cannot control themselves and lack the ability to stop spending money the nation doesn't have.

However, the debate over the debt limit is a false argument; the debt ceiling and current spending levels are not correlated. While many Americans may not understand how the debt limit works, members of Congress surely do. Yet, they are playing on the public's lack of understanding to score political points.

The "debt ceiling," or debt limit, is a legislative restriction on the amount of national debt that can be issued by the Treasury. However, since expenditures are authorized by separate Congressional legislation, the debt ceiling does not actually restrict deficits. In effect, it can only restrain the Treasury from paying for expenditures that have already been incurred by Congress.

In other words, the debt ceiling only limits how much the Treasury can borrow to pay for past expenditures approved by Congress. The debt ceiling is raised as necessary through separate legislation.

A 2011 Government Accountability Office study found "the debt limit does not control or limit the ability of the federal government to run deficits or incur obligations. Rather, it is a limit on the ability to pay obligations already incurred."

A January, 2013 poll of a panel of highly regarded economists found that 84% agreed or strongly agreed that, since Congress already approves spending and taxation, "a separate debt ceiling that has to be increased periodically creates unneeded uncertainty and can potentially lead to worse fiscal outcomes."

The United States and Denmark are the only democratic countries to have legislative restrictions on issuing debt.

The U.S. has had some sort of legislative restriction on debt since 1917. However, since 1960, Congress has acted 78 separate times to permanently raise, temporarily extend or revise the definition of the debt limit — 49 times under Republican presidents and 29 times under Democratic presidents.

The United States has never reached the point of default where the Treasury was unable to pay its obligations. However, in 2011 the U.S. reached a point of near default. The delay in raising the debt ceiling resulted in the first downgrade in the United States' credit rating, a sharp drop in the stock market and an increase in borrowing costs.

Here's the important question: if the debt limit is so useless in restraining Congress' deficit spending and the issuance of new debt, why have a limit? Why the charade? Congress has continually voted to raise the ceiling anyway. Congress approves all expenditures, so it just needs to show some restraint in spending (or increase its revenues, or both). However, that restraint is due long before it's time to pay our bills for already incurred debt.

If Congress wants to place limits on the amount it can borrow to pay its debts, it must also place limits on the amount it first spends. The time for such decisions is in the spring or early summer, not October.

The U.S. is in quandary. Not raising the debt ceiling would be economic suicide. The U.S. government would default for the first time in our nation's history — and it would be by choice. That would be insane. World markets would be rocked and the cost of borrowing would skyrocket.

As it stands, three-month Treasury bills today sold at a high rate of 0.13%, well above the 0.035% paid to sell the notes a week ago. That's a nearly 400% increase, and the nation hasn't defaulted... yet.

However, the U.S. is facing a genuine fiscal dilemma: It must borrow even more money, and go further into debt, in order to service its already massive debts. In essence, we're continually borrowing new money just to pay off old debts.

Tuesday, September 24, 2013

Americans Facing Economic Insecurity



For the past seven years, the Independent Report has been chronicling the decline in American incomes and living standards, the reliance on debt, and the increases in poverty, joblessness and low-wage work. But the trouble began long before.

Though the Great Recession wreaked havoc on the U.S. economy and on millions of families, the American worker — in fact, the entire U.S. middle class — had already been under assault for decades.

The inflation-adjusted wages of full-time, male workers are now lower than they were in 1973, according to Census figures, which has forced more women into the workforce. Yet, despite the emergence of female workers and two-income families, household income continues to fall.

The median income of American households was $51,017 in 2012, following a median of $51,100 in 2011, the Census Bureau reported on Sept. 17th. While the Bureau said that the decline was not statistically different, it was a decline nonetheless and it followed two previous annual declines.

Yet, while incomes have been steadily falling, prices have been steadily rising.

Inflation rose 1.6 percent, 3.2 percent and 2.1 percent in each of the last three years, respectively. The government strips out the costs of food and gas from the Consumer Price Index, so the inflation rate is actually higher than reported. Oil still trades at over $100 per barrel, which raises the cost of gas, food and all other consumer goods.

This fall in incomes was not merely the result of the Great Recession. The decline had already been underway for many years.

In 1999, median household income was $56,080, adjusted for inflation, according to the Census Bureau. So, our median household income has fallen more than $5,000 since that time. That's not progress; it's a stark decline.

We're just slowly, steadily, sliding backwards as a nation.

Meanwhile, the U.S. poverty rate was essentially unchanged at 15 percent in 2012, as roughly 46.5 million people were stuck living at or below the poverty line, the Census Bureau reports.

As the Associated Press noted, "It was the sixth straight year that the poverty rate had failed to improve, hurt by persistently high levels of unemployment after the housing bust."

In 2012, 13.7 percent of people ages 18 to 64 (26.5 million) were in poverty compared with 9.1 percent of people 65 and older (3.9 million) and 21.8 percent of children under 18 (16.1 million).

Numbers like these may seem remote and impersonal, but the stark takeaway is this: there are 16 million American children, or one-in-five, living in poverty today.

Yet, the federal poverty rate surely underestimates the true number of poor Americans. For example, the poverty level for 2012 was set at $23,050 (total yearly income) for a family of four. However, if a family of four has $30,000 in annual income, it's safe to say that they are still living in poverty.

For a family trying to pay for housing, medical insurance, food and utilities — especially in a large metropolitan area — even $30,000 doesn't go far.

The official poverty measure ignores critical information like geographical location and the cost of housing. It is determined using the price of certain food staples nationwide and uses a flawed formula to adjust for inflation that doesn't consider the cost of gas.

If the government were being honest about the true scope of poverty and struggle in America, an ugly and shameful picture would emerge.

Forty-five percent of Americans lack basic economic security, or the ability to pay for necessities like housing, utilities, food, health care, child care and transportation, according to a recent report by the nonprofit Wider Opportunities for Women.

It's more evidence that the middle class has been eviscerated.

The wealthiest Americans, however, are doing just fine. The top 10 percent of earners made half of all income in 2012, the most on record, according to IRS data.

Consequently, the gulf between the richest 1% of Americans and the rest of the country reached its widest level in history last year.

The top 1% of earners in the U.S. pulled in 19.3% of total household income in 2012, which is their biggest slice of total income in more than 100 years, according to a an analysis by economists at the University of California, Berkeley and the Paris School of Economics at Oxford University.

One of the economists behind the research, Emmanuel Saez of UC Berkeley, is a top researcher on the topic of wealth and income inequality. He won the John Bates Clark medal last year.

In a separate analysis, Saez found the top 1% of earnings posted 86% real income growth between 1993 and 2000. Meanwhile, the real income growth of the bottom 99% of earnings rose just 6.6%.

Yet, the disparity has only worsened since that time.

According to the latest figures from Saez, the top 1 percent received 95 percent of all real income gains between 2009 and 2012. In 2011, when real real income fell for the bottom 99 percent, the top percentile accounted for 121 percent of the year's income gains.

The richest percentile now accounts for 22.5 percent of total U.S. income.

How can this state of affairs exist in a country that prides itself on greatness?

The U.S. is a plutocracy. The richest one percent have become a controlling class that runs Big Banking, Big Energy, Big Pharma, Big Insurance, Big Healthcare, Big Agra, Big Media and the Military-Industrial Complex. These industries are the core of the U.S. economy.

To repeat; the top 10 percent of earners made half of all income in 2012, the most on record, according to IRS data.

This isn't good in a consumption-based economy. When so many Americans have so little disposable income, it chokes off demand.

It's not a matter of fairness or economic equality; it's a matter of national economic survival. Siphoning off so much income to the top 1 percent, or even top 10 percent, is crushing economic growth and it is driving the emergence of a low-wage economy.

Fifty-eight percent of the jobs created during the recovery have been low-wage positions, according to a 2012 report by the National Employment Law Project. These low-wage jobs had a median hourly wage of $13.83 or less.

Wealth isn't easily defined. To someone making minimum wage, a person who makes $100,000 annually may seem rich. Yet, to someone who earns $1 million annually, $100,000 may seem like chump change. And then there are the stunningly wealthy Americans who make tens of millions of dollars each year, like hedge fund managers and elite Wall Street bankers.

So, how much money do Americans make across the economic strata?

The median income for those employed full-time between the ages of 25 and 64 is $39,000, according to the Census Bureau. The median household income is roughly $51,000 (this means half of American households earned more than that amount, while half earned less).

A household, as the Census defines it, consists of all the people who occupy one house or apartment. That means anyone living under the same roof and includes families, roommates sharing an apartment, and people living on their own.

In 2010, 39% of all households had two or more income earners. As a result 19.9% of households had six figure incomes, even though just 6.61% of Americans had incomes exceeding $100,000.

To be clear, less than 7 percent of Americans make more than $100,000 annually. That's not a large group. And fewer than 1 percent of the U.S. population has an annual income of more than $1 million.

On the other hand, one household out of every four (24.9 percent) makes less than $25,000 a year. This isn't just a national shame; it's a national crisis.

Workers in seven of the 10 largest occupations typically earn less than $30,000 a year, according to data published by the Bureau of Labor Statistics.

As long as capital is treated superiorly to labor, this state of affairs will continue until the economy totally breaks down, with American workers unable to purchase whatever goods they still produce (assuming those goods aren't already produced overseas).

America is devolving into a nation of serfs, ruled by a small class of oligarchs.

As it stands, the U.S. already has the highest income inequality in the developed world, and the fourth highest among all nations. All signs point to this blight continually worsening.

Economic opportunity and a respect for labor drove the emergence of robust American middle class in post-war America. But falling wages, the off-shoring of jobs, inflation, income inequality and diminished opportunity have destroyed the middle class — a group that previously separated America from the rest of the world.

Tuesday, September 10, 2013

The Rise of (and need for) Female Economic Power



When I was growing up, my father noted that something had noticeably changed in the American economy since he was a young man. Whereas his parents could get by quite well and even live a middle-class lifestyle on just one income, that had changed by the early 1970's.

By that time, maintaining a grip on a middle-class lifestyle typically required two incomes, forcing both parents into the workforce.

Since that time, the situation has become even more pronounced.

According to the Families and Work Institute in New York, 80 percent of today's married/partnered couples have both people in the work force, up from 66 percent in 1977.

The proportion of wives working year-round in married-couple households with children increased from 17% in 1967 to 39% in 1996.

By 2012, the share of married-couple families with children where both parents worked was 59 percent. And the labor force participation rate (the percent of the population working or looking for work) of married mothers with a spouse present was 68.3 percent, according to the BLS.

Of course, the number of women in the workforce is considerably higher when you include single and divorced women. And a dual-income household includes unmarried people, such as cohabiting couples of both sexes, as well as roommates who share expenses.

The issue is the number of families with children that require two incomes just to make ends meet. For most American families, there is no choice in whether a mother or father gets to stay home and parent their children. Having two working parents is now an economic necessity for almost all families.

The annual median wage fell in 2010 for the second year in a row to $26,364, a 1.2 percent drop from 2009, and the lowest level since 1999, according to David Cay Johnston at Reuters. And according to the Census Bureau, per capita income was $27,915 in 2011.

This is why most households now require more than one income to get by.

According to the Social Security Administration, 40.28% of all American workers currently make less than $20,000 a year. One in five people with a job earns only the minimum wage.

This just reaffirms why the majority of American households require two income earners, not just one.

Median household income has been sliding backward for the last six years, according to a new Census Bureau report. In 2007, at the beginning of the Great Recession, it was $55,480. By June, 2009, when the recession had officially ended, it had fallen to $54,478. And by June of this year, it had dropped to $52,098.

Income of that level does not go far in our economy, given the cost of food, energy, housing, education, and healthcare.

During the Great Recession, and even in its aftermath, companies cut jobs and salaries to get leaner and lower costs. But the financial struggles of average Americans are not just a matter of lower incomes; the problem is coupled with the continually diminished purchasing power of our money.

This is attributable to the pernicious effects of inflation, which is engineered by the Federal Reserve (meaning that it is intended). Inflation is not some mysterious phenomena. As the Fed has continually increased the money supply through the decades, it has steadily eroded and devalued the buying power of our money.

Inflation is a topic that I have explored repeatedly on this page through the years.

The effects of inflation, plus stagnant wages, have driven most American women into the workforce over the past four decades. This has resulted in such a historic shift that it can be aptly described as a sea change.

In 2010, for the first time in American history, the balance of the workforce shifted toward women, who now hold a majority of the nation’s jobs.

Women also dominate today’s colleges and professional schools: for every two men who will receive a B.A. this year, three women will do the same.

According to the Bureau of Labor Statistics, women now hold 51.4 percent of managerial and professional jobs — up from 26.1 percent in 1980. They make up 54 percent of all accountants and hold about half of all banking and insurance jobs. About a third of America’s physicians are now women, as are 45 percent of associates in law firms — and both those percentages are rising fast.

To be clear; not all working women are in the workforce simply out of economic necessity. Many women desire to work for a variety of personal reasons. Many of them have a skill or degree that they wish to utilize. Work can provide all people with a sense of community and of belonging. It can provide structure and a sense of purpose. Work can also be socially and mentally engaging. Additionally, it can provide a sense of identity and pride.

But there is no denying the economic impetus that has driven so many mothers into, or back into, the workforce.

It's bad enough that so many mothers are compelled to work as a result of economic necessity, even if they'd rather be at home with their young children. But they are also paid considerably less than their male counterparts for the very same jobs.

Women in the United States today are paid on average 77 cents for every dollar paid to men. The gap is even worse for African-American and Latina women.

Women ages 25 to 34 with only a high-school diploma currently have a median income of $25,474, while men in the same position earn $32,469.

Even among educated women, this wage-gap still exists.

The life-time earnings for a male with a professional degree are roughly 40 percent (39.59%) higher than those of a female with a professional degree, according to the Census Bureau. The lifetime earnings gap between males and females is the smallest for those individuals holding an Associate degrees, with male life-time earnings being 27.77% higher than those of females.

According to a new study done by the National Partnership For Women And Families (NPWF), the median yearly pay for women who are employed full time is $11,084 less than men’s.

This has has major implications for the ability of families and single women to afford essentials like food, housing and gas. According to NPWF, in more than 15.1 million families the woman is the breadwinner. And 31 percent of these families fall below the poverty line.

So, while women have advanced professionally in so many ways relative to men in recent decades, their pay still lags their male counterparts. This affects the men in dual-income households as much as it does the women.

If women were paid commensurately to men, all American families would benefit.

In 1970, women contributed 2 to 6 percent of the family income. Now the typical working wife brings home 42.2 percent. And four in 10 mothers — many of them single mothers — are the primary breadwinners in their families.

However, as women climb the ranks of the professional world, advancement eventually stalls out. Only 3 percent of Fortune 500 CEOs are women. But given societal trends, that will likely change sooner than later.

Women now earn 60 percent of master’s degrees, about half of all law and medical degrees, and 42 percent of all M.B.A.s. Most important, women earn almost 60 percent of all bachelor’s degrees — the minimum requirement, in most cases, for an affluent life.

This is the first time that the segment of Americans ages 30 to 44 has more college-educated women than college-educated men. So this is not a new trend; it has been underway for some time.

As women have stepped up and taken on increasingly larger roles in society and the workplace, their earning power has allowed many more families to maintain their foothold in the middle-class.

In fact, in many cases wives now out-earn their husbands — another historic shift. Of all married couples, 24 percent include a wife who earns more, versus 6 percent in 1960.

But as women have increasingly taken on a larger role in the workplace — whether voluntarily for personal reasons, or less voluntarily due to economic reasons — it has created a greater demand for child care and resulted in more latch-key kids. This has increased the pressure (and expense) for both parents.

It's laudable when women are able to enter the labor force at will to utilize their skills or degrees for their own self-fulfillment. But it's not so great when women must work just make ends meet for their families, yet get paid less than their male counterparts for doing the very same jobs.

The great middle-class expansion in the U.S. began following World War II. Women had entered the workforce during the war as a matter of patriotic duty and necessity. Millions of men were fighting overseas, so women stepped up and fulfilled many of the jobs suddenly left vacant by men.

In 1940, only 28 percent of women were working; by 1945, this figure exceeded 34 percent. In fact, the 1940s saw the largest proportional rise in female labor during the entire twentieth century.

However, more than half of the women drawn into the workforce by the war had left at the end of the decade. The Baby Boom had begun and for most of them work had become a choice — not a necessity.

In contrast, by 2011, 58.1 percent of women were in the labor force (which includes the noninstitutionalized civilian population, 16 years of age and over, that is willing and able to work and is either employed or actively seeking employment). But in recent years that percentage has been shrinking due to high unemployment. The women’s labor force participation rate peaked at 60.0 percent back in 1999.

Having more women working can be viewed as a sign of progress and of gender equality. But the wage difference between men and women remains very antiquated and even sexist. This bias is impacting nearly all American households.

The primary issue is that the post-war rise of the American middle-class was built largely on the backs of a single income-earner (typically men). But in order for the vast majority of American families to maintain their tenuous grip on that middle-class status, it almost certainly requires two incomes these days.

That's surely not a sign of progress.

Saturday, July 27, 2013

Savings Crisis Leaving Americans Unprepared for Emergencies/Retirement



For many years, Americans have shown a consistent inability to put money into savings, be it for emergencies, retirement, or the larger, necessary purchases that may arise.

In fact, the savings rate has been declining for decades, notwithstanding a brief uptick during the Great Recession. Here's a look at where the savings rate stood at the start of each of the last six decades, according to the Commerce Department's Bureau of Economic Analysis:

1960: 7.2%
1970: 9.4%
1980: 9.8%
1990: 6.5%
2000: 2.9%
2010: 5.1%

In 2005, the savings rate actually turned negative for the first time since the Great Depression, and it stayed that way for about two years.

While the savings rate reached a high of 5.4% in 2008, as Americans were trying to pay down their debts during the initial phase of the financial crisis, it started declining again in 2011.

This will have horrible consequences for the millions of Americans who will be facing a retirement funded entirely by Social Security. Pensions are largely a thing of the past and 50 percent of Americans don't participate in a retirement savings plan at work.

As it stands, current retirees are already relying far too heavily on Social Security.

According to the Social Security Administration, 23 percent of married couples and 46 percent of single people receive 90 percent or more of their income from Social Security. Furthermore, 53 percent of married couples and 74 percent of unmarried people receive half of their income or more from the program.

The average monthly Social Security benefit for retirees is just $1,262. That amounts to just $15,144 annually. Obviously, that doesn't go far.

According to a report by AARP, three out of five families headed by a retiree over 65 had no retirement savings. And half of those 65 and older had annual individual incomes of less than $18,500.

The problem for millions of Americans is that they simply can't afford to save. Adjusted for inflation, wages have been stagnant since the 1970s.

As bad as the Great Recession was to household incomes, those incomes have continued falling during the alleged recovery. Between June 2009, when the recession officially ended, and June 2011, inflation-adjusted median household income fell 6.7 percent, to $49,909, according to a study by two former Census Bureau officials.

The problem continues; Inflation-adjusted wages fell 0.4% in 2012, following a 0.5% decline in 2011.

So this gives a petty clear indication as to why Americans aren't saving for retirement or an emergency; they simply can't afford to.

Here's a look at the U.S. savings rate in recent years, according to the Organization for Economic Cooperation and Development (OECD).

2006: 2.6%
2007: 2.4%
2008: 5.4%
2009: 5.1%
2010: 5.3%
2011: 4.7%
2012: 4.3%
2013: 4.0%

However, the personal savings rate was just 3.20% in May. It had been as high as 6.4% in December.

The low savings rate creates the potential for crisis for millions of individuals and families.

Nearly three-quarters of Americans don’t have enough money saved to pay their bills for six months, according to survey results released in June by Bankrate.com.

Half of the survey respondents said they had less than three months’ worth of expenses saved up, and more than one-quarter have no reserves to draw on in case of emergency.

In addition to stagnant wages, persistent unemployment has made it difficult for Americans to put any money away.

Low- and middle-income Americans were hit harder by the recession and slow recovery than the wealthy. While the annual wages of the bottom 90 percent of workers declined between 2009 and 2011, the wages of the top one percent rose 8.2 percent during the same period, according to a January analysis by the Economic Policy Institute.

According to Bankrate, if Americans want to ensure they're protected in the event of a financial emergency, like a job loss or a medical issue, they should have enough money to cover about six months' worth of bills saved.

The U.S. retirement savings deficit is between $6.8 and $14.0 trillion, according to the National Institute on Retirement Security.

That is a staggering sum of money, or rather, a staggering deficit.

The average working household has virtually no retirement savings. When all households are included — not just households with retirement accounts — the median retirement account balance is $3,000 for all working-age households and $12,000 for near-retirement households.

Previous generations were much better at saving. During World War II, Americans were encouraged to buy government bonds as a matter of patriotic duty to aid the war effort. Following the Great Depression, regulation of the banking/financial industry greatly diminished bank failures, encouraging more Americans to save.

Credit was also not nearly as available in earlier eras, which encouraged people to save for future needs, whether it was a car, or a house, or an education. But, beginning in the '80s, there was an explosion of cheap and easy credit. That allowed people to get by without saving.

But the larger issue is the fact that household incomes have been flat for decades, while inflation has driven the prices of everything higher.

I've been coving the savings crisis and its implications on the retirement security of Americans since 2005, and the story hasn't gotten any better.

I asked the question "Will You Have Enough to Retire?" in February, 2006, and later that year I asked, "Are You Retirement Ready?"

And in September, 2010, I noted that "Americans' Retirement Savings Look Bleak."

Sadly, nothing has changed in recent years. Millions of people have simply moved closer to, or into, retirement quite unprepared. This has huge implications for our country.

Seniors are among the most vulnerable people in our society. Many will have to rely on their adult children or other family members to help them get through their final years. That will place tremendous burdens on already struggling families.

Assisted living facilities are very expensive. Nursing homes are even more expensive. Home health care and aids are also beyond the reach of great numbers of our senior population.

None of this appropriate or acceptable for such a wealthy nation — one that sees itself as a first rate, world leader.

For millions of Americans, what were supposed to be their "golden years" will not be so golden after all. At the least, they won't be nearly as golden as those of their parents and grandparents.