Saturday, September 16, 2017
The Census Bureau reported this week that median household income rose to $59,039 in 2016, a 3.2 percent increase from the previous year and the second consecutive year of healthy gains.
To be clear, household income is the amount of income earned by an entire household, whether it’s a single occupant, a couple, or roommates.
Let's explore the term 'median', which denotes the midpoint. The median is the "middle" value in a list of numbers.
When it comes to median income, this means that half of earners have incomes above that amount, and half have incomes below that amount.
The fact that median household income rose last year is obviously good news, but before we pop the champagne, let’s have a little perspective.
Middle-class households are only now seeing their income eclipse 1999 levels. The Census Bureau reports that in 1999, median household income, adjusted for inflation, was $58,655.
Yes, it took us 17 years just to get back to where we started. Are we really supposed to celebrate that?
According to the Social Security Administration, 50 percent of wage earners make less than $30,000 annually, 61.5 percent make less than $40,000 and 70.5 percent make less than $50,000.
Again, 50 percent of all wage earners make less than $30,000 per year, and a lot of them come from single-income households.
The proportion of Americans who live alone rose to 27 percent in 2013, according to the latest Current Population Survey from the Census Bureau; that’s more than one-in-four people. Obviously, these are single-income households and a lot of them are surely earning less than the median.
As of 2012, 60 percent of married couples with kids had dual income households. Among the households with just one income earner; the father was the sole earner in 31 percent and the mother was the sole earner in 6 percent.
The average American household consisted of 2.53 people in 2016.
So, the median household income is generally a byproduct of two earners. That suddenly makes $59,000 seem a lot less impressive.
The incomes of the richest households are much further apart from those at the median than those at the median are from the bottom households. In other words, the difference between the households at the top and those in the middle is much more vast than the difference from those in the middle to those at the bottom.
Consider how different the earnings are among those above and below the median income.
According to the Internal Revenue Service (IRS), the top 1 percent had an adjusted gross income of $465,626 or higher for the 2014 tax year.
Clearly, the top 1 percent are not all billionaires.
Remember, a significant majority of earners take home less than $40,000 per year. Thirty-seven percent of earners take home less than $20,000. However, the lifestyles of those two groups aren’t as radically different from each other as they are from those who make at least $465,000 a year.
Roughly 534,000 Americans made more than a million dollars in 2013, according to the latest IRS data available.
The mega wealthy — a fraction the top 1 percent — now earn an average of $1.3 million a year. That’s more than three times as much as the 1980s, when the super rich "only" made $428,000, on average, according to economists Thomas Piketty, Emmanuel Saez and Gabriel Zucman.
The top fifth of earners are taking home more than half of all overall income, a record.
The top 1 percent take home more than 20 percent of all income.
However, the bottom 50 percent collect about 12 percent percent of national income.
This is why the savings rate was just 3.5 percent in July. People simply don’t have enough left-over income to save. Most Americans are spending everything they make just to get by and are using credit cards to make up the difference, even for basics like groceries and gas.
Credit-card debt in the U.S. recently surpassed the peak set just before the 2008 financial crisis. Outstanding revolving credit, which includes credit-card debt, rose to $1.02 trillion in June, according to a monthly report from the Federal Reserve.
Low incomes and slow income growth are having some rather glaring effects, especially on younger Americans.
Last year, the Pew Research Center reported that 32.1 percent of 18- to 34-year-olds lived at their parents’ homes in 2014, exceeding the 31.6 percent of young adults who were married or living with a partner in their own household.
Yes, one third of young adults — people up to the age of 34 — are still living with their parents. Yet, the findings were even worse the following year.
Almost 40% of young Americans were living with their parents, siblings or other relatives in 2015, the largest percentage since 1940, according to an analysis of census data by real estate tracker Trulia.
It should be noted that 1940 was the end of the Great Depression, so the current state of affairs is quite troubling.
This huge percentage of adult children living with their parents has added at least one additional working adult to millions of households. Assuming the adult child lives with both parents and is working, it results in at least three income earners in millions of households.
Once again, that $59,000 median doesn’t seem so impressive.
Yes, it’s nice that median household income went up last year, but we’re still only back to 1999 levels and it’s clear that the masses are still struggling.
Remember, the bottom 50 percent collect just 12 percent percent of national income.
That’s important to consider, if you’re celebrating the rise in median household income.
Sunday, September 03, 2017
Labor Day, the first Monday in September, is a celebration of the American worker. As you might expect, this day of tribute was created by the labor movement. Since labor has always had less capital than management, its strength is always in its numbers.
The day is an acknowledgement of the American workers who built this nation, and continue to do so, while advancing it economically and socially.
Though municipal ordinances were passed in 1885 and 1886, the first state to officially recognize Labor Day by law was Oregon, on February 21, 1887. Colorado, Massachusetts, New Jersey, and New York all followed suit later that year, and by 1894 a total of 28 states had adopted the holiday in honor of workers.
Finally, on June 28 of that year, Congress passed an act making the first Monday in September of each year a legal holiday in the U.S.
Labor unions, such as the American Federation of Labor, the Brotherhood of Carpenters, the American Railway Union and the International Association of Machinists were instrumental in fighting for betters working standards, as well as promoting a holiday to celebrate the working man (and later, working women).
Fighting for better labor standards, such as the eight-hour work day, the five-day work week and the minimum wage, etcetera, have raised the living standards of all workers and led to higher economic and political principles for all.
While Labor Day also symbolizes the unofficial end of summer and is celebrated with beach parties, backyard barbecues and retail-sales events, no American worker should ever lose sight of the true meaning of Labor Day, or the sacrifices that the labor movement made over the past century to improve work and living standards.
In the late 1800s, the average American worked 12-hour days and seven-day weeks to simply get by. Children as young as five or six worked in mills, factories and mines across the country, earning a fraction of their adult counterparts’ wages. Conditions for all workers were poor and often terribly unsafe. Many employees lacked access to clean air, toilet/wash facilities and work breaks.
There were examples of workers living in company-owned buildings, in company-owned towns, where the workers’ only option was to spend their hard-earned money in company-owned stores.
These circumstances drove workers to organize and strike for better conditions, hours and pay. At times, these strikes turned violent and dozens of workers died in protest. Chicago was the scene of the infamous Haymarket Riot of 1886 and the bloody Pullman strike in 1894, the latter of which resulted in federal troops being deployed to crush the rioting workers.
These violent, brutal events led President Grover Cleveland to sign the Labor Day holiday into law just days later.
The incidents also led to common labor statutes that are now accepted as standard, such as child labor laws, mandatory breaks, paid overtime, vacation time and even paid holidays.
The unfair, unsafe, often inhumane labor practices of the 19th Century ultimately resulted in United States labor law, whose basic aim is to remedy the "inequality of bargaining power" between employees and employers.
The Fair Labor Standards Act of 1938 requires a federal minimum wage and discourages unhealthy working weeks over 40 hours through time-and-a-half overtime pay. The Occupational Safety and Health Act of 1970 requires that employees have a safe system of work.
The Clayton Act of 1914 guarantees all people the right to organize and the National Labor Relations Act of 1935 gave the right for most employees to organize over unfair labor practices without punishment.
None of these things would have resulted if not for the struggles and organization of American workers. We owe better pay, better hours, paid overtime and safe workplaces to the workers who fought for these things.
There is still a long way to go in order for most workers to be fairly compensated for their labors.
The federal minimum wage has remained at $7.25 an hour since 2009. Adjusted for inflation, it peaked in 1968 at $8.68 (in 2016 dollars), according to the Pew Research Center.
Contrary to popular belief, minimum wage workers are not all, or even mostly, teens. Less than half (45 percent) of the 2.6 million hourly workers who were at or below the federal minimum in 2015 were ages 16 to 24.
If a minimum wage worker puts in 40 hours per week and works 52 weeks a year (no vacation time), his/her annual earnings would be $15,080. The federal poverty guideline in 2017 for a family of two (one working parent, with one child) is $16,240. In other words, that minimum wage worker is below the federal poverty line.
This isn’t theoretical; it's real life for millions of Americans. According to U.S. Census Bureau, there were 23 million single-parent families in 2016 and the vast majority were headed by single mothers. Today, 1 in 4 children under the age of 18 are being raised without a father and almost half (40 percent) live below the poverty line.
It’s not just minimum wage workers who are a struggling. A reasoned argument can be made that most American workers are underpaid, which is a major factor in our troubling income inequality problem.
Sadly, America ranks among the worst in the world in this category. Of the 35 developed countries in the Organization for Economic Development (OECD), only three have worse income inequality than the US (Turkey, Mexico and Costa Rica).
Today the top 1 percent take home more than 20 percent of all U.S. income. Meanwhile, the bottom 50 percent earn barely 12 percent of income.
According to the Social Security Administration, in 2015:
- 50 percent of wage earners made less than $30,000
- 61.5 percent of wage earners made less than $40,000
- 70.5 percent of wage earners made less than $50,000
Unionized workers are rarely minimum wage workers. Many are skilled laborers who rightfully earn more. The problem is that there are relatively few union workers left in America today.
The share of American workers who actually belong to a labor union has been falling for decades and is at its lowest level since the Great Depression, according to the Pew Research Center.
As of 2013, only 11.3 percent of wage and salary workers belonged to unions, down from 20.1 percent in 1983, according to the Bureau of Labor Statistics.
At their peak in 1954, 34.8 percent of all U.S. wage and salary workers belonged to unions, according to the Congressional Research Service.
In many ways, American workers have come a long way over the past century, but they clearly have much further to go.
However, the labor movement's past achievements are worth remembering and celebrating on this, and every, Labor Day.
Saturday, August 19, 2017
August 2nd was a rather auspicious day this year. That’s because what is known as Earth Overshoot Day (EOD) fell on the second day of August — the earliest it has ever arrived.
What is Earth Overshoot Day, you may be asking?
It is the date on which humanity’s resource consumption for the year exceeds earth’s capacity to regenerate those resources that year. The problem with EOD arriving on August 2nd is that there were still five months left in the year. From this point forward, humanity is stealing from its own future.
In a normal, healthy world, humanity would not use up all its available resources until Dec. 31. In fact, until the past few decades, humanity didn’t even come close to depleting all of the earth’s renewable resources on an annual basis. In 1963, humanity used just 78 percent of the earth's biocapacity.
Yet, Earth Overshoot Day has been arriving earlier than the previous year on a rather steady basis since the early 1970s. Look at where it fell at the start of recent decades:
1971 - December 21
1980 - November 4
1990 - December 7
2000 - November 1
2010 - August 21
Notice the trend? The arrival of EOD has really accelerated over the past two decades. As recently as 1975, EOD fell in December. The last time it fell in November was 1985. At the current pace, it will arrive in July in 2019.
Aside from representing the day when the human population overshoots its environment, economically speaking, EOD also represents the day on which humanity begins its ecological deficit spending.
This earth contains finite resources, some of which are renewable if given adequate time to replenish. Therein lies the problem; humanity is using these resources far faster than they can be restored annually.
Obviously, this has limits, which are recognizable in the form of shrinking forests, topsoil erosion, species loss, fisheries collapse, diminishing fresh-water supplies and higher commodity prices.
Earth Overshoot Day is calculated by Global Footprint Network, which calls itself, “An international think tank that coordinates research, develops methodological standards and provides decision-makers with a menu of tools to help the human economy operate within Earth’s ecological limits."
According to Global Footprint Network’s calculations, our demand for renewable ecological resources and the services they provide is now equivalent to that of more than 1.5 earths. The data shows us on track to require the resources of two planets well before mid-2000-century.
As you may know, there are no additional earths from which to extract precious resources. That poses some rather obvious problems, the kind that don’t have easy answers.
The obvious solution is to change our behavior, but humanity has never been very good at that. However, reality doesn’t negotiate. It’s terms are firm and irreconcilable.
Environmental groups, such as Global Footprint Network and the World Wildlife Foundation, recommend some fairly simple remedies, such as eating more vegetarian meals and cutting food waste. Yet, those are tough sells in America, where people believe in “American exceptionalism” and don't like being told what to do.
Big problems usually don’t have easy solutions. The above suggestions are fairly straight forward and achievable. However, getting big industries to stop deforestation and overfishing, for example, will be much more challenging.
The earth needs more trees, not less. Without a massive tree-planting campaign, the forces of climate change will evolve more quickly and be more devastating.
Every single minute, an area of forest the size of fifty soccer fields is cut down. Some 129 million hectares of forest — an area almost equivalent in size to South Africa — have been lost since 1990, according to the United Nations' Food and Agriculture Organization (FAO).
Overfishing has left vulnerable the millions upon millions of people around the world who are dependent on the sea for food and income. Nearly 90 percent of global fish stocks are either fully fished or overfished, based on an analysis from the UN’s FAO.
Ocean species are not the only animals vanishing from the earth at a dangerously rapid pace.
A report from World Wildlife Fund found that more than half the globe's vertebrates — mammals, birds, reptiles, amphibians and fish — were wiped out in a mere four-decade span. Specifically, these populations declined by 58 percent between 1970 and 2012.
The world’s topsoil, which is vital to growing crops, is in a perilous state; about a third of it is already degraded and the decline is projected to continue. Generating three centimeters of top soil takes 1,000 years, and if current rates of degradation continue all of the world’s top soil could be gone within 60 years, according to a senior UN official.
The global population is presently 7.5 billion. The United Nations predicts it will increase to 10 billion by 2050.
This means, by that time, the world’s farmers, ranchers, and fishers must find a way to produce more food than they have in all of human history. That will prove daunting since farmland is decreasing instead of increasing.
In 1960, there were 1.1 acres of arable farmland per capita globally, according to data from the UN. By 2000 that had fallen to 0.6 acre. Clearly, productive farm land and the human population are moving in the opposite directions.
Fresh water scarcity afflicts much of the world. Only about half of the world’s population has a connection to a piped-water supply in the home, whereas 30 percent rely on wells or local village pipes, and about 20 percent have no access at all to clean water.
Though 71 percent of the earth’s surface is covered by water, only 3% of all the water is freshwater, meaning it is safe for drinking. However, most of this is unavailable for human use. Roughly three-quarters of all freshwater is part of the frozen, and largely uninhabited, ice caps and glaciers. What remains for our use is about 1 percent of the total.
While a concerted, global effort is needed to stop and ultimately reverse the exhaustion of the earth’s resources, much of the developed world remains woefully unaware of the crises that are currently unfolding.
People who live in places where massive deforestation has occurred and continues, such as the Amazon, are well aware the emergency. People in fishing communities around the world come face to face with empty nets on a daily basis. The half of the global population without a piped water supply in their homes likely views clean, drinking water as the vital, precious resource that it is, while the other half likely takes it for granted.
Governments around the world need to act unilaterally, and quickly, to solve these problems or the entire planet will be facing multiple, crushing resource shortages all at once, just a few decades from now.
For much of the world, these issues are already at full-blown crisis levels right at this very moment.
Friday, August 11, 2017
U.S. housing inventory hit a record low this year and the trend in tightness shows no sign of abating.
The inventory of homes for sale was down more than 11 percent in June, year over year, according to Zillow, with steeper drops in big markets like San Francisco (minus 26 percent), Minneapolis-St. Paul (down 30 percent), Washington, D.C. (down 20 percent) and Seattle (minus 24 percent).
Remarkably, housing inventory experienced a year-over-year decline for the 104th consecutive month, dating back to October 2008, according to RE/MAX.
Homebuilders haven’t picked up the slack by constructing more homes. Though housing starts have increased dramatically from the crash-era bottom, they’re still below average. In short, building hasn’t caught up with demand yet, which has kept supply painfully low.
“There are about as many homes for sale now as there were in 1994, except there are about 63 million more people in this country now than there were then,” reports Svenja Gudell, chief economist at Zillow.
That has driven home prices to record highs. The June 2017 Median Sales Price of $245,000 was the highest in the history of the RE/MAX National Housing report.
Unless supply rises quickly and dramatically, or unless demand suddenly falls due to lack of affordability, expect the housing crunch to continue.
Though median household income rose to $56,516 in 2015 (the latest data available), according to the U.S. Census Bureau, it remains 1.6 percent lower than in 2000, when it hit $57,790, and 2.4 percent below the 1999 peak at $57,909.
However, homes were much less expensive back then. The median U.S. home value was $119,600 in 2000, according to the U.S. Census Bureau. Because homes are far less affordable today, relative to incomes, it will lead to more defaults and foreclosures in the next, inevitable, recession.
This isn’t the only evidence that Americans are living beyond their means.
Credit-card debt in the U.S. rose again in June, surpassing the peak set just before the 2008 financial crisis.
Outstanding revolving credit, which includes credit-card debt, rose to $1.02 trillion in June, according to a monthly report from the Federal Reserve.
This has consequences; defaults are once again on the rise. The New York Federal Reserve observed a 7.5 percent rise in the share of credit-card balances that were seriously delinquent, or at least 90 days past due, in the first quarter.
Mortgage debt ($14.4 trillion), credit card debt $1 trillion), student loan debt ($1.4 trillion) and auto debt ($1.2 trillion) have all piled up to record levels. Consequently, U.S. household debt surpassed its pre-crisis peak in the first quarter. Total household debt increased to $12.73 trillion, surpassing the previous record level seen in 2008.
If you’re looking for a ray of light in this story, it might be that household income is now higher than in 2008. However, it still remains lower than in 1999 and 2000. That’s not good news.
Another positive view: in the fourth quarter of 2007, when many of us realized that the wheels were coming off the wagon, Americans were devoting 13 percent of their disposable personal income to household debt service. By the first quarter of 2017, that percentage was 10 percent.
Maybe that means that everything ok and Americans have it all under control.
But there’s no escaping the fact that wages have been very sluggish, rising just 2.5 percent over the past year. Wages typically grow by 3.5 percent to 4 percent when the unemployment rate is this low
However, inflation has also been sluggish, remaining below the Federal Reserve’s 2 percent target for five years. In fact, the Consumer Price Index rose just 1.7 percent, year-over-year, in July.
So, though wages have remained weak, inflation has remained even weaker.
Yet, the CPI is misleading. Things such as college tuition, prescription drugs and home prices are all far above the overall inflation rate. For example:
* Tuition at four-year public colleges has risen 225 percent over the past 20 years, according to College Board data. Student loan debt has now risen to $1.4 trillion.
* From about mid-2015 to mid-2016, prescription drug costs jumped by nearly 10 percent. Furthermore, according to an AARP study, the price for an AARP-selected basket of widely used prescription drugs rose from $4,140 in 2005 to $11,341 in 2013, an average annual increase of 13.4 percent and a total jump of 174 percent.
* The median U.S. home value rose 96 percent from 2000 - 2016, according to the U.S. Census Bureau, meaning it roughly doubled over 17 years. That’s an average annual increase of 4.1 percent, well above the inflation rate.
Though consumers may have seen modest increases for consumer goods, such as clothing and footwear, the bulk of our money is spent on much more expensive items, such as housing, tuition, prescription drugs, health insurance and healthcare. None of those things have undergone modest price increases.
The heavily inflated costs of these components are creating enormous and unhealthy debt levels. Record-high debts should be seen as a canary in the coal mine.
The last time debt levels were this high, it didn’t end well. In fact, we’re still grappling with the aftermath a decade later.
Tuesday, August 08, 2017
In the first quarter, real GDP increased 1.2 percent, according to the Bureau of Economic Analysis. Yet, that weak performance didn’t stop all the major US stock indexes from closing at or near record highs.
Though the economy strengthened in the second quarter, expanding at a 2.6 percent clip, it was still tepid by historical standards. However, the Dow Jones has just experienced a streak of nine record closes. Over the course of 2017, the Dow has posted 35 record finishes.
Records have become commonplace for the Dow in recent years. In fact, the Dow has reached a new high, on average, once every seven days since fully recovering from the Great Recession in March 2013. In all, the Dow has achieved a new record 154 times in that span.
The S&P 500 also rose to a new record this week. The index has advanced nearly 11 percent so far in 2017. The NASDAQ and Russell 2000 indexes also closed at record highs this summer.
So far this year, the US economy has expanded just 1.9 percent, yet the stock markets are going nuts. It’s all come on the heels of a weak 1.6 percent expansion for all of 2016.
With that in mind, ask yourself this: Why are all of the stock markets at, or near, all-time highs?
This is the definition of “irrational exuberance,” as former Fed Chairman Alan Greenspan once described it.
The stock market is supposed to be forward looking. Yet, that sort of wisdom has become a thing of the past. Federal Reserve officials now expect GDP to remain around 2 percent through 2019. The markets are somehow unconcerned with this projection.
While most members of the 30-stock Dow make a big chunk of their money overseas, they are a pittance compared to the thousands of U.S. companies who do not.
The truth is, the stock market is not an accurate measure of the health and strength of the economy. The markets are simply a bet on the future performances of a select group of companies listed on a few stock exchanges.
Most American companies aren't even publicly traded. In fact, less than 1 percent of the 27 million businesses in the U.S. are publicly traded on the major exchanges.
Additionally, the number of public companies in the U.S. decreased by nearly 50 percent from 1996 to 2014, according to the National Bureau of Economic Research.
So, in reality, Wall St. is not a true reflection of how the average American worker, or the average family, is faring. In fact, nearly half of us don't own any stocks at all.
According to Gallup, 52 percent of U.S. adults owned stock in 2016. Since Gallup started measuring this in 1998, that's only the second time ownership has been this low. These figures include ownership of an individual stock, a stock mutual fund or a self-directed 401(k) or IRA.
Furthermore, the gains from this surging stock market have been flowing mainly to richer Americans. Roughly 80 percent of stocks are held by the richest 10 percent of households.
Clearly, the ballooning stock market is not a reflection of the financial well-being of the vast majority of Americans. Half of them aren't even investors. The markets are simply Wall Street’s betting games.
The reason for the markets' meteoric rise has been the Federal Reserve’s vast financial engineering.
During the 2008 financial crisis, the Fed cut its key interest rate to zero. After determining that this radical move wasn’t sufficient, it took the dramatic step of initiating quantitative easing, or QE.
Following three successive rounds of these Treasury and mortgage bond purchases with magically conjured money, the Federal Reserve’s balance sheet ballooned to $4.5 trillion. That amounted to a fourfold increase from late 2008 to late 2014. Much of that freshly-created money flooded into the stock markets. The money had to go somewhere.
Most people can’t live with, or on, the measly interest rates from savings accounts or certificates of deposit, which seem downright antiquated at this point.
Treasuries offer little help. Check out these yields (as of today):
1-year: 1.22 percent
2-year: 1.36 percent
5-year: 1.83 percent
10-year: 2.28 percent
30-year: 2.86 percent
Remember that the S&P has advanced nearly 11 percent so far just this year. It’s little wonder that investors are willing to roll the dice and hope that the good times just keep on rolling.
Of course, bets don’t always pan out. The markets always correct and they sometimes crash. Right now, there are plenty of reasons to worry, or at least be deeply concerned.
Michael Lebowitz of 720Global assembled “22 Troublesome Facts” behind his reluctance to follow the bullish stock market herd. Here’s a small sampling:
• The S&P 500 cyclically adjusted price-to-earnings (CAPE) valuation has only been higher on one occasion, in the late 1990s, during the Tech Bubble. It is currently on par with levels preceding the Great Depression.
• Total domestic corporate profits (w/o IVA/CCAdj) have grown at an annualized rate of just .097% over the last five years. Prior to this period and since 2000, five-year annualized profit growth was 7.95% (note: period included two recessions).
• Over the last 10 years, S&P 500 corporations have returned more money to shareholders via share buybacks and dividends than they have earned.
• At $8.6 trillion, corporate debt levels are 30% higher today than at their prior peak in September 2008.
• At 45.3%, the ratio of corporate debt to GDP is at historical highs, having recently surpassed levels preceding the last two recessions.
John Mauldin summed it all up this way:
"So, US corporations are simultaneously more indebted, less profitable, and more highly valued than they have been in a long time. Furthermore, they are intentionally making themselves more leveraged by distributing cash as dividends and buying back shares instead of saving or investing that cash. Yet investors cannot buy their shares fast enough. Maybe this will end well… but it’s hard to imagine how."
As I have long said, this will end in tears. History tells us so. What goes up must come down. Nothing grows in perpetuity.
Millions of investors will be wiped out when this market has its eventual collision with reality. Many think they can time the market, but no one has a crystal ball. When markets tumble, investors by the millions sell in a panic. The trouble is, for every seller, there must be a buyer. When everyone is trying to exit the market at the same time, there won’t be enough buyers. It will turn into a bloodbath.
The U.S. has entered its ninth full year of expansion — making it the third longest since the 1850s — and that creates reason for concern.
Throughout U.S. history, the gap between one recession’s end and the next one’s beginning has averaged just under five years. In other words, this expansion has gotten really long in the tooth, which is a very uncomfortable reality.
Whether it’s the next, inevitable recession that sparks a stock market meltdown (remember recessions often begin before they are officially recognized) or if it's a market collapse that ignites the next recession doesn’t really matter.
The outcome will be the same, and it will be brutal.
Wednesday, July 19, 2017
During the Great Recession, the ranks of the poor and impoverished in America swelled. Yet, a decade after the recession began, its aftermath remains broad and extensive.
Following the Great Recession, the poverty rate increased to 15.1 percent in 2010 and registered 15.0 percent in 2011, according to a report by Stanford University.
By 2015, 13.5 percent, or 43.1 million, Americans lived in poverty. However, according to a supplemental poverty measure, the poverty rate was 14.3 percent. That was six years after the Great Recession ended.
The official poverty rate in the United States had ranged from a high of 22.4 percent when it was first estimated for 1959 to a low of 11.1 percent in 1973. The launch of major “War on Poverty” programs in 1964 has kept the poverty rate fluctuating between roughly 11 and 15 percent ever since.
What does it all mean?
Despite the “War on Poverty" and repeated economic expansions, nearly 1 in 6 Americans still lives in poverty. But the problem goes well beyond that.
On top of the 43.1 million Americans living in poverty, an additional 97.3 million (33 percent) people in the US are low‐income, defined as having incomes below twice the federal poverty line, or $47,700 for a family of four.
Taken together, this means that 48 percent of the US population is poor or low income. That’s 1 in every 2 people.
Consider that for a moment: census data shows that half the population qualifies as poor or low income.
If you’re wondering how this can be, take a look at national income data; it's quite elucidating.
According to the Social Security Administration, in 2015:
- 50 percent of wage earners made less than $30,000
- 61.5 percent of wage earners made less than $40,000
- 70.5 percent of wage earners made less than $50,000
It’s a snapshot of the economic struggles most Americans are facing. Again, half of US workers earn less than $30K annually.
What officially defines poverty? The federal government offers this definition, according to the 2017 poverty guidelines:
1 person - $12,060
2 people - $16,240
3 people - $20,420
4 people - $24,600
By this criteria, a family of four that makes $25,000 this year is not considered by the federal government to be in poverty.
I beg to differ. If we’re being honest, millions more Americans are genuinely impoverished, though they are not officially recognized as such.
This is not even a discussion about fairness. The poor always have been, and always be, among us. But, surely, we can do better. After all, the US remains one of the richest nations in the world.
These weak income levels are a primary reason that our economy has averaged a meager 2.16 percent annual growth rate over the last five years. That’s less than two-thirds of the historic average.
As I’ve said many times for many years, the majority of Americans simply do not have enough income to grow the economy at levels once considered normal or customary.
The US economy is based on demand and consumption, not manufacturing and exporting. If Americans aren’t vigorously spending, the economy grounds to a halt, as it has over the past decade.
There is nothing to suggest that incomes are going to rise significantly this year, or any time in the near future, or that the poverty rate will suddenly decline toward its 1973 low.
We can’t even get the government to be honest about how many of its citizens are truly impoverished.
The first step to correcting a problem is to admit that you have one, and the federal government can’t — or won’t — do that.
Thursday, June 08, 2017
Healthcare and health insurance have been hot topics in the U.S. for well over a decade now. But, to be clear, they are not one and the same.
Healthcare is about access and affordability; the ability to get care when needed and the ability to pay for it without going bankrupt or into debt. Health insurance, on the other hand, is supposed to provide financial protection in the eventuality of a costly illness or injury.
What the U.S. suffers from is a lack of accessible and affordable healthcare. What good is health insurance if you can't access or afford healthcare anyway?
The Affordable Care Act (ACA) was supposed to remedy these problems. However, after seven years of existence, the law has had, at best, uneven results.
The good news is that the number of Americans without health insurance has been reduced by roughly 24 million. The CDC reported that the percentage of people without health insurance fell from 16.0 percent in 2010 to 8.9 percent during the January–June 2016 period.
However, most of these people tend to be poor Americans who receive subsidies through the ACA’s health exchanges and people who qualify for Medicaid, which was designed for the poor.
The Congressional Budget Office reported in March 2016 that there were approximately 12 million people covered by the exchanges, 10 million of whom received subsidies to help pay for insurance. Another 11 million were made eligible for Medicaid by the law, a subtotal of 23 million people. An additional 1 million were covered by the ACA’s "Basic Health Program," for a total of 24 million.
Wealthy Americans can afford healthcare, even though it is expensive. They do not avoid the doctor or monthly insurance payments because the cost interferes with their mortgage payments or ability to eat.
The people squeezed by the ACA are those in the broad middle — the ones who make too much to receive a subsidy, yet too little to practically afford soaring premiums and deductibles.
The way insurance is supposed to work is that you either pay a high premium that affords you a low deductible (out-of-pocket treatment cost) or you pay a low premium that puts you on the hook for a higher deductible when you eventually get sick or injured.
Middle income Americans are rightly angry about health insurance costs; both our deductibles and our premiums have concurrently shot upward. This is not how insurance is supposed to work.
Healthcare and insurance costs have become unreasonable and unmanageable, and the ACA hasn't changed that. But it wasn't always this way.
As recently as 1980, health care expenditures were just 4.2 percent of gross domestic product. Since that time, costs have taken off like a rocket and they now account for roughly one in six of all dollars spent in the U.S.
National health expenditures grew 5.8 percent to $3.2 trillion in 2015, or $9,990 per person, and accounted for 17.8 percent of Gross Domestic Product (GDP), according to the Centers for Medicare & Medicaid Services (CMS).
Per capita healthcare spending in the U.S. is more than twice the average of other developed countries. Meanwhile, the share of GDP allocated to health spending in the 34 OECD (developed) countries was just 8.9 percent in 2013. That’s less than half the percentage the U.S. spent.
Yet, things are only getting worse. The CMS projected that total health care spending for 2016 reached nearly $3.4 trillion, up 4.8 percent from 2015. The costs just keep rising, with no end in sight. The CMS projects that U.S. health care spending will grow by an average 5.6 percent annually over the next decade, due to the aging population and rising prices for health care services.
This problem is compounded by the fact that healthcare costs are growing more rapidly than the economy. The CMS projects that national health care spending will outpace GDP growth by 1.2 percentage points over the next decade. As a result, the CMS estimates that health care spending will account for 19.9 percent of GDP by 2025, up from 17.8 percent in 2015.
These are the issues the ACA was supposed to rectify. Back in 2009, US healthcare costs totaled $2.5 trillion and comprised 17.6 percent of the economy. However, the treatment for this disease turned out to be no cure at all. By 2015, healthcare costs had risen to $3.2 trillion and 17.8 percent of GDP.
Despite these massive expenditures, the U.S. gets very little in return. Bloomberg put it this way:
The U.S. health-care system remains among the least-efficient in the world.
America was 50th out of 55 countries in 2014, according to a Bloomberg index that assesses life expectancy, health-care spending per capita and relative spending as a share of gross domestic product… and life expectancy was 78.9. Only Jordan, Colombia, Azerbaijan, Brazil and Russia ranked lower.
By those measures, the U.S. healthcare system is clearly failing.
A 2014 report from the Commonwealth Fund found that the US “ranked last overall among 11 industrialized countries on measures of health system quality, efficiency, access to care, equity and healthy lives.” The kicker was that the US has the highest costs while also displaying the lowest performance.
Whoever says, “You get what you pay for,” clearly hasn’t seen America’s deplorable healthcare statistics.
These findings are not a recent development. They are decades in the making.
Back in 2000, the World Health Organization ranked the US 37th of 191 countries for "overall health system performance," 72nd for "level of health," and first for "health expenditures per capita."
Yet, even as healthcare costs continue to soar, Americans are doing very little to help themselves. The U.S. is beset by lifestyle diseases, such as heart disease, hypertension and diabetes. In other words, Americans suffer from things that are preventable, but we choose to do little about them. The American diet is famously unhealthy and Americans exercise far too little, if at all.
A 2013 study by the CDC estimated that nearly 80 percent of adult Americans do not get the recommended amounts of exercise each week, potentially setting themselves up for years of health problems.
However, it’s a relatively small group of Americans that is driving the cost of healthcare.
Just 5 percent of the population is responsible for almost 50 percent of all healthcare spending. At the other end, half of the population accounts for just 3 percent of spending.
Unlike other sectors, healthcare is one area where improvements in technology don’t lower costs. It’s a true anomaly because it’s by design.
More than anything else, the cost of healthcare in America is driven by greed. Hospitals, pharmaceutical companies and medical-device makers, for example, have determined that they can stick it to the federal government and the insurance companies. Those costs are simply passed along to tax payers and consumers, as Steven Brill so eloquently pointed out in his authoritative TIME magazine piece, Bitter Pill: Why Medical Bills Are Killing Us.
If we’ve learned anything from the ACA experiment, it’s that we can’t expect the government to save us. Accidents will surely happen, but the more we take care of ourselves the less likely we are to become sick and in need of healthcare.
Regular exercise results in health and wellness. Fitness means taking care of your health. In other words, fitness is health care. On the other hand, going to the doctor is sick care.
Taking care of yourself to the best of your ability is likely the best insurance policy you will ever have, or ever be able to afford.
As the old saying goes, an ounce of prevention is worth a pound of cure.