Tuesday, November 02, 2021

FAANGs Contribute Disproportionately to S&P 500, Yet Little to US Employment





Tech giants Facebook, Amazon, Apple, Netflix and Google constitute a disproportionate share of US stock market performance.

Often referred to as the FAANGs, they make up about 19 percent of the S&P 500, as of August 2021. It means that one-fifth of the performance of a 500-company stock exchange is driven by just five of them. This disproportionate representation is staggering considering that the S&P 500 is generally viewed as a proxy for the United States economy as a whole. It's unsettling that just five stocks can have such a substantial, inordinate effect on the performance of an entire exchange.

The five FAANG stocks are among the largest companies in the world, with a combined market capitalization of nearly $7.1 trillion as of August 2021.

However, despite their enormous size, these companies contribute very little to US employment.

Total Number of Employees

Facebook - 58,604

Apple - 147,000

Amazon - 950,000 in US (1.3 million globally)

Netflix - 12,135

Google/Alphabet - 135,301

In total, these five powerhouses employ just 1.3 million Americans. The civilian labor force amounts to 161.35 million people. In other words, the FAANGs employ just 0.008 percent of the US civilian labor force.

Yet, they have a combined market capitalization of over $7 trillion and make up one-fifth of the S&P 500.

Meanwhile, the FAANGs, plus Microsoft, had a combined “tax gap” of more than $100 billion in the decade ending in 2019, according to an analysis by Fair Tax Mark, a British organization that certifies businesses for good tax conduct.

Many argue that the FAANG’s are vital to the US economy. At the least, they are quite vital to the S&P 500. However, when it comes to employment and paying taxes, the FAANGs contribute relatively little.


Tuesday, April 23, 2019

American Despair



Life expectancy in the US declined for the third consecutive year in 2017. Americans could expect to live to 78.6 years, down from 78.7 years in 2016. In all, life expectancy fell 0.3 years from 2014 to 2017. Most alarmingly, rates of death even increased among adults between the ages of 25 to 34.

It was a rather stunning development since successive declines of that sort had not been witnessed since World War I, a span of 100 years. At that time, the world was gripped by a global flu pandemic. Since then, life expectancy had been continually progressing due to medical advances and better-coordinated public health efforts.

The opioid epidemic, suicide and alcoholism (i.e., alcohol overdoses, cirrhosis) are all driving this steady decline in life expectancy. The rates of these types of death are all surging.

There were 70,237 drug overdose deaths in the US in 2017, according to the CDC, a 10 percent increase from 2016. That amounts to 192 overdose deaths every single day.

The national suicide rate increased 33 percent between 1999 and 2017, according to the CDC. In 2017, 47,173 Americans died by suicide, an average of 129 suicides per day. Suicide is at its highest point in 50 years and is now the second-leading cause of death for Americans under the age of 35.

Even alcohol-related deaths are on the rise. An estimated 88,000 people die from alcohol-related causes annually, which makes alcohol the fourth-leading preventable cause of death in the United States, according to the National Institute on Alcohol Abuse and Alcoholism.

These are all indicators of widespread, national despair.

What’s driving this? It’s likely a number of factors.

In what has to qualify as a national emergency, 61 percent of Americans don't have enough savings to cover a $1,000 emergency. Since this is the status of the majority, you may well be one of them. If so, you know just how stressful this is. It’s like a weight that’s never lifted.

It isn’t that much of a surprise when you consider how little the typical American earns. The median (middle) income per capita in 2017 was just $31,786, according to Census Bureau data. The most stunning thing about it is that half of Americans earn even less!

The real average wage of American workers (that is, the wage after accounting for inflation) has about the same purchasing power it did 40 years ago. In fact, the $4.03-an-hour rate recorded in January 1973 had the same purchasing power that $23.68 would today, according to Pew Research.

Additionally, the share of wealth held by the bottom 90 percent of Americans fell from just over 33 percent to less than 23 percent from 1989 to 2016.

With stagnant wages and a shrinking piece of the wealth pie, debts are growing.

As of the last quarter of 2018, total US household debt stood at $13.58 trillion, nearly a trillion dollars above its pre-recession peak.

Experian data from the fourth quarter of 2018 revealed record-setting debt in several credit categories:

• Mortgage debt reached a new high of $9.4 trillion.

• Student loan debt reached a record high of $1.37 trillion.

• Auto loan balances hit $1.27 trillion, an all-time high, coupled with a new record for the average monthly auto payment, at $523.

• Credit card debt reached an all-time high of $834 billion.

• Personal loan debt totaled $291 billion and was the fastest-growing type of consumer debt in the past year.

Keep in mind that these figures are from the fourth quarter of 2018; they’ve only grown larger since then.

Remember what a big news story it was when student debt topped $1 trillion back in 2011? Well, it’s now over $1.5 trillion. That means it’s risen an additional 50 percent in less than a decade.

But Americans’ problems go well beyond money and finances. We all know that Congress doesn’t give a damn about what we think. Our elected officials in Washington serve only their corporate masters.

Professors Martin Gilens of Princeton University and Benjamin Page from Northwestern University looked at more than 20 years worth of data and found that the opinions of 90 percent of Americans have essentially no impact at all. As the professors noted, "The preferences of the average American appear to have only a miniscule, near-zero, statistically non-significant impact upon public policy.” What does have influence? You guessed it -- money.

Americans feel powerless and at the mercy of corporate and government forces much larger than them. It’s like David versus Goliath. The deck is stacked and Americans feel that their battles are already lost.

Trust in institutions has been deeply eroded or nearly ruined. Public confidence in Congress, the presidency, the Supreme Court, Big Business, banks, newspapers, TV news, the medical system, churches/religion and the criminal justice system are all at deplorable levels.

Speaking of the criminal justice system, the United States easily tops all nations in incarceration rates, with over 700 people behind bars per 100,000 in the population as a whole. Russia incarcerates about 500 per 100,000; Singapore just under 250; France about 100; and well toward the bottom is Japan, at 55.

Americans are feeling more lonely, too, despite the prevalence of social media.

Researchers recently conducted a study examining the lives of over 20,000 American adults. The results were shocking. Among other things, the findings revealed that 46 percent of respondents feel alone either sometimes or always, and 43 percent say that their relationships are meaningless.

About 60 percent of adults under age 35 now live without a spouse or a partner. One in three adults in this age range live with their parents, making that the most common living arrangement for the cohort. Think about that for a moment: a third of adults under 35 still live with their parents. That’s not liberating.

This is a seismic change in American culture. In 1978, 59 percent of 18-34 year olds were married. By 2018, that figure had plunged to 29 percent.

In 2017, the US census reported 110.6 million unmarried people over the age of 18 — that’s 45.2 percent of the American adult population. It's the highest number in US history.

Yet, studies have shown that married people tend to be healthier and live longer, though the reasons why are still not clearly understood.

Our health is not all that good, especially given the amount of money the nation spends on healthcare. Health spending per person in the U.S. was $10,224 in 2017, which was 28% higher than Switzerland, the next highest per capita spender.

On average, other wealthy countries spend about half as much per person on health as the U.S. spends. It's not money well spent.

The CDC reported in 2016 that obesity affected about 93.3 million US adults, or 40 percent of the adult population. In all, 72 precent of the adult population is either overweight or obese.

Financial costs aside, this has horrible impacts on our national health. For example, more than 100 million adults in the US are now living with diabetes or pre-diabetes, according to a 2017 report from the CDC.

Being obese and having diabetes, for example, create more than just physical consequences; they are stressful and depressing states.

No single thing could drive this 21st Century American despair. Instead, it is the convergence of a multitude of factors that have led to a perfect storm of misery.

Examined through this lens, the surge in overdose deaths and suicides, and the commensurate decline in life expectancy is not all that surprising, though it is still quite alarming.

Too many Americans are facing financial distress and poor job prospects, despite the fact that unemployment remains historically low. They have massive debts that may seem insurmountable. They see the rich getting richer, while the masses struggle to tread water. They see corporate and other special interests controlling the political system and public policy. They trust almost no institutions anymore. Many adults live alone or with their parents, and they feel quite lonely. Too many are obese, sick and out of hope.

Taken as a whole, this is a recipe for disaster and despair. It’s led us to addiction and abuse, suicide and declining life expectancy, at a time when we should instead expect life spans to be increasing.

We need each other. We need our families and our friends, and we need to remember that we are not alone in our struggles. Ultimately, we can take better care of ourselves, and of each other.

Wednesday, February 13, 2019

National Debt Reaches $22 Trillion; Washington Shows No Concern



The national debt has surpassed $22 trillion for the first time in U.S. history, according to new Treasury Department data. And trillion dollar annual deficits are now the norm, according to the Congressional Budget Office (CBO), which has issued some rather gloomy projections for the years ahead.

The CBO estimates the deficit will average $1.2 trillion each year from 2019 and 2028, for a total of $12.4 trillion. At the current rate, the nation will tack on another $10 trillion in new debt in the next eight-plus years.

This explosion in debt has occurred despite the fact that the economy has grown in each of the past nine years. In fact, this is the second-longest economic expansion since World War II ended. Employment and job creation continue to be robust. Just imagine what happens when the next recession or financial crisis unfolds and the government decides to implement another emergency spending package.

Annual deficits exploded during the Great Recession, when the government stepped in to fill the gaping void left by the crippled private sector. Yet, though we are no longer in an economic crisis, massive deficits continue because tax revenue has fallen (due to tax cuts) and federal spending continues to rise. This combination is a recipe for disaster.

The U.S. now pays an average of $1 billion every day in interest on the debt, which is money that could otherwise be spent on infrastructure, healthcare, education or anything else. The CBO projects that the costs of servicing the debt will surpass defense spending by 2025.

The public has been able to ignore this explosion in debt because interest rates are historically low, as I've previously illustrated. For example, at the beginning of this century, the yield on the 10-year Treasury averaged 6.03 percent. It fell to an average of just 1.80 percent in 2012 and is averaging 2.71 percent this year. If the 10-year were to revert to its average in 2000, the effects would be widely felt and impossible to ignore.

Higher interest rates would lead to higher borrowing costs across the economy and would surely slow economic growth, which would only worsen the effects of nation’s debt burden.

Who owns (or is owed) the U.S. debt? Most of it is owned domestically by individual banks and investors, the Federal Reserve, state and local governments, mutual funds, pension funds and insurance companies. However, 30 percent of the national debt is held by foreign governments and investors.

In a fiscal squeeze, the U.S. government will likely give precedence to foreign borrowers, otherwise they would stop lending. The government can always add new taxes on its own citizens and hope that inflation lessens the burden of the debt, even while it imposes heavy burdens on the citizenry.

Some argue that the debt isn’t such a big deal since the economy continues to grow right along with the debt. According to this argument, it’s the debt-to-GDP ratio that really matters. The U.S. economy will produce an estimated output of $21.5 trillion this year, while the debt is already above $22 trillion in February.

However, the U.S. doesn’t fund its annual budget with the entire economy -- it pays for it with tax revenues, which continually come up short. The nation must finance these annual shortfalls by selling Treasuries (aka, debt). Naturally, the country can’t service its debt with tax collections. In essence, the government issues new debt just to pay off old debt, like borrowing from Peter to pay Paul. If the U.S. could finance its budget and deficit with its entire economy, we wouldn't be in this financial mess in the first place.

So, where is the reasonable concern in D.C. about this debt mountain? It’s nowhere to be found. While the GOP used to howl about the debt in past years, it has suddenly fallen silent.

The Daily Beast reported that when senior officials attempted to warn the president of the impending debt crisis in early 2017, Trump replied, “Yeah, but I won’t be here.”

That’s shirking his responsibility in a horribly negligent manner, but it merely echoes the sentiments of many elected representatives in Congress and the White House for the past few decades. Most have played a game of “kick the can,” passing along the fiscal responsibility -- and a looming crisis -- to some future group of officials.

Have no doubt, however, a debt crisis is speeding our way, like a freight train with no brakes.

Thursday, September 13, 2018

Is This Really the "Greatest Economy Ever"?



"This is an incredible time for our nation. We have the best economy in history. The stock market is at record highs. Unemployment is at historic lows. And more Americans are working today than ever, ever, ever before.” — Donald Trump, addressing a political rally in Billings, Montana, September 6

Such remarks are a recurring theme for Trump. All summer long, he has made some variation of this claim:

"We have the strongest economy in the history of our nation." -- Trump, in remarks to reporters, June 15

"We have the greatest economy in the history of our country." -- Trump, in an interview with Sean Hannity on Fox News, July 16

"We're having the best economy we've ever had in the history of our country." -- Trump, in a speech at a steel plant in Illinois, July 26

"This is the greatest economy that we've had in our history, the best." -- Trump, in a rally in Charleston, W.Va., Aug. 21

"You know, we have the best economy we've ever had, in the history of our country." -- Trump, in an interview on "Fox and Friends," Aug. 23

"It's said now that our economy is the strongest it's ever been in the history of our country, and you just have to take a look at the numbers." -- Trump, in remarks on a White House vlog, Aug. 24

“We have the best economy the country's ever had and it's getting better." -- Trump, in an interview with the Daily Caller, Sept. 3


Do these claims square with reality?

Last year (2017), the US economy expanded just 2.2 percent.

In the first quarter of 2018, gross domestic product (GDP) registered 2.2 percent. Economic growth in the second quarter came in at 4.2 percent. However, quarterly measures are mere snapshots of economic health and they are volatile. Annual GDP measures are more illustrative and revealing.

Economists broadly expect growth to slow in the coming months, to round out the year at about 3%. Estimates for the current quarter range widely, from a forecast of 2% growth in the third quarter by the Federal Reserve Bank of New York, to 4.6% from the Federal Reserve Bank of Atlanta.

Historically, the GDP growth rate in the US averaged 3.22 percent from 1947 through 2018, reaching an all time high of 18.9 percent in 1942, during World War II.

Yet, over the last two decades, as with many other developed nations, our growth rates have been decreasing. In the 1950’s and 60’s the average growth rate was above 4 percent. In the 70’s and 80’s it dropped to around 3 percent. It has dropped even further in the 21st century.

Since 2001, GDP has reached at least 3 percent in just two years: 2004 (3.8 percent) and 2005 (3.4 percent). In every other year, through 2017, GDP failed to crack even 3 percent, a number that was once considered customary.

So, if GDP cracks 3 percent this year, it would be reason to celebrate. Yet, even if it somehow reaches 4 percent, it would still be nowhere near the record. Look at these double-digit growth rates from previous years:

1934 - 10.8 percent
1936 - 12.9 percent
1941 - 17.7 percent
1942 - 18.9 percent
1943 - 17.0 percent

It makes sense that the economy benefitted from the pent up demand of the Great Depression and the massive output generated by World War II.

Since 1943, the US economy has never again experienced double-digit growth. However, there were some very robust years, nonetheless.

The best year for the U.S. economy since 1943 came in 1950, when the economy expanded by 8.7%.

Here are the years since 1943 that GDP growth registered at least 5 percent:

1944 - 8.0 percent
1950 - 8.7 percent
1951 - 8.0 percent
1955 - 7.1 percent
1959 - 6.9 percent
1962 - 6.1 percent
1964 - 5.8 percent
1965 - 6.5 percent
1966 - 6.6 percent
1972 - 5.3 percent
1973 - 5.6 percent
1976 - 5.4 percent
1978 - 5.5 percent
1984 - 7.2 percent

It may have been tedious reading all of those yearly GDP numbers, but I listed them to illustrate a point: very clearly and demonstrably, this is NOT “the greatest,” “the strongest” or “the best” economy in US history. Since 1934, there were 19 years in which GDP significantly outpaced what we are experiencing in 2018.

Trump counts on his supporters naively and willingly believing everything that comes out of his mouth. However, his boastful claims are easily disproven.

How about unemployment?

The US unemployment rate was unchanged at 3.9 percent in August; the jobless rate ticked up slightly from the 3.8 percent low in May. However, those are quarterly snapshots. What matters is consistency — the unemployment rate for the entire year.

The unemployment rate in 2017 was 4.1 percent. The lowest annual unemployment rate this century was 3.9 percent in 2000. So, the current unemployment rate isn't even the lowest of this century.

To smooth out the distortions from the Great Depression and WWII, let’s look at the years since 1950 in which the unemployment rate fell to 4 percent or less:

1951 - 3.1 percent
1952 - 2.7 percent
1965 - 4.0 percent
1966 - 3.8 percent
1967 - 3.8 percent
1968 - 3.4 percent
1969 - 3.5 percent
1999 - 4.0 percent
2000 - 3.9 percent

Once again, Trump is full of it. The unemployment rate is NOT at historic lows. It would be wonderful if the annual unemployment rate finishes the year at or below 4 percent, but it will not be historic or “the greatest” or “the strongest” or “the best” ever.

How about the suggestion that, “More Americans are working today than ever, ever, ever before”?

In 2017, about 125.97 million people were employed on a full-time basis in the US. The number of full-time employees in the US has increased by almost 20 million people since 1991. In 1990, there were 98.67 million full-time employees.

However, the US population has grown considerably since 1990 and this must be taken into account. Since there are more people, surely there should be more working people.

US Population by Year:

1990 - 249.6 million
2000 - 282.1 million
2010 - 309.3 million
2018 - 327.2 million

In the past 28 years, the population has grown by 77.6 million. The fact that there are more people working today is not the least bit surprising. In fact, it is entirely predictable.

However, the labor force participation rate in August was 62.7, which tied the lowest rate this year. However, in January 2008, it was 66.2. The rate had reached an all time high of 67.30 percent in January of 2000.

The Bureau of Labor Statistics defines the labor force participation rate as the percentage of the civilian labor force, ages 16 years and over, currently employed or seeking employment.

Labor Force Participation Rate by Year:

1990 - 66.5 percent
2000 - 67.1 percent
2010 - 64.7 percent
2017 - 62.9 percent

Though the number of working Americans has grown due to population growth, the participation rate continues an alarming tumble. Additionally, the number of persons employed part time for economic reasons (referred to as involuntary part-time workers) stood at 4.4 million in August.

Yes, the retirement of the Boomers affects the participation rate, but the Millennials are now the largest generation in the US labor force.

The president can try to spin this, but the reality is that 96.3 million working-age people were not in the labor force in August. Add in the 4.4 million people involuntarily working part-time jobs and we’ve got a big problem.

Lastly, lets examine the record high stock market the president regularly brags about.

The Dow Jones Industrial Average closed at an all-time high of 26,616.71 on January 26, 2018. It now trading at about 25,900, well off that high, but still robust, nonetheless.

The S&P 500 closed above 2,900 for the first time on August 29, 2018.

The NASDAQ Composite closed at a record high of 8,109.69 on August 29, 2018.

This is all great news… for those who are invested in the US stock market.

The reality is that a huge segment of Americans have little, if anything, in the market.

The Chicago Tribune put it this way:

Nearly half of country has $0 invested in the market, according to the Federal Reserve and numerous surveys by groups such as Gallup and Bankrate. That means people have no money in pension funds, 401(k) retirement plans, IRAs, mutual funds or ETFs. They certainly don’t own individual stocks such as Facebook or Apple. Wall Street is not Main Street.

The rich are far more likely to own stocks than middle or working-class families. Eighty-nine percent of families with incomes over $100,000 have at least some money in the stock market, compared with just 21 percent of households earning $30,000 or less, a recent Gallup survey found.

Stock ownership before 2008 was 62 percent, Gallup found. Even after recent inflows, only 54 percent of Americans are invested now. More adults in the United States own homes than stocks.

People overseas seem to be benefitting, however. Foreign holdings of US securities rose to a record $18.4 trillion at the end of June, according to the Treasury.

It’s been said time and time again: the stock market is not the economy.

In truth, the stock market is not an accurate measure of the health and strength of the economy. The markets are simply a bet on the future performances of a select group of companies listed on a few stock exchanges.

Most American companies aren't even publicly traded. In fact, less than 1 percent of the 27 million businesses in the U.S. are publicly traded on the major exchanges.

Additionally, the number of public companies in the U.S. decreased by nearly 50 percent from 1996 to 2014, according to the National Bureau of Economic Research.

So, in reality, Wall St. is not a true reflection of how the average American worker, or the average family, is faring.

The following is an excerpt from the New York Times. Ask yourself if this sounds like a healthy stock market:

The US stock market is half the size of its mid-1990s peak, and 25 percent smaller than it was in 1976. In the mid-1990s, there were more than 8,000 publicly traded companies on exchanges in the United States. By 2016, there were only 3,627.

Profits are increasingly concentrated in the cluster of giants — with Apple at the forefront — that dominate the market. In 2015, for example, the top 200 companies by earnings accounted for all of the profits in the stock market. In aggregate, the remaining 3,281 publicly listed companies lost money.

Aside from the top 200 companies, the rest of the market, as a whole, is burning, not earning, money.

In sum, the economy is indeed healthier now than it was 10 years ago, during the throes of the Great Recession. But, unless you take an exceptionally narrow view of the economy — that the stock market is the end all, be all — it is primarily serving corporations.

Corporate profits in the US are now at record highs. Profits increased in the second quarter by $47.3 billion, or 2.4 percent, to an all-time high of over $2.12 trillion, following an 8.2 percent jump in the previous quarter.

Meanwhile, median household net worth remains below where it stood in 1998, according to the Federal Reserve, even as households take on more debt than ever before.

Household debt grew for the 16th consecutive quarter in the April-to-June period, rising by 0.6 percent, or $82 billion, to $13.29 trillion, the New York Fed reported. Overall household debt is now 19.2 percent above the post-financial-crisis trough.

That’s not a sign of health. Excessive indebtedness spurred the last economic collapse and it is now even higher.

Rising prices have erased US workers’ meager wage gains. The cost of living was up 2.9 percent from July 2017 to July 2018, according to the Labor Department, outstripping a 2.7 percent increase in wages over the same period.

This occurred despite the fact that the US economy continues to grow. However, that growth just isn’t trickling down to workers. It's part of a 40-year trend. After adjusting for inflation, today’s average hourly wage has just about the same purchasing power it did in 1978.

In short, if you’re part of the top 1 percent, this economy probably seems great. Otherwise, you’re probably not buying all of Trump’s boasts about this being “the greatest,” “the strongest” or “the best” economy in US history.

That’s because it’s not.

Monday, September 10, 2018

Many Will Be Blind When the Next Recession Unfolds



According to the U.S. National Bureau of Economic Research (the official arbiter of U.S. recessions) the Great Recession began in December 2007 and ended in June 2009, a period of 19 months.

Yet, many Americans (economists and policy-makers included) didn’t realize that the economy was truly in trouble until the financial crisis unfolded in September of 2008. By that time, the recession has been underway for nine months.

That's not uncommon. Recessions have a tendency to be underway for a while before they are fully recognized. That's partly due to the way a recession is defined.

The technical definition of a recession is two consecutive quarters of contracting gross domestic product, which is often referred to as negative growth (an oxymoron).

However, this does not necessarily need to occur for the National Bureau of Economic Research to call a recession.

According to the NBER, "a recession is a significant decline in economic activity spread across the economy, lasting more than a few months, normally visible in real GDP, real income, employment, industrial production, and wholesale-retail sales."

One primary indicator of recession is a rising unemployment rate. However, unemployment is a lagging indicator of a recession. Employment may remain elevated months after a recession has begun, as it did after the Great Recession began. Corporate profits are another lagging indicator. And, since it takes a month for the first estimate to be made and another month for the second estimate, GDP is itself a lagging indicator.

A recession typically lasts from six to 18 months, according to Investopedia. Economists say there have been 33 recessions in the United States since 1854.

Since 1960, the U.S. has gone through eight recessions -- an average of one or two recessions per decade. To be specific: one in 1960s, two in the 1970s, two in the 1980s, one in the 1990s and two in the 2000s.

The fact that the U.S. hasn't undergone a recession in nine years is kind of odd, at least by historical measures.

The period from March 1991 to March 2001, a 120 month stretch, was the longest period of economic expansion in U.S. history. The current expansion has been underway since June 2009, a span of 110 months.

This expansion will become the longest on record in July 2019, based on National Bureau of Economic Research figures that go back to the 1850s.

Simply put, this expansion is getting long in the tooth. Expansions don’t last indefinitely and the longer this one continues, the closer we are to the next inevitable recession.

When it arrives, don’t be surprised if the media, policy makers, economists and the some segments of the public don’t fully realize it for a while.

Friday, August 31, 2018

Interest Rates, Supply, Demand and Home Prices



Though the Federal Reserve doesn’t actually set mortgage rates, it determines the federal funds rate, which generally affects short-term and variable (adjustable) interest rates.

The federal funds rate is, perhaps, the most important benchmark in financial markets. This interest rate is used as a regulatory tool to control how freely the U.S. economy operates.

The funds rate is the interest rate banks charge each other for overnight loans. When the federal funds rate increases, it becomes more expensive for banks to borrow from other banks. Those higher costs are typically passed on to consumers in the form of higher interest rates on lines of credit, auto loans and, to some extent, mortgages.

The funds rate was raised to an all-time high of 20 percent in 1980 and 1981 to combat double-digit inflation.

In response, the average 30-year fixed-mortgage rate in 1981 reached an all-time high of 16.63 percent. Consider that for a moment; such an extraordinary rate was paid by some of our parents back then, which is unimaginable today.

Mortgage rates continually fell over the ensuing years, finally dropping into the single digits in 1991, and finished the decade at around 7 percent. The downward trajectory continued in the 2000s, when rates dipped as low as 5 percent, which of course sparked the housing bubble and resulting Great Recession.

The Federal Reserve cut the funds rate to an all-time low of 0-0.25 percent on December 17, 2008, in response to the financial crisis. Before this, the lowest federal funds rate had ever been was 1.0 percent in 2003, to combat the 2001 recession.

The Fed didn't raise the funds rate until December 2015 -- an extraordinary span of seven years. The central bank subsequently raised the funds rate once in 2016 and three times in 2017. The Fed has, so far, raised its benchmark rate twice in 2018 (March and June), and noted that two more rate hikes are appropriate this year.

At its June meeting, the Federal Open Market Committee set the funds rate to a range of 1.75-2.00 percent. As of August 30, the federal funds rate was 1.92 percent.

So, how has the series of rate hikes since 2015 affected mortgage rates?

In 2012, the average 30-year fixed-mortgage rate was 3.66 percent, the lowest annual average on record, according to Freddie Mac.

In 2015, the annual average was 3.85 percent. In 2016, it was 3.65 percent and in 2017 it was 3.99 percent. Through July, 2018, the 30-year fixed-mortgage rate averaged 4.42 percent. As of August 31, it is 4.53 percent.

The trend is clear: mortgage rates are steadily rising, right along with the funds rate.

A little history, according to Freddie Mac:

In 2008, the average 30-year fixed-mortgage rate was 6.03 percent.

In 2000, the average 30-year fixed-mortgage rate was 8.05 percent.

In 1990, the average 30-year fixed-mortgage rate was 10.13 percent.

In 1981, the average 30-year fixed-mortgage rate reached an all-time high of 16.63 percent.

Though mortgage rates are still historically low, they have been steadily rising since 2016, after the Fed began raising the funds rate.

Historically, mortgage rates can affect home prices, with an inverse relationship. As rates rise, prices may fall. Conversely, as rates fall, prices may rise. Low interest rates make money cheap. The lower cost of borrowing allows people to purchase more expensive homes. Realtors (and, consequently, home sellers) understand this and jack up prices to meet the capacity of the market.

Though incremental changes to mortgage rates may not affect home prices very much, rising rates over a longer period can result in weaker demand. Less competition for homes can, in turn, lower prices or, at the least, moderate price growth.

Of course, inventory affects prices (supply and demand) and the market has been tight in recent years. Simply put, there have been more buyers than sellers, so demand has outstripped supply. Consequently, after declining for a few years in the aftermath of the crash, home prices have climbed back above the levels they were before the bubble began to burst in 2006.

Interestingly, mortgage rates and home prices have been rising in tandem over the past couple of years. Yet, rates are still remarkably low by historical standards. Remember, back in 2000, the average 30-year fixed-mortgage rate was 8.05 percent. In essence, the present rate would have to double to climb back above that figure once again.

So, incrementally rising rates may not be enough to push down home prices. For that to occur, a lot more supply will have to come onto the market, something that doesn’t appear likely any time soon.

Friday, May 11, 2018

The US is Nowhere Near Full Employment



The US unemployment fell to a 17-year low of 3.9 percent in April.

While, on its face, that seems like good news, the unemployment rate has been falling for years because out-of-work Americans have exited the labor force.

Unfortunately, 236,000 people left the labor force in April, adding to the 158,000 who quit in March. The labor force participation rate, or the proportion of working-age Americans who have a job or are looking for one, fell to 62.8 percent last month from 62.9 percent in March. It was the second straight monthly drop in the participation rate.

There is something deeply wrong with, and misleading about, an unemployment reading that improves because people stop looking for work. That's on top of the fact that it's absurd to count people who aren't even working as part of the labor force simply because they are looking for a job.

The so-called U-6 unemployment rate -- which is a broader measure of unemployment because it includes people who want to work but have given up searching and those working part-time because they cannot find full-time employment -- dropped to 7.8 percent last month, the lowest level since July 2001, from 8.0 percent in March.

Yet, that's twice the official U-3 unemployment rate, which the government prefers to reference, for obvious reasons.

The labor force participation rate reached an all time high of 67.30 percent in January of 2000 and a record low of 58.10 percent in December of 1954.

The participation rate fell for many years as women entered the labor force in mass in the 1960s. That led to a much larger pool of working-age Americans.

The labor force participation rate has averaged 62.99 percent since 1950, which is almost exactly where it is now. Yet, there are a lot more women who want or need to work today, which is why that current number is so troubling.

Some Americans don't want or need to work, but millions of people who want jobs can't find one. A total of 6.4 million Americans remain unemployed, yet it's even worse than that.

A whopping 5 million people were employed part time for economic reasons in April. These are sometimes referred to as involuntary part-time workers. These individuals, who would have preferred full-time employment, were working part time because their hours had been reduced or because they were unable to find full-time jobs, according to the US Bureau of Labor Statistics.

Additionally, 1.4 million people were marginally attached to the labor force. This means they "wanted and were available for work, and had looked for a job sometime in the prior 12 months." They were not counted as unemployed because they had not searched for work in the four weeks preceding the survey.

No one who wants to be taken seriously should be uttering the term "full employment" right now. That's disingenuous and insincere, at best. At worst, it's just plain misleading.

More job training would help and a public/private partnership could get it done. Over and over, it's been widely reported that employers can't find skilled workers. The Federal Reserve backs that contention, saying that there are labor shortages all over the country.

We need a program to close the skills gap and get workers prepared for the jobs of today.

That would require political will and consensus, little of which exists in the nation's capitol.