Technically Speaking, October 2014

LETTER FROM THE EDITOR

We have a variety of articles for you to consider this month. We start with a look back.  Many of us forget that trading software is relatively new and in the first article, one of the pioneers in the development of trading software, Louis Mendelsohn, provides insights into the evolution of software and technical analysis. We are also reprinting some of Lou’s work from the 1990’s which detail problems the financial industry still faces today.

We then look at volatility using articles that rely more on the common VIX indicator including the thoughts of three Federal Reserve economists. Other articles provide insights into the state of the markets and work being done by MTA members and chapters around the world.

MTA member, Stella Osoba, CMT, published “Women on Wall Street” on Traders Planet. The article provides valuable career advice for both individuals breaking into the field and those in established positions. The article highlights ten valuable career tips for those seeking a job or for those looking to advance within their current position. You can read the whole article at http://go.mta.org/12184

After reading that article, please let us know if you think it would be beneficial to include content similar to that in Technically Speaking. You can always reach us at editor@mta.org.

Sincerely,
Michael Carr

What's Inside...

THE EVOLUTION OF TECHNICAL ANALYSIS: THE CATALYZING FORCE

Evolution is about change… adaption… survival. For a thing to evolve, though, it needs a catalyst – an environmental shift...

Read More

BROADENING THE SCOPE OF TECHNICAL ANALYSIS: THE IMPORTANCE OF AN INTERMARKET PERSPECTIVE

Editor’s note: this was originally published in the March 2001 issue of Technically Speaking.

With the proliferation of microcomputers and...

Read More

VOLATILITY IN PERSPECTIVE

Editor’s note: This was originally published by Crestmont Research (www.CrestmontResearch.com) and is reprinted with permission.

Who or what is rocking...

Read More

WHAT CAN WE LEARN FROM PRIOR PERIODS OF LOW VOLATILITY

Editor’s note: This was originally posted at the New York Federal Reserve’s Liberty Street Economics, a site that “features insight...

Read More

CONTRARY OPINION FORUM

Editor’s note: Susan recently attended the Contrary Opinion Forum, an annual meeting sponsored by Fraser Advisors that “centers around the...

Read More

SUMMARY OF MTA SINGAPORE CHAPTER MEETING

The MTA Singapore Chapter Meeting was honored to host Alex Siew, Dr Sun, and Caner Seren Varol, CMT, CFA, ERP...

Read More

INTERVIEW WITH GREGORY HARMON, CMT, CFA

How would you describe your job?

I don’t know that I really have a job in the sense that many...

Read More

SUMMARY OF THE CHICAGO MTA CHAPTER EVENT

Ralph Acampora is an excellent lecturer and it was no surprise that he drew a large crowd to the Chicago...

Read More

REMEMBERING 1995 DOW AWARD WINNER WILLIAM X. SCHEINMAN

William X. Scheinman received the Charles H. Dow Award in 1995 for his paper, Information, Time and Risk. In his...

Read More

U.S. SECTOR ETF PORTFOLIO

Editor’s note: This was originally published on September 29, 2014 at WWW.JBECKINVESTMENTS.COM and is reprinted with permission.

ETF selection has...

Read More

WINANS LEGACY STOCK INDEX (WILSI)

The Winans Legacy Stock Index (WILSI) is an unweighted composite of...

Read More

CHART OF THE MONTH

Other Winans Indexes include the Winans Real Estate Index™ (WIREI) and the Winans Preferred Stock Index® (WIPSI). The data is...

Read More

THE EVOLUTION OF TECHNICAL ANALYSIS: THE CATALYZING FORCE

Evolution is about change… adaption… survival. For a thing to evolve, though, it needs a catalyst – an environmental shift of a substantial magnitude or a subtle mutation – that forces change to occur.

In the late 1970s, the magnitude of the newly introduced personal computer (PC) caused an environmental shift that forced change upon the world. Adapt or perish was the new reality for everyone. In the financial sector, this reality was equally true, except that a not-so-subtle mutation in technical analysis made adapting even more imperative. If those who traded markets did not adapt, they perished.

Recently, a number of trading publications have featured articles highlighting the evolution of technical analysis since the advent of personal computers nearly forty years ago. As history tells us, technical analysis evolved dramatically with the introduction of PC-based trading software and then again later as the global economy emerged and the world’s financial markets became increasingly interconnected.

One commodities trader turned trading-software developer has been on the leading edge of the evolution of technical analysis– Louis Mendelsohn. Working full time in the mid-to-late 1970s as a hospital administrator, while trading commodity futures contracts on the side, Mendelsohn fostered a new age in both technical analysis and trading software. His innovative work with PC-based trading software catalyzed the evolution of technical analysis. It transformed from hand-drawn charts derived from computing technical indicators on hand-held calculators, and software that merely automated these functions, to technical analysis calculated on powerful personal computers.

Prior to the use of trading software, charts were maintained by hand as shown in the nearby example. This is a handdrawn chart from Mendelsohn’s archives, circa 1979 tracking US Treasury Bill Futures from the days before computerized trading software.

Mendelsohn had a vision – he was determined to squeeze all the power from personal computers to produce technical analysis previously only available to traders at big trading firms with large research staffs and mainframe computers at their disposal. In 1979, at 31, he started Market Technologies to develop PC-based trading software for his own use and to license it to other like-minded commodities traders. Mendelsohn understood that the PC coupled with powerful analytic software would become the future for traders.

Mendelsohn unveiled his first new tool for traders in 1983, ProfitTaker Futures Trading Software. With that software, he introduced strategy back-testing for personal computers, an innovative and unprecedented approach to analyzing potential trades.

Other futures traders began to follow his approach, which he presented in a series of articles on trading software published in Futures magazine in 1983. He also presented his visionary ideas as a speaker and expert on technical analysis in panel discussions at international trading conferences. His expertise expanded and his influence spread. Hand-drawn charts and hand-held calculators for technical analysis would soon join Tyrannosaurus Rex as a relic of the past because of Mendelsohn’s vision. PC-based trading software would transform from mere automation of functions to a truly powerful tool.  Mendelsohn worked with industry leaders to build a fully functional software package.

Figure 1: A letter to Mendelsohn from Microsoft, written on a typewriter. The Microsoft registered trademark is circled with blue ink.  As Lou notes, “That’s right – no word processing at Microsoft in 1982.  There are also typos in the letter which were fixed with ‘correcto-type’”

His strategy back-testing software, the first such commercially available software to offer this incredibly powerful capability at the PC level, furthered the evolution of technical analysis and energized the growth of what would soon become a multi-million dollar trading-software industry, an industry built in large part on Mendelsohn’s pioneering innovations.

During the incubation period for PC-based trading software, while still a cottage industry of individual developers, early customers of Market Technologies became trading software developers themselves and created their own software, incorporating strategy back-testing patterned after ProfitTaker.

Mendelsohn’s genius was about creating a new paradigm for traders in a world that would rapidly evolve, a world the same in that moment, but yet a world on the verge of unprecedented new possibilities. Like most forward thinkers who are doers, Mendelsohn acted in the moment of his time because he saw the possibilities. And, like most visionary entrepreneurs, he didn’t just stumble upon back-testing strategies with ProfitTaker. He worked through a process that had steps, a process that created other important pieces along the way that would add real value to the trading tool chests of traders around the world.

Those pieces include other significant contributions to the world of computerized technical analysis and more stages in the evolution of the trading-software industry. Many of those contributions were incorporated into ProfitTaker in 1983, including the capability to perform rollover testing on actual expiring futures contracts, testing for lock-limit conditions that would prohibit a trade from being taken, forecasting closing prices a day in advance so that traders could execute trades just prior to the close rather than having to wait until the next morning’s open, quantifying the impact of execution timing on trading system performance, and displaying comprehensive history tester performance results.

Below are reproductions of handwritten notes that were used in the development process. The first two pages show Mendelsohn’s hand-written notes on how to program software to test actual futures contracts with rollovers, circa 1980 – 1981.

The next three figures are hand-written notes on the design, layout and content of various reports to include in a history tester, circa 1980 – 1981.

Again, as it is with most proactive visionaries, the technological limitations of his time did not impede his forward movement. Hardware and software were evolving and as they did, Mendelsohn was right there and aware enough to take advantage of Moore’s Law, the technological maxim that predicts processor speeds, or overall processing power for computers, will double every two years.

In the mid-to-late 1980s, as the trading-software industry matured, what would soon be referred to as the global economy also began to take shape.

In those early days of the PC genesis and trading-software development, Mendelsohn anticipated another aspect of the burgeoning technology boom. His vision now included an emerging global economy driven by advancements in global telecommunications including the then yet-to-be-understood Internet technology of the 1980s. This evolutionary step would open the door to high-speed, global-data transmission and globally interconnected economies and financial markets, which in turn would necessitate a new way to analyze markets – intermarket analysis, a revolutionary new take on market analysis.

To Mendelsohn, it no longer made sense to look at a single market in isolation, since in the rapidly evolving financial world, one market would influence another instantaneously, and that market in turn would influence yet another in that moment, and so on. It was no longer productive to calculate and chart linear ratios or differences in the price of just two markets.

In this light, he shifted his research efforts away from the single-market analysis approach that had been the mainstay of technical analysis for the past century and turned it toward intermarket analysis.

Mendelsohn’s research over the next few years proved effective in addressing the globalization of the markets and the need to employ intermarket analysis. In 1991, Market Technologies released its second-generation, intermarket analysis software, VantagePoint Intermarket Analysis Software. This software, unlike its less robust predecessor, applied the nonlinear, pattern-recognition and forecasting capabilities of neural networks to intermarket analysis, marking yet another significant step in the evolution of PC-based technical analysis.

Mendelsohn’s introduction of VantagePoint signaled a leapfrog evolutionary moment in the trading world. VantagePoint demonstrated that software could take advantage of the interconnection between global events and their effects on markets and how those ripples would affect technical analysis.

His approach to intermarket analysis gave him the ability to produce highly accurate, short-term forecasts of market direction for each target market under study, based upon both its own behavior and that of other related markets found to have the greatest impact on each target market.

Recognizing his pioneering work on intermarket analysis, the financial press asked him to present his ideas about the potential risks of globalization and possible adverse effects on the financial markets under strained global economic conditions, in numerous editorial articles.

His editorials in the Journal of Commerce in February, 1990 entitled “Build a Global Safety Net”(reproduced below) and in Futures in April, 1990 entitled “24-hour trading: Let’s do it right” were prescient in identifying a number of systemic risks and proposing possible solutions that regulators, exchanges, clearing houses, and central banks could implement to avoid a future global-financial meltdown.

In two editorial pieces for Technically Speaking, the Market Technicians Association newsletter, (“It’s Time to Rethink the Role of Technical Analyst” in September, 1991 and “Broadening the Scope of Technical Analysis: The Importance of an Intermarket Perspective” in March, 2001), Mendelsohn called for a broadened definition of technical analysis to include intermarket approaches that could identify and quantify the effects of related global markets on each market traded. The first of these editorials is reprinted below. The second is included elsewhere in this issue.

In the 23 years since the first release of VantagePoint, Mendelsohn and his research team at Market Technologies have continued to develop and perfect its proprietary mathematical processes and algorithms that now apply intermarket analysis to hundreds of global markets each day. These include commodities, ETFs, stocks, Treasuries, indices, currencies, and metals.

In 2013, the United States Patent and Trademark Office recognized Mendelsohn’s novel work, granting him two patents for his invention of technologies that combine intermarket analysis with neural networks to create leading technical indicators which forecast global markets (with an accuracy rate above 80%). His work goes on.

More recently, over the past several years, Market Technologies’ R&D team has been developing its next generation trading software platform, TradeShark. Currently available on a limited basis to select VantagePoint traders who are power-users, TradeShark applies Mendelsohn’s patented algorithms and processes that integrate global intermarket analysis with trend forecasting.

Throughout this evolution in technical analysis, Mendelsohn has succeeded at tailoring his unique analytic methodologies to meet the challenges of the financial markets of the 21st century. In today’s global era, traders must look beyond singlemarket analysis and trend following or risk extinction.

One man, one moment, sometimes that’s all it takes to spark massive transformation, and, sometimes, that one man in that one moment can disrupt the status quo. This is what Mendelsohn has done over and over again for the last forty years. He catalyzed the evolution of the trading software industry, an industry so critical today that no one would trade without utilizing analytical trading software. Neither would he or she look at any market in isolation and make trading decisions without at least paying attention to what’s happening in closely related markets.

His influence on the financial world has been featured in articles in Barron’s, Futures, Technical Analysis of Stocks & Commodities, Investor’s Business Daily, the Wall Street Journal, Technically Speaking, and on CNN, Bloomberg Television, and CNBC. He has written two books on this subject, collaborated on more than half a dozen books on technical analysis, and has published dozens of articles and editorials on technical analysis and the effects of globalization on the financial markets.

Mendelsohn, a Market Technicians Association member since 1988, introduced so many mutations into the technical analysis, trading-software industry that it is fair to say he is not only a trading software pioneer, but, more impressively, has remained a driving force in this arena for four decades. His innovations and achievements in technical analysis and prodigious contributions to the evolution of computerized technical analysis continue today.

Now in his late sixties, Louis Mendelsohn, married 38 years, with three grown sons and four grandchildren, is still active as the CEO of Market Technologies. Although his firm’s day-to-day operations are overseen by other executives (including his oldest son, Lane), he still provides his visionary guidance to help create the next generation of trading software, software relying on apps in the cloud or next-generation computers perhaps, or on something as of yet unseen to most people, save for people such as Louis Mendelsohn who will see it and then help foster the evolution process.

Evolution is what happens when an intelligent, motivated, and analytical person discovers his true passion early in life and then devotes the rest of it to pursuing his dream and staying at the forefront of innovations in that field.

Thankfully, Mendelsohn (with his wife, Illyce’s, encouragement and support) made the decision to leave the world of hospital administration to devote all of his energies and efforts to technical analysis and trading software development – at a time when very few traders even owned a personal computer or would have known what to do with it. Hospital administration’s loss is clearly a big win for technical analysis and traders all over the world.

Contributor(s)

Brandon Jones

Brandon Jones is an active trader and writer. He has worked with Lou Mendelsohn’s innovations for many years.

BROADENING THE SCOPE OF TECHNICAL ANALYSIS: THE IMPORTANCE OF AN INTERMARKET PERSPECTIVE

Editor’s note: this was originally published in the March 2001 issue of Technically Speaking.

With the proliferation of microcomputers and trading software over the past two decades, there has been a surge of interest by futures and equities traders in applying technical analysis to their trading decisions. Concurrently, a transformation has been underway, due in part to advancements in global telecommunications and information technologies, in which the financial markets have become increasingly globally interconnected and interdependent.

Despite the globalization of the financial markets, technical analysis is still directed primarily at analyzing each individual market by itself. This is done utilizing various technical indicators, many of which have undergone little, if any, change in their construction since first being applied by technicians decades ago. These include subjective charting analysis techniques such as head and shoulders, flags, pennants, and triangles, which attempt to find repetitive patterns in singlemarket data thought to be useful for market forecasting, and objective trend-following indicators such as moving averages, which due to their mathematical construction tend to lag behind market action.

These recent structural changes in the financial markets call into question the efficacy of trading strategies that rely solely upon single-market technical analysis methods and indicators to examine the price movements of individual markets. Now it is imperative to take external effects of related markets into consideration as well. This realization has brought about the emergence of an approach to market analysis, called intermarket analysis, which I have been involved in developing since the mid-1980s.

While most traders today will readily acknowledge that the world’s financial markets have become interconnected and influence each other, these same traders will just as quickly admit that they still do not perform intermarket analysis.  Instead, they continue to focus on one market at a time, while paying scant attention, if any, to what’s occurring in related markets. For instance, a QQQ equities trader or a Nasdaq-100 Index futures trader might keep a cursory eye on one or two related markets, such as Treasury bonds, the Nasdaq Composite, and maybe even crude oil. Typically, this is done by glancing over at price charts of these related markets. Intermarket analysis conducted at this rudimentary level is not amenable to rigorous evaluation or historical testing. While better than not performing intermarket analysis at all, this minimal effort still limits traders’ perceptions of what is really happening – and more importantly what is about to happen – in the markets that they are trading. No wonder there is a revolving door of traders who dabble with technical analysis for a while, only to fail at trading.

As the financial markets become increasingly more complicated over the next few years, with the ongoing melding of futures and equities, both domestically and internationally, and the inauguration of futures contracts on individual stocks, traders who continue to restrict their analysis to a single market’s past prices (or rely solely upon subjective chart pattern analysis or trend-following lagging indicators) for clues regarding an individual market’s future trend direction, will find themselves at a severe disadvantage. Since the stated purpose of technical analysis is to identify market trends and forecast (or at least extrapolate) their likely future direction, it stands to reason that traders could more easily attain this goal through the use of leading indicators that quantify the simultaneous linkages between markets and their effects on the traded market.

Today’s financial markets are an intensely competitive arena, and as in the case of futures markets are a zero-sum game.  In this battlefield-like environment predictive intermarket analysis tools – that expand the scope of analysis beyond that of a single-market – demand serious attention by technical analysts and traders. I am not suggesting, however, that traders should quit performing single-market analysis altogether or abandon the use of popular technical indicators such as moving averages which have been the mainstay of technical analysis for decades.

Many widely-used single-market indicators can be adapted to today’s global markets. There is no need to throw the baby out with the bath water. Intermarket analysis is not the elusive Holy Grail of technical analysis. The Holy Grail does not exist. Intermarket analysis is simply another facet of technical analysis, and should be implemented in conjunction with single-market analysis. In this way, marginal trades, which are otherwise indiscernible using only a single-market approach, can be identified and thereby avoided, while potentially outstanding trades can be seized upon early in their formation, with greater confidence.

Intermarket analysis brings an added dimension to the analytical framework so that the behavior of each individual market can be examined from without as well as from within. Intermarket analysis is a natural extension of single-market analysis, thereby broadening the definition of technical analysis. This evolution is necessary, given the complexity of today’s global financial markets.

Yet, it is quite challenging for individual traders to perform intermarket analysis beyond simply “eyeballing” the charts of two or three related markets. Relatively simple quantitative methods have been developed by technical analysts in the past to measure the effects of related markets on a given market. One such approach, widely used by futures traders, performs a “spread analysis” on two markets to measure the degree to which their prices move in relation to one another.  This is accomplished by calculating and comparing the ratio of or difference between the prices over time.

As the number of related markets to be taken into consideration increases, the ineffectiveness of such approaches to analyzing intermarket relationships for trend identification and market forecasting becomes apparent. These methods are limited to price comparisons of only two markets at a time, assume that the effects of one market on another occur without any leads or lags, and that the relationships are linear.

These assumptions are not borne out by how today’s global financial markets actually behave. Intermarket linkages between markets are neither fixed nor linear. They are dynamic, and have varying strengths, as well as varying leads and lags to one another that shift over time. Therefore, in order to perform intermarket analysis effectively, more robust mathematical tools need to be employed. One such tool, which I have worked with for more than a decade and found to be very effective, is neural networks.

Neural networks can find reoccurring patterns and relationships within both intra-market and inter-market data. Through a process known as “training”, neural networks can be designed to make highly accurate market forecasts of the trend direction of various financial markets. For instance, in order to forecast the short term trend direction of the Nasdaq-100 Index, neural networks can be trained on past market data on the Nasdaq-100 Index itself (including open, high, low, close, volume and open interest), in addition to inter-market data from any number of related markets. These related markets might include the Dow Jones Industrial Average, 30-year Treasury bonds, S&P 500 Index, U.S. Dollar Index, S&P 100, New York Stock Exchange Composite Index, Bridge/CRB Index, Dow Jones Utility Average, and New York light crude oil.

The trend forecasts made by neural networks through their pattern recognition capabilities often forewarn of impending changes in market direction before they would even show up on conventional price charts or could be identified through the use of single-market trend following indicators. One innovative way in which I have been able to amalgamate intermarket analysis with single-market analysis has been to use both intra-market and inter-market data as inputs into neural networks which are then trained to make forecasts of moving averages.

Moving averages have long been recognized by traders and technical analysts as an important quantitative trend identification tool. While the “lag effect” inherent in the mathematical construction of moving averages has continued to challenge technical analysts and market researchers, moving averages are still extensively relied upon by technicians to gauge current market behavior and discern future market direction. If this shortcoming could be eliminated, moving averages could rank as perhaps the most effective trend forecasting tool in the technical analyst’s arsenal. 

For traders it is critical to know what the market direction is expected to be in the immediate future, since profitable trading decisions are predicated on these expectations being correct more often than not. Unlike yesteryear, it is no longer good enough to find out that a market made a top or a bottom two or three days ago. In today’s highly volatile markets even a one day lag can be the difference between profits and losses.

By contrast, if it were possible to forecast accurately, for example, a five-day simple moving average of closes for two days in the future, then the lag effect would be eliminated from a practical trading standpoint. Changes in the trend direction of a market could be identified just before or at the time of their occurrence, not days after the fact.

One approach that I have found highly successful incorporates forecasted moving averages into predictive moving average crossover trading strategies. In this design, the value of a predicted moving average, based upon both intramarket and inter-market data inputs into the training of neural networks, is compared mathematically with the value of a calculated moving average which is based strictly on past single-market prices. The resulting metric indicates the expected market trend direction. When the predicted moving average value for a future date is greater than today’s calculated moving average value, the market can be expected to move higher over that time frame. Similarly, when the predicted moving average value for a future date is less than today’s calculated moving average value, the market is likely to move lower over that time frame.

For instance, a predicted five-day simple moving average for two days from today can be compared to today’s calculated five-day simple moving average. If the predicted average is greater than today’s calculated average, this indicates that the market is likely to move higher over the next two days. The difference between the two moving average values from one day to the next measures the strength of the anticipated move.

Another intriguing application of predicted moving averages is to compare one to another. For example, a predicted fiveday moving average for two days from today can be compared to a predicted ten-day moving average for four days from today. When the shorter predicted average is above the longer predicted average (and both are above their respective calculated averages), this is a strong indication of near-term upward movement.

Predictive moving average crossover strategies can be devised to indicate when to enter and exit a position, where to place market or limit orders, and at what price to set trailing stops. Further research on various lengths and types of forecasted moving averages, as well as on the application of optimization techniques to forecasted moving averages, should be conducted.

As the world’s financial markets become increasingly intertwined, intermarket analysis will play a more critical role within the overall field of technical analysis in the 21st century, just as back-testing and optimization of single-market trading strategies became integral to computerized technical analysis in the late 20th century. In fact, I can envision the definition of technical analysis broadening even further, in which technical, intermarket and yes, even fundamental data are brought together within one unified (quantitative) analytical framework.

I have previously referred to this paradigm as “synergistic market analysis.” This concept expands upon an earlier article of mine published in the September 19991 issue of the MTA Newsletter entitled “It’s Time to Rethink the Rile of Technical Analyst.” Given the unprecedented transformation of the global financial markets presently underway, open-mindedness on the part of technical analysts accepting a re-definition of technical analysis along these lines is warranted.

Obviously, trend identification and market forecasting will never achieve 100% accuracy, due to randomness and unpredictable events that are inherent in the financial markets, as well as due to the daunting task of developing powerful market forecasting tools. Technical analysis is as much art as science, if not more. Still, it is our responsibility as technical analysts to push the quantitative envelope of financial market analysis as far as possible. That’s what makes this arena so intellectually challenging to those of us who are fortunate enough to be involved professionally in technical analysis.

Contributor(s)

Louis B. Mendelsohn

VOLATILITY IN PERSPECTIVE

Editor’s note: This was originally published by Crestmont Research (www.CrestmontResearch.com) and is reprinted with permission.

Who or what is rocking the boat? Is the current level of volatility “normal” or is it extreme in either direction? The purpose of this presentation is to graphically put volatility into historical perspective. This report will be updated periodically as volatility is just too volatile to be ignored.

The first look at volatility uses the common measure of standard deviation. For this analysis, the monthly percentage changes in the S&P 500 Index are used and then the result is annualized to reflect a measure of the amount of variability in the market. This measure is often used by financial market professionals as an indication or measure of risk in models that assess risk versus return. It’s not important for this discussion to go into detail about the statistic—it is only necessary to appreciate that it is the most common measure of volatility and to recognize that a higher value means higher volatility.

Figure 1. S&P 500 Volatility: twelve-month rolling standard deviation

Volatility declined by more than 7% over the past quarter and remains more than 35% below its historical average. High or rising volatility often corresponds to declining markets; low or falling volatility is associated with good markets. The current period of low volatility is a reflection of a good market, not a predictor of good markets in the future.

Let’s look at six decades of volatility…to put volatility into perspective. To present a view of volatility and its change over time, Figure 1 presents the twelve-month rolling standard deviation for the S&P 500 Index. The concept of rolling periods just means that the value that is used for each month is the ‘standard deviation’ for the most recent twelve months. So as the market goes through periods of higher and lower volatility, the measure reflects those changes.

As you can see in Figure 1, volatility tends to average near 15% (the average that many models and academics use for stock market volatility). Yet one of the most interesting aspects of the history of volatility is that it tends to move around a lot. Although most periods generally fall within a band of 10% to 20% volatility, there have been periods when volatility was unusually high and periods when it was unusually low…and often extreme periods in one direction are followed by oppositely extreme periods. The time between the light grey vertical bars on the graph represent four-year periods. So some of the extreme periods can last for a while, yet few last a long time.

For most of the mid-2000s, volatility had been unusually low—and by late 2006 and early 2007, volatility fell into the lowest three percent of all periods since 1950. No wonder that investors and market spectators had become complacent to market volatility…or maybe complacency about risk led to the low volatility. Nonetheless, the waters of the market were unusually calm.

Then, in 2008, volatility surged to startling and anxiety-producing levels. This longer-term measure (which is a little slow to react since it requires twelve months of information) increased to more than 25%—fairly high by historical standards, yet not without precedent. In recent years, it has settled back within the typical range. This midrange acts as a holding pattern until volatility again breaks in either direction.

Across most of 2013, an eerie calm has again come over the stock market. Volatility has plunged to near 2006/2007 levels. If history is again a guide, a surge to high volatility may not be far away.

For a better reflection of near-term changes and trends in volatility, we can look at two other measures: the frequency of days each month that close up or down by more than 1% and the intra-day range expressed as a percentage. The first of these measures reflects the “six o’clock news summary” of daily volatility—since significant moves in the market often make the news—and the second reflects the “rollercoaster” that many professionals experience. For example, there are days when the market opens higher or lower and stays there—so measuring 1% days reflects the magnitude of daily changes. Therefore, with only a week or month of trading days, we can quickly see emerging changes in the overall level of volatility.

On other days, when the market professionals get home with that worn-out-look, the market may have swung wildly yet closed with little change from the previous day. So to capture that aspect of volatility, we can measure the difference between the high and low and present it as a percentage of the previous closing price. A higher percentage reflects higher volatility.

On other days, when the market professionals get home with that worn-out-look, the market may have swung wildly yet closed with little change from the previous day. So to capture that aspect of volatility, we can measure the difference between the high and low and present it as a percentage of the previous closing price. A higher percentage reflects higher volatility.

Next, let’s look at the other shorter-term measure of volatility trends and changes: the average daily range. This one could be called the “rollercoaster factor” since it measures the trough-to-peak each day as a percent of the market index. For example, if the S&P 500 Index starts at 1015 and falls to 1000 before ending at 1014, the daily range was 15 points (i.e. 1015 minus 1000) or 1.5% (i.e. 15 divided by 1000). The intra-day information that is needed for this measure is available from 1962, providing over four decades of data. The average daily swing over more than forty years has been approximately 1.4%. At today’s levels, that’s about 18 points for the S&P 500 Index and the equivalent for the Dow Jones Industrial Average would be almost 170 points.

Figure 3 reflects that the average daily range has been similarly variable like our other measures of volatility. As our most quickly reacting measure of volatility, the Average Daily Range has declined to near-record lows.

In Figure 4, as an update to the information originally presented on page 48 of Unexpected Returns and discussed in the book, the table reflects the propensity for the stock market to perform well in lower volatility periods and perform poorly in higher volatility periods. The principles of valuation and volatility that are explored in Unexpected Returns are the key drivers of stock
market returns and performance over multi-year periods. As a result, the current environment of lower volatility certainly suggests that defensive and/or hedged strategies are appropriate due to the risk of a shift toward higher volatility.

CONCLUSION

From ultra-low levels of volatility to ultra-high levels and now back again, the past decade has been unique—but not unprecedented. There are several ways to measure volatility, some with longer-term, bigger-picture perspective. Others provide a shorter-term, more current view of conditions. All measures currently reflect that volatility has declined, with some measures near historical lows.

An historical perspective of volatility reflects that higher volatility periods are normal and can extend for quarters or years.  Many investors had anchored on the mid-2000s extreme low volatility years as a normal condition and were surprised by the subsequent period of high volatility. A true understanding of history provides a more rational perspective that can help investors take action to protect their portfolios should high volatility return…while being positioned to participate in improved market conditions as volatility abates.

As we saw about six years ago, extremely low volatility periods are often followed by surges in volatility. And such surges generally correspond to periods of market disruption and decline. Today’s favorable conditions can be a good time to assess portfolios for their ability to weather market storms.

Contributor(s)

Ed Easterling

Ed Easterling is the author of Probable Outcomes: Secular Stock Market Insights and the award-winning Unexpected Returns: Understanding Secular Stock Market Cycles. He is currently president of an investment management and research firm. In addition, he previously served as an adjunct professor...

WHAT CAN WE LEARN FROM PRIOR PERIODS OF LOW VOLATILITY

Editor’s note: This was originally posted at the New York Federal Reserve’s Liberty Street Economics, a site that “features insight and analysis from economists working at the intersection of research and Fed policymaking.”

Volatility, a measure of how much financial markets are fluctuating, has been near its record low in many asset classes.  Over the last few decades, there have been only two other periods of similarly low volatility: in May 2013, and prior to the financial crisis in 2007. Is there anything we can learn from the recent period of low volatility versus what occurred slightly more than one year ago and seven years ago? Probably; the current volatility environment appears quite similar to the one in May 2013, but it’s substantially different from what happened prior to the financial crisis.

For the three periods we consider, the chart below compares how low volatilities are across the following asset classes: two-year Treasury securities, ten-year Treasury securities, and the S&P 500 index. The bars represent the percentiles of historical ranges. For instance, the S&P 500 value for July 2014 is 7. That means only 7 percent of historical monthly observations, which we took back
to 1994, have had a lower volatility. For the two-year interest rate, only 8 percent of past observations are lower. For the ten-year interest rate, less than 1 percent are lower. We focus on May 2007, May 2013, and July 2014 because these periods were low points for volatility across asset classes. Although volatilities are a bit higher today, they remain near their long-term lows. And while volatilities in the last few months are similar to those in May 2013 and 2007, there are important differences in the underlying drivers that can help us understand the recent market environment.

Before we get into those differences, we had to think long and hard to answer the question: What should drive volatility?  To come up with possible explanations, we asked economists and hedge fund managers, combed the academic literature, and scrutinized the popular press. For example, economists generally believe the strength of economic growth and level of inflation may influence volatility. Recent economic research has focused on the dispersion of forecasts around various economic and financial market variables, and this may also be an important driver of market volatility. Lower forecast dispersion, as well as stronger growth and lower inflation, should be associated with lower volatility. The press has also been discussing the current stance of monetary policy, with low interest rates and large-scale asset purchases by the Federal Reserve, as a factor in explaining the current levels of volatility.

Clearly, many variables might help explain volatility dynamics. For our regression analysis, we start with a wide range of potential variables that might matter, and then sequentially narrow our broad list by removing the least important variable, one at a time (in statistical terms, the least significant variable in explaining volatilities). The chart below shows the five most important variables (for the wonks, here are further details on estimation approach and regression results). The bars represent the differences of those variables from their long-run means, measured as a number of standard deviations.  The three bars for each variable correspond to three different times: July 2014, May 2013, and May 2007.

Broadly, our results confirm what has been reported in the press—namely that reduced uncertainty regarding the policy outlook, proxied by the dispersion of the three-month Treasury-bill forecast, as well as low levels of financial stress, are contributing to the recent low levels of volatility. Also, low interest rates tend to mute volatility, with the ten-year Treasury yield low by historical standards. These three variables were similarly low last May, contributing to the interpretation that current underlying fundamentals resemble those from a year ago. When thinking about what was happening back in 2007, there are some substantial differences. Uncertainty regarding short-term policy was higher, with survey forecast dispersion for the three-month Treasury bill being substantially greater than it was today or last year. Financial stress, as represented by the TED spread (the difference between the interest rates on interbank loans and Treasury bills), and the ten-year Treasury note yield were also higher. In contrast, households seemed much more certain in their outlook, based on the dispersion of forecasts from household surveys. This measure was substantially lower prior to the financial crisis, but is currently near average, as it was last May.

During times of relative market calm, like today, it could be that low financial market volatility is pushing these fundamental drivers lower, rather than the other way around. This note does not specifically address whether volatility is affecting or is being
affected by these drivers. Rather, it approaches the links between these potential fundamental factors and volatility as instructive in providing more historical context when approaching the question of why volatility may be low now versus in the past.

Disclaimer The views expressed in this post are those of the authors and do not necessarily reflect the position of the Federal Reserve Bank of New York or the Federal Reserve System. Any errors or omissions are the responsibility of the authors.

Contributor(s)

Fernando Duarte

Fernando Duarte is an economist in the Federal Reserve Bank of New York’s Research and Statistics Group. 

Juan Navarro-Staicos

Juan Navarro-Staicos is a financial markets analyst in the Bank’s Markets Group.

Carlo Rosa

Carlo Rosa is an economist in the Bank’s Markets Group.

CONTRARY OPINION FORUM

Editor’s note: Susan recently attended the Contrary Opinion Forum, an annual meeting sponsored by Fraser Advisors that “centers around the idea of learning to think for yourself in solving practical investment problems.”

It was a small but quality crowd. Some spouses come, but typically do not stay past one presentation, and look for something else to do. Alas, their partners are clued to the overhead charts with a passion that is cruelty to most people.  The presenters were either very bullish or very bearish, and closely split. I highlighted on a few situations that I found interesting.

A few bearish charts and thoughts that gave me pause and I looked at them more closely:

  • One commented that there are enough black swans out there to blacken the sky.
  • Margin Debt on the NYSE
  • The Russell 2000 leads the market down
  • One commented that there are enough black swans out there to blacken the sky.
  • Margin Debt on the NYSE
  • The Russell 2000 leads the market down

Bullish arguments:

  • There is a Major Base Breakout
  • Economic indicators are pointed up
  • The majority of presenters were bullish on China.

Bearish Argument
I would not disagree that there are a lot potential problems out there. However, black swans are something you cannot plan for. What I can look at is the supply demand forces in the market, or how people feel about the stock market.

Margin Debt on the NYSE

I used a very long-term chart to look at Margin Debt, just the way they did. This is displayed differently, but I think it is fair to say that it does spike around the time there is a turn in the market.

Right now Margin Debt is Trending up along with the stock market, which it can do for some time. Rising Margin debt in a bull market, by itself, is not anything new.

Rate of Change of Margin Debt on the NYSE Monthly

The Annual Rate of Change is not in a spike so far.  There are also a couple of circled spikes that occurred at the beginning of a significant advance, which indicate they can be also be bullish.

Small Caps

In time frame of “A”, the Russell underperformed for years and did signal trouble. In “B”, they both rolled over simultaneously, after “Topping” for a few years.  Right now, there is a very tiny consolidation, and not much that can be gleaned.

This small H&S Top pattern in the short-term chart of the Russell 2000 was on a lot of people’s minds. Friday it rallied back inside the pattern, so it is still Neutral. The short—term downside risk is still there. However, I have no conviction unless there is follow through on the downside. Sideways Trending is not out of the question.

The Bullish Case

From a chart pattern perspective: A Major Base Breakout in the Major Indexes is enough for me to be bullish until I see signs of Major Distribution.

Intuitively know that it is good when the economic charts hit an extreme on the downside. They are now heading up. There were plenty of those charts at the forum.

I included a few China Indexes.

The Bearish Descending Triangle was also mentioned in gold. One person suggested buying the capitulation selloff. Two other presenters said that gold should be in a portfolio. I would say most had no interest in gold at all, which is interesting.

Contributor(s)

Susan Berger

Susan Berger worked with John Magee from 1968 until his retirement and then worked as a technical analyst at Fidelity Investments. She now provides independent research and additional information can be found at http://susanbergerstocks.com/.

SUMMARY OF MTA SINGAPORE CHAPTER MEETING

The MTA Singapore Chapter Meeting was honored to host Alex Siew, Dr Sun, and Caner Seren Varol, CMT, CFA, ERP at the September 30th, 2014. There were over 110 market professionals and investors who attended the presentation making it a great success for this young and growing chapter.

Jamie Coutts, CMT, CFTe is the incoming Singapore Chapter Chair, taking over for newly elected MTA board member, James Brodie, CMT. Jamie opened the event with a welcome address that introduced the MTA and provided a brief overview of the CMT
Program. The guest speakers then took the floor covering a diverse range of investment topics.

Alex Siew, Fund Manager at VCB Capital presented “Detecting Reversal Points Using Traditional Indicators, Quantitative Tools and How to Systemize Your Process.” Dr Sun’s presentation was titled “Quantitative Approaches to Control Tail Risk Control” Jamie Coutts then covered “US Equities: Sentiment, Breadth & Inter Market Analysis of Current Market Conditions” Finally, Caner Seren Varol delivered “Applying Technical Analysis and Risk Management to the Global Oil Markets.”

The speakers’ slides have all been made available via the Singapore Chapter page. Please visit go.mta.org/singapore

Each of the guest speakers was engaging and informative and kept the audience involved as they moved through a broad range of analysis. After the presentations concluded, the meeting was opened up for a Q&A discussion, which lasted for 10 minutes with excellent questions coming from the audience.

It’s evident from feedback that the success of the evening can be attributed to several key factors:

  1. The cross section and diversity of topics and speakers
  2. The recognition of quantitative methods as an extension of classical technical topics – people were interested in understanding automated strategies and rules based systems for applying technical tools

James Brodie, CMT built a strong base in Singapore and the success of this event can be traced back to his hard work and dedication. It is also important to thank Bloomberg for graciously hosting this and other events. If it is possible to find a disappointment in such a successful event, it is that we were unable to record the event.

Jamie and the Singapore Chapter of the MTA will be leveraging the success of the event to encourage greater recognition of technical analysis throughout the Asia-Pacific region. Collaborations with the CFA Society and GARP are of keen interest. Given the large statistical analysis and risk management component embedded in the FRM program, it makes sense to partner with them for an event on topics such as position sizing or technical analysis risk management methods. Please stay tuned for more information on a professional networking event in December.

Contributor(s)

Jamie Coutts, CMT, CFTe

Jamie  Coutts, CMT, CFTe is a crypto market analyst for Bloomberg Intelligence and is responsible for creating crypto market structure, asset, sector and thematic analysis content for clients. He takes a multi-disciplinary approach to his analysis of this emergent asset class, combining...

INTERVIEW WITH GREGORY HARMON, CMT, CFA

How would you describe your job?

I don’t know that I really have a job in the sense that many see it. I have two small businesses that require an awful lot of my time. It is more a lifestyle then a job. I am a Partner and CIO at Presidium Capital, a firm where, with my two other partners, we manage separate accounts for clients. The second is Dragonfly Capital which is a subscription newsletter service with macro technical analysis and detailed trade ideas, for those that still want to do it themselves. The work that goes into both overlaps a lot but each also has its distinct features. I spend a lot of time after dinner answering subscriber questions for instance, while on the management side there are a host of other issues. So I guess I am a trader, an analyst, a teacher and a manager,

If that were not enough I am also part of a trio restarting the North East Ohio Chapter of the MTA, in my spare time.

What led you to look at the particular markets you specialize in?

I have traded many markets over the years, starting with repurchase agreements and commercial paper, to securities lending, then equities, options and derivatives. The most fun I had was with options and derivatives so I like to focus there. Three are a lot of variables involved when you trade in options. And that keeps many out of the space.

Do you look at any fundamental or economic inputs to develop your opinions?

I try to be true to the technicals but we all live in a world that is 24/7 news, so it is hard to shut out everything. I follow the macro-economic indicators to keep abreast of the economy, not for trading signals but for and help with a longer term view and the risks to the technical picture. I have a CFA as well but do not find it useful for trading. It has been most useful for the occasional relative value analysis or showing extremes in valuation, but again not as a trade signal.

What advice would you have for someone starting in the business today?

I think the most important thing for someone new coming into the trading/money management business is to try as hard as possible not to have a bias, and where that is not possible understand what your bias is and how it will influence you. This is not easy for new entrants. It takes experience, success and failure to figure out how you will react to a stock you own being down 15% overnight, or up 20% on a rumor. You cannot know how every scenario will play out every day, but you can plan and prepare for the ones that you can imagine. Preparation with an open mind will be the key to your success.

What is the most interesting piece of work you’ve seen in technical analysis?

I have to admit that I am not the fastest at getting to all the analysis out there, but I recently read the report from David M. Smith and Ying Wang of SUNY Albany School of Business, and Christophe Faugere of Kedge Business School Bordeau, titled Head and Shoulders Above the Rest? The Performance of Institutional Portfolio Managers Who Use Technical Analysis. It was the first piece that I had seen that showed a meaningful quantitative advantage to using technical analysis measured on the institutional space.

What research area do you think offers the greatest potential in technical analysis at this time (something like an indicator, charting technique or trading tool)?

There are a lot of styles of technical analysis that have been around for a long time. But many that require heavy quantitative analysis have just not been practical until around 30 years ago. The work John Bollinger did with his Bollinger bands was a great step in looking at volatility. Carson Dahlberg and Kirk Northington are doing some great things in that area as well. But my guess is that research on volatility is just getting started.

Contributor(s)

Greg Harmon, CMT

Gregory W. Harmon, who holds a Chartered Market Technician (CMT) designation, is the President and Founder of Dragonfly Capital Management, LLC, a company that provides daily technical analysis of securities markets and consulting services to the marketplace. Greg was previously the CIO...

Amber Hestla-Barnhart

Bio coming soon.

SUMMARY OF THE CHICAGO MTA CHAPTER EVENT

Ralph Acampora is an excellent lecturer and it was no surprise that he drew a large crowd to the Chicago chapter meeting last month. He gave us a rich perspective of what Wall Street was like 40 years ago, how he and a close friend and colleague cofounded the Market Technician Association, and most importantly the need of serious recognition for Technical Analysis or, as they were referred to back then, “chartists”. Ralph gave the audience a vibrant perspective of the history on the development of the MTA and CMT designation. He highlighted key events, important dates, and how the CMT Program became recognized as the premier designation for technical analysts worldwide.

Ralph then went on to share a presentation he will be giving in Kuwait later this month for Altaira Ltd, where he serves as the Director of Technical Analysis. The presentation is titled “A Global Secular Bull Market” and starts with what he believes to be a generational low that happened in March 2009. Ralph supports this view with innumerable charts; exhibiting that most of them form a long term base which could be a starting point for global rallies. His analysis covered the MiddleEast, BRIC countries, Europe, and the Pacific. Everybody in the room also got a good laugh when he said he will be extra cautious when giving his analysis on crude to investors in Kuwait. Optimistic or neutral views are much better received.

After the event we had the chance to discuss Ralph’s vision of the MTA at present and into the future. We discussed the MTA’s dedication to delivering quality guest speakers at local events and our continued partnership with technology leaders like Bloomberg. Ralph is a true ambassador to our craft and we are honored that he visited the Chicago MTA Chapter.

I would also like to thank Tom Schneider at Bloomberg for graciously hosting this meeting.

Contributor(s)

Brian Stoner, CMT, CFA

Brian Stoner, who holds a Chartered Market Technician (CMT) designation, is a futures account broker at NinjaTrader Group, LLC. Brian started his career in the financial industry as a stockbroker in the mid 90’s and spent the better part of twelve years...

REMEMBERING 1995 DOW AWARD WINNER WILLIAM X. SCHEINMAN

William X. Scheinman received the Charles H. Dow Award in 1995 for his paper, Information, Time and Risk. In his paper, William outlined the core principals of Dow Theory and concepts by Edson H. Gould, and applied the ideas to the market.  Dow Theory was combined with Gould’s resistance lines, unit measurement and the rule of three. His Dow Award winning paper provides detail on each of these techniques.

William started his career on Wall Street in the mid 60’s. He learned to invest by reading works from prominent Wall Street figures and eventually became a licensed as a broker. He worked at Furman Selz Mager Dietz & Birney and then Arthur Wiesenberger & Company as a technical research analyst. He started a newsletter called Timings for institutional investors and published it until his death. He was one of the first members of the Market Technicians Association when it was founded in 1973. In 1970, William wrote Why Most Investors Are Mostly Wrong Most of the Time, a book that was reissued in paperback in 1991.

William passed away in 1999 at the age of 72. His ashes were interred on a small island in Lake Victoria next to the grave of Tom Mboya, a Kenyan political figure who was killed in 1969. William had met Mboya on a trip to Africa on behalf of Arnav Aircraft Associates, a company William had founded. William served on the executive board of the American Committee on Africa, established the African American Students Foundation and worked with Mboya to bring hundreds of Kenyans to study at American universities from 1957 to 1961.

According to his obituary in The New York Times, he lived a noteworthy life:

“…a self-taught Wall Street analyst who became an enthusiastic supporter of the independence movement in Africa

Mr. Scheinman, who was the pilot of an amphibious Navy landing craft in the Pacific during World War II, told friends that he had once brought down an attacking Japanese fighter plane with a burst of machine gun fire.  After his try at college, his son said, he ”bummed around” for a couple of years with professional poker players in the Midwest. Then he came to New York, drifted into the jazz world of Harlem and became a publicist for Count Basie and other musicians before becoming a salesman for an aircraft-parts company and then starting his own company.

Though Mr. Scheinman spent years on Wall Street he never looked the part, eschewing pinstripes for turtlenecks and cowboy boots and, in later life, drawing his graying hair back into a ponytail.

”He was unconventional,” said Peter Griffiths, an investment manager in Denver, ”in the way he dressed and the way he approached the market.”

Contributor(s)

U.S. SECTOR ETF PORTFOLIO

Editor’s note: This was originally published on September 29, 2014 at WWW.JBECKINVESTMENTS.COM and is reprinted with permission.

ETF selection has been based upon a traditional approach to ETF research to incorporate the best suited benchmark domestic sector ETFs in a portfolio designed to outperform the overall US equities market. The analysis therein uses a classical technical approach to uncover trends, patterns, relative strength leaders, et al., in order to reveal the leading/lagging US equity sectors.

This unique dual approach to ETF research not only seeks to outperform but also to managed risk in your U.S. equity portfolio, while staying invested in a very challenging market.

Our methodology

A rigorous ranking methodology was used to select among broad based benchmark US sector ETFs. For the purposes of this asset allocation portfolio, ETFs tracking a strategy index methodology designed to “beat the market” or “become a market” were excluded from our selection process. Only ETFs that track benchmark indexes are eligible and ranked according various factors including costs, efficiency, breadth, and liquidity. It’s easy to pick the wrong ETF, but it’s not necessary.

Our mission

The goal of our sector allocation will be to tactically outperform the S&P 500 on a relative basis1 using ETFs selected based on a proprietary ranking methodology.

Our Conclusion

Roughly 61% of the S&P 500 is comprised of its four largest sectors. My analysis in sum is overweight +3% these sectors and therefore implicitly suggests that the market is in a healthy state, at least relatively speaking. Note that since this is a relative and not an absolute game that my sector work does not directly imply higher equity prices. In fact, my recent blog – A Major Test is Upon Us – is clearly suggesting to me that there is a battle going on between the bulls and the bears.  However, keep in mind that at key inflection points it is not always clear as to which side has the advantage. At times like this, I believe that sector/macro/sentiment analysis plays a crucial role.

Vanguard Information Technology (VGT) – The daily relative strength chart began showing some near-term weakness earlier this month. My 9/3/14 blog (Technology might be losing its charm) highlighted breakdown, but note that it was not yet considered a sustainable reversal. The recent lower low and violation of the May 2014 RS uptrend line gives me reason to believe that a tactical opportunity is abound. As a result, the tactical allocation is pared back to Neutral from Overweight.

Vanguard Financials (VFH) – A relative strength trading range that began back in early May 2014 now looks to be firmly in place. Even tactical moves with a 3-6 month outlook might still be too difficult to chase within even this well defined range.  However, this rangebound scenario does allow for a Neutral allocation. It seems advisable to allow for this range to resolve itself before making a more aggressive call. Until then, a Neutral allocation makes sense.

Vanguard Health Care (VHT) – Despite the relative strength weakness in the earlier part of this year, VHT maintains a well-defined uptrend. In fact, VHT has just broken out to RS highs, which can be consid ered a technical victory for the bulls and for this portfolio. As a result I continue to step up weighting, now at +3%, up from +2%. With that said, keep in mind that this a 4-year relative strength uptrend and weakness should be viewed as temporary, at least until there is evidence of a sustainable shift out of this sector ETF.

Vanguard Consumer Discretionary (VCR) –The change from a relative strength (RS) leader occurred in Mar 2014 as the 2008 RS uptrend line was violated. The next few months of underperformance has been followed by what looks to be a basing effort for the time being. A confincing move above the Mar 2014 breakdown level, and the Jun/Sep 2014 highs helps to confirm that a RS bottom, but move below Jun/Jul 2014 lows warns that there is further weakness to follow

Vanguard Energy (VDE) – They say “the trend is your friend” and “don’t get emotional”, but why does energy FEEL weak even though its above its 2011 uptrend? Feelings aside, I have to abide be the facts and the ones that matter are that the 2011 uptrend IS intact and VDE is approaching a relative strength support after a steep decline. Also, despite the converging 10/30-wk moving averages, they have not crossed. Hold tight and stay the course.

Vanguard Consumer Staples (VDC) – I believe that there are 3 ways to interpret the relative strength chart of the past 3 years. 1) A broad sideways pattern has developed leaving one best suited to stay on the sidelines and not trying to chase performance. 2) Shorten you trading outlook to try to get an edge and time the near-term changes in trend. 3) A large complex head and shoulders top pattern is forming. With that said, let’s try to catch a bit of alpha on the RS breakout from the May 2014 downtrend line.

Vanguard Industrials (VIS) – From a relative strength perspective, VIS broke down from both its 2012 uptrend line and through the bottom of a 7-month trading range in Jun. 2014. This marked the beginning of a sustainable cycle of underperformance, in
my view. The weekly chart is now showing another RS breakdown as VIS moves to a lower low. I believe that this further substantiates the basis for a continuance of the underperformance.

Vanguard Materials (VAW) – Unlike VIS, VAW’s relative strength chart is still trading in a sideways fashion. In fact, this looks to have been going on since Mar.  2014. A breakout above this trading range would help to confirm a more favorable positioning, while a breakdown could force one to become more negative. Until then, a neutral allocation is advised. When managing your risk, note that initial support resides near 108-110 or the 30-wk moving average, the Jul. 2014 low, and the 2011 uptrend line.

Vanguard Utilities (VPU) – I am maintaining my Neutral stance for the time being, but I am keeping a very close eye on interest rates as the 10-Year (TNX) looks to be on the verge of a turnaround, which could weigh on this interest  sensitive sector ETF. I am also not so thrilled with the relative underperformance of late. Admittedly I may be late to the game on this call, but I am currently holding back awaiting confirmation from the Treasury market.

Vanguard Telecommunications Services (VOX) – Maybe this has been in the making for some time, but despite the volatile swings in relative strength, the fairly consistent downtrend channel suggests investing with the current. I am therefore tweaking the portfolio to reflect this by underweighting this sector ETF. From a purely price perspective, I am also monitoring the potential 5-month head and shoulders top pattern developing. Neckline support looks to be in the 85-86 range.

Notes:

  1. When relative strength is referred to, please consider it in conjunction with the S&P 500 unless otherwise noted.
  2. Sep. 28th, 2014 – courtesy of www.standardandpoors.com.

Contributor(s)

Jonathan Beck

Jonathan Beck brings over 10 years of buy/sell-side equity research experience to the table at J. Beck investments. He has previously spent more than half of his career working exclusively as a technical analyst on one of the most well respected technical...

WINANS LEGACY STOCK INDEX (WILSI)

The Winans Legacy Stock Index (WILSI) is an unweighted composite of approximately 110 common stocks (from diverse industry sectors) that have been continuously traded on the New York Stock Exchange (NYSE) since 1970 and in continuous operation since 1897 (on average). Since the WILSI’s underlying components remain unchanged over the last 44 years, this provides a baseline to compare today’s market conditions to past market cycles using the exact same securities. The WILSI provides an alternative means to evaluating stock market activity (past and present), and it eliminates many of the statistical flaws inherent in conventional stock market indices (i.e., S&P 500 Index & Dow Jones Industrial Average) due to frequent changes in underlying components and data weighting methods.

Applications: The WILSI provides certain unique advantages.

  1. Focuses on Core Stock Investments, WILSI measures the performance of senior U.S. common stocks from numerous industry sectors which comprise the longstanding backbone of the U.S. economy (i.e., corporate blue bloods). This allows for effective monitoring of an important subset in the stock market universe which is owned by a large cross section of investors who buy and hold these securities over many decades.
  2. Effective Historical Comparison Tool, Long-term market analysis can be unreliable when using conventional stock market indices, because they are weighted by market capitalization and frequently change a significant number of the their underlying components. The WILSI is comprised of a static, unweighted dataset of seasoned U.S. common stocks that have been through numerous bull and bear markets. This allows for direct, “apples to apples” comparisons with past market cycles for effective market analysis.
  3. Multiple Sets of Data, WILSI has supplemental data for a more complete market analysis based on both average and median calculations of the component’s price data. Also, data on annual dividends and daily trading volume is included.

Data Used: NYSE data has been the most accurate and consistently tracked U.S. common stock dataset for historical studies. Data was provided by Reuters, Dial Data, Yahoo and Google with calculations made by Winans Investments.

Data Granularity: The index is calculated on an end of day basis from 1970 to present. It is an unweighted index with no changes in the underlying stocks.

Index Calculations: The WILSI is the average of the median and average prices of the 110 individual stocks comprising the index.

Index Construction:

  • Originally constructed in 2013. Delisted stocks are left in the index and are not replaced with another security in order to eliminate “survivor bias” from results.
  • Each individual stock has an even weighting in the index. Stock prices are split adjusted. Dividend income is included in annual performance calculations.
  • Data Filters – Companies were excluded if: 1. the company was founded after 1951, 2. the common stock has not continuously traded since March 31, 1974, 3. price data was incomplete from the data vendors, and 4. stock price data was affected by a spinoff, merger or reorganization with NYSE stock symbol change.

Key WILSI Statistics:

  • 1897 was the average year the WILSI’s companies were founded.
  • 17% of the WILSI’s components are currently in the Dow Jones Industrial Average, and 7% were previously in the DJIA. 76% are currently in the S&P 500 Index.
  • 90% of the components have price data back to 1970.
  • Only one stock delisting since 2009.
  • 89% of the individual components have outperformed the S&P 500 Index on a cumulative basis since 1970. (WILSI components 5,943% vs. S&P500 1,918%).
  • 46% of the individual companies are at record stock price levels.
  • 2.4% annual dividend yield vs. 1.8% for S&P 500 in 2013.
  • $5.4 trillion was the 2013 market capitalization was with combined 2013 revenues of $4.2 trillion.
  • 51% of 2013 revenues were derived from foreign sales.
  • 35% had Value Line financial ratings were above A+; 6% had ratings below B-

Initial Research Observations:

  • This WILSI correlates well with The S&P 500 Index but eliminates many of the statistical flaws (such as survivor bias) inherent in conventional stock market indices (i.e., S&P 500 Index & Dow Jones Industrial Average) due to frequent changes in underlying components and data weighting methods.
  • WILSI share volume calculations are constant and are not affected by the NYSE volume calculation adjustments
    made in 2004.
  • Conventional market analysis indicators (trend and oscillators) effectively work on the WILSI.
  • WILSI is an effective benchmark for inter-market analysis with preferred stocks and corporate bonds.

Significant divergence occurred between WILSI and S&P500 in 1998-99 indicating late stage bull market.

Conclusions:

  • Contrary to popular opinion, mature stocks have performed well against the overall stock market over multiple time periods and market conditions on a nominal and risk adjusted basis.
  • WILSI provides a unique way to examine overall market conditions of U.S. common stocks with a static, unweighted dataset. This allows for direct, “apples to apples” comparisons with past market cycles thus effective market analysis.
  • WILSI is a good confirming indicator to trend changes in the S&P 500 Index. It can replace the DJIA as a secondary indicator. In fact, the WILSI trend change signals often occur earlier than either S&P500 or DJIA.
  • WILSI works well with many conventional trend indicators and oscillators: Winans Trend Indicator, 40-Week Moving Average Spread and Relative Strength Comparatives.

Contributor(s)

Ken Winans, CMT

Ken Winans, CMT President & Founder, Winans Investments For 28 years, Ken Winans, CMT, has conducted groundbreaking financial research within the discipline of technical analysis while serving as a portfolio manager, investment analyst, broker and investor. Ken is the President and Founder...

CHART OF THE MONTH

Other Winans Indexes include the Winans Real Estate Index™ (WIREI) and the Winans Preferred Stock Index® (WIPSI). The data is available through Global Financial Data, Metastock or Stockcharts.

The WIREI (Patent Pending 11/670,914) tracks new U.S. home prices since 1830. It’s unique approach rescaled and combined several well know government studies of U.S. new home prices into a continuous data set without the “gapping” and time lag problems found in other studies. The WIREI has several sub-indices: 1. Sales since 1962, 2. Inflation since 1932, 3. Home sizes (i.e., average square feet) since 1973 and 4. Geographic regions (Northeast, Midwest, South, West) since 1975. The WIREI is shown on the left.

The WIPSI was the first modern index to track these cornerstone securities. It is an evenweighted index that consists of 85 traditional preferred stocks of US companies that have consistently issued listed preferred stocks on the NYSE. As of December 2013, the industry breakdown was 58% financial services, 22% real estate investment trusts, 12% utilities and 8% industrials. The issuers’ average revenues are $24 billion. It has price, yield and volume data back to 1890. The WIPSI is shown to the right.

Contributor(s)