agosto21.jpg

Monthly Market Commentary - August 2021

"On Galactic Valuations"

MR. MARKET.

The beginning of September is always a peculiar period for me. These days I am celebrating my 21st anniversary in the business. It is the time of the year I wander about the things that happened since I started my career. This year, I have a particularly weird feeling because, since the mid of August, I have the pleasure of having a bright and motivated stager collaborating with me and he is… 20 years old. So, he was not even born when I moved my first baby steps in finance in September 2000. It sounds like an ancient past, another era, another world. In a sense it is. Over these years, there is a common theme underlying my experience dealing with markets. I have always been a “valuation junkie”. I believe that the predominant valuation methodology in a specific period tells you a lot about the investment environment. Imagine my shock when I read the first real report on a listed stock. It was full of “Price per click”, “Price per user, “P/ARPU”, so on and so forth. It was the end of the great bull market of the nineties, and I admit that analysts had gone nuts. PEG (Price Earnings Growth) ratios, a standard of the nineties, were already history. At an aggregate level average market valuations were not much more expensive that today. If you look at the “median” valuation, they were actually cheaper. The movement upwards to the stratosphere was concentrated to a relatively limited number of “new economy” stocks, while the rest of the market, the so called “old economy”, was trading in line or at discount to historical norms. With nominal and real interest rates well in positive territory, analysts had to invent some new valuation yardsticks to justify ridiculously high-priced stocks, but the ground was still fertile for value investors. Then the bubble burst and those reports full of fantasy measures were lost in the dust (and the stocks involved down 80-90%). After this period, several cash-flow based measures became fashionable. In a short span, we got a proliferation of very well-structured valuation approaches like “Cash Flow Return on Investment”, “Cash Return On Capital Invested”, “EV/Invested - ROIC/Wacc,” Eva or Nopat based multiples. With hindsight, 2001- 2004 was a good period for financial analysis. At least it was a rational one with several “new” and “old” economy stocks trading at attractive levels. From 2004 until the great financial crisis the dominant approach in valuation shifted to EV/Ebitda ratios and the likes. EV/Ebitda was not a new thing. Ask John Malone. However, the rise of the high yield bond market created the foundation for a more “corporate finance approach” to valuation. From an accounting point of view, EV/Ebitda is a very sexy thing. It is, in theory, a good proxy for operating cash flow and easy to explain. Ebitda has also the key characteristic to be a measure outside the official IFRS or GAAP framework. So, it definitely leaves some room for “creative accounting”. We witnessed a proliferation of adjustments for restructuring, rents, leasing, M&A costs, fines, and a plethora of exceptionals. An immediate success for the sell-side research. This “corporate” approach was so successful that also LBO/takeover assumptions on terminal values was a common feature of financial modeling in those days. Then the great financial crisis came. Honestly speaking… it was a mess. The key aspect of any valuation work was understanding the survival of a firm. P/Book values were back in vogue. For banks those book values seemed always overstated so we moved to core and then to tangibles and then to core tangibles etc. Unseen since World War II, some stocks were trading below cash held on the balance sheet, the so called “net-nets” popularized by Benjamin Graham in the 1930s. As it is always the case “the balance sheet does not matter until it is the only thing that matters”. A rule to remember. Fortunately, this is the past. Markets rebounded strongly since 2009. In these last 12 years we had two different periods. The first few years most of the valuation methods coexisted. Until 2015 there was a push for rerating of quality growth and dividend paying stocks. They scored well on most measures and got lifted up. From 2016 onwards the focus shifted heavily on top line growth. In this latter period, to my surprise, the Discounted Cash flow Model emerged as the dominant valuation tool. Today, I see DCFs all over the place. You could ask, why the surprise? At the end of the day, the DCF is the “correct closed-end formula” for company valuation. Every analyst knows it and accepts such evidence. However, if you have done some DCF work in your life you realize that the sensitivities to the basic assumptions can make a huge difference. If you move some of those assumptions one way or the other, you can easily get target prices in a wide 40-50% range. They call it fair value…In a nutshell, DCFs are like telescopes. Move it by a few centimeters and you end up watching another galaxy. Do not take me wrong. DCFs can be useful, at the extremes. If, no matter how hard you try, you cannot reasonably justify more prudent assumptions and you get a higher target value compared to the current price, that is a buy. Conversely, if you cannot reasonably justify rosier assumptions and you get a target value lower than the current price, that is a sell. Everything in the middle is a gray area and DCFs do not bring much added value in buy and sell decisions. The best alternative use it is reverse engineering market prices to understand the assumptions currently implied. Then, why the success of DCFs in recent times? I believe it is because analysts must justify prices that reached another galaxy and DCFs are the only available option in the toolkit. Acting as the devil’s advocate, I did the exercise to build a theoretical “uber bullish” DCF for an imaginary “greatest company of all times”. Translating my results in simple multiples I got something like 10x P/Sales. Today, it is common to find stocks trading at much higher valuation on the proposition that they are disruptive, and their “TAM” (total addressable market) is very large. Despite this, by definition, there cannot be so many “greatest companies of all times”. So, I went back to my “uber bullish” model to understand what the missing piece was. If people buy several stocks at 50x P/Sales they must have an idea these are the correct valuations. How it is possible that I cannot get to such high levels? After some tweaking, I got my answer. I keep repeating the same mistake. In my assumptions I used a positive cost of capital with the idea of some return to the mean. This sounded reasonable to me. I am doing a valuation of a business as a going concern. That means my assumptions must be valid for an indefinite future. I do not know what interest rates will be in 20-30 years’ time, so I used some regression to the mean from the historically low levels we trade at. Right? Wrong. I got to use a cost of capital very close to zero forever and live happy with the huge present value of highly uncertain future cash flows. This way I can justify 50x, 70x, 100x P/Sales. Voilà, “the galactic valuation” problem sorted. Easy, ain’t it?

Peppe Ganci, CFA


Archivio News

© Copyright 2018 Compass Asset Management SA