Wednesday, October 31, 2012

When performance isn't so good

A year ago Monday, the Northeast part of the United States was greeted with a horrendous storm, which included a lot of snow and damaging wind. My wife and I that day went to a civil union ceremony, a trip that should have taken about an hour; instead, it took four.

Well, on this storm's (and the happy couple's) one-year anniversary we were visited by Sandy, a hurricane which has pelted us quite hard. Its performance was quite good, for a hurricane, but quite bad, for those of us who were recipients of its forces.

The NYSE closed for two days, as did our offices. Many, including almost everyone in our NJ offices, are without power. Our office, however, has power, so those of us who can are here.

Many roads remain closed, which can make travel quite difficult. While I had no problem making it in (I live less than 10 minutes away), it took Patrick Fowler three times as long as usual to get here, and others may not make it in.

Because of loss of power, many (most?) gas stations are closed; and those that are open are doing lots of business, as the lines extend for quite some distance (Chris counted 150 at one waiting for a single pump!).

Hurricanes have hit this area before; what made Sandy different?

I am not, of course, a meteorologist. But, from what I can gather, most hurricanes come ashore much farther south (e.g., in Florida), and then, if they so choose, work their way north along the coastal states, but inland, meaning their power is weakened as it moves northward. Sandy moved north in the ocean, and made land right along the New Jersey coast, with its width spreading north (to NYC, Long Island, and Connecticut) and south (to Philadelphia, Delaware, and Maryland). It caused extremely high waves, which pummeled the shoreline, flooding many towns and communities. Its winds, sometimes in excess of 100 mph, caused much destruction.

We're the lucky ones

Despite some of the damage that we've had, and our power issues, our troubles are not at all like those of many others, whose homes, cars, and property have been lost or severely damaged. Our thoughts are also with those who have died as a result of this devastation (55 so far), as well as their families that are left behind to mourn their passing.

Friday, October 26, 2012

You can pay me (annualize) now, or you can pay (annualize) me later

One of our clients introduced me to alternative ways to calculate two commonly used risk-adjusted return measures: information ratio and Sharpe ratio. Their immediate derivation is from Zephyr, and I am attempting to identify their origin. I confirmed that Morningstar also uses them.

In both cases, they annualize and then do the math, rather than do the math and then annualize. This calls to mind the associative property of mathematics, which says the order does not matter. While in addition, this holds (2 + 4 = 4 + 2 = 6, for example), it does not in these methods, as we get different results.

This also reminded me of how some attempted to derive my favorite risk-adjusted measure, Modigliani-Modigliani (annualize first or last?).

The information ratio differences:


And, the differences with Sharpe ratio:

 
This raises numerous questions. For example, is one approach superior to another? My suspicion is that most firms use what I refer to as the "typical" formulas. The reference materials I've checked only show these approaches. But clearly, some favor the alternative. The CFA Institute's CIPM(R) program, as I recall, also references the "typical" approach.
 
We know that there are multiple ways to derive various statistics, and here are just two more cases. As I learn more, I will share it with you. In the mean time, feel free to "chime in" with your thoughts. 

Tuesday, October 23, 2012

Working my way through the Alternative Investments GS

I have read through the new GIPS(R) (Global Investment Performance Standards) Alternative Investments Guidance Statement a few times, and frequently make what I think are interesting discoveries. Here's just one:


I have highlighted the part I find of interest (I hope it's readable). We see "firms may wish to present simulated, model, or back-tested hypothetical performance results due to the lack of an actual historical track record." [emphasis added] This further adds credibility to my prior suggestions that the Standards permit the use of non-real performance, which is, I think, quite a good thing. And so, for example, if a firm has a new strategy for which they only have simulated, model, or back-tested performance, they can use it (along with a disclosure that makes it evident that this isn't a real track record (i.e., against managed assets)).

However, when we continue to read we find that these same results "can be presented as Supplemental Information to a compliant presentation." [emphasis added] What compliant presentation?

I think this is where it can get interesting, and even confusing, but it shouldn't be.

First, we see the word "can," which isn't the same as "must." And so, in the absence of a track record for a strategy, a firm can use hypothetical results.

When would the "can" apply?

In a couple cases.

First, even though this sentence is in the same paragraph as the wording dealing with the absence of a track record, the hypothetical performance can be used even if there is a track record, to demonstrate performance for periods not covered by the real investments.

Second, just because a firm doesn't have a track record it can still have a presentation; I have long been an advocate of this. The firm would have all the necessary disclosures, but no performance. The performance would be the supplemental hypothetical.

Make sense?

We also see the same language that appears in the Supplemental Information guidance statement: that you can't link hypothetical and actual performance. By "link" we don't mean "geometrical linking," though this would also hold true, but rather "visual" linking, where by presenting hypothetical and actual on the same page, the reader might mistakenly think they're one and  the same (i.e., actual for the full period). And so, to avoid the potential confusion, you're required to have them on separate pages.

I'll have more to say about this GS in future posts.

Thursday, October 18, 2012

Did you hire a GIPS verifier or a cell phone provider?

If truth be told (and I am about to tell it), I do not know how common this practice is, but we have become aware of cases where GIPS(R) (Global Investment Performance Standards) verifiers require their clients to sign multi-year contracts. E.g., they will perform the verification for the period 2010-2012. Does this remind you of anything?

Cell phones come to mind.

AT&T, Verizon, etc. all do this: they require you to sign up for a multi-year contract.

Can you get out of them? Sure, if you want to pay them. But who does? Regardless of the quality of the phone or service, you're pretty much stuck.

The Spaulding Group has NEVER thought of doing this. Our view is, if you don't like our service, fire us! Our feelings won't be hurt (okay, maybe a little bit, but we'll get over it). While we always strive to deliver the highest degree of service, we don't want to require our clients to keep us if they don't want to.

Why would you want to require your clients to sign a multi-year contract?

Oh, I know why: for YOUR benefit! Now I get it. Lock the client in. Even if they discover a better option, you lock them in, so that they are forced to retain your services.

Our advice to firms looking for a verifier: if the firm you choose is making you sign a long-term agreement, just say "no!"

If you have confidence in what you deliver, you won't have such a practice. Have a different view? Chime in!

p.s., if you'd like to learn more about TSG's GIPS and non-GIPS verification service, go to our website; better yet, fill out a questionnaire and get a detailed no-obligation proposal, along with a surprise gift!

Tuesday, October 16, 2012

A new GIPS rule being introduced in a non-standard way

It came to my attention yesterday that a new GIPS(R) (Global Investment Performance Standards) rule is being introduced into the GIPS Handbook regarding the treatment of significant cash flows (SCF): compliant firms will no longer be able to use the "number of portfolios" as a factor to employ the firm's SCF policy.
 
Where did this new rule come from? And more importantly, why wasn't the public given a chance to comment? Would such a change not be better handled through a revision to the SCF guidance statement, with the traditional public comment period?
 
This change is no doubt being couched within a "Q&A," that was perhaps fabricated for the purpose of introducing it, but the Q&As were never intended to be the source of new rules but rather interpretation of existing rules. Was this a Question that was asked of and Answered by the GIPS Help Desk? I suspect not, since it's not listed in the Q&A section of the GIPS website. Was this rule vetted with the Interpretations subcommitte or the GIPS Executive Committee? Or, was it added as part of the last minute editing process, without the benefit of public discourse?
 
The irony here, as you will see, is that the earlier version of the guidance statement conflicts with what is now apparently a rule!

Here's what is being changed: Let's say you have an SCF policy that you employ firm wide, that says if there's a flow greater than 30%, you will remove the portfolio for one month. Great! BUT, you have a few small composites (small in number of portfolios) that could have "gaps" or breaks in performance if you ever employed the SCF rule here. And so, you want to condition your policy by having something like "if the composite has less than five accounts, the policy does not apply." I.e., you'd rather suffer from the impact of the flow rather than experience a break in your performance history.

Or, what if you want a policy that says that composites with ten or more accounts have a 30% threshold, while composites with 5-10 have 40%, and below five will not participate in the SCF policy, so as to avoid possible breaks; what is the harm with such a policy?

This new rule will prohibit firms from doing this. Why? What's the point? What is the harm with firms having such policies? I find the change unnecessary, but more important, the manner in which it is being done in total conflict with the traditional methods to introduce new rules. What happened to protocol?


I mentioned above that these new "rules" seem to conflict with language in the earlier version of the SGF guidance:

 
As you can see, the SCF guidance recognized that a firm could experience gaps if a composite had just a few accounts, and so it cautioned firms on applying the same rules across the firm, without any regard to the number of portfolios a composite might have. For an unknown reason, this wording was removed from the most recent version of this guidance. Are we now witnessing a total "180" regarding a firm's ability to have rules partly conditioned on the number of portfolios in a composite? Why?

Since this is clearly a rule change, why isn't there an effective date?


I find this frustrating, ridiculous, absurd, and senseless! What do you think?

By the way, firms can STILL accomplish having their policy sensitive to the number of portfolios in a composite; they will have to specifically identify those composites which will employ the SCF policy. It's a little more cumbersome, perhaps, but can still be done. There is no requirement that firms must have a single SCF policy that must apply to all composites (at least not yet!).

Also, one might wonder what other new rules are being introduced this way; I guess we'll find out soon.

Monday, October 15, 2012

TSG Hires Steve Sobhi to head up our western region sales

This was just released by The Spaulding Group:

One possible reason for the reluctance to hear different views

The weekend WSJ often provides fodder for this blog, though this is the first that I recall it came from Peggy Noonan. Her assessment of Thursday's Vice Presidential debate ("Confusing Strength With Aggression") in this past weekend's edition was quite insightful. But the source for today's post can be seen as independent of the debate critique. She offered the following:

"Age can seem reactionary, resistant to change in part because change carries a rebuke: You and your friends have been doing things wrong, we need a new approach." [emphasis added]

A few of the new ideas that I and others have championed have met with resistance from some in our industry. For example:
  • My call to eliminate the aggregate method to calculate composite returns (for the Global Investment Performance Standards (GIPS(R))
  • Our desire to see broader acceptance and employment of money-weighted  returns.
It had never occurred to me before that some may interpret these suggestions as rebukes, as this was never our intent. But surely we can rise above this, can we not?

The irony is Noonan's linking of the resistance to change to age, as if those who don't want it are older. In spite of my youthful appearance, the reality is that I am one of the oldest folks in our industry!Those who oppose the ideas put forward are in all cases younger than I am. Why must they be so wedded to their ideas that they resist being open to new ones? Can our industry advance when some (and sometimes those who hold positions of authority) refuse to even consider change? 

Thursday, October 11, 2012

Valuing securities with stale prices for GIPS compliance

I am preparing to meet with a hedge fund client who wants to comply with GIPS(R) (Global Investment Performance Standards). As part of this preparation I am revisiting the recently published Guidance Statement on Alternative Investment Strategies and Structures.

This guidance makes it clear from the start that what is set forth within it applies to all asset classes.

When the GS was being discussed by the GIPS Executive Committee, I inquired into one particular section:


This essentially gives firms the opportunity to use stale prices, if those prices are deemed "the best estimate of the current fair value of the investment."

I asked (in order to confirm) whether this could be applied to fixed income securities, for example, and was told "yes."

This is a HUGE change. I recall individuals at the annual GIPS, conference around the time the 2010 edition was introduced, asking about the need to revalue for large cash flows, when their portfolio might contain less liquid securities (e.g., municipal bonds) that do not price daily. While I don't recall the specific response, I do not believe it was what is stated here.

If a GIPS compliant manager who has such assets finds that they may need to use historical prices, they need to document this in their valuation policy and indicate that they do so when they believe it's the "best estimate of the current fair value of the investment." Alternatives, such as the use of matrix pricing, can also be used, but it's good to know that firms can also use historical prices.

Wednesday, October 10, 2012

Mixing currency returns

A software vendor client of ours described a situation involving one of their clients. Until a certain date, the portfolio's return was in US Dollars (USD). After that date, it was in Japanese Yen (JPY). The question: can we link these periodic returns?

My initial reaction was to distinguish between local and base returns. The local return is the return of the assets. For the initial period, my guess is that the portfolio was invested in USD denominated securities, so the local return was USD. And more recently, it appears the portfolio was invested in Japanese denominated securities, so the local return is JPY. To paraphrase my friend, Steve Campisi, you can't eat local returns.

Further, as Denis Karnosky and Brian Singer wrote in their monograph, Global Asset Management and Performance Attribution, "Because of the unavoidable impact of interest rate differentials in controlling exchange rate exposures, local Eurodeposit returns are an inseparable component of currency returns." I.e., investors cannot achieve local returns.

My initial reaction was that you wouldn't want to link these different currency returns and that the investor should be more interested in their base return, which is reflected in their currency.

However, upon further reflection (which occurred in the wee hours of this morning, while attempting to remain asleep), I was once again reminded of the fundamental principle that one should always keep in mind when discussing returns: what's the question? That is, what question does the investor wish to get an answer to?

If they're interested in how the manager(s) performed at the local currency level, before the impact of currency movements (as well as any hedging which may have been employed) is taken into consideration, then to link these returns is fine. We often mix different currencies together to obtain our local return, so to link them is acceptable. The why behind the question (that is, why would they want to know this?) may have value, perhaps at a minimum from an academic perspective, but in the end, if the client asks that the two period returns be linked, this is okay. But again, "what are they hoping to gain from this information?" is a valid point to raise.

Tuesday, October 9, 2012

True or False: Verification verifies a firm's claim of compliance with GIPS?

In The Spaulding Group's Fundamentals of Performance and GIPS(R) Fundamentals courses, I often pose this question to attendees:

Verification verifies a firm's compliance with GIPS:
true or false?

Simple question, right? What does it mean to be verified?

Is it not intuitive that verification would verify compliance with GIPS (Global Investment Performance Standards)? It's probably not surprising therefore that many, if not most, of those attending will respond "true."

Therefore, it's also not surprising that when they learn that the statement is false, they appear a bit perplexed.

If you check the Standards' glossary you'll learn that verification is "a process by which an independent verifier assesses whether (1) the firm has complied with all the composite construction requirements of the GIPS standards on a firm-wide basis and (2) the firm's policies and procedures are designed to calculate and present performance in compliance with the GIPS standards."

Going through the Standards we also find that it "tests the construction of the firm’s composites as well as the firm’s policies and procedures as they relate to compliance with the GIPS standards." And while it "is intended to provide a firm and its existing clients and prospective clients additional confidence in the FIRM’S claim of compliance with the GIPS standards," and that it "brings additional credibility to the claim of compliance" [emphasis added] it does not specifically verify compliance.

Semantics? Perhaps, but we should be accurate in our claims.

We sometimes see statements such as "XYZ verified our compliance with the Standards." This is technically incorrect, since a verifier does not verify claims of compliance.

Under the AIMR-PPS(R), the verifier did verify compliance; however, this provision did not carry over to GIPS.

Confused? You're not alone.

Friday, October 5, 2012

Announcing Free One-Day Fundamentals of GIPS Workshops

The Spaulding Group has offered training for close to 20 years, including our one-day GIPS Fundamentals Workshop. To date we've trained over 3,500 individuals throughout the world (all over the United States, in several cities in Canada, various parts of Europe, in South Africa, as well as in the Middle East, and in several locations in Australia and Asia). In addition to open enrollment classes, which we hold in various cities throughout the year, we also regularly conduct in-house training.

The Global Investment Performance Standards (GIPS(R)) have become an important part of the investment performance area, but there is still a great amount of confusion and misinterpretation about the Standards. The course has been designed to educate individuals that work with the Standards, as well as to address some of the problem areas we see with many firms.

We have decided to offer this course for FREE, but there are a few catches:
  • The course is free only to investment firms (i.e., money managers and plan sponsors)
  • Each class is limited to only 20 participants, to make it more intimate: easier to share ideas and discuss issues and concerns
  • Only one participant per firm may attend. (If a firm wants to send more than one, then a modest fee of $500 will be applied)
We've already scheduled the course for five dates and locations, with more to follow:
  • Monday, November 19, 2012 (New York, NY)
  • Monday, January 28, 2013 (Los Angeles, CA)
  • Monday, April 15, 2013 (Toronto, ON)
  • Wednesday, May 15, 2013 (Philadelphia, PA)
  • Monday, June 10, 2013 (London, UK)
We have taught similar workshops for the CFA Institute for roughly ten years. This course is intended for those responsible for GIPS and performance measurement, chief compliance officers, marketing, sales, and others for whom the Standards have value and importance.

John D. Simpson, CIPM, Jed Schneider, CIPM, FRM and I will take turns conducting these courses. This means you will be taught by performance and GIPS specialists with more than 20 years' experience each, who work regularly with the Standards, through our verification work as well as consulting.

We are accepting registrations for the November 19th NYC and January 28th LA classes now. If you'd like to join us, please visit our special registration page. But hurry, as the spaces will fill quickly.

Tuesday, October 2, 2012

Rating the ways to present performance to prospective clients

I am conducting a "non-GIPS(R)" verification this week for a client who uses "rep" (representative) account performance for marketing purposes. This particular firm is not easily able to comply with GIPS (Global Investment Performance Standards) today, and so this seems to be a reasonable alternative.

Last week we held a meeting with the Universal Advisor Performance Standards  (UAPS) board, to continue our review of the draft. We briefly discussed the various approaches to create historical performance for marketing.

These events caused me to think that perhaps we should develop a rating scale, that ranks the various approaches, from "best practice" to "not best practice" (would we call the worst of these "worst practice"?).  The following graphic represents my current thinking:


Perhaps we should have numeric designations for these, too. I'd give composite returns that are derived using equal-weighting a ten (or maybe it should be a 9.9). Note that asset-weighted has three options; and these should get individual ranks. Perhaps 9.5 when beginning value plus weighted cash flows is used, 9.3 when beginning value, and 9.1 (or perhad 8.9?) when the aggregate method is used.

As an aside, I've addressed the asset-weighting issue in the past, and my basic question would be what value is it to an investor to see an asset-weighted composite return if they are not aware that the result may be skewed by one or two very large accounts? As for the aggregate method, I stand on my earlier position that this return fails to meet the definition of a composite return and can present nonsensical results.

I will have more to say on this in this month's newsletter. In the mean time, feel free to offer your thoughts.

Monday, October 1, 2012

Responding to a GIPS Composite Question

We were recently sent the following question, that relates to GIPS(R) (Global Investment Performance Standards) and composites:

What becomes of a composites performance if the strategy (value, growth, etc) remains the same, but a new team of portfolio managers and analysts take over the strategy but implement a new and different investment process and construction to the strategy composite?

Does a new composite need to be created? If not or so, what disclosure should be stated so that performance can be clearly indicated as to the differentiation?

Our reply:

It has to do with "materiality." Performance belongs to the "firm." There is an expectation that staff will change over time. If the change is materially significant, then creating a new composite may be justified; however, I’m thinking of changes of the magnitude of almost going from a fundamental to a quantitative style of investing. If the strategy, per se, hasn’t changed, to justify creating a new composite might be difficult.

Changes in management would be considered "significant events," requiring a disclosure. And so, to disclose that effective a certain date, management changed, will communicate to the prospect that the "firm" has managed the strategy for a particular period, and that since a certain date, the management changed. Likewise, if the benchmark changes, you would document this.

Agree? Have different thoughts? Please chime in!