Sean Park Portrait
Quote of The Day Title
I say profound things

Articles tagged 'derivatives'

This is no way to run a financial system

The micro-cracks are turning into fissures, soon to be gaping crevasses as (finally) the obsolescence of our industrial age banking system plays itself out in spectacular front page headlines. Meanwhile it would seem that our society and our leaders are (mostly) frozen in some kind of macabre trance – eating popcorn and mesmerized by the inevitable Crash.

If you look at the LIBOR scandal in the context of the technology of the fast emerging information economy, it is absolutely mind-boggling that such an anachronistic process even exists in the world of 2012. In a world where every financial flow is digitized and only really exists as an entry in a database. In a world where truly enormous real-time data sets (ones that make the underlying data required for a true LIBOR look puny) are routinely captured and analyzed in the time it takes to read this sentence. In a world where millions (soon billions) of people have enough processing power in their pocket to compute complex algorithms. In a world where a high school hacker can store terabytes of data in the cloud.  In this world, we continue to produce one of the most important inputs into global financial markets using the equivalent of a notebook and a biro… WTF???

You think I’m joking? Libor is defined as:

The rate at which an individual Contributor Panel bank could borrow funds, were it to do so by asking for and then accepting inter-bank offers in reasonable market size, just prior to 11.00 London time.

For each (of 10) currencies, a panel of 7-18 contributing banks is asked to submit their opinion (yes, you read right) each morning on what each rate (by maturity) should be. The published rated is then the “trimmed arithmetic mean”; basically they throw out the highest and lowest submissions and average the rest. No account is taken of the size or creditworthiness or funding position of each bank and the sample size after the “trimming” for each calculation is between 4-10 banks. However, the BBA assures us that this calculation method means that:

…it is out of the control of any individual panel contributor to influence the calculation and affect the bbalibor quote.

You don’t need to be a banker or a quantitative or statistical genius, or an expert in sociology, or even particularly clever to figure out that this is a pretty sub-optimal way to calculate any sort of index, let alone one that has an impact on the pricing and outcomes of trillions of dollars worth of contracts…

In the 1980s when LIBOR was invented – and (lest the angry mob now try to throw the baby out) it should be said an important and good invention – this methodology just might have been acceptable then, as the “best practical solution available given the market and technological context.” Banks used to have to physically run their bids in Gilt auctions to the Bank of England (thus why historically banks were located in the City, tough to compete on that basis from the West End or Canary Wharf, at least without employees a few Kenyan middle distance Olympians…) But you know what?  And this is shocking I know… They don’t do it that way anymore!!!

So if LIBOR is important (and it is), how should we be calculating this in the 21st century? Here’s a few ideas:

  • include all banks participating in the market – and not necessarily just those in London – how about G(lobal)IBOR??
  • collect and maintain (in quasi-real time) important meta-data for each contributing bank (balance sheet size and currency breakdown of same by both deposits and loans, credit rating, historical interbank lending positions, volatility/consistency of submissions, derivative exposure to LIBOR rates, etc.)
  • collect rates and volumes for all realized interbank trades and live (executable) bids and offers (from say 9-11am GMT each day)
    build robust, complex (but completely transparent and auditable) algorithms for computing a sensible LIBOR fixing arising from this data; consider open-sourcing this using the Linux model (you might even get core LIBOR and then forks that consenting counterparties might choose to use for their transactions, which is ok as long as the calculation inputs and algorithms are totally transparent and subject to audit upon request1)

This is not only possible, but in fact relatively trivial today. Indeed companies like the Climate Corporation*, Zoopla*, Metamarkets*, Palantir, Splunk (and dozens and dozens more, including newcomers like Indix* and Premise Data Corp) regularly digest, analyze and publish analogous datasets that are at least (almost certainly far more) as big and complex as the newLIBOR I’m suggesting.

Indeed, the management of this process could easily be outsourced to one – or better many – big data companies, with a central regulatory authority playing the role of guardian of standards (the heavy lifting of which could actually be outsourced to other smart data processing auditors…) In theory this “standards guardian” could continue to be the BBA (the “voice of banking and financial services”) but the political and practical reality is that it should almost certainly be replaced in this role, perhaps by the Bank of England, but given the global importance of this benchmark, I think it is also worth thinking creatively about what institution could best play this role. Perhaps the BIS? Or ISO? Or a new agency along the lines of ICANN or the ITU - call it the International Financial Benchmarks Standards Insitute (IFBSI)? The role of this entity would be to set the standards for data collection, storage and computation and vet and safekeep the calculation models and the minimum standards (including power to subsequently audit at any time) required to be a calculation agent (kitemark.) Under this model, you could have multiple organizations – both private and public – publishing the calculation and in principle if done correctly they should all get the same answer (same data in + same model = same benchmark rate.) Pretty basic “many eyes” principal to improve robustness, quickly identify corrupt data or models.

As my friend (and co-founder of Metamarkets and now Premise Data Corporation) David Soloff points out:


And it’s not just LIBOR as Gillian Tett highlights in the FT:

If nothing else, this week’s revelations show why it is right for British political figures, such as Alistair Darling, to call for a radical overhaul of the Libor system. They also show why British policy makers, and others, should not stop there. For the tale of Libor is not some rarity; on the contrary, there are plenty of other parts of the debt and derivatives world that remain opaque and clubby, and continue to breach those basic Smith principles – even as bank chief executives present themselves as champions of free markets. It is perhaps one of the great ironies and hypocrisies of our age; and a source of popular disgust that chief executives would now ignore at their peril.

Rather than join the wailing crowd of doomsayers, I remain optimistic. The solution to this – and other similar issues in global finance – either exist or are emerging at a tremendous pace. I know this because this is what we do here at Anthemis. But I’m clear-headed enough to know that we only have a tiny voice. Clearly it would seem that our long predicted Financial Reformation is starting to climb up the J-curve. I just hope that if Mr. Cameron does launch some sort of parliamentary commission that voices that understand both finance and technology are heard and listened to. Excellent, robust, technology-enabled solutions are entirely within our means, I’m just not confident that the existing players have the willingness to bring these new ideas to the table.

* Disclosure: I have an equity interest, either directly or indirectly in these companies.

1 There may exist some good reasons for keeping some of the underlying data anonymous, but I think it would be perfectly possible to find a good solution whereby the data was made available to all for calculation purposes but the actual contributor names and associated price, volume and metadata were kept anonymous and only known to the central systemic guardian. Of course you’d have to do more than just replace the bank name by some static code, it would need to be dynamically changing, different keys for different calculation agents etc. but all very doable I’m sure. You’d be amazed what smart kids can do with computers these days.

Enhanced by Zemanta

Markets in everything, Part 471: Hooray for Hollywood

The LA Times published an interesting article yesterday discussing the arrival of two new exchanges focused on helping hedge box office risk:

Two trading firms, one of them an established Wall Street player and the other a Midwest upstart, are each about to premiere a sophisticated new financial tool: a box-office futures exchange that would allow Hollywood studios and others to hedge against the box-office performance of movies, similar to the way farmers swap corn or wheat futures to protect themselves from crop failures.

The Cantor Exchange, formed by New York firm Cantor Fitzgerald and set to launch in April, last week demonstrated its system to 90 Hollywood executives in a packed Century City hotel conference room….

…On Wednesday, Indiana company Veriana Networks, which says its management includes “veterans of the Chicago exchange community,” unveiled the Trend Exchange, its own rival futures exchange for box-office receipts.

These are exactly the kind of novel risk management marketplaces that will continue to emerge over the next 5 to 10 years as technology enables robust, easy and cost-effective trading and settlement mechanisms and data (which is the raw material of any exchange or risk management toolkit) continues to grow in size, richness and availability across every sector of the economy. Indeed the greatest impediment to the development of such markets is cultural: there is still an irrational, sometime hysterical, aversion to any risk management tool that is non-traditional and can be characterized as gambling. Of course gambling, trading and hedging are indistinguishable in practice and can only be differentiated in context, and really only represent differences in intent. As such, it is very difficult to proscribe one while allowing the other(s). There are however reasonably good, tried and tested regulatory frameworks that have been developed over decades to manage unhealthy practices (insider trading, market abuse, etc.) in traded markets for outcomes and commodities. Using these, regulators should be happy to quickly approve as many new marketplaces or exchanges as creative entrepreneurs and traders invent and let a thousand flowers bloom. I don’t think it is for the regulators to second-guess who might be interested in trading such markets and why, as long as the market rules and framework are robust, transparent and participants are swiftly held accountable for any abusive behavior.

But that certainly isn’t the way the establishment sees things and even those that are developing new markets often see their market as an exceptional addition to the risk management landscape rather than a specific example of a more general case. (Although to be fair this may be simply a tactic to curry favor with the forces defending the status quo in order not to appear to be too heretical and so smooth approval for their specific new initiative.)

“The day that a widow or orphan bets against ‘Finding Nemo 3’ — that’s not a good day,” said Rob Swagger, Veriana’s chief executive.

Why? Why shouldn’t anyone be able to put their knowledge and insights to work to make a return. Why is it ok for a ‘widow or orphan’ to bet on GE’s future performance (by buying or selling their shares) but not to bet on the potential return of a film? It simply doesn’t make sense. Or the view that certain risks or outcomes are worthy of being traded and managed but not others?

Government authorities have generally approved only those futures exchanges that allow for the redistribution of a preexisting risk. Sports betting is not approved because, unlike a farmer selling a futures contract to offset losses from crop failure, neither party involved in the wager has an economic interest in the underlying event.

This statement is of course patently ridiculous. Many, many agricultural risk contracts are traded amongst principals who are neither producers nor end consumers, and to say that there is no ‘real world’ economic risks that could be managed via sports trading is just silly given that sports is an enormous, global business with hundreds of billions of dollars of capital at risk. And if that weren’t enough, it is happening anyways, with admittedly high risks of fraud and abuse. Wouldn’t it make more sense (in the context of protecting vulnerable market participants) to encourage regulated, robust, well monitored marketplaces rather than cling to the current Potemkin-esque prohibition? (Disclosure: I am a shareholder in Betfair.)

In any event, I can only endorse Cantor’s vision of creating a new, more vibrant and useful market for managing risk and structuring finance in the entertainment industry:

Now Cantor hopes for its exchange to be the first of many complex financing products for the entertainment industry. In one of the more ambitious plans, Jaycobs wants to team with filmmakers to create something like an initial public offering of stock in a specific film, staking out a potential new way to finance production.

And I hope they (and Trend Exchange,) working along side the CFTC are able to quickly illustrate that well-built and well-regulated marketplaces can mitigate the potential dangers while at the same time providing a powerful and useful set of tools for managing risk and generating returns. Perhaps this will help pry open the door to seeing more and more outcome markets develop of the course of the next several years.

Reblog this post [with Zemanta]

Probably the best start-up you’ve never heard of.

Today Markit Group announced that General Atlantic has invested $250 million, valuing the 7 year old company at a whopping $3.3 billion. Founded by Lance Uggla, Kevin Gould and Rony Grushka in 2003 to address the growing need for quality data in the burgeoning credit derivatives market, what followed was several years of unbelievably good execution and disciplined acquisitions which has positioned the company as a critical component at the heart of the trillion dollar OTC derivative markets. The products they provide aren’t considered sexy (something that is often given all too much importance in this status conscious industry) – but their data, valuations, indices, trade processing and other products and services are the plumbing that is key to the continuing operation of many financial markets. They are a great example of creating value by building a great platform and understanding how to monetize data. I had the good fortune to be a non-executive director from 2003 to 2006 and I can say without hesitation that this team is one of the best I’ve ever seen and fully deserve the success they have achieved. (Congratulation guys. Awesome, truly awesome.)

And I am certain there is more to come. Their primary constraint has and will likely continue to be the physical/logistical limitations of growing as fast as they have but each year they only improve and in terms of acquisitions the company they most remind me of (in terms of disciplined and deliberate execution) is Cisco. Besides, General Atlantic doesn’t invest in companies where they don’t think they can make 20-30% annual returns or more.

And yet many (most) people in the ‘start-up’/’tech’ scene whether in the UK or the US have never (or only vaguely) heard of Markit. (For example, I counted only about 50 or so tweets referencing the announcement today, less than for any TechCrunch launching start-up…) Why is that? Obviously I can’t say for sure but (in no particular order) would guess the explanation perhaps lies in the following:

  • not venture capital funded; funding initially came from it’s cornerstone customers, the investment banks, and then later from some very smart hedge funds
  • focused on the wholesale financial services industry (and not on consumer or media or other mass markets)
  • key products and services (and associated economics) unknown to those outside finance and even worse generally considered ‘boring’
  • management team laser focused on execution, not PR (although to be fair they had this luxury not needing to sell to the mass market)
  • and so folks like TechCrunch and VentureBeat don’t know or write about them (aka “if a startup isn’t listed in CrunchBase does it really exist?” syndrome)

Indeed for me, Markit is a poster child for the cognitive, cultural and expertise chasm that exists between ‘Wall Street’ and ‘the Valley’ (or the ‘City’ and the ‘Roundabout’ to use the less good UK-centric metaphor.) They might as well be on different planets. Indeed bridging this divide is at the core of what we set out to do at Nauiokas Park and was the driver that led Paul Kedrosky and Tim O’Reilly to launch the Money:Tech conference in 2008 (which sadly didn’t survive the financial crisis and quite frankly was met by a deafening indifference by the vast majority of the Wall Street side of the equation.)

And yet, the opportunities available to those who can successful bridge this gap are enormous. Well, anyway that’s what we think. And the crisis in venture capital ostensibly caused by too much capital? I’m going to disagree with Paul and Fred and suggest it’s not too much money overall; rather it’s too much money concentrated with too few investors, focused on too few sectors, who end up all chasing the same deals. So to the LPs out there my message would be: don’t shrink the pool, enlarge the opportunity space. Oh, and try to make sure you’ve got exposure to the next Markit Group.

Reblog this post [with Zemanta]

Data, AI, Web, Repeat.

Today announced a further GBP3.75 million investment round in which we are very excited to be participating alongside Atlas Venture and Octopus Ventures. I will also be joining their advisory board. We were first introduced to the very talented founder and CEO, Alex Chesterman, by my friend Fred Destin almost a year ago after I had congratulated him on Atlas’ original investment in Zoopla and had expressed my admiration for Zoopla’s site and approach. logo

So what is Zoopla!? In their own words: is a unique property website offering users information and tools to help them make better-informed property decisions. Our aim is to provide the most comprehensive source of residential property market information in the UK to help buyers, sellers, owners and estate agents alike and give them an advantage in the property market…

…We have started by providing FREE value estimates, sold prices and local information as well as letting users add content by editing information and uploading photos. We are the UK’s fastest growing property website and by far the largest and most active property community in the UK, with over a million user contributions to our website in 2008 alone…

…Our value estimates are calculated using a proprietary algorithm (a secret formula) that we have developed by analysing millions of data points relating to property sales and home characteristics throughout the UK. The algorithm works by comparing relationships between home prices, economic trends and property characteristics in given geographic areas. Our estimates are constantly refined, using the most recent data available and a variety of statistical methodologies, in order to provide the most current information on any home.

We are still testing and improving our features and tools and recognise that things aren’t perfect yet…

So what’s so interesting about Zoopla!? Or perhaps more specifically, how does Zoopla fit into Nauiokas Park’s investment universe? Two words: rich data.

  1. In Zoopla, Alex and Simon Kain (co-founder and CTO), have leveraged the web to feed intelligent algorithms that allow them to bootstrap basic, publicly available data, into an increasingly more robust, accurate, rich and granular dataset of UK residential property.
  2. They have built the site in a way that naturally compels visitors to improve and enrich the dataset. This user-generated data is not only very valuable but is itself subject to Metcalfe’s Law and so adds tremendously to the sustainable advantage of the site and their database. This is not trivial. When I was running a Credit Trading business, complex-data quality issues were absolutely critical to running the business efficiently and having effective risk management. We, like other banks, were plagued with bad quality (inconsistent, out-of-date, missing, etc. etc.) data. As a part of the ‘web-ification’ of our business (pre Digital Markets stuff), one of the single most effective things we did was to expose our various data structures to broad populations of users within the bank and allow users to correct and enhance the data on an ad hoc basis. Of course the ‘data priests’ were aghast…but it worked. Really I think it’s just applying a variation of Linus’ Law: “given enough eyeballs, all bugs are shallow.”

But how does a unique, rich, ever-improving, granular, transparent, database of UK property prices fit with Nauiokas Park’s focus on disruptive business models and technologies in financial services and markets? Well, we think Zoopla is ideally positioned to drive and benefit from a fundamental shift in the economic structure underlying the property markets. (This is a theme regular readers will recognise,) ie the shift from a market predicated on information scarcity to one build on information abundance. And you don’t even have to be particularly clever to work out how this is likely to play out, as property is the ith market in a series of [N] markets to have this thrust upon them. I don’t want to give too much away, but for the City types out there just think back to the bond markets of 1990. (For Wall Street types you only have to think back to oh about, 2004…) All other things being equal, as this “phase change” occurs in an industry, value moves away from transactions (matching) to data. (Think Merrill Lynch vs. Bloomberg LP over the past few years as a reasonable pair trade in this vein. Or all investment banks vs. Markit Group…)

Post-2008, even the proverbial man-in-the-street knows there was a data… how would you say… “issue”… when it came to the intersection of residential property and finance… Now I’m not suggesting (not quite anyways) that had Zoopla existed and been well-established globally years ago that the sub-crimeprime crisis would not have occurred (stupid is as stupid does)…but having easy access to the kind of readily “digestable” data available from Zoopla would clearly have been a boon to any responsible mortgage underwriter or securitization professional. In fact, I’d go so far as to say that today were I an institutional investor in UK RMBS, I would require that the underwriters/originators of the pools provide me with a FTP feed of the individual Zoopla data of every property in the pool. And if I were running say a big UK mortgage book and/or originator, I would certainly be interested in having an independent automated external mark-to-market run at least monthly, probably weekly…you get the idea.

And finally, whenever you have good, digital, reproduce-able data, well there my friend you have the makings of a myriad of listed and OTC markets in that underlying. Think Case-Shiller only better.

We are truly excited by the myriad of business opportunities available to Zoopla as it continues to grow and improve its core database and builds products and services on top, but perhaps most exciting is being able to participate once again at the early stages of a company that is set to play a key role in transforming an important and large marketplace, reducing friction and creating an entirely new value paradigm. Even reminds me a little of another UK start-up you might have heard of called Betfair… And we can’t wait to see what Alex and the team will achieve in the next few years and look forward to helping them in any way we can.

So, if you live in the UK, what are you waiting for? Go Zoopla! your home, claim it, enhance the data and presto, you now have effectively a pretty good proxy ticker-tape for (probably) the most important asset you own.

Reblog this post [with Zemanta]

Alchemy is not a good core strategy in financial services.

Last September I was asked to give a presentation at the DerivaTech conference in London on the merits of derivative markets. My basic premise was that derivatives are (just) tools: they can be incredibly useful and are not intrinsically ‘good’ or ‘bad’ but rather their utility (or danger to society) depends on how they are used. You can use a hammer to build a house. Or you can use it to bash someone’s head in. Getting rid of hammers because of this undesirable use case obviously wouldn’t make too much sense.

Further, I made the case that the industry had done itself an enormous disservice by “using the hammer” in the “wrong” way – by (deliberately) exploiting the ability of derivatives to obfuscate, the industry had not only ended up losing hundreds of billions but had done a great job in destroying perhaps its single most important core value creator. That of course would be trust. And in the bargain all the beneficial uses of derivatives risked being thrown out with the proverbial bath water.
The arbitrage alchemists...
Basically, as their traditional businesses and cash cows – agency trading, underwriting, etc. – had their margins melt and their business models / compensation structures made obsolete by the rise of the networked information economy (destroying information scarcity which lay at the core of the traditional banking business model), the banks turned more and more to principal risk taking – prop trading, derivatives ‘arbitrage’, etc. – to make up the difference. Putting aside the moral hazard (too big to fail, insured deposits etc.) issues this raised and ignoring for a moment whether or not it is an intrinsically good business model for a bank, it got worse as this shift coincided with a long period of low volatility and benign economic growth… This meant that the (real) opportunities disappeared quickly and – still needing to shore up the bottom line, to feed the blue line – what had started out as science slowly but surely slid into alchemy

Of course this didn’t happen overnight, but slowly and therein lay the heightened danger: like the apocryphal frog boiling in a slowly heating pot, what started out as useful and reasonable ended up dangerous and irresponsible.

It strikes me that the whole Madoff affair was in fact a particularly acute and egregious manifestation of this phenomenon. I was reminded of this by Andy Kessler’s excellent analysis in Forbes:

My guess is that this is what went down. Even though Madoff Securities was on the leading edge of automated trading, the business itself was becoming less and less lucrative. Everyone had the same computers. Spreads, the difference between the bid price and the ask price that became Wall Street trading profits, began shrinking. And the move to list stocks in penny increments instead of eighths (12.5 cents) whacked trading desks all over Wall Street.

So you make it up in volume. Beyond cocktail parties, Madoff really created the money management business to feed himself trades. But his strategy was garbage. He absolutely bombed as a money manager, but he desperately needed the assets under management to feed his trading operations, so he started to make the numbers up. As is usually the case, most don’t set out to be crooks, but Madoff became one when his talents proved lacking. There is your “why.”

It’s not new. This was the Enron story: They lost tons in water ventures and Indian power plants, so concocted fraudulent entities to cover up their losses. Same for Sam Israel and his Bayou hedge fund. And even (without the fraud) the Citigroup/Wall Street story, too. They tried to be investors to make up the difference of their bread-and-butter business deteriorating and were awful at it, so they levered up in off-balance-sheet vehicles.

So why are smart people seduced into these kind of strategies (ie bloody-mindedly pursuing disappearing returns to the point of destruction)? Obviously any trite answer on a blog post will fail miserably to do justice to this question, but if I had to venture a pithy hypothesis, it would be that – like it or not – most people are wired to prefer risking conventional failure over embracing unconventional success. Just ask the behavioral finance guys…I think it has something to do with continuing to dance.

So I can get my head around a ‘Madoff’ happening. What is harder to understand is what on earth the fund-of-funds who invested so much money with him were thinking? I may be obtuse, but I thought the main (the only?) reason for these businesses to exist was in order to identify, understand and monitor good investment managers. On this I have to say I agree with Martin on this (that financial companies who made money selling Madoff products should return their commissions.) And it is worth pointing out that regulators haven’t exactly covered themselves in glory either (which should be a cautionary tale for those who suggest that regulation is a panacea…)

Perhaps the only good thing to come out of all of this is that the cult of secrecy that for too long permeated finance will disappear. Don’t misunderstand me, there is a time and place for confidentiality. But too often it is indiscriminately invoked like some sort of fantastical talisman – out of all proportion and context – to hide not skill but incompetence.

And to end on a more optimistic note, the problem is with the ‘traditional’ (ie 19th/20th) business models in finance, not finance itself. And here at the dawn of the 21st century there is an abundance of opportunity to discover, invent and build the financial services industry of the future. This hasn’t changed in 2008. It just became a bit more likely to happen sooner rather than later. Remember the wise words of William Gibson:

The future is already here – it is just unevenly distributed.

Reblog this post [with Zemanta]

Sunshine Guaranteed.

Image representing WeatherBill as depicted in ...
Image via CrunchBase

Great to see David and his team at Weatherbill line up another great deal – this time with Priceline, bringing the benefits of (weather) derivatives to Main Street:

Under the limited-time Sunshine Guaranteed promotion launched today, customers who book a qualifying Priceline vacation package between June 2 and July 17, 2008 and travel between July 1 and September 7, 2008 will be eligible for a refund if their vacation is rained out. For full details on Priceline’s Sunshine Guaranteed promotion, visit:

Brett Keller,’s Chief Marketing Officer commented, “Ten years ago with our Name Your Own Price® launch, and more recently with our elimination of booking fees on published-price domestic and international airfares, has demonstrated a commitment to continually innovate in order to get great deals for our customers. Now we’re also offering them great weather. Best of all, these Sunshine Guaranteed vacations are available at the same great prices we offer for all of our packages. Our customers can book their Sunshine Guaranteed trips and rest assured that there’s a silver lining waiting if Mother Nature doesn’t cooperate.”

There is no additional charge to book a Sunshine Guaranteed vacation package. Qualifying vacation packages must be 3-8 days in length. Travel must commence at least 12 days after a package is purchased. If it rains more than 0.50 inches per day on half or more of the days of a Sunshine Guaranteed vacation (including travel days), will provide a refund for 100% of the cost of airfare, hotel, rental car and attractions and services components of the Sunshine Guaranteed vacation package.

Reblog this post [with Zemanta]