The micro-cracks are turning into fissures, soon to be gaping crevasses as (finally) the obsolescence of our industrial age banking system plays itself out in spectacular front page headlines. Meanwhile it would seem that our society and our leaders are (mostly) frozen in some kind of macabre trance – eating popcorn and mesmerized by the inevitable Crash.
If you look at the LIBOR scandal in the context of the technology of the fast emerging information economy, it is absolutely mind-boggling that such an anachronistic process even exists in the world of 2012. In a world where every financial flow is digitized and only really exists as an entry in a database. In a world where truly enormous real-time data sets (ones that make the underlying data required for a true LIBOR look puny) are routinely captured and analyzed in the time it takes to read this sentence. In a world where millions (soon billions) of people have enough processing power in their pocket to compute complex algorithms. In a world where a high school hacker can store terabytes of data in the cloud. In this world, we continue to produce one of the most important inputs into global financial markets using the equivalent of a notebook and a biro… WTF???
The rate at which an individual Contributor Panel bank could borrow funds, were it to do so by asking for and then accepting inter-bank offers in reasonable market size, just prior to 11.00 London time.
For each (of 10) currencies, a panel of 7-18 contributing banks is asked to submit their opinion (yes, you read right) each morning on what each rate (by maturity) should be. The published rated is then the “trimmed arithmetic mean”; basically they throw out the highest and lowest submissions and average the rest. No account is taken of the size or creditworthiness or funding position of each bank and the sample size after the “trimming” for each calculation is between 4-10 banks. However, the BBA assures us that this calculation method means that:
…it is out of the control of any individual panel contributor to influence the calculation and affect the bbalibor quote.
You don’t need to be a banker or a quantitative or statistical genius, or an expert in sociology, or even particularly clever to figure out that this is a pretty sub-optimal way to calculate any sort of index, let alone one that has an impact on the pricing and outcomes of trillions of dollars worth of contracts…
In the 1980s when LIBOR was invented – and (lest the angry mob now try to throw the baby out) it should be said an important and good invention – this methodology just might have been acceptable then, as the “best practical solution available given the market and technological context.” Banks used to have to physically run their bids in Gilt auctions to the Bank of England (thus why historically banks were located in the City, tough to compete on that basis from the West End or Canary Wharf, at least without employees a few Kenyan middle distance Olympians…) But you know what? And this is shocking I know… They don’t do it that way anymore!!!
So if LIBOR is important (and it is), how should we be calculating this in the 21st century? Here’s a few ideas:
include all banks participating in the market – and not necessarily just those in London – how about G(lobal)IBOR??
collect and maintain (in quasi-real time) important meta-data for each contributing bank (balance sheet size and currency breakdown of same by both deposits and loans, credit rating, historical interbank lending positions, volatility/consistency of submissions, derivative exposure to LIBOR rates, etc.)
collect rates and volumes for all realized interbank trades and live (executable) bids and offers (from say 9-11am GMT each day)
build robust, complex (but completely transparent and auditable) algorithms for computing a sensible LIBOR fixing arising from this data; consider open-sourcing this using the Linux model (you might even get core LIBOR and then forks that consenting counterparties might choose to use for their transactions, which is ok as long as the calculation inputs and algorithms are totally transparent and subject to audit upon request1)
This is not only possible, but in fact relatively trivial today. Indeed companies like the Climate Corporation*, Zoopla*, Metamarkets*, Palantir, Splunk (and dozens and dozens more, including newcomers like Indix* and Premise Data Corp) regularly digest, analyze and publish analogous datasets that are at least (almost certainly far more) as big and complex as the newLIBOR I’m suggesting.
Indeed, the management of this process could easily be outsourced to one – or better many – big data companies, with a central regulatory authority playing the role of guardian of standards (the heavy lifting of which could actually be outsourced to other smart data processing auditors…) In theory this “standards guardian” could continue to be the BBA(the “voice of banking and financial services”) but the political and practical reality is that it should almost certainly be replaced in this role, perhaps by the Bank of England, but given the global importance of this benchmark, I think it is also worth thinking creatively about what institution could best play this role. Perhaps the BIS? Or ISO? Or a new agency along the lines of ICANN or the ITU - call it the International Financial Benchmarks Standards Insitute (IFBSI)? The role of this entity would be to set the standards for data collection, storage and computation and vet and safekeep the calculation models and the minimum standards (including power to subsequently audit at any time) required to be a calculation agent (kitemark.) Under this model, you could have multiple organizations – both private and public – publishing the calculation and in principle if done correctly they should all get the same answer (same data in + same model = same benchmark rate.) Pretty basic “many eyes” principal to improve robustness, quickly identify corrupt data or models.
As my friend (and co-founder of Metamarkets and now Premise Data Corporation) David Soloff points out:
If nothing else, this week’s revelations show why it is right for British political figures, such as Alistair Darling, to call for a radical overhaul of the Libor system. They also show why British policy makers, and others, should not stop there. For the tale of Libor is not some rarity; on the contrary, there are plenty of other parts of the debt and derivatives world that remain opaque and clubby, and continue to breach those basic Smith principles – even as bank chief executives present themselves as champions of free markets. It is perhaps one of the great ironies and hypocrisies of our age; and a source of popular disgust that chief executives would now ignore at their peril.
Rather than join the wailing crowd of doomsayers, I remain optimistic. The solution to this – and other similar issues in global finance – either exist or are emerging at a tremendous pace. I know this because this is what we do here at Anthemis. But I’m clear-headed enough to know that we only have a tiny voice. Clearly it would seem that our long predicted Financial Reformation is starting to climb up the J-curve. I just hope that if Mr. Cameron does launch some sort of parliamentary commission that voices that understand both finance and technology are heard and listened to. Excellent, robust, technology-enabled solutions are entirely within our means, I’m just not confident that the existing players have the willingness to bring these new ideas to the table.
* Disclosure: I have an equity interest, either directly or indirectly in these companies.
1There may exist some good reasons for keeping some of the underlying data anonymous, but I think it would be perfectly possible to find a good solution whereby the data was made available to all for calculation purposes but the actual contributor names and associated price, volume and metadata were kept anonymous and only known to the central systemic guardian. Of course you’d have to do more than just replace the bank name by some static code, it would need to be dynamically changing, different keys for different calculation agents etc. but all very doable I’m sure. You’d be amazed what smart kids can do with computers these days.
Markets in compute power, much talked about by me and others are now it seems finally here (from The Economist:)
Fundamentally, SpotCloud works like other spot markets. Firms with excess computing capacity—operators of data centres, cloud providers, hosting firms—put it up for sale. Others, who have a short-term need for some number-crunching, can bid for it. Enomaly takes a cut of between 10% and 30% depending on the size of the deal. But there is an important difference: SpotCloud is what Enomaly calls an “opaque market”, meaning that the firms offering capacity do not have to reveal their identity. Thus selling computing services for cheap on SpotCloud does not cannibalise regular offerings.
You may have noticed, I haven’t been posting much here lately. It’s not that I don’t have anything to say, probably just the opposite (!) but have be full out from dawn until dusk working on a number of exciting new projects including our own development (more on that in a few weeks.) One project that has been front of mind the past few weeks is a new company we are developing that is directly inspired by Paul Graham‘s great advice to “solve problems that affect you directly”.
A bit of background. When I was in banking, one of the achievements I was most proud of was effectively using web technology to transform how (debt) capital was raised (at least in Europe*.) At DrKW, we built what for many years was the state of the art capital raising platform, whose core product was our eBookbuilding platform (now in Commerzbank yellow!) It completely revolutionised what had heretofore been a disjointed, manual, somewhat ad hoc process into a seamless, collaborative, mostly painless process. Initially it met with enormous resistance from other (much bigger and more successful) banks and syndicate managers, who as ‘guardians of the temple’ jealously guarded their power, derived (in their minds) from the information asymmetry they enjoyed (vs issuers and investors.) However – and despite being at best a middling player in the fixed income new issues market – our disruptive technology was such a big improvement on the status quo that eventually the market adopted our standards (with everyone then rushing to build their own analogous platforms.) In the spirit of making sure these platforms could ‘play well together’ we even published our XML-Schema for new issues and invited all our competitors to contribute to it and use it. (Which had the effect of basically freaking out our competitors. They thought we were crazy – like Ali – because they didn’t have the slightest idea what it means to compete in a world of information abundance and platforms, but that story is for another day…)
Anyhow, when I became seriously and then professionally active in ‘venture capital’ or more generically speaking, in investing in private companies, the lack of technology available to manage workflows surprised me; I was particularly puzzled because ostensibly this was a world populated with techophiles, early adopters and people who ate disruption for breakfast (quite unlike the world of institutional capital markets). Further, there is much talk (and consensus) around the fact that it is hard/impossible to scale venture investing. And while I think this holds at some level, it struck me that a significant number of the gating factors limiting the ability to scale could be vastly improved. Not to infinity but substantially, perhaps by an order of magnitude. Pulling out an example from my old career, when I started life as a bond trader 20 years ago (ack!) the number of bonds that a typical good trader could manage numbered in the dozens at best (and even then, you would find that a trader really traded 10 to 20 bonds 80% of the time and sort of went through the motions for the other bonds hoping most of the time not to trade.) Then came Bloomberg. And excel spreadsheets. (And later bespoke pricing and analytic tools and platforms.) And all of the sudden, a trader could manage a book with hundreds of securities. There was still a degree of 80/20 but everything was an order of magnitude bigger.
I don’t know if our new initiative will definitely achieve that degree of change in the private investment market, but we are convinced that there is a better way and having a fit-for-purpose platform to help company management, non-executive directors and investors communicate, collaborate and manage their positions and responsibilities would be a huge step forward. It’s not that nothing currently exists, but I would say we are at the ‘excel spreadsheet’ phase to use my bond trading analogy – with many firms and people starting to use things like Google Apps or Basecamp and the like to better manage information flows and collaboration. But while this (and excel for traders) is (was) a good start, the real juice comes when dedicated, purpose-built platforms emerge. If you have a screw that needs driving, a hammer is better than nothing (or a rock) but a screwdriver is even better! (A power screwdriver better still!)
So we conceived of (what has been provisionally named) CiRX – the corporate director and investor relations information exchange:
CiRX is a purpose-built platform enabling private companies, directors and investors to communicate and collaborate more efficiently saving time, money and effort. By streamlining processes and connecting stakeholders in an intuitive and context-rich environment, CiRX offers a tailored yet consistent solution to the challenge of managing information and documentation flows, reducing administrative burdens and creating opportunities for a richer, more dynamic and flexible approach to corporate governance and strategic management.
Over the past few months, we have been developing the concept, the business model and have done a significant amount of macro research to identify the potential size of the market opportunity and now have started to take the next step and ‘talk/think details’ as they say. In order to support this next stage of development, as we are poised to start ‘cutting code’, we wanted to get more direct feedback from the community – of company executives and founders, non-executives, angel and institutional investors – to better understand how their experiences and perceptions were both similar and different to our own. To do so we created a short(ish) survey and have sent it to a number of our contacts across all these communities, but if we missed you and you are a company founder or non-exec director or investor in one or more private companies and you are interested in contributing your views, you can find the survey by clicking here.(We’ll leave the survey open for a couple weeks probably but if you are so inclined to complete it, we are excited to be presenting CiRX at mini-seedcamp London next week so would be great to have as much feedback as possible before then.) Of course you are also welcome to share your views – good, bad and ugly – in the comments below.
* That e-bookbuilding (generic) never gained acceptance in the US (at least not while I was still in the market) is in my opinion a telling manifestation of the oligopoly of Wall Street (which gives us things like 7% IPO fees with the spooky consistency of North Korean election results) which absent the pressure of competition, allowed the dominant underwriters to resist this change tooth and nail. It was even more glaringly apparent when these same US firms operating in Europe adopted e-bookbuilding as strongly as everyone else once it was obvious it was an evolutionary winner…
You may have noticed that I haven’t posted much in the last couple months and given all the interesting things going on in the world it certainly wasn’t for lack of material. Breaking my arm obviously didn’t help increase my productivity (or make typing very easy) but it wasn’t the main reason for the silence. It’s much simpler than that: I was busy!
Busy investing in a whole bunch of super exciting and interesting new businesses. Busy working on the sale of ODL Group (where I was the lead independent non-executive director) to FXCM to create a true global leader in FX trading. Busy working with my partner Uday and FT Advisors on a number of interesting strategic advisory projects, in particular focused on the electronic and algorithmic trading space. Busy helping two of our portfolio companies raise follow-on financing. Busy working on our own corporate structure and capital raising where I hope to be able to communicate some exciting news in the not too distant future. Busy.
So what have we been investing in? Here is a quick rundown (in alphabetical order):
Babuki – 2008 seedcamp winner, launching soon (will update) with an innovative platform for social gaming
Blueleaf – investment information management and planning software “to help people like you see all their savings and investment accounts in one place; understand their financial information more completely, more quickly; securely share information and collaborate with spouses, family or advisors; save their data, even if they change financial institutions; and maybe most importantly, help them stay financially safe and secure.”
Timetric – builds services to make sense of time-series statistics, based on the Timetric Platform: a proprietary service for publishing, analysing, and performing calculations on very large quantities of time-varying statistical data. Have a look at this neat little demo website they have built for tracking equity portfolios.
Metamarkets – provides global, real-time media price discovery by aggregating billions of electronic media transactions in order to deliver dynamic price data, proprietary price and volume aggregations, and comprehensive analytic media market views to sell-side media principals.
[not yet closed - will update soon]
Over the next few weeks or so, I plan to do a proper write-up on each of these businesses and the reasons we think they have bright prospects. So watch this space.
Admittedly a very small holding (acquired via our investment in CohesiveFT) and with some mixed feelings (more on that below) but nonetheless an excellent result for an exciting and important technology and the team behind it led by the one and only Alexis Richardson…yes today SpringSource (VMWare) announced its acquisition of Rabbit Technologies – the company behind the world’s leading implementation of AMQP, RabbitMQ.
RabbitMQ was born of a JV between CohesiveFT (my partner Amy sits on their Board) and L-Shift and was spun out as an independent entity under Alexis’ leadership about a year ago. The mixed feelings I alluded to above are only because we were quite excited by the prospect of helping Rabbit grow as a standalone business, given their already excellent market share, the existing and extremely fast growing market for their product (messaging), the already strong brand and market adoption of RabbitMQ and a number of successful open-source business model pioneers and exits to emulate. As we did not have the capital required to make this happen we could not put a credible alternative on the table. To be fair, there were always a lot of moving parts and there is no guarantee that we could have put a better, workable deal forward and clearly joining the VMWare family is an awesome opportunity for the company and the team.
In any event, I’m really excited and happy for them and proud to be associated with them, even if only in a small way. Here’s to hoping this is a homerun deal for VMWare! (And yes having “Rabbit” in your name is one of our investment criterea…)
Today Markit Group announced that General Atlantic has invested $250 million, valuing the 7 year old company at a whopping $3.3 billion. Founded by Lance Uggla, Kevin Gould and Rony Grushka in 2003 to address the growing need for quality data in the burgeoning credit derivatives market, what followed was several years of unbelievably good execution and disciplined acquisitions which has positioned the company as a critical component at the heart of the trillion dollar OTC derivative markets. The products they provide aren’t considered sexy (something that is often given all too much importance in this status conscious industry) – but their data, valuations, indices, trade processing and other products and services are the plumbing that is key to the continuing operation of many financial markets. They are a great example of creating value by building a great platform and understanding how to monetize data. I had the good fortune to be a non-executive director from 2003 to 2006 and I can say without hesitation that this team is one of the best I’ve ever seen and fully deserve the success they have achieved. (Congratulation guys. Awesome, truly awesome.)
And I am certain there is more to come. Their primary constraint has and will likely continue to be the physical/logistical limitations of growing as fast as they have but each year they only improve and in terms of acquisitions the company they most remind me of (in terms of disciplined and deliberate execution) is Cisco. Besides, General Atlantic doesn’t invest in companies where they don’t think they can make 20-30% annual returns or more.
And yet many (most) people in the ‘start-up’/'tech’ scene whether in the UK or the US have never (or only vaguely) heard of Markit. (For example, I counted only about 50 or so tweets referencing the announcement today, less than for any TechCrunch launching start-up…) Why is that? Obviously I can’t say for sure but (in no particular order) would guess the explanation perhaps lies in the following:
not venture capital funded; funding initially came from it’s cornerstone customers, the investment banks, and then later from some very smart hedge funds
focused on the wholesale financial services industry (and not on consumer or media or other mass markets)
key products and services (and associated economics) unknown to those outside finance and even worse generally considered ‘boring’
management team laser focused on execution, not PR (although to be fair they had this luxury not needing to sell to the mass market)
and so folks like TechCrunch and VentureBeat don’t know or write about them (aka “if a startup isn’t listed in CrunchBase does it really exist?” syndrome)
Indeed for me, Markit is a poster child for the cognitive, cultural and expertise chasm that exists between ‘Wall Street’ and ‘the Valley’ (or the ‘City’ and the ‘Roundabout’ to use the less good UK-centric metaphor.) They might as well be on different planets. Indeed bridging this divide is at the core of what we set out to do at Nauiokas Park and was the driver that led Paul Kedrosky and Tim O’Reilly to launch the Money:Tech conference in 2008 (which sadly didn’t survive the financial crisis and quite frankly was met by a deafening indifference by the vast majority of the Wall Street side of the equation.)
And yet, the opportunities available to those who can successful bridge this gap are enormous. Well, anyway that’s what we think. And the crisis in venture capital ostensibly caused by too much capital? I’m going to disagree with Paul and Fred and suggest it’s not too much money overall; rather it’s too much money concentrated with too few investors, focused on too few sectors, who end up all chasing the same deals. So to the LPs out there my message would be: don’t shrink the pool, enlarge the opportunity space. Oh, and try to make sure you’ve got exposure to the next Markit Group.
A couple of months ago, I had the privilege to have been invited to speak at eComm 09 in Amsterdam. I have posted on this previously but recently the video of my talk was posted and perhaps will make it easier to understand my accompanying presentation. If you can spare 20 minutes (there is an additional 10 minutes of q&a at the end) and are interested in understanding how Nauiokas Park defines our opportunity space, please have a look as it is probably the most succinct expression of the worldview we bring to investing and analyzing potential investment opportunities.
And here is the presentation again, in case you would like to follow along as you listen to the video:
Well-built developer platforms are the future of every industry. (-ReadWriteWeb)
Note: Their is a small glitch around 7:40 where the video skips over a few seconds; funnily enough (for the conspiracy theorists out there) this is exactly where I say that had ZSIN’s existed, the extent of the disasters that occurred in the mortgage securitization markets would have been at least an order of magnitude smaller…)
A phrase popularized by the late Charlton Heston in his crusading role as the poster boy for the NRA. But I’m surprised it hasn’t yet been officially adopted by more old economy industry groups as a rallying cry to marshall support to save and protect their dying business models. To the bitter end.
An Ontario court has shut the door on attempts to create new web sites to repackage real estate listings using data from the Multiple Listings Service system.
In a ruling released Monday, Mr. Justice David Brown of the Ontario Superior Court said Toronto real estate broker Fraser Beach did not have the right to provide broad public access to MLS data through a web site he helped create while working for BCE Inc. division Bell New Ventures in 2007.
The decision comes after the Toronto Real Estate Board (TREB) shut down several attempts in recent years to create new web sites allowing members of the public to sort MLS data – including an operation started by Mr. Beach.
That the Canadian Real Estate Association would want to protect its MLS data is entirely reasonable, indeed it is a very valuable dataset. However one would hope that they would take this as a wake-up call and start thinking very hard about developing a new business model around this data. One that reflects the modern realities of a fully connected, digitized economy. Perhaps they are. To be honest I have no idea. So acknowledging that this is pure unadulterated speculation, I suspect they aren’t. I suspect like the newspaper, music, bookselling, banking, etc. sectors before them, the main focal point of their efforts is to keep the bloody genie in the bottle. At least for long enough for the old hands to ride off into the sunset and let the next generation deal with it.
It’s a shame really, because on paper – as for most incumbents – not only do they have the most (everything) to lose when the paradigm shifts, but they are also by far the best positioned to maintain a leadership position so long as they adapt (in time.) Inertia, installed base and brand recognition take care of that. Basically they’ve got a strong hand. But time and time again it seems that these kinds of companies and institutions can’t help themselves but to overplay it. Taking another card while holding two Jacks kind of thing. Admittedly it would be hard work for someone to build up a competitive offering to the MLS from scratch, but I suspect not impossible. I don’t know what the public information access laws are like in Canada but if they are similar to those in the UK for instance, a smart entrepreneur might mimic the route taken by Zoopla and bootstrap prices starting from public sales records. And even if they do manage to maintain a data monopoly, they and their member agents will be faced with an increasingly angry client base who won’t readily accept being held hostage by secretive data trolls.
If I were a Canadian real-estate broker, I would be leading the charge to flip the MLS and traditional broker roles on their heads. Having read this excellent post on the future of my profession, I would understand that my customers are (mostly) not looking to do away with me but to get real value from my services and insights and conversely will become annoyed and resentful if they get the feeling they’re just paying a toll to a glorified data monkey.
The way a broker creates value in a world of abundance (vs a world of scarcity) is fundamentally different. Someone forgot to tell the record companies. Let’s not make the same mistake again. Save a real estate broker: free the data.
I thought I’d play a little markets jeopardy with the headline to this post. The question of course is: “what would happen if Google stopped mucking around and just came out and said it?” Said they were going to take their massive dataset, brilliant algorithms and (hire) all the smartest people in all the lands and offer a free service to “do anything anyone anywhere might conceivably want to do.” That should be enough to cast a pall over even the most profitable or promising companies. Sell everything (else) and buy Google, right?
Many of you are of course thinking no, not right: the premise is far-fetched (not to say ridiculous) and even if you accept it in the spirit of the thought experiment it so obviously is, the conclusion – that they take out every other competitor at the kneecaps – is not a given by any stretch of the imagination. And yet, when Google announced that they were going to launch a free property listing plug-in to enhance their UK maps product, the market reacted pretty much as if Google were indeed Merlin the Magician and just by waving it’s googly wand it could take over any market at will just by unleashing its fierce intellect and sizzling technology on the hapless incumbents. In this particular instance, Rightmove‘s (the leading UK property portal) shares collapsed on the news trading down 10% on the day and c. 15% in all since the story broke. Now to be fair, having traded as low as 156p at the start of the year, RMV shares have had a pretty solid 2009, hitting a high of just over 600p and trading around 550p before the Google ‘news’ hit the market. And since investing (and especially trading) is not about picking the prettiest asset but picking the asset you think most others will find prettiest, I don’t blame any fund manager for selling first and asking questions later. And I have much sympathy for those that think that Rightmove’s market leadership is vulnerable in the medium term; only I don’t harbor much fear that this threat will come from Mountainview. The competitor that Rightmove’s shareholders should be keeping a close eye on isn’t Google, but Zoopla of course. (Reminder: we are investors in Zoopla.) Ah, but Zoopla has a silly name, it can’t be a real threat. Google however…
And it’s not just UK property where I think the mainstream markets and pundits breathlessly get it wrong about Google. In area after area they have proven not to be a very successful or threatening competitor and in other areas their entry has often been a boon for specialist competitors in the segment due to the legitimizing power Google brings to the table. They are able to (implicitly) validate new business models in ways a smaller, more specialist start-up could never dream of, and yet this market validation very often plays right into the hands of folks who, well, know what the hell they are doing.
Don’t believe me? Let’s take just a couple areas where – if you believe the logic in the argument used to justify Rightmove‘s downtrade – Google should be causing wholesale panic and disruption:
Financial Information: maybe I’m wrong but I don’t exactly see Thomson Reuters or Bloomberg shaking in their boots, and yet here is a sector that is tailor made for Google’s engineering, distribution and technology assets, and one where they have had years to refine the value proposition; and yet Google Finance remains essentially a working prototype of a back-of-the-napkin sketch of what a Google financial information portal could become. Umair challenged CEO Schmidt to take up this challenge a couple months ago but I’m not convinced it would be as easy as it looks.
News aggregators: Google News is all we need right? (Perhaps supplemented with Google Reader…) There’s no reason for sites like Digg or Daylife or the Huffington Post to exist. I mean what are these guys thinking: some of them even started after Google News went into public beta. Crazy. Except they actually work, they have customers willing to use them despite Google News existing. But really, how long can this last?
Advertising: I must be joking now. After all advertising is the one market Google owns; the market that gave them their billions that allowed them to hire all the smart (non-evil) people and enter and take any other market at will. Right? Well if you think so, have a look at this recent post from Paul Kedrosky. It’s why vertical search and specialist sites exist. It’s why you (usually) go to Amazon.com if you know you are searching for a book, and not necessarily via Google.
And I could go on. But the point of this post is not to say that Google are useless, yesterday’s game, past their prime. In fact my best Google-fanboy guess would be that they are far from the point of diminishing returns and structural foolishness. My point is rather that they are not – or at least not universally – the ‘destroyers of all economic worlds’; that as they grow to become a company of thousands of employees in dozens of locations they will inevitably have to deal with some of the structural pathologies that this involves, including rising mediocracy and products looking more like camels than horses. Oh yeah and evil too. Yes they are a fierce competitor and certainly there is some risk that they could destroy your business model and take your business with it. But this is far from certain. They are human. They make mistakes. They execute poorly. They don’t always (or even often) win. And best of all, once you’ve proven that you can beat them, they just might buy your company.
Using the tried and tested TED 20min format, it was a great opportunity for me to collect my thoughts into (what I hope was) a coherent overview of how I think technological and economic forces will shape the optimally adapted ‘industrial stack’ for the sixth paradigm. It’s a great summary of the prism through which we look at potential investment opportunities and I hope will help us articulate this more powerfully to entrepreneurs and prospective investors.
I’d love to hear any feedback (good, bad and ugly) from any of the eComm delegates who saw my presentation and hope to continue the conversation with you and others here. You can also follow me on twitter @nauiokaspark.
Thanks to Paul and Lee for inviting me and especially to those of you who took the time to respond to my call for input – it was tremendously valuable in helping me to shape and refine my thinking and in building the presentation; just a few years ago, assembling this kind of distributed brainpower would have been impossible, and I hope I never lose my ‘childlike sense of wonder’ at the boundless possibilities that technology enables.)