What is a Frontier Tech Narrative?
When trying to understand how to build a lasting frontier tech business, stakeholders (investors, employees, customers) and founders alike often swing between ‘data-driven’ analysis and cultural/circumstantial analysis.
These two pieces united form common narratives. Narratives are the prominent stories within markets tied to successes and failures that are half data points around the commercialization of these technologies in the past (the novel GTMs, pricing structures, and unexpected hitches in developing brand new science), and half built off of the particular circumstances and culture that they took place within.
For founders commercializing new technologies whose growth and potential market is often not as well understood as SaaS sales, studying the two of these in combination and how they form common patterns is important when building a lasting frontier tech business. By analyzing the past of frontier tech commercialization, stakeholders and founders can make educated hypotheses about how these patterns may play out in the future, or can be harnessed in your favor.
In a world where the state of the market a frontier tech company might be building within could change heavily in 5 years due to new technology or inflections in cost, the cultural/circumstantial portion of narratives are especially important.
Looking Back at Frontier Tech Narrative Successes & Failures
“The important takeaway is that whether true or not, narratives will spread rapidly, and before you can shift your narrative, it will already have proliferated into a non-trivial percentage of the increasingly loud echo chamber of tech and sheepish venture capital ecosystem.”
In 2019, my colleague Mike Dempsey outlined some of the ways narratives shaped the markets’ perception of deep learning-centric companies, the on-demand economy, network-centric companies, and more. Narratives’ framing effect on the way the market sees the world presents all the more reason to be aware of them or even shape the narratives proactively.
There are a few generalizable patterns and questions to answer in narrative formation, and looking at how other companies have successfully or unsuccessfully approached them can help add perspective.
1. “Everyone Believes in it but No One Knows What the Company is.“
Space often faces this challenge. Everyone has long been excited about space travel especially in the first waves of space investment from the government side in the 1960s, or some level of belief that accessible space travel is one of the purest representations of progress in pure engineering and scientific exploration. There was another peak in the early 2000s furthered largely by corporate investment from Boeing, Alcatel, Morotola, etc. The most recent wave was arguably prompted by SpaceX’s founding, but really kicked up in the mid 2010’s shifting to the ‘newspace’ era & mentality, especially as Spacex launched their first commercial payload and published their launch pricing introducing market transparency.
Aerospace has always been uniquely dependent on a number of complex variables including: public-private partnerships, cost of launch decreasing at a predictable rate, and capital-intensive technical challenges that are uniquely few-shots on goal oriented, even compared to the rest of frontier tech. The newspace era brought small launch, smallsat manufacturing, orbital transfer vehicles as SpaceX, Rocket Lab, and others had come in with all of their commercial success and fundamentally lowered the cost of rocket launch and proved its repeatable reliability at scale.
Suddenly this next generation of businesses predicated on low launch cost and reliability seemed much more feasible. And this led to a wave of private startup funding where there had been very little before, with 6 companies funded for a total of 82M invested in 2012, 21 companies funded for a total of $1.1B three years later in 2015, and a record of $8.9B in 2020 despite covid (CB Insights data here). Companies like Planet Labs, Spire, Terra Bella (fka Skybox Imaging, acquired by Planet Labs), and Spaceflight Industries all received funding within this new era.
Agriculture is another example of an industry that has long been romanticized due to its inefficiency but large scale. Despite this, it has been written off by investors for years. There’s been attempts at A/B testing for better understanding of optimal growing conditions, middle-man platforms for purchasing from seed and grain vendors, and many IOT/robotics plays to automate various processes. Most companies tried to fit these businesses into SaaS frameworks and the reigning narrative became that agtech was very difficult to build a company in because farmers had low margins and their purchasing was often very relationship driven.
Ultimately, Farmers Business Network (FBN), founded in 2014, came in with a much more effective go-to-market strategy (HBS case study). Their core insight was to acquire users by providing price transparency on seed, fertilizer, chemicals, etc, and then eventually move into a broader data analytics platform across weather, field temperature, soil performance, and seed performance that older players had immediately tried to dive into. This multi-step, horizontal vision has enabled FBN to raise $870M in funding due to the aforementioned massive market size they are going after.
Ambrook, founded in 2020, also disrupted the anti-agtech narrative by providing financial savings to acquire customers. The company realized that millions of dollars in federal grants, loans, and financial assistance were going unused because farmers didn’t know how to best access them. They ran paid ads offering a quick assessment that would match farmers with money they qualified for and help them navigate the process. Ultimately they offer a suite of bookkeeping, and automated P&L & expensing.
Both FBN and Ambrook managed to flip the negative narrative that farmers had low margins and were relationship driven into a positive that meant easy customer acquisition where there was promise of improvement on margins, and longterm very sticky customers.
When the dominant response to an emerging domain of technology is “everyone believes in it but no one knows what the company is,” it is key to focus on value accrual through clarity of go to market. It’s a challenging place to be because initial public support for the challenge you are tackling may seem high, but the skepticism for your solution and the potential value being captured by your company is also high. The positive of being a company building in this space is that a clear go to market will set you apart incredibly quickly in a world where people already believe someone needs to be solving the problem that you are.
2. “Everyone Believed in it but Saw too many Promising Companies Struggle.”
Since the 1980’s, when the term virtual reality was invented by an Atari employee, there have been waves of companies attempting to build the future of scaled virtual reality or augmented reality. The most recent cycle of VR/AR company boom in the early 2010’s included Magic Leap (raised over $3B dollars), Google Glass (shut down in 2019), and Microsoft Hololens, which were short-term failures that cemented the public narrative that the space was nearly impossible to tackle, even with the resources of giants like Google or Microsoft. Magic Leap tried to sell its first headset for $2,300, Hololens for $3,000 and $1,500 for Google Glass.
The primary challenge companies struggled with was cost-effectively scaling down the computational hardware, as necessary to play video without a lag. This primarily reduces nausea and other side effects and ensures the ability to sell devices at an accessible price point to the general consumer.
Oculus came along and disrupted this narrative slowly, and then all of the sudden. The company was founded in 2012, and purchased by Meta for $2 billion in 2014. Over this time they acquired some of the best hardware and gaming talent in the world, and focused on scaling down costs to make Oculus more accessible to the consumer in the face of many criticisms that VR just wasn’t usable for long periods of time, and for that reason wasn’t here to stay.
Meta reported that in 2020 alone, 60+ games and apps on the Oculus headset alone made more than 1M in revenue, and 6 titles generated more than 10M. They also focused intently on creating an ecosystem for developers of all sorts of apps, building out their tooling suite, and ultimately expanding into enterprise as well. The proliferation of headsets means that applications now have distribution and ultimately changed the narrative that virtual reality is here to stay with superior hardware investment that led to an enjoyable VR ecosystem whose value will only compound as the applications built upon it gain greater distribution.
In addition, this long-term hardware investment from large players has also drawn more market competition, with rumors of an Apple ARVR headset coming at some point in 2022/2023, which is notable due to Apple’s tendency to ship hardware only once they believe it will drive massively scaled adoption.
The world of education is one that technologists love to theorize about and yet failed startups promising automated education litter the past. From more efficient classroom management software, to “up-skilling” technology, to AI assistants and tutors, to more general attempts at online schools, venture backed startups in these areas have not had much success. The narrative around these companies was largely that technology would teach for us, rather than augment teaching capabilities, which drove companies into cycles of over-promising and under-delivering, and trying to fit students into metricized boxes that made it easier for software to participate in.
Quizlet took the existing edtech narrative and flipped it on its head by focusing on building an excellent single feature. The company was founded in 2005 (the founder was 15 years old) but only formally started taking venture capital a decade later. They focused on digitizing flash card creation and review, and allowing users to discover and engage with other user generated flash card sets.The company also hit unicorn status when they were valued at $1billion amidst the pandemic in 2020. Even prior to the covid ed-tech boom, half of all high school students and a third of all college students in the US used the platform at least once a month in 2018. This laser focus on flash card digitization set them apart as they spent years focused on simply improving memorization with flash cards (rather than the wider world of teaching as a whole) and bringing in more and more students to generate content and data.
On top of this, Quizlet built their most popular product, Quizlet Learn. The tool lets students simply input more abstractly what they would like to learn by when, using the data they gathered over millions of students to employ machine learning and spaced repetition to test more robust understanding (quizzing across a variety of formats over time) of various subjects.
Although educational startups had been challenged with a rough narrative to overcome, Quizlet played the long game by focusing on augmenting one important type of learning for millions of students. This single feature focus enabled them to gain data at scale on measurable learning outcomes (length of memory rather than ambiguous ‘better understanding of a subject’) and augment how a student already works rather than try and outsource all of learning to software at once.
I believe we’re currently seeing this pattern play out within quantum computing (my older views here). Much of the fundamental quantum computing research began around the 1980s and so frontier tech investors have long been exposed to the idea that this field of computing could hold the power to solve some problems that were previously too computationally heavy such as breaking certain types of encryption, or simulating complex molecules. But now, many foundational quantum hardware companies that originally received funding in the mid 2010’s have struggled to scale to a number of stable qubits as quickly as we once thought possible, and more generally, to find problem spaces/algorithms that what we have today can outperform classical computers on. Consequently, full scale quantum computing is a bit further down the line in terms of commercialization than most people originally estimated.
In the last year or so companies like Zapata Computing, Entropica Labs, 1QBit, and QSimulate have presented the next wave of quantum as they framed themselves in the newer category of “near-term quantum solutions.” They generally focused on exploring certain problem spaces that may or may not be better addressed with any of the quantum hardware we have today. Some companies use quantum simulation on classical computers by approaching existing corporate optimization departments interested in quantum, and spending a number of months on algorithm development for an existing problem that company had and delivering research on how all the solutions compared. Others look to quantum inspired or hybrid algorithms to deliver simulation speedups for the simulation applied to the biology and chemistry industry especially.
Companies like these framed themselves as intermediate bridges to challenges of hardware scale, while each angling to build up proprietary research on which industries are the best early adopters based on where we’re at now.
When the dominant response to an emerging frontier tech business is: “Everyone believed in it but saw too many promising companies founder / struggle,” it is key to understand whether the technical inflection point your business is predicated on might take longer to commercialize effectively than the market realized (whether it is because of high need for capital or other time constraints), and to focus on building a very focused product around that inflection point, or even break the technology’s inflection point into a smaller, nearer term focus. You likely have lower competition, but will be in a position of strength when the market comes back. Recruiting and fundraising will also likely be harder to begin with as some people will take a while to change their mind.
3. The Narrative is Solidified: “People Generally Don’t believe in it.” or “Everyone believes in it but thinks the Winner has Already Been Crowned.”
One great example of this common paradigm is the arc of autonomous driving. With the boom of companies we saw emerging in the space in 2016, along with a broader boom in deep learning applications and understanding of how important data was to building robust, highly effective models, it seemed like the industry had a reasonable path forward. Collect as much annotated data as possible and dictate actions with a rules based approach.
As time went on however, and these players got to the last 10% of automating edge cases, progress slowed down to the point where returning to the beginning on a codebase for a given approach was a better alternative. Players like Cruise, Waymo, and Tesla seemed to be crowned winners but have been slow to expand to a number of cities while also demonstrating a high level of autonomy.
The Wayve team spun out of Cambridge, with novel bayesian deep learning and reinforcement learning research that wasn’t reliant upon rules, but would ultimately integrate with an autonomous vehicle’s entire perception stack to assess risk of a given situation. This ultimately allowed them to differentiate their approach by lower disengagement rates broadly, and not needing as much data. Anticipating some of the correct prohibitive hurdles that came later in the game allowed Wayve to speed run the competition despite entering the market later, and speed run the prevailing narrative that autonomous vehicles companies had been adequately invested in [Disclosure: Compound was an early investor in Wayve].
Genetically-informed D2C Healthcare
Personalized healthcare is an industry that has seen a ton of evolution in the past two decades and multiple scaled winners. From the simpler end of the service side: moving analogue services into the digital space (Talkspace [Disclosure: Compound was an early investor in Talkspace], for example, which enabled more flexibility to things like talk therapy), to direct primary care (DPC) companies (Forward, OneMedical, and more specialized versions like Tia for women’s health [Disclosure: Compound was an early investor in Tia]).
With respect to personalization innovation more focused on the data science and product side (Curology for data driven personalized skincare formulation, Prose for personalized haircare formulations, Hims products for skin care, hair loss, and sexual health, etc.) there have also been scaled winners.
In this space, however, I think genetically informed D2C treatment/product recommendation will produce yet another wave of highly successful personalized healthcare companies, especially as the cost of genetic testing drops significantly. Prairie Health (genetic testing integrated into broader care to better understand people’s responses to SSRIs based off of pharmacogenomics), Adyn (aims to combine genetic testing with birth control prescription to better understand adverse reactions), and Luminate (skin patches that lift samples you send to their lab, where they identify genetic mutations caused by damage that inform your best treatment going forward) are all great examples. Each of these players have positioned their GTM to reach potential customers who lie at the extreme end of risks that genetic testing can serve to predict: adverse reactions to SSRIs (especially in the case of treatment resistant depression), adverse reactions to birth control, and the worst skin conditions or damage in need of quick treatment. These companies create a wedge for themselves by taking advantage of an inflection point of price-accessible genetic testing and by addressing some of the most at-risk customers in each sector, that then allows them to leapfrog competitors and infiltrate their customer base over time.
When the dominant response to an emerging frontier tech business is: “People generally don’t believe in it.” or “everyone believes in it but thinks the winner has already been crowned,” it is key to focus on why a new technical inflection point enables your company to challenge the existing winners by providing a better product OR why the market has expanded or been underserved and the existing players might not have kept up. The challenge to building a business here is convincing customers to pay attention when they think it may not be possible for their problem to be solved, or they don’t feel a pressing need to find a better solution to what they’re currently using. The positive is that once you have done this, there’s a world of customers and market sizing that are already vetted and become easier to onboard to a newer solution.
The Big Picture
Commercializing ground-breaking new technologies is no simple undertaking, and it would be thoughtless to say that everything boils down to narratives, so I won’t make that claim. However, no new discoveries are made in isolation, but the more astounding a technological inflection point is, the easier it can be to abandon the continuous study of the world and beliefs a technology was formed within, loosing sight of the greater narrative that inevitably affects the future.
Preemptively participating in these narratives and bending them in your favor, forming stances and tactical responses, is one of the best ways to shift our increasingly evolving world.
As always — feel free to tweet or message me questions, thoughts, disagreements, or pitches on twitter or at firstname.lastname@example.org