We thought this was a really big idea. ... We worked for years with teams of scientists and engineers to miniaturize all of the technologies in the laboratory
Don Lucas was one of the early VCs in Silicon Valley. I knew him as someone who focused on building great companies in the long term. I was introduced to him by someone who had gone to college with my dad. He had a lot of questions. He began a very comprehensive diligence process. He hired a law firm to review our patents. He asked us to get an audit of our financials. He wanted copies of our contracts.
Elizabeth Holmes
look, if at some point you got lied to, it's because you agreed to get lied to
Pamela Meyer
when I was asked by editors who I'd worked for, I lied. I listed a handful of magazines that sounded likely, and I sounded confident, and I got jobs. I then made it a point of honour to have written something for each of the magazines I'd listed to get that first job, so that I hadn't actually lied, I'd just been chronologically challenged.
Neil Gaiman
I
There's this question that's been bugging me for a while. How much should you lie? I decided to ask it, and turns out people have quite a few opinions on the subject.
I started thinking about this again recently when reading about the Lambda school saga, where it turns out that Lambda school might not have been fully upfront about its success rates in how amazing it is. And that they lied about the exact way in which they made money from the students.
The story in summary is as follows. Lambda school is a venture funded darling, with a mission to make the world a better place, and with the funding and talent that brings with it.
But turns out they took some, shall we say, creative liberties with their spiel. The concurrent lawsuits filed by three of its former students alleges three very specific problems.
They lied about the job placement rates
They misrepresented about how they get their revenue
They concealed a regulatory dispute in California while continuing to provide services
This wasn't much of a surprise (except to their students presumably). There have been rumblings of this in tech circles for several years (but not, it seems, amongst their students). All of this is pretty seedy, highly disheartening and kind of well predicted?
Going back all the way, the basic idea behind Lambda school is a straightforward one. Historically, kids pay money to a school. The school teaches them things. And with that education, the student would land a job. At Lambda they said, hey with that job, long as they hit a minimum salary, pay a percentage of that salary to Lambda for a few years.
Which ... sounds amazing? And that was indeed the general consensus. You align the incentives of the student with the institution and the end employer. It made perfect sense.
Except that if they're lying about job placement rates and misrepresenting regulatory disputes so they could illegally enrol students, that completely changes things. That's just a bigass lie.
So, is lying always bad? Depends where you ask. In politics, for instance, lying seems to have no appreciable effect. Even in the good old pre-2016 days it seemed more an annoyance than anything real.
At least in technology, there are plenty of egregious lies to go around. There have been some spectacular frauds.
Nikola, who pushed a truck downhill and used that as evidence to tout its self-driving capability
Theranos, who lied about seemingly pretty much everything about their product, except that they did stuff with blood, badly
Luckin Coffee lied about its revenues, expenses and losses (that's all?) to defraud investors
Ozy imitated a Youtube executive as a prospective customer and lied about it to Goldman Sachs, who were looking to invest $40m in the company. To the vampire-squid's credit, they were not fooled.
OpenSea had a senior executive who traded NFTs on insider information. Did he find out which apes were the prettiest beforehand? Who knows. But he definitely did front-run trades before they got listed.
App Annie cofounder was charged with securities fraud, because the data company lied to their customers about how they used their data. Oops.
These we can definitely rule as cases of egregious lying.
II
So, rather than draw a conclusion, I note that there's three possible stories that can be made about this saga.
Story 1: it's not that bad
Lambda's argument is that this is all a lot of brouhaha over nothing. The thing is, a startup, at the best of times, is in a race against the money clock to make a particular future come about. They're usually haemorrhaging money and trying to grow and using every trick in the book.
Lambda School's external reports show that they did, in fact, have a high-ish employment rate (though not as high as they said), at least once upon a time. So maybe they're still misrepresenting, just not by as much as others claimed they were.
Now, this is from H1 of 2018, and this is for one particular cohort. If the startup is like any other startup I've seen, the numbers could've moved in any other direction in the last couple years with a wide margin.
If you're growing at, say, 100%, and the curriculum is 225 days, then you'll see cohorts change over 1.5x a year at least. That's c.5 cohorts. The numbers today could be 30% or 80%, and the previous numbers wouldn't tell us much.
For instance, this report from 2018 talked about 71 students. The most recent report from H2 2019 had 743 students, which is a 10x increase in a year and half. At that speed you're lucky if the wheels stay on. And of this, 70% graduated.
Now, story 2: this is just regular old exaggeration
Every startup tries this. Promises features it hopes to be able to ship. When Steve Jobs did it he could create a reality distortion field that transformed what the world saw, and made things like copy pasting seem like nuclear fusion. That was his gift.
Elon Musk, the patron saint of talking about the future he wants to see vs the future that's likely to happen, would probably concur.
Imagine if you figured out the answer to reducing the impact of financial crises by 40%. You can fix it, no more scrambling around once a decade to herald crazy solutions to once-in-a-lifetime problems. You can take the central bankers and junior bureaucrats through a carefully crafted curriculum that will make them smarter, wiser, and solve problems previously considered impossible.
But there's a problem. You need to make them all believe you. A 40% improvement is a lot! But there's stragglers and doubters and all sorts of complainers who make bringing them together hell.
So what do you do? Do you start by saying, "hey, 40% is better than nothing", or do you go "hey this is revolutionary and crazy amazing and will definitely work"? Do you play fair cop with balanced views, or try create a reality distortion field?
And, story 3: liars broke the sacred covenant and should pay the price
The world is filled with liars of all stripes. In fact, I can't help but think of the other saga re lying that's going on through the courts right now. The turtleneck wearing genius who promised to test for many many diseases, all from a tiny single drop of blood. Turned out that was a lie too. A balder faced lie, some might say. Scientifically impossible. Pants on fire.
In this world, all we can do is to punish the lies as soon as we find them. We gotta discourage them with fire and brimstone.
In this world Austen Allred and his myriad lies all need to be exposed for the moral failing that it is. His stories about having been homeless, his lies about graduation rates, his lying lies about the "random" success stories, his lies about actually providing the grads a proper education instead of duping them and making them sign ISAs.
This is also, at least partially, the media's story. Lies are bad and liars are worse. And that's where we find ourselves.
III
All three versions above have versions of lying in them. Sometimes its "just" exaggeration which is industry-standard and therefore not noteworthy, and sometimes its outright lies regarding empirical facts. So how should we think about them?
Alex Danco's seminal piece looks at whether or not startup founders are allowed to lie covers the ground as a sacred truth offered to the chosen few in the valley.
You probably don’t call it “lying”, but founders have to will an unlikely future into existence. To build confidence in everyone around you – investors, customers, employees, partners – sometimes you have to paint a picture of how unstoppable you are, or how your duct tape and Mechanical Turk tech stack is scaling beautifully, or tell a few “pre-truths” about your progress. Hey, it will be true, we’re almost there, let’s just say it’s done, it will be soon enough.
But he nails down what is problematic about lying later on in the article. He talks about Nikola's famed years of untruth that got them to roll a truck downhill and get SPAC-ed at $4B+ valuation, and how it came undone with a dumb lie about "HTML5 supercomputer".
See, this is what’s not allowed. You cannot just throw out preposterous line items like “The infotainment system is an HTML 5 super computer that lets us build our own chips.” This disqualifies you! It disqualifies you immediately! If you are caught saying anything like this in Silicon Valley your pre-telling privileges get revoked. And the reason why they get revoked is that saying something like this pretty much reveals that you are not, in fact, building the future. You’re just pretending.
And that, I think, reveals the real principle behind the rule. What’s important in Silicon Valley isn’t quite what’s true right now. It’s a different kind of truth; less about factual truth and more about authenticity. That’s why the blessing that VCs grant founders is so important: it’s a power, but also an acknowledgement of the burden of authenticity that founders have to carry.
Matt Levine, in an essay so hilarious it shouldn't be allowed to be read in offices, writes about the paradox of startups, that they have to convincingly sell a vision of something that doesn't exist and may never exist.
The pitch is, like, you put your arm around the shoulder of an investor, you gesture sweepingly into the distance, you close your eyes, she closes her eyes, and you say in mellifluous tones: “Can’t you see the trucks rolling off the assembly line right now? Aren’t they beautiful? So clean and efficient, look at how nicely they drive, look at all those components, all built in-house, aren’t they amazing? Here, hold out your hand, you can touch the truck right now. Let’s go for a drive.” That’s not true, but it’s a nice metaphor; the goal is to get the investor to see the future, so she’ll give you money today, so that you can build the future tomorrow.
He is talking about the bargain made between the founders and the investors in this case, both private investors and recently the public investors, who buy into this delusion willingly. What is not written is the delusion as spread to the employees or the customers, who are not bought into the same worries. Like if I'm one of the first people to buy the Tesla FSD and it didn't work, I'd rightfully be pretty pissed off!
HBR similarly has another analysis of entrepreneurs and their flexible relationship with the truth.
It may be tempting to think that departures from the truth are just part of doing business—that we operate in a no-holds-barred capitalist arena in which all contestants are responsible for their own welfare and know the rules of the game. Unfortunately, such cynicism feeds on itself; when we encounter dishonesty or scandal, we become disillusioned and are more likely to engage in such behavior ourselves.
Ethan Mollick pointed out a really great journal article that showed angel investors at least didn't care about exaggeration at all, but valued pitch preparation a lot more. Which at least tangentially indicates that exaggeration might be a useful tool in the kit.
Because investors are aware of entrepreneurs’ willingness and ability to exaggerate an interesting paradox emerges between what the actual impact of exaggeration is, and what the entrepreneur hopes it will be. Our findings suggest that although respondents penalize exaggeration that penalty does not result in the venture being viewed as illegitimate. By straddling the line between preparedness and exaggeration, entrepreneurs assume – correctly in some cases – that their statements are likely to be interpreted in a positive light
IV
All of these point to the fact that its not the lie per se that's the issue, its the fact that sometimes the lies are clumsy. What's key is to ensure your tale is as soaring as Matt Levine describes while having as little of the dumb lie details that Alex Danco wrote. Its a fine line to tread.
What it reminds me of mostly actually is foreign policy. Specifically the one in international development, a policy called "Strategic Ambiguity".
A policy of deliberate ambiguity (also known as a policy of strategic ambiguity, strategic uncertainty) is the practice by a government of being intentionally ambiguous on certain aspects of its foreign policy. It may be useful if the country has contrary foreign and domestic policy goals or if it wants to take advantage of risk aversion to abet a deterrence strategy. Such a policy can be very risky as it may cause misinterpretation of the intentions of a state, leading to actions that contradict that state's wishes.
With this theory, the key is that nobody should be able to falsify (or even clearly pin down) your position on an issue. At best you're a Rorschach test for your audience, and otherwise your position is one everyone can plausibly interpret as not antagonistic.
If your position gets firmed up though, there is no route left for strategic retreat and you're toast. Even when things are overwhelmingly clear, like whether Israel intervenes externally or UK regarding its missile strategy. Or, regarding the most recent geopolitical occasionally-hot-potato.
To state it succinctly, the United States has never had an iron-clad security commitment to defend Taiwan. That policy of “will-they-or-won’t-they” has worked well for over 70 years and should continue to serve American interests.
... Moreover, the administration deliberately sought to “fuzz up” the security pact in such a way that the territories covered by the document were unclear.
(If this sounds like the sort of thing that five year olds do in the playground, I can tell you from personal experience that it very much is.)
What I think is happening with our Bloom friends is closer to this notion than anything else. There is an elaborate dance that's well choreographed from the founders that the investors accept. "We have the best team on the planet" and "there's nothing else like this on the market" are taken as statements that are, as Peter Thiel pointed out, serious but not literal. But "our numbers are this high" when they're not are, unfortunately, a bit too specific.
On the one hand it feels like there should be a clear distinction between when a startup founder outright lies about what they've built and securities fraud. On the other hand it's an extraordinarily fuzzy thing to nail down, as Matt Levine loves to point out again and again.
In our system, which deals with issues of morality and ethics by adversarial checks and balances, what is the optimal number of sleazeballs and frauds?
It's not zero, because then we are nowhere near the optimal growth trajectory. Stop doing things or trying to push the future we'd want, and you'd make a ton of Type 1 errors and not just Type 2. If we count our future selves (not to mention our descendants) at all, then we'd want to try grow as much as we can for their sakes.
I don't think there is an easy answer, short of adding yet another regulatory agency with the burden to tease out truth, to add on to the existing regulatory and legal systems, but even that I'd only want to do if we think that we're reaching far above the acceptable levels of fraud.
But I'm not sure we know what that is. Most of our attempts at finding out act as if the only acceptable answer is zero, which doesn't help.
And the mistake that Milton, Nikola's founder, made is in making a literal mistake that was unambiguous, in calling his technology a HTML5 supercomputer.
If what you say is falsifiable (or in this case risible) you're toast!
The artistry of the lie seems not in whether what you're saying is true or false, but in whether it's even falsifiable.
V
There's one complication to the theory above that what you need to avoid are the outright lies. This is the mysterious case of senior executives who were found to have lied about their college credentials several years (sometimes decades) later and made to resign. Some examples:
Sandra Baldwin, former president and chairman of the US Olympic Committee lied that she had a Doctorate degree from Arizona State
David Edmondson, spending 11 years working his way up Radio Shack until he was the CEO, had to resign because he lied about his Bachelor's degree
Scott Thompson, CEO of Yahoo had to resign because he said he had a computer science degree, but only had an accounting degree. Dan Loeb found this out and made noise, because presumably what Yahoo lacked was a computer scientist as CEO. As we know, this worked out well.
My favourite, Marilee Jones, was with MIT for 28 years as Dean of Admissions, before they realised she didn't have any college degrees.
The happy one here, for a change, is James Peterson, the CEO of Microsemi Corp, who lied about his degree from Brigham Young, but the company decided to keep him as CEO because presumably his performance trumped a bachelor's degree from Brigham Young
The most interesting thing about the list are two things - 1) lying doesn't seem to have stopped these folk all that much, the puritanical seppuku firings aside. 2) There's a hell of a lot of it going around!
The analysis is something like 58% of hiring managers caught applicants exaggerating or fudging details in their resume. The key word there is "caught".
One might argue lying about your background or education isn't the same as lying about whether your product, you know, actually works. But they're both servants of the narrative you're going for. As Neil Gaiman said, you get work however you get work.
VI
So are we lying too much, or too little?
Despite the red-hot-media-rage to burn the liars, there seems to be no dearth of fibs. What this tells me anyway is that our neediness for facts does occasionally get out of hand, and perhaps isn't all that optimal anyway.
What does this mean for Lambda nee Bloom? Or Nikola? Or the new budding entrepreneur painting a picture of a prophetic future?
I think the broadest extension of this is perhaps to back away from the virtue ethicist view of "lying is bad" to a more (bounded) consequentialist view of "how much could you have foreseen". If we're playing with shades of grey and narratives then some lies turn out to be okay and necessary while others not. My heuristic has always been if what you're lying about is going to have pretty messy consequences, then maybe don't! But I can also see why that can't be a categorical imperative to everyone else.
Regardless of his charm, I'd be pretty pissed at the fact that Frank Abagnale pretended to be a doctor. I'm much less annoyed that he lied about going to law school since he passed the bar.
Alex Danco ends his essay talking about how it's all about not lying like an idiot. It's about being artful, confirming to an aesthetic. Saying you'll take over the world can be bad, even rolling a truck downhill is shady, but the worst lie is the HTML5 supercomputer. That lie tells more about the inadequacy of the liar than the fact that it was a lie.
I think this is right, but doesn't account for the entire corporate sector seemingly lie most of the time. These aren't artful dodges, they're easy lies that nobody bothers to check because they sound plausible.
My theory is that lying is more akin to playing a game. In this game you're co-opting the other player(s) to your point of view. To convince them of your purpose, your will, your innovation. You tell the narrative that's needed to convince them. As with a game, you can do it clumsily and badly, or do it well. And that seems more important than the specifics itself.
(This isn't a knock on narratives. It's a side effect of the fact that when reducing the craziness of the world to a narrative sometimes lies are useful, if not necessary. But more on this another essay.)
Are we all begging to get lied to, as Pamela says? Are we all being patsies to each other? Or is a certain percentage of fibbing, even about facts, just the best way we've found to convince each other about a possible future?
If the lies are mostly gonna be inconsequential towards the greater goal, then they end up sliding, whether that's about a product that kills folks occasionally or your college degree or how great your program is.
The best we can do may be to focus on the narrative and its intent, and hope nobody gets killed en route. This may be the most contentious I've ended a post on, but I'm genuinely not sure how or where or even if we can draw the line.
Outright lies about facts, bad, depending on the facts. Outright lies about beliefs, kind of bad. Kind of lies about facts and misrepresentation, bad depending on the narrative. Kind of lies about what could happen vs what you think will happen, meh. In that grey area where you can credibly claim to believe in the exceptional is where you get to pull people to the future.
I hope when you lie next you too will do it in the service of the greater narrative and keeping the foreseeable consequences in mind. The marginal lie is the lie that tries to get at a truth of the world the only way it can, by bending the facts that stand in the way.
Thank you for this detailed article on an important topic. A few additions:
Some lies have more consequences than others. Our society actually values some lies (e.g. lies that allow everyone to save face and don't create any victim). I take that you are focusing here on lies that carry meaningful consequences.
Lying is a behavioral habit to an extent: some of us lie more, some of us less. This matters a lot, because it may become part of your brand, and i doubt anyone wants to be branded as a liar.
Lying is putting your credibility at risk. Is it worth it? What's the next best available alternative? Could there actually be a better alternative? In my experience, there typically is, but feel free to try me.
If I remember well, a world-class liar once told me: "it's only a lie if you get caught". I also remember them asking me the difference between a great and an ok liar: "a great liar never lies"
There is a fundamental difference between taking about what was/is and what will be. Unless speaking with an accredited fortune teller, nobody should take anything one says about the future as anything but a dream that may or may not become reality.
An interesting anecdote I have comes from the perspective of "engineer that worked closely with over a dozen startup founders": Usually when I give probabilities they end up being turned to ~100% or ~0% based on whether or not they fit the narrative.
"Yes, I think this NLP product can be done within a month or two, I'd say 50-60% chance"
"No, I don't think I can make it run on ARM, I mean, I'm not sure, but I'm like 2/3rds certain"
Somehow gets turned into: Within a month we'll have this NLP tool available for Android!
This is actually fine, in the sense of a 1/6 chance of getting a nice product is actually ok. If you approach 3,4,5,6, 10 ideas that way, chances are one will work.
My question is really "why not say there's a 1/6 chance it will work?", after all investors/early-customers/employees understand that new products are risky business and adjust accordingly.
I assume it's a cultural thing where people don't deal well with probabilities, which lead to assigning a probability being equivalent to "this guy doesn't think he can do it", because people only assigned probabilities to things that were basically impossible, and set numbers way too high.
But it seems likely that the culture will shift away from that in the near future.