With Theranos, can we be done with disrupt culture now?

A chemistry is performed
so that a chemical
reaction
occurs and generates
a signal from the chemical
interaction
with the sample, which is
translated into
a result, which is then
reviewed by certified
laboratory personnel.

Okay, that’s not really free-verse poetry, but I thought maybe if I laid the words out as if they were, it would help convey something—anything. Nope. In fact those were the exact words used by the CEO of a $9-billion corporation to describe the technology behind its value proposition; and according to Maya Kosoff and Nick Bilton, both writing for Vanity Fair, those were the words that inspired Wall Street Journal reporter John Carreyrou to begin investigating the now-disgraced Theranos corporation to see if there was any substance behind all the hype. There wasn’t.

Two years ago, Theranos was a darling among tech start-ups—or at least with the tech press. With a Stanford-dropout founder, Elizabeth Holmes, becoming the world’s youngest “self-made” female billionaire, and its promise to upend the blood-testing industry, Theranos could not have been better scripted into the anthology of Silicon Valley’s “disrupt culture.” Unfortunately, Theranos’s story was too scripted; and last week, the SEC accused both Holmes and president Sunny Balwani of perpetrating a “massive fraud,” misleading investors “about the company’s technology, business, and financial performance.”

In fairness to the major Venture Capitalists of Silicon Valley (I never thought I’d write that clause), Bilton did note in his 2016 article that none of the big players actually invested in Theranos. Couple that with Holmes’s cryptically obtuse explanation (above), published in The New Yorker in December of 2014, and it’s hard to wonder how the company rose to such heights without anyone, other than apparently Carreyrou, kicking the tires a little harder. But in that same 2016 article, Bilton answers this question rather pointedly, blaming the fecklessness of the Silicon Valley tech press. “They embraced Holmes and her start-up with a surprising paucity of questions about the technology she had supposedly developed. They praised her as ‘the next Steve Jobs,’ over and over (the black turtleneck didn’t hurt), until it was no longer a question, but seemingly a fact,” Bilton wrote at the time.

Granted, a lot of tech news is fairly innocuous. Reporters get invited to flashy launches, attend conventions like CES, and publish articles, blogs, vlogs, etc. about the latest gadgets that may or may not be in production six months down the road. Right or wrong, the debut of the vibrating denim shorts isn’t likely to be a matter of life-and-death, which cannot be said for a company like Theranos jumping into the medical industry without anybody asking a tough question or two.

Of course, none of this is surprising if you’ve followed reportage about internet giants like Facebook and Google. Until the fallout from the 2016 election—which is still falling out, by the way—revealed Russian hacks, mass data breaches, and caches of bogus news, it was pretty tough to get the mainstream press to say boo about these companies. “It’s a game of access,” wrote Bilton in 2016, “and if you don’t play it carefully, you may pay sorely. Outlets that write negatively about gadgets often don’t get pre-release versions of the next gadget. Writers who ask probing questions may not get to interview the C.E.O. next time he or she is doing the rounds. If you comply with these rules, you’re rewarded with page views and praise in the tech blogosphere.”

That sounds like a fair description of the atmosphere before November 2016. Even when it came to stories about harassment or serious crimes like sex-trafficking, it was rare to see the technology press insinuate that platform operators might bear some responsibility. The underlying theme that internet equals freedom (not to mention stock valuation) so don’t touch it, continued to at least color—if not dominate—the narrative. But now, that narrative has shifted, and last week’s press release by the SEC addresses the industry directly, stating, “‘The Theranos story is an important lesson for Silicon Valley,’ said Jina Choi, Director of the SEC’s San Francisco Regional Office.  ‘Innovators who seek to revolutionize and disrupt an industry must tell investors the truth about what their technology can do today, not just what they hope it might do someday.’”

Indeed. With stories like Theranos, the shake-up at Uber, and the still-unfolding saga of revelations about voter data-manipulation via social media, the SEC’s sober warning resonates well beyond the investment community. Internet and other technology companies shouldn’t just tell the truth to shareholders but also to the public, who are all stakeholders. For too many years, we’ve accepted the premise that any form of restraint (i.e. rule of law) in cyberspace will “hurt the innovation.” Apropos the SEC’s warning, though, internet platforms et al should be required to more clearly define the “innovation” supposedly being stifled by certain restraints; and maybe—just maybe—it’s the tech press who should be asking some of those questions.

The Freedom to Unplug

Photo by the author.

Today, I live in a somewhat economically homogenous community, but back in the 1990s, when we still lived in the financial mosaic of Manhattan, I made a note in a journal somewhere that it seemed to me that people wanted to succeed in contemporary, technological society in order to win the reward of living more as organic beings separate from technology.  Put another way, we live our lives and do our jobs by plugging into systems in order to earn the freedom to comfortably unplug from as many systems as we can.  Why else do leisure-time pursuits so often involve dirt, water, sun, fresh foods, silence, conversation, and a general embargo on high-tech gadgets?

I was thinking about those days while reading an article int he NY Times by Nick Bilton titled Steve Jobs Was a Low-Tech Parent.  Beginning with an anecdote about Jobs’s own kids not being allowed to use the iPad when it was first released, Bilton cites several examples of top executives in the computer tech industry who place some rather strict limits on their own children’s time spent with various devices.  He wonders if these digital executives teaching analog values to their kids might “know something the rest of us don’t,” but I’m not sure that’s quite right.  It is tempting, of course, to calls these tech-industry parents hypocrites for selling their wares to our children while sheltering their own, but I suspect that many of us know exactly what these parents know — that too much screen time is probably unhealthy.  As such, I would not be surprised to learn that households headed by parents who work in the upper echelons of other industries are likewise rigorous about restricting iPads and such for their kids as well.  I really think it’s about economics.

It should be stipulated here that post Boomer parents do have an apparently endless supply of theories about child raising.  We Gen Xers knew two things as our firsts were born:  1) that we had a rapidly increasing wealth of information being made available to us; and 2) that our parents were unflinchingly wrong about everything. (It’s a wonder we lived, really.) Perhaps the most extreme manifestation of these converging phenomena is the conspiracy of parents who still refuse to vaccinate their children, literally bringing hideous diseases back from extinction, thus representing one of the greatest failures of the so-called information revolution.  Certainly, the data are less clear regarding the effects of tech toys on children than, say, pertussis; yet I haven’t encountered too many parents who don’t at least make conscious choices, pro or con, with regard to how much screen time they feel is too much.

Back to economics, though, let’s face it — a contemporary middle-class household is a hectic environment, consistently pressured by the reality that many of life’s basic needs (e.g. medical care) continue to rise in cost outpacing our ability to earn.  Add a couple of kids and their divergent, asymmetrical, and at times unreasonable, demands and we rely increasingly on own devices to achieve that elusive work/life balance they keep talking about in the magazines.  The balance, of course, is the tricky part, isn’t it?

After all, it’s good news/bad news that we can read a client’s email during dinner that got off to a late start because somebody had martial arts practice; but if you are in fact the client (or boss) in that equation, you are unquestionably freer to ignore that email and engage in conversation with your kids just like low-tech Steve Jobs reportedly did.  In turn, the parents’ freedom to unplug models the behavior they want to instill in the child for whom they have set related limits.  But in the frenetic, middle-class household today, patterns or rituals can be very difficult to maintain, and all of our many “helpful” devices and their apps are not designed in any way to complement human rhythms or cycles; they much prefer us multi-tasking, always on, and a bit jittery.  At what point we become extensions of the technology rather than the other way around is an ontological question I won’t attempt to answer.

So, do these tech-industry parents mentioned in Bilton’s article imply a measure of responsibility on the part of manufacturers?  Should we expect Apple to provide warning labels on iPads?  Caution:  Extended time playing Minecraft may make your child a pain in the ass at home and a lousy student.  We probably shouldn’t hold our collective breath for that one or anything like it; and I don’t personally think it is the makers of these technologies who bear that responsibility any more than heavy metal bands are responsible for anti-social behavior in teens.  Nevertheless, these digital tools/toys are unquestionably having both positive and negative effects on kids, and the most important feature for parents, regardless of the promises in every new release, will probably still be the Off button.