Remix Culture & Food?

It was while sneaking one of my guilty pleasure foods, a small bag of Cool Ranch Doritios, that I read this article in the New York Times “Rethinking Eating” by Kate Murphy in which she reports that Silicon Valley is getting into the food business.  Well, the sustenance business anyway.  I’m not sure food is the goal in any of the cultural, social, or personal connotative senses of that word.  But technologists getting into the sustenance game isn’t necessarily a bad thing, applying algorithmic genius to the task of creating nutritional, and maybe experiential, substitutes for animal protein based foods.  Certainly, your vegetarian friends will remind you that animal-protein foods come with myriad downsides, ranging from environmental impact to cruel treatment of the animals themselves to any number of potential health hazards for the eater.  At the same time, too much of the still-growing world population remains hungry, and so it is not inconceivable that computer scientists mucking about in the world of algae and protein could be the legacy of Norman Borlaug, winner of the 1970 Nobel Peace Prize for inventing the hybrid “dwarf wheat,” credited with saving a billion lives.

As Murphy reports, “Instead of the go-to ingredients previously used in animal protein substitutes — soy, wheat gluten, vegetable starches — Food 2.0 companies are using computer algorithms to analyze hundreds of thousands of plant species to find out what compounds can be stripped out and recombined to create what they say are more delicious and sustainable sources of protein.”  No question, it’s an interesting area of research, and in all likelihood, this experimentation will yield some benefit the scientists aren’t even seeking.  Isn’t that part of the fun of science?

On the other hand, food scientists still trying to understand Food 1.0 have only just begun to seriously explore the microbial biodiversity of the human gut.  It is understood, for instance, that the innards of typical Western citizens are home to a more homogenous microbiome than they likely were in the past, while societies still living and eating more “primitively” show signs of a greater diversity of microbes.  How exactly certain microbes benefit humans — and thus how their absence may be harmful — is still science in its infancy, but researchers theorize that an increase in certain diseases in the developed world may be manifestations of our unwittingly killing off symbiotic species of bacteria and the like.  And since research in this area is so new, I’m going to assume that the algorithms being tested in Food 2.0 labs cannot account for these myriad chemical interactions between man and his meat, as it were.

At a glance, the efforts of these food tech entrepreneurs appear contrary to contemporary trends in culinary wisdom, which seeks food sources unsullied by mass production processes that often strip the very elements our bodies need in the first place.  While gut biodiversity science is still nascent, the general consensus among the food conscious is to follow the wisdom of experts like Michael Pollan who advises (if I may paraphrase), “Eat food and enough of it, and don’t eat that which is not food.” (My Cool Ranch Doritios definitely belong in the “not food” category.)  In short, we don’t necessarily need to know what every microbe does so much as we understand generally that we need to consume a fairly broad range of foods that are not over-sanitized because different symbiotic microbes thrive on different elements in the diet.  This is why the Western diet that is a bit too protein and sugar-rich has sustained certain microbes and killed off others.

And of course nobody needs me to tell them that, at its best, food feeds the soul, which may be much harder to factor into any equation than the probable influence of a single microbe.  So, it will certainly be interesting to see what comes from this new line of R&D, but historically, technology has a way of tasting like technology.  Anyone who has ever eaten a grocery-store tomato alongside a farm-stand tomato knows what I mean.  If this research leads to solutions that address world hunger and/or environmental and health hazards associated with current food production models, bring it on.  But if it’s a bunch of guys developing yet another way to treat food like a necessary evil that gets in the way of work or some other activity, that may not be progress for the human experience.

We like what appear to be ready solutions — like eCigarettes, which are so far unregulated on the assumption that they’re safe and are, therefore, being marketed to kids with sugary flavors like snicker doodle (yeah, I was surprised by that myself). So, perhaps these food tech guys are onto something, but they certainly appear to be investing in the opposite proposal that suggest maybe we just stop poisoning the apples and eat the damn apples.

On Being a Luddite

Carriage HouseIf you say anything publicly critical of the Internet, there’s a good chance some technobrat on Twitter will call you a Luddite.  The simple irony is that the word Luddite in this case is being misused, which is something the would-be accuser might discover for himself were he to look it up on the Internet.  In this excellent article on Smithsonian’s website, for example, Richard Conniff confirms that we critics of Web utopianism are indeed following in the tradition of the original Luddites, who were highly proficient users of the machines they destroyed but critical of how those machines could be used to ill purpose by powerful interests.  If anything, the Luddite protest in 1811 by English textile-workers has more kinship with the spirit of Occupy Wall Street (coincidentally of 2011) than with technophobia.

Of course, it is in the nature of the English language for word meanings to change with use, so there’s nothing linguistically wrong with calling someone a Luddite who you believe to be anti-technology.  But the revisionist history inherent in this particular semantic shift is relevant because there is a great deal wrong with both the intent and the implications of this over-used and unexamined pejorative.

To scrutinize technologies from a broad perspective, including matters of law and policy, is anything but an endeavor toward an anti-progress agenda. To the contrary, policy sometimes promotes technological advancement to the scorn of one set of interests or constrains it to the frustration of another.  One need look no further than the brewing battle over gun regulation to see this in action right now. A high-capacity magazine is certainly a technological upgrade;  but to some Americans, it is a useless improvement except in the hands of professional soldiers or mass murderers, while to other Americans, access to this technology is a matter of civil liberty.  We see this all the time:  one man’s freedom is another’s industry run amok.  Either way, it’s a safe bet that no matter what policy emerges, neither position will be wholly satisfied.  Welcome to democracy.

A glance at the mundane technologies around us reveals that the advancement and adoption of progress almost always requires a balance among free enterprise, civil liberty, and regulatory policy.  Even something as innocuous as the CF lightbulb has raised libertarian hackles to near incandescence (yes, I went there) over the right to use whatever bulbs they want; but it’s a given that the vacuum filament variety will soon be phased out as a matter of both personal habit and public policy.  When color television was ready for mass consumption, the federal government instituted the standard we know as NTSC, which required the color signal transmit in a manner that owners of black and white TVs could receive it.  This was never the best quality the technology could have delivered — some engineers call it Never Twice Same Color — but it is an example of public policy constraining technology in order to avoid disenfranchising citizens from news and information based solely on their financial means. In case you’re just tuning in, that’s the government protecting free speech and access to it by keeping the technologists in check.

I was thinking about this balancing act while working on an old carriage house that sits on the property where I live.  As far as records show, this house belonged primarily to village doctors beginning in the early 19th century and up to at least the late 1960s, so  the presence of the carriage house is explained by the fact that its early, physician owners made house calls by horse and buggy.

Taking care of historic structures is an exercise in both the practical and the abstract. One must simultaneously consider past, present, and future in a way that is neither simple nor purely academic and sentimental. Fortunately for me, my best friend Craig is a professional conservator who works on historic buildings for the National Park Service; so unfortunately for him, this means I get his help wrangling with my old structures in exchange for beer.

Historic building conservators deal with philosophical questions as much as with the technical means of preservation; and the perennial question seems to be one of value (i.e. why something is preserved in the first place).  “Preserving old buildings for the sake of sentiment alone,” Craig says, “is like advocating technological progress solely for the sake of making something new.”  An apropos simile in this context.  What we preserve of the past implies the question of what we protect from the future. There is nothing inherently anti-progress about this question unless progress must exclusively mean to leap without looking.

Lining the ceiling of the ground floor of the carriage house are a number of ceramic knobs that were used in a wiring system known as Knob & Tube, outdated by the turn of the 20th century but still found in active use in some historic homes, much to the frustration of new buyers. It occurs to me that building code makes a pretty good example of public policy working hand-in-glove with technological advancement. K&T WiringWhile it isn’t a perfect system, the principle applies that soon after a more efficient, safer, better way to do something emerges, it becomes mandated by law. At the same time, we continue to learn by not entirely erasing the systems and structures that tell the stories of the past.

The last time I was personally called a Luddite was in reaction to my recent post about my kids and online piracy.  This is fairly typical of the techno-utopian response to those of us who believe systems like copyright remain something more than outdated ideas waiting to die like K&T wiring.  The more complex irony in this familiar ad hominem is that, just like puzzling the maintenance of a 150-year-old barn, we contemporary “Luddites” are actually taking a much more expansive view of the future than our detractors.  Perhaps this has something to do with the fact that, like our namesakes of 1811, we filmmakers, musicians, photographers, authors, designers, etc. are expert users of the technologies we’re not afraid to criticize.

Workspace of a Technophobe
Workspace of a Technophobe