U-All-No and How We Won the War

U All No, from the Hidden City blog's post about the inscribed brick smokestacks of the Philadelphia area. 

I spend a lot of time on the Amtrak, shuttling between New York and Philadelphia, and one of the many delights of that stretch of the Northeastern rail corridor is this smokestack on the outskirts of Philly: 

There is something hauntingly defiant about this disused smokestack. From its cacographic "U" to its punning reduction of "know" to "no," I've always been cheered by its persistent spouting of this little bit of near-nihilism up in the Northern fringe of the city. 

But what is it about?

"U All No" was an after-dinner mint produced by the Manufacturing Company of America. It turns out that they played a critical role in the US war effort during the First World War. 

I'm not sure when exactly the Manufacturing Company of America started making the mints, but the company registered their trademark for the words "U All No" on June 5, 1906.  

Candy was a big deal in the Progressive era, as sugar consumption among Americans spiked, and as temperance activists promoted candy-eating as a sober alternative to the temptations of demon liquor -- or even as a substitute for it, satisfying the same cravings. As A.C. Abbott, Pennsylvania's state health commissioner, put it: "The appetite for alcohol and the appetite for candy are fundamentally the same." (For more on this, check out Jane Dusselier's essay on candy-eating and gender in the collection Kitchen Culture in America.)  

In the wake of the 1906 Pure Food and Drug Act,  modern candy makers emphasized the scientific purity of their products. "U All No" mints even made the 1907 Good Housekeeping Pure Food "Roll of Honor." The magazine noted: 

"Made in a peculiarly cleanly manner, mostly by machinery, from cane sugar and ingredients chemically tested for purity and uniformity. This firm maintains a specially equipped laboratory, in charge of a graduate chemist of the University of Pennsylvania, where critical tests are made of every material entering into the candy."

However, the reason these mints helped win the war was not because of their ability to divert Americans from the intoxications of booze to the intoxications of sugar, nor because of their invigorating freshness, nor because of the lab-certified purity of their production.

It was all about the tins.

When the US entered the First World War, they faced the problem of transporting American-factory-built fuses and detonators 4,000 miles or more, over land and sea, to the front lines. Fuses are fragile and persnickety. Moist air can cause a detonator mechanism to malfunction. As William Bradford Williams put it, rather ghoulishly, in Munitions Manufacture in the Philadelphia Ordnance District (1921)

"A dampened fuse when placed in a projectile results in a 'dud,' and a dud never raised the mortality rate of the German soldiery."

The Manufacturing Company of America had faced a very similar problem when they contrived to deliver their mints as fresh as the day they were made to the post-prandial candy-cravers of these United States, leading to the development of a box that was "absolutely air-tight and moisture-proof.... hermetically sealed against light, water, dust and air."

Good enough to suit the needs of Army Ordnance, and deliver minty-fresh fuses and detonators to the front.

According to Williams, the Manufacturing Company of America allowed the government to take over the production line at the U-All-No plant, modifying the process to built tins large enough to fit detonators for "high-capacity drop bombs" and fuses for Livens flame-throwers. They continued to made mints, though, for our boys in the army. Quoth Williams: "A large part of the firm's U-All-No After Dinner Mint was taken over by the government to supply the insatiable demands of our boys overseas for a few of those delicacies to which they had become accustomed at home." 

U All No tin black and white.jpg

 

Addendum: The Masticating Ape

As summer winds down, I've been catching up on some old America's Test Kitchen podcasts, including one from June 6 that adds a little monkey business to my earlier musings on Soylent and chewless foods

cookingchimp.jpg

In the podcast, Christopher Kimball interviews Harvard primatologist Richard Wrangham about the substance of his 2009 book, Catching Fire: How Cooking Made Us Human. (You can listen to the podcast here; the interview starts at the 16-minute mark.)

I'm not super wise to the latest theories in human evolution, but apparently the conventional argument is that meat-eating is key to explaining the emergence of modern humans. Wrangham argues that the shift to eating meat could not have occurred without cooking. Cooking is what makes the hunter-gatherer lifestyle possible, and all the things that go with it: bigger brains, gendered social structures, culture. A key part of his argument rests on the relationship between cooking and chewing.

Lucy taking a break from chewing, apparently.

Lucy taking a break from chewing, apparently.

According to Wrangham, chimpanzees typically spend six hours a day chewing, and then another couple of hours in a post-prandial lull, digesting. It's not just the actual procurement of food that requires energy, it's the consumption and assimilation of it. All that raw plant matter has to be broken down by time, effort, and big guts. You can see the evidence of this in the anatomy of our plant-eating Australopithecene ancestors (hi, Lucy!): they have great big jaws, big teeth, and big guts.

On the other hand, modern adult humans spend less than an hour a day chewing – a figure that remains consistent, Wrangham says, despite regional, cultural, and economic differences. Unlike our Australopithecene forebears and our living primate relatives, we have relatively small guts, like carnivores, and relatively large and fuel-hungry brains. This anatomical shift is in evidence about two million years ago, with the emergence of another of our ancestors, a species we call Homo erectus.   

Homo erectus had small guts, like a carnivore, but did not have sharp carnivore-like teeth to tear meat off bones and consume it raw. Moreover, although meat was important to the Homo erectus diet, it would not have been consistently available year-round. But if Homo erectus meals varied seasonally between being meat-dominant and plant dominant, their small guts and small jaws would not have been sufficient to the task of effectively extracting a sustainable number of calories from plant matter.

Wrangham argues that cooking resolves these puzzles.  Cooking changes the material and chemical properties of food, which has two evolutionary advantages: it makes food softer, meaning that less time needs to be spent chewing and digesting, and it denatures proteins and breaks apart chemical bonds, making more calories and nutrients biologically available.

The days of our lives are numbered, as are the hours in each day. Less time spent chewing leaves "more time for other things – going to war, gamboling under a tree, writing poetry."   

To illustrate the increased caloric payload of cooked food, Wrangham describes an experiment with rats. One group of rats was fed hard pellets, the other group was fed the same pellets that had been aerated to make them soft and tender. Both groups of rats technically ate the same number of calories, but the rats eating soft pellets had 30 to 40 percent more body fat. (The correlation between softer foods and bigger bodies has also been observed in humans, and is cited as one of the possible explanations for increasing levels of obesity in the developed world.) 

Cooking was essential to the emergence of hunter-gatherer cultures because it indemnified against the inherent risks of hunting. Chimpanzees "love meat," Wrangham says, but rarely eat it, and only spend about twenty minutes a day hunting. Why? Because if they go out hunting all day, and fail, there is no way to make up that day's caloric deficit – there aren't enough hours in the day to hunt, fail, and chew leaves for six hours. Cooking meant that humans (men, according to this theory) could venture out and hunt all day, and even if they failed, they would still be able to consume and assimilate enough calories (dished out by women, or whoever else stayed near home base) to make up for the loss. 

For Wrangham, cooking is not only key to understanding the evolutionary history of the human species, it is also a uniquely human technology: harnessing external energy sources to improve and enhance the energy-providing qualities of food. Instead of using only our own biological, bodily resources and processes (chewing, digesting) to extract the energy and nutrients from food, cooking takes over some of the work that our hominid ancestors did with their gnashing teeth and their churning guts. 

So perhaps chewless foods – like Soylent, or like those Hugo Gernsbach imagined in Ralph – are a brave new stage in evolutionary history, and perhaps our descendants will only use their dainty teeth as ornaments and mementos of a chewier, tougher to swallow, past.

Are Teeth Necessary? Chewing on the Food of the Future

There's been a cluster of recent articles about Soylent, the Silicon Valley open-source pap that is supposed to be the perfect fuel for knowledge-workers' ceaseless sedentary labors. "What if you never had to worry about food again?" is Soylent's slogan, and the product promises to resolve all our nagging food anxieties. Not only: what's for dinner? But also: is it good for me? Will it make me fat? Does it wreck the environment or exploit migrant farm workers?  Will it get crumbs on my keyboard, and make me look conspicuously sad and slovenly as I eat yet another meal at my desk? Soylent is a powder (either purchased from the company or DIY) that, when mixed with water and oil, forms a nutritious beige slurry - allegedly capable of providing sustenance for hours of uninterrupted, untroubled, supremely focused labor.

But in all the chatter about the resultant mephitic farts and "the end of food," I haven't heard much said about how Soylent revises the actual mechanics of eating. It is a chew-less food, and this places it in a particular tradition of techno-scientific "foods of the future." The company's name, of course, is an explicit (either ironic or ill-considered) reference to the eponymous edible in the film Soylent Green, a nutritious wafer allegedly derived from algae, but which we all know by now is people. But other, earlier science-fictional precursors to this kind of all-in-one food product are perhaps better models for Soylent's particular material ideology.  

Gernsback demonstrating one of his many inventions, "The Isolator." "Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand."

Gernsback demonstrating one of his many inventions, "The Isolator." "Outside noises being eliminated, the worker can concentrate with ease upon the subject at hand."

For instance: Hugo Gernsback's Ralph 124C 41+: A Romance of the Year 2660. First serialized in the 1910s, Ralph gets hazed as one of the worst novels to have ever made it into print, and I suppose most people read it as a historical curiosity rather than with genuine relish. (Gernsback is the Luxembourgian immigrant credited with creating "science fiction" as a pulp magazine genre, which was initially a sideline to promote his radio-and-electrical hobbyist mail-order emporium. He's the guy the "Hugo Awards" are named for.)

Ralph and Alice explore New York 2660 on tele-motor-coasters.

Ralph and Alice explore New York 2660 on tele-motor-coasters.

Ralph, the scientist-hero of the story, is one of literature's most dogged and unflappable mansplainers. A rudimentary damsel-in-distress plot serves as the occasion for him to take his lady-love, Alice, on a guided tour of future New York. Total weather control? Sleep-learning? Solar-powered generators wirelessly transmitting energy? "Alomagnesium" roller skates (er, "tele-motor-coasters") for smooth gliding over crack-less "steelonium" sidewalks? They've got all the mod cons. Earth circa 2660 is a place where the forces of nature have been entirely subdued, and where all matter (and ether) has been organized to facilitate a particular kind of human design: maximally efficient, maximally automated, where form always follows function, and where waste of all kinds is assiduously eliminated (eg, the lossless conversion of solar to electrical energy; the time we once wasted sleeping now a time for productive learning).

Rob Rhinehart, the creator of Soylent, is but a stripling of twenty-five, yet his fixations seem to spring directly from this Progressive-era obsession with maximizing efficiency and minimizing waste. The idea for Soylent occurred to him when he became frustrated by the time, labor, and expense necessary to feed himself adequately during the waning days of a failing start-up. An engineer by training, Rhinehart began to perceive food itself as inefficient, a poorly designed vehicle for the delivery of the chemical compounds that sustain life. As he puts it in Lizzie Widdicombe's fantastic New Yorker profile, "You need amino acids and lipids, not milk itself... You need carbohydrates, not bread." Fruits and vegetables? Sure, they've got vitamins and minerals, but as a matter of fact they're "mostly water." And so he did research: streamlining life's necessities to a list of 35 essential vitamins and nutrients, and ordered the raw materials for his simplified, complete food off the Internet. It's got everything you need, nothing you don't.

For Rhinehart, food's inefficiencies begin at the source: agriculture. Farms, he explains, are "very inefficient factories" that require excessively strenuous and dangerous work from an impoverished underclass. Unlike slow-food advocates who prescribe a return to skilled, artisanal practices to restore dignity and meaning to farm work, Rhinehart believes that the solution is to increase mechanization and industrialization: "There’s so much walking and manual labor, counting and measuring. Surely it should be automated.” 

This is certainly a sentiment that Ralph would get on board with. Food in 2660 is grown in vast, machine-tended, accelerated-growth greenhouses, stimulated to rapid ripeness by artificial lights and electric currents. And when it's not grown, it's manufactured. Taking Alice on a tour of a synthetic food factory, Ralph proclaims: "Men of an inquisitive nature must have asked themselves the question for thousands of years, 'Why grow grass, let the cow eat the grass, digest it, and finally turn it into milk? Why not eliminate the cow entirely?'"  

But while I think Rhinehart would definitely be for eliminating the cow, he still concedes the social and emotional need for traditional meals, prepared with care, eaten in the company of others -- "recreational food," he calls this, arguing that Soylent actually makes these indulgences less fraught, heightens their pleasure and meaning, by taking the problem of mere sustenance off the table. Soylent provides everything you need, nothing you don't, so that when you do choose to chomp on larks and pavlovas, you needn't worry about ruining your diet. Your diet is taken care of.

In Ralph's world, on the other hand, the material consistency of food is as important as its nutritional composition. The future food in Ralph's world is exclusively chew-less. When Ralph escorts Alice to a "Scientificafé," he assures her, "I think you will prefer it to the old-fashioned masticating places." Crucially, the "scientific food" served at these restaurants is available exclusively in liquefied form. Chewing (or, as Ralph invariably puts it, "masticating") is just another inefficiency, one that technoscience has rendered no longer necessary.   

Let's accompany Ralph and Alice on their date at the Scientificafé, shall we? Before entering the dining room, they tarry in the Appetizer, "a large room, hermetically closed," where pages from humor magazines are projected on the walls. When Alice grows peckish, Ralph explains: "The air in here is invigorating, being charged with several harmless gases for the purpose of giving you an appetite before you eat -- hence its name!"

After being gassed into a proper state of hunger, they then proceed to the "main eating salon," white-and-gold luxe in international moderne style. There are no waiters, no attendants, and the room is silent save for a "muffled, far-off, murmuring music." The diners recline in leather armchairs, in front of a complicated silver board at whose side hangs a flexible tube capped by a silver nozzle, resting in disinfectant solution.

You feed through the tube. "Meat, vegetables, and other eatables, were all liquefied and were prepared with utmost skill to make them palatable." The silver board lists the day's offerings, diners push buttons to make their selections, and the food begins to flow. A red button controls the flow-rate, and other buttons and switches allow the diner to adjust the temperature, or add salt, pepper, and spices to the slurry. Between courses, the tube rinses itself out with hot water.

There's no need to labor over your meal with a knife and fork; no need to chew each bite until it can safely be swallowed. Ergo, the book's narrator concludes, "eating had become a pleasure."   

The only problem to the widespread acceptance of scientific food was getting people to overcome their repulsion at sucking their meals through tubes. "Masticating" is old-fashioned, and like all "inherited habits," difficult to shake. At first, Ralph explains, people rejected the new mode of eating, regarding it "with a suspicion similar to a twentieth century European observing a Chinaman using his chop-sticks." It seemed "unaesthetic," and  "devoid of the pleasures of the old way of eating." But once people understood the physiological benefits -- how chew-less food "did away almost entirely with indigestion, dyspepsia, and other ills," how it made people "stronger and more vigorous" -- they abandoned their irrational, sentimental attachment to mastication.  

health!.jpg

For Ralph (and Gernsback), the chief virtue of "scientific foods" is not their refined flavor nor even their nutritional content, but their "digestibility." Many scientifically-minded Americans of the late nineteenth and early twentieth centuries considered dyspepsia (indigestion) to be a genuine health crisis -- "the great American plague," to quote Henry Finck, whose 1913 book, Food and Flavor: A Gastronomic Guide to Good Health and Good Living, makes the epicurean case for chew, chew, chewing food to a proper liquefaction. Chewing each mouthful - up to a hundred times - was seen as an essential component of physical and mental hygiene. In the words of health reformer Horace Fletcher, "nature will castigate those who don't masticate," a gospel that was promoted widely during this period, including at John Harvey Kellogg's famous sanitarium in Battle Creek, Michigan. 

Historian Christina Cogdell has chronicled the obsession with "smooth flow" in the Progressive era, showing how the Progressive virtue of frictionless efficiency manifested in different cultural realms: in concerns about the dangers of constipation, in the fad for streamlined design, and in eugenic policies and politics.

From Ladies Home Journal, 1934. Image courtesy Duke University Hartman Center for Sales, Advertising & Marketing History. 

From Ladies Home Journal, 1934. Image courtesy Duke University Hartman Center for Sales, Advertising & Marketing History. 

Constipation was understood to be "a disease of civilization," caused by excessive consumption of excessively rich or highly flavored foods, by impurities and contaminants, and by the habit of hastily "bolting down" food rather than civil, deliberate chewing.  But the consequences of constipation were more significant than any one individual's discomfort and bloating; they undermined the very health of the polity. To Progressive reformers, a stagnant colon was at the root of both moral and physical degeneracy, causing "autointoxication" that enfeebled, enervated, and exhausted the nation's citizens. Food should flow smoothly and at a consistent rate, as though down a factory assembly line, from mouth to anus. Dyspepsia, constipation, indigestion -- all of these things made us, as a society, less productive, less fit, less suited to meet the challenges and seize the opportunities of modernity.    

And though we've left Fletcherism and its gospel of mastication more or less in the past, functional foods like Soylent stage a sort of return to this dream of a food perfectly suited for frictionless productivity - a food designed for the steady satiation of needs without the distracting stimulation of appetites. By design, Soylent has no particular flavor - which Rhinehart sees as unnecessary ornament, a compromise of the compound's commitment to functionalism. (The New Yorker quotes him: “I think the best technology is the one that disappears.... Water doesn’t have a lot of taste or flavor, and it’s the world’s most popular beverage.”) On a steady diet of Soylent, Lizzie Widdicombe writes:

"As Rhinehart puts it, you 'cruise' through the day. If you’re in a groove at your computer, and feel a hunger pang, you don’t have to stop for lunch. Your energy levels stay consistent: 'There’s no afternoon crash, no post-burrito coma.' Afternoons can be just as productive as mornings." 

Who wouldn't want this? As a lady who sometimes (often) struggles to write, who owns not one but two copies of Getting Things Done, (neither of which I've read beyond the first chapter, naturally), and who, on the regular, postpones lunch for as long as possible, because of the sluggish lull of afternoon lackadaisy that always succeeds eating - this sounds pretty excellent. Like putting on Gernsback's isolator helmet, and concentrating "with ease at the subject at hand." And yet. And yet... Latent in this, I think -- and tracing back to at least some of those Progressive reformers, whose vigorous championship of rational design and smooth flow came from the most unimpeachable motives, produced monuments of exceeding beauty, but concealed some pretty ugly collateral -- is a suspicion of eating itself. A belief that food is somehow toxic, harmful, or impure -- and that our appetites and desires betray us rather than guide us toward well-being. That life's processes should be kept distinct from life's purposes, and to delight in one degrades the other. Who hasn't felt a pang of - something, maybe regret? - when encountering yet again the oft-cited fact, that we spend a third of our lives in bed? Food is a pleasure, but only the most shameless gourmandiser might calculate the amount of time spent eating, thinking about eating, talking about eating, getting ready to eat, resisting and indulging, without somehow feeling at a loss. Well, "enjoy every sandwich."

Technology mediates all aspects of life in Ralph's world, from stimulating the desire to eat (that Appetizer room) to mechanizing the labor of chewing - once done by teeth, now done by liquefying machines. But Gernsback does not go so far as to imagine whether these new technological accommodations will result in bodily alterations, new human physiologies emerging adaptively in response to the technological reshaping of the edible world.

Other science fiction writers - HG Wells, JBS Haldane (in his exercise in speculative eschatology, "The Last Judgment," from 1927) - did take the opportunity to imagine future iterations of human beings as conspicuously toothless. In a 1893 article in The World, Wells argued that technoscience would make chewing obsolete, rendering teeth vestigial and maladaptive. He explained:

"Science gives [mankind] the knife and fork. There is no reason why it should not masticate and insalivate his food. Does it now digest it with all the pepsin compounds? Teeth will disappear....

In some of the most highly developed crustaceans, the whole alimentary canal has solidified into a useless cord, because the animal is nourished by the food in which it swims. The man of the year one million will not be bothered with servants handing him things on plates which he will chew, and swallow, and digest. He will bathe in amber liquid which will be pure food, no waste matter assimilated through the pores of the skin. The mouth will shrink to a rosebud thing; the teeth will disappear; the nose will disappear - it is not nearly as big now as it was in savage days - the ears will go away. They are already folded up from what they were, and only a little tip fast vanishing remains to show that ages ago they were long-pointed things which bent forward and backward to catch the sound of approaching enemies."

Wells imagined the man of the year one million as a toothless cranium, with huge saucer-eyes and teeny tiny limbs: 

HG Wells' own depiction of the man of the year one million.

HG Wells' own depiction of the man of the year one million.

According to Bee Wilson in her recent Consider the Fork, technologies have indeed changed our dentition, though not in the way that Wells presumed. The widespread adoption of the fork, she claims, made overbites endemic. What made teeth optional, she says, was not forks and knives but stew-pots. A stew, simmering for days, softened up all tough bits so that even the toothless could get their share of calories.

Will our species ever be able to leave this toothy period of our evolution behind? There's something tempting about imagining it. Teeth are expensive and uncomfortable to maintain, and thus a sterling status symbol: indicators not only of wealth, but of deserving wealth (because they display the fastidious rigor of our self-care, or our self-denying willingness to submit to pain and discomfort in service of straightness, conformity, regularity, and impeccable whiteness; compare with the derision reserved for grills and tooth-jewels, racialized bling that seems to signify money but not wealth). If the protestant ethic still holds (settle down, Max Weber) straight white teeth could be considered one of the hallmarks of the elect.

So keep smiling, dentists; you've got a million years or so before teeth go out of fashion.

  

 

Keep it Fresh, Keep it Real, Orange Juice

We don't tend to think of freshness as a flavor, at least not in the same way that we think of "orange" or "vanilla" as flavors. "Freshness" is supposed to indicate something about a thing's material condition, its temporality: its recentness to the world and to us. The life history of a fresh food is assumed to be reassuringly direct: there were few intermediaries, few machines intervening, as it made its way to us. Fresh foods are also by definition not stable -- nothing can be fresh forever -- and so always at risk of becoming not-fresh, stagnant, rotten, stale.

There's something uncanny about a fresh-seeming food that is really an old food -- like the changeless McDonald's hamburger in Supersize Me, or those legendary Twinkies from decades ago, still plump and gleaming in their wrappers -- something reflexively repulsive. It brings to mind succubus myths, old women who make themselves appear young and nubile to seduce enchanted knights. Those stories certainly deserve some full-strength feminist revisionizing, yet remain among the purest expressions of the grotesque in our culture.

At the turn of the 20th century, one of progressive reformers' most potent accusations against food manufacturers was that they hired chemists to rehabilitate and deodorize rotten meat and rancid butter, to restore them to the appearance of freshness. This is a deceptive practice -- akin to running back the odometer on a used car -- but pure food advocates also largely opposed chemical preservatives, which didn't run back the meter so much as slow its rate of progress. Part of their opposition came from the claim that these chemical additives were harmful, but I think some of the horror of it was that preservatives made the question of freshness beside the point. Some foods were fraudulent by passing themselves off as something they weren't: margarine for butter, glucose for maple syrup. What chemical preservatives were doing was faking freshness.  

The problem isn't so much that the food is rotten or dangerous, but that you can't tell the difference between fresh and not-fresh, and that difference matters to us. Time changes food; and food unchanged by time seems somehow removed from the natural world, indigestible.

Yet why does freshness matter so much? (We don't always favor new-to-the-world foods, of course. Sometimes time increases value: think of old wines, caves of teeming cheeses, dry-aged beef, century eggs).

What we call freshness is not an inherent condition of a food, but an interpretive effect. We read it from cues including color, taste, aroma, texture, as well as the contexts of consumption. This is what I'm arguing here: freshness is a cultural or social category, not a natural one.

As a case in point, consider the story of store-bought "fresh-squeezed Orange Juice," as described in the April 2014 Cook's Illustrated feature somewhat luridly titled:

The Truth About Orange Juice

Is the sunny image of our favorite breakfast juice actually just pulp fiction?

Cook's Illustrated -- one of my all-time favorite magazines, by the way -- assembled a panel of tasters to evaluate various brands of supermarket orange juice. With the exception of two low-cal samples, all the juices list only one ingredient on the label -- orange juice.

Nonetheless, as Hannah Crowley, the article's author, extensively illustrates, orange juice is a processed food: blended from different oranges, pasteurized, packaged, shipped across continents or over oceans, and required to remain shelf-stable and "fresh tasting," at least until its expiration date. Orange season in the US lasts three months. But we want orange juice all year long.

Part of the challenge of producing commercial name-brand OJ is consistency. How do you get each container of Minute Maid to taste the same as every other container, everywhere in the world, in May or in October? Coca-Cola, the corporate parent of Minute Maid and Simply Orange, uses a set of algorithms known as "Black Book" to monitor and manage production. As an article last year in Bloomberg Businessweek put it: "juice production is full of variables, from weather to regional consumer preference, and Coke is trying to manage each from grove to glass." In all, Black Book crunches more than "one quintillion" variables to "consistently deliver the optimal blend," the system's author told Bloomberg, "despite the whims of Mother Nature."

Sure, but how do you reproduce the experience of freshness? Preservation is not enough. In fact, the means used by OJ producers to arrest decay and rancidity in order to allow them to "consistently deliver" that optimal blend -- pasteurization and deaeration -- actually alter the chemical profile of the juice, in ways that makes it taste less fresh. Pasteurization can produce a kind of "cooked" flavor; deaeration (which removes oxygen) also removes flavor compounds.

Freshness is an effect that is deliberately produced by professional "blend technicians," who monitor each batch, balance sweetness and acidity, and add "flavor packs" to create the desired flavor profile in the finished juice.  Flavor packs are described by Cook's Illustrated as "highly engineered additives... made from essential orange flavor volatiles that have been harvested from the fruit and its skin and then chemically reassembled by scientists at leading fragrance companies: Givaudan, Firmenich, and International Flavors and Fragrances, which make perfume for the likes of Dior, Justin Bieber, and Taylor Swift." The only ingredient on the label of orange juice is orange juice, because the chemicals in flavor packs are derived from oranges and nothing but oranges. Yet orange juice production also has something to do with the same bodies of knowledge and labor that made "Wonderstruck" by Taylor Swift possible. (There are in fact multiple class-action suits alleging that the all-natural claim on orange juice labels is inaccurate and misleading.)

In other words, this isn't just about "adding back" what has been unfortunately but inevitably lost in processing, restoring the missing parts to once again make the whole. The vats of OJ, in a sense, become the occasion for the orchestration of new kinds of orange juice flavors, that conform not to what is common or typical in "natural fresh-squeezed orange juice" (whatever that may be), but to what we imagine or desire when we think about freshness and orange juice. As Cook's Illustrated puts it: "what we learned is that the makers of our top-ranking juices did a better job of figuring out and executing the exact flavor profile that consumers wanted." These flavors don't reproduce nature; they reproduce our desires. But how do consumers know what they want, exactly, and how do manufacturers figure out what this is?  

I can't really answer either of those questions now, but I think one of the consequences is a kind of intensification of the flavor dimension of things. Consider: consumers in different places want different things when it comes to OJ. Consumers the US, according to Cook's Illustrated, especially value the flavor of freshness. One of the volatile compounds present in fresh orange juice is ethyl butyrate, a highly volatile compound that evaporates rapidly and thus is correlated with the newness of the OJ to the world, so to speak. Simply Orange, Minute Maid, and Florida's Natural juices -- all juices "recommended" by the Cook's Illustrated tasting panel -- contained between 3.22 and 4.92 mg/liter of ethyl butyrate. But juice that's actually been squeezed at this moment from a heap of oranges contains about 1.19 mg/l of ethyl butyrate. The equation here is not as simple as ethyl butyrate = fresh flavor, so more ethyl butyrate = megafresh flavor. (One of the exception on the panel's recommendations - an OJ with an ethyl butyrate content more in line with that of fresh-squeezed juice - was actually produced in a way that permitted seasonal variations, was not deaerated, had a much shorter shelf life, and depended on overnight shipping to make its way to stores.) But there is a kind of ramping up, somehow, that seems to both correlate with our desires and recalibrate them.


Dying at the Bench: The Hazards of a Chemical Career

Some days I seem to come across very few actual people in the parched wilderness of trade journals, biennial census reports of manufacturers, and bulletins of chemical societies  -- the archival terrain where I'm currently wandering. But of course, all of these things are full of people, even if they're very deliberately not raising their voices. It's a hazard to mistake all the statistical tables and formulas and price lists as things that have somehow shaken themselves free of human beings, that represent the effortless interactions of chemicals, the frictionless relations of markets.

But then, sometimes, I'll be on the trail of a name -- some minor analytical chemist, or some voluble manufacturer, who seems to hold a key or serve as a connection between things or ideas -- when, unexpectedly, I trip across the obituary and realize that I've been compiling a dossier on an actual person. Shaken, I realize that the person has taken on the same tone as the tables and graphs, has become one of my "historical actors," etiolated, unresistant, a pawn that I move around my paragraphs in service of my arguments. 

For the past couple of weeks, I've been researching methods for manufacturing synthetic vanillin around the turn of the twentieth century, especially processes that rivaled the patented techniques of the leading French and German manufacturers. And that's how I came across a small notice about Edward C. Spurge's premature death -- in the laboratory -- overcome by toxic fumes from his own chemical experiments. A reminder that, as with the fatal "dissection wounds" of nineteenth-century medical students, or Mme. Curie's radium-martyrdom, the pursuit of scientific knowledge can take its toll.

From The Niagara Falls Electrical Handbook, Being a Guide for Visitors from Abroad Attending the International Electrical Congress, St. Louis, MO, 1904. Published by the American Institute of Electrical Engineers.

From The Niagara Falls Electrical Handbook, Being a Guide for Visitors from Abroad Attending the International Electrical Congress, St. Louis, MO, 1904. Published by the American Institute of Electrical Engineers.

E.C. Spurge was one of first vanillin manufacturers in the US. Born in Essex in 1875 (or possibly 1874), a graduate of the Bloomsbury College of Pharmacy with a B.S. from London University, Spurge was a working chemist who specialized in what were sometimes called "fine chemicals." After putting in time with pharmaceutical and perfumery companies in England and Paris, he emigrated to the US in 1904. Two years later, he patented a method for synthesizing vanillin from isoeugenol (derived from clove oil), and founded the Ozone-Vanillin company in Niagara Falls around the same time to put his ideas into action.

Why Niagara Falls? The ozone-generating machines necessary for the process to work needed a reliable electric current, and Niagara Falls, the center of the electrical and electrochemical industries in the U.S., was just the place.

In the wake of the 1906 Pure Food & Drugs Act, the ambiguous status of synthetic vanillin -- chemically identical to the compound that gave "real" vanilla its prized odor and flavor, yet legally declared an adulterant of "vanilla extract," an "unlike substance" -- meant that, even while demand increased, the prestige of the chemical was questionable. A triumph of synthetic chemistry, but disparaged as a "coal-tar" flavor by many pure food advocates. The Ozone-Vanillin Company tried to distinguish itself from its competitors -- and define its position relative to genuine vanilla extract -- by emphasizing the immaculateness of its product.

Take this advertisement from a 1914 issue of Simmons' Spice Mill:

ozonevanillin.jpg

"Ozone-Vanillin is not an imitation of nature, but an absolute reproduction of the natural aromatic principles of the vanilla bean by the combination of the very same elements which have hitherto been found only as blended in Nature's own laboratory.

Our method of manufacture is an improvement upon approved methods, so that we obtain a snow-white and absolutely pure vanillin by a harmless electro-chemical process."

Snow-white and absolutely pure! 

But Spurge was not alive to see this advertisement run. He died two years earlier, November 6, 1912, "at the bench" -- in the company laboratory, felled by fumes of hydrocyanic acid while working on a series of experiments to present at an upcoming meeting of chemists. Hydrocyanic acid is a solution of hydrogen cyanide and water; hydrogen cyanide was the chemical that would be used in Zyklon B. At the time of his death, Spurge was 37 years old. Several obituaries noted that he was survived by his wife, whom he had married earlier that year. 

Spurge was a practical chemist, a manufacturing chemist -- not an academic chemist. The honorific that he appended to his byline, F.I.C. -- Fellow of the Institute of Chemistry -- indicated "professional competence," not "full training." Nonetheless, his professional identity and the success of his synthetic chemical business were tied up with research, with continued experimentation, as was his collegiality with fellow chemists.

Who found his body? In 1908 testimony to the House Ways and Means Committee on vanillin imports, Spurge argued that American manufacturers needed tariff protection because of the scarcity of professional chemists in the US; instead, there were intelligent but unskilled workmen, who needed to be trained. Did one of these "intelligent but unskilled men" find his boss's body, in a small room full of precise glassware and toxic fumes? What exactly was Spurge working on?  What did he hope to prove? And what about the fate of his vanillin factory, on the American side of the falls, catalyzed by ozone, "the cleanest and most agreeable oxidizing agent known"?

The first mention of using ozone to synthesize vanillin from isoeugenol that I've found dates back to 1895, when two French chemists, Marius Otto and Albert Verley, received a patent to cover this method of production. I also found a remark about a Parisian factory -- I assume Verley's -- producing several kilos of vanillin a day this way. But the ozone-generating machine did not work properly, the yield was inconsistent, profits drooped, and they soon were forced to cease production. Spurge's method was intended as an improvement upon this original electrochemical method, but although his company survived him, it did not outlast him for long. A 1923 article in the journal Chemical and Metallurgical Engineering, reconsidered the processes used by Ozone-Vanillin, lamenting that "after expensive experiments, the method was abandoned, even as it seemed on the verge of success."

(Probably) Albert Verley, synthetic perfumer, student of Satie

(Probably) Albert Verley, synthetic perfumer, student of Satie

Albert Verley, one of the men who held the original patent, has another claim to distinction: he was Satie's only composition student. According to this, as a young man, Verley had dreamed of a career in music, but trained as a chemist; then a serious accident in the lab gravely damaged his right hand. (The hazards of a chemical career!) And so he parted from his piano, and instead devoted himself fully to chemistry.

He did well for himself as a manufacturer: he owned a factory outside of Paris that made synthetic perfume materials, including a renowned version of jasmine that he had developed. Satie's brother Conrad was a chemical engineer, and he may have been the one to make the introduction to the composer. Satie appears to have taken on this pupil mainly for money, not love, but Varley was not, apparently, without talent. Satie strongly recommended Verley's "strange piece" -- L'Aurore, which Satie had orchestrated -- in a 1916 letter to Varése. Verley also composed a ballet inspired by Edgar Alan Poe, Le Masque de la Mort Rouge, The Mask of Red Death, and launched the career of the young conductor, Vladimir Golschmann, by bankrolling a series of concerts of new music. 

Spurge certainly knew of Otto and Verley's method for turning clove oil into vanillin with ozone. He probably first learned of it while working as a chemist at the Societe Anglais-Francais des Parfums Perfecciones, in Courbevois, outside of Paris, the same town where Verley's operation was based. This must have been around the time, 1899, when Verley perfected his synthetic jasmine. What must it have been like, for Spurge, as a young man and a young scientist, strolling in the evening, outside of Paris, at the very coda of the nineteenth century, the suburban landscape faintly scented by the now-deathless odor of chemical jasmine?

 

IBM's "Cognitive Cooking" Food Truck

I'm not ashamed to admit that "Wait, Wait... Don't Tell Me!" is one of my main sources of breaking news, and that's where I first heard that Watson, IBM's own Jeopardy champ, is running a food truck at South by Southwest. Of course, I had to look into it...

A joint venture between IBM and the Institute of Culinary Education, the food truck is an exercise in what IBM (rather bloodlessly) calls "cognitive cooking" -- a street-food demonstration of the practical applications of their "cognitive computing" system, aka Watson. Would you like to read an advertorial about it in Slate? Here you go. And here's IBM's promotional website about the cognitive cooking project. 

This is how you use it. You have to input three things: the main ingredient, the cuisine (eg, Indian, Azerbaijani, Canary Islander...), and the type of dish (eg, burrito, bisque, sandwich). (At SXSW, the type of dish was left up to a Twitter vote, and I suppose the other variables were supplied by IBM.)  Watson then reviews the vast universe of possible combinations, modeling the flavor chemistry of each component and its interaction with other flavor compounds, as well as the potential taste appeal of the final dish and how novel the combination is. It outputs a set of recipes comprising 12 to 14 ingredients, each with a rating based on its assessment of flavor interactions, likeability, and surprise. Just like on "Chopped," you're judged not only on taste but also on "creativity." The goal is to come up with something that's both "weird" and "good."     

[An aside: What is it about the times we live in that makes cross-cultural comminglings the apogee of "weird" cooking? "Indian turmeric paella," are the first words out of the advertorial's mouth. "Peruvian poutine," "Swiss-Thai asparagus quiche," "Austrian chocolate burrito" are all dishes featured in the cognitive cooking recipe archive. Are these combinations really so strange, or unimaginable without cosmopolitan Watson to liberate us from our parochial attachment to thoroughbred cuisines? This is not, I think, simply a retread of the 90s vogue for "fusion," which sought a diplomatic accommodation between US appetites and "exotic" (usually Asian) ingredients and techniques. All the borders have come down; materials and methods can be freely recombined without tariffs or translations; culture is just another seasoning. Should we call this "world markets cuisine," globalism's dinner plate, neoliberal gourmandise?]     

IBM's challenge is to prove to all of us that Watson isn't just some better sort of Google, a more refined filter for sorting relevant from irrelevant, signal from noise. What IBM wants to demonstrate is that Watson can provide creative or unprecedented solutions, things that don't just work right but also "feel right." As the Slate advertorial puts it, "A system that can generate new things the world has never seen before is a significant step in cognitive computing."

This is actually a rather tall order, especially as IBM is always careful to insist that "cognitive computing" is not a replacement for human creativity (the brain is "the most creative computer of all," in their words) but a tool to enhance it. The decision to use food -- and, specifically, the creation of unusual flavor combinations -- as a debut showcase for this technology is thus very deliberate, and taps into a longer history. Sure, the marketing team has festooned this with all the right merit-badges -- hipster foodies and their food trucks, Twitter crowdsourcing, SXSW, "the cloud" -- to gain likes and influence retweets in those zones of social media where knowing what's "trending" counts as connoisseurship. But the problem of meshing these two kinds of information about flavor -- what IBM refers to as "chemoinformatics" (ie, its chemical behavior) and "hedonic psychophysics" (ie, our sensory experience of it)  -- is something that has daunted the flavor industry since, at least, the mid-twentieth-century.

I've just been reading the proceedings of the 1961 Flavor Chemistry Symposium, hosted by Campbell's Soup at their old HQ in Camden, New Jersey. This was one of the very first scientific conferences devoted to this chemical subfield. (The Society of Flavor Chemists, the first professional organization, had been inaugurated less than a decade earlier; the American Chemical Society wouldn't create a flavor chemistry division until six years later.) The papers from this conference makes it clear how rapid progress has been in the field: more and more, the molecular structure of flavor compounds, their chemical precursors and interactions with other molecules during cooking and preparation, how they degrade, what influences them, and so on, are being quantified, verified, understood. As Carl Krieger, the director of Basic Research & Product Development at Campbell's remarks at the kick-off of the conference, there was a new "realization that the mysteries of flavor can be solved."

Except. Except that "the physiology and psychology of taste, odor, and flavor" are still vast unknowns. Krieger ventures that only by making positive identifications of flavor chemicals "will it be possible to describe flavors in universally meaningful terms" (ie, by their chemical names) rather than the subjective terms of experience -- "metallic," "stale," "rancid," -- "which, I must confess, seem to me to be pure gibberish." Thankfully, Krieger concludes, their conference will not focus on perception of flavors, but their chemistry - "something that I believe all of us feel is more amenable to direct experimental study." 

Okay, that's all well and good for Krieger to say, but knowing what the flavor compounds are doesn't answer the million-dollar question: "Will people like it?" That's a big missing piece of the puzzle -- the gap between the chemoinformatics, so to speak, and the hedonic psychophysics. Flavor companies -- and the US government, especially the army -- labored to make flavor evaluation "objective," to standardize descriptive vocabularies, to train tasters and impanel consumers to supply their opinions before a product hits the market. But these studies always involved human beings, unruly instruments on their best days, and their subjective responses are, by definition, not generalizable -- do not produce the "universally meaningful terms" that Krieger claimed chemistry did.

And this, fundamentally, is what IBM claims is different about its "cognitive computing" model, and what it's trying to show with this food truck project. We're quite used to claims like "chefs can only consider combinations of two or three ingredients at a time; computers can contemplate quintillions" -- yes, computers can outfox even the foxiest human thinkers. This system doesn't just crunch numbers, it makes judgments about subjective sensations. As the IBM advertorial tells us, it "understands why thousands of different recipes are appealing, what people prefer." Here's the crux of the claim: "It understands, learns, and considers not just big data but also human perception."

These two things -- big data, human perception -- continue to be held at arm's length from each other. But isn't the promise of this technology, in fact, that it successfully converts human perceptions into data, data that the machine-system can "consider" and that are susceptible to the same tools and techniques that guide the collation and analysis of other forms of 'big data'? The dream realized here is that we will finally be able to bring subjective experience into the same table that we use to calculate agricultural yields or profit margins.

What is supposed to make Watson different, I think, is that it claims to formalize the bodies of knowledge that have so far resisted formalization. Things like intuition. Experience. What we in the STS biz call "tacit knowledge" -- the kinds of things you learn by practice, by doing -- like how to make fine adjustments to instruments, or to hone a curve on the form of a chaise lounge, or to add a new ingredient to a recipe. Not just the look of things, but what we felt at what we saw. But Watson enters a crowded field, because our "personal technologies" increasingly aspire to recognize and cater to our subjective preferences. Like when Netflix deduces your taste in movies, not merely spitting out a list of other black comedies, but synthetically tailoring for you an array of "Dark GLBT Comedies with a Strong Female Lead." Or the new music data venture that scans Twitter for early "flickers of excitement about a fledgling band," "the kinds of signs music scouts have always sought." The Watson system isn't just about helping General Foods design new crazy flavors of potato chips; IBM promises that the applications for cognitive computing are in all fields that rely on "design and discovery." This isn't a technology that competes with Google; it's technology that competes with technicians and so-called knowledge-workers -- designers, flavorists, A&R divisions, R&D folks -- highly skilled workers whose refined, intuitive knowledge of their fields are supplemented (or supplanted) by "cognitive computing."

But fear not! Our cherished celebrity chefs won't be driven to extinction by our new networked overlords. "Cognitive computing is a sous-chef working alongside seasoned professional chefs." Right, it's not Emeril's job that's at stake, but those of his unnamed assistants, who will surely still be required to slice and dice -- Watson, after all, doesn't have hands to get dirty -- but perhaps less entrusted with the fine adjustments and refinements, with the knowledge side of technical work. (Similar, for instance, to what Deborah Fitzgerald calls the "deskilling" of farmers after the introduction of genetically modified hybrid corn.) Or maybe not. Maybe systems like this really do foster innovation, break down the barriers that have hitherto prevented us from dreaming up a Swiss-Thai quiche, an Indian paella.  

I should wrap this up on a less lugubrious note. So I'll add that, the consensus on the internet seems to be that Watson's food was pretty good and somewhat novel, though some were disappointed that it was prepared by humans and not robots. Brillat-Savarin said it, and I believe it: "The discovery of a new dish, which excites our appetite and prolongs our pleasure, does more for human happiness than the discovery of a star." The question, I suppose, is how you define "new," and what you mean by "discovery."  

Green Appetites

I'm re-reading Regina Lee Blaszczyk's excellent The Color Revolution, a gorgeously illustrated history of how twentieth-century commodities got their colors, and how those colors were managed -- foretold, masterminded, coordinated -- by a new set of experts: men and women working for chemical companies like DuPont, across the fashion industries, or for manufacturers of products ranging from sedans to dinnerware.  Building on the work of World War I camouflage experts and early-twentieth-century color systems, expert color managers drew together scientific theories of color, consumer statistics, psychology, French couture, modern art (see, for instance, Georgia O'Keeffe's ads for Cheney Brothers' textiles), and considerable savvy about design -- to produce color palettes that enhanced the contentment of workers and stimulated the appetites of consumers.

No account of the backstage rigging and scrims of mass consumption would be complete without an appearance by Edward Bernays, Freud's nephew, founding genius of PR, and subject of this earlier post. Blaszczyk offers this incredible anecdote:

Invitation to the 1934 Green Ball, from the Edward L. Bernays papers, Library of Congress. From Blaszczyk, The Color Revolution, 161.

Invitation to the 1934 Green Ball, from the Edward L. Bernays papers, Library of Congress. From Blaszczyk, The Color Revolution, 161.

"Women wouldn't buy Lucky Strike cigarettes because they thought the dark green package clashed with their wardrobes. The chief executive refused to redesign the package, having spent millions of dollars advertising it. Enlisting the support of New York high society ladies, Bernays launched the Green Ball, a spectacular charity event at the Waldorf-Astoria, which made dark green the fashion sensation of 1934. His staff worked behind the scenes getting stores to promote green, mills to make green, and prominent women to wear green. The Green Ball evoked color as a status symbol, a fashion trend, and a money generator." (p. 160)

And all of this for Lucky Strike, which now, of course, has Op-Art red and white packages!

Here's an old ad for Luckies, pre-redesign, as reference:

LuckyStrikeDoctor.jpg



Meat Juice and Perfect Food

This alluring advertisement in the back pages of 1895 issues of The Manufacturer (a Philadelphia-area weekly industry newspaper from back in the day) caught my eye.

Meat juice extractor?! What is happening here! Luckily, I found an explanation in an earlier issue:

All yours for the low, low price of $2.50

All yours for the low, low price of $2.50

"The use of meat juice for medicinal purposes is a growing one, and is recommended for the aged, also delicate invalids, and for invalids, in all cases where complete nourishment is required in a concentrated form. The meat to be operated upon should merely be thoroughly warmed by being quickly broiled over a hot fire, but not more than to simply scorch the outside, and then cut in strips. The yield should be about six (6) ounces from one (1) pound of round steak. Only tepid water may be added, as hot water will coagulate the meat juice. Season to taste. The machine being tinned, no metallic or inky flavor will be imparted to the material used. The dryness of the pulp or refuse can be regulated by the thumb-screw at the outlet." (The Manufacturer 7, no. 26 (1894), 10)

 

Nourishment in concentrated form for the aged, delicate invalids, and (unqualified, presumably indelicate?) invalids! This reminded me of something that my mother once told me about one of her own childhood spells as a delicate invalid; she grew up in a little town on the Argentine pampas during the 1940s and 50s. I called her up and asked:

Me: Mom, what was that thing you once told me about how you had to drink meat juice...?

Mom: Oh, yes, when I was very sick with hepatitis. Nona would make this. She put a piece of filet mignon in the machine, and it would squeeze it, squeeze it, and the juice would fill a bowl. And the filet mignon afterwards was like a cardboard.

Me: And you would drink this??

Mom: No, you did not drink it raw! You warmed it in a bain marie, with some salt and pepper. Swirl it, swirl it until it is hot - and then you drink it.

Me: What was the machine?

Mom: It was like a press - it had two flat plates, metal.

Me: Where was this meat press machine? In the kitchen? Did Nona buy it specifically to make this?

Mom: Yes, she bought it specifically. It was very common. At this point, meat in Argentina was very cheap. It took two filets to make five ounces of liquid. You know how expensive that would be here!?

The machine my mother describes doesn't seem exactly equivalent to the Enterprise Manufacturing Company's model - which appears to be more like a masticating juicer than a "press." But the two seem similar enough, and they share a common purpose: the domestic production of a special restorative diet for the enfeebled.  

But why meat juice? How did this become a therapeutic food?

There's a long tradition of prescribing aliment as a treatment for particular ailments. Galenic medicine used food to recalibrate the body's four humors, whose imbalances were thought to cause disease. There's also a long tradition in the West of associating meat-eating with masculine vim and vigor. Some of this back-story certainly shapes the widely held belief that meat is "strengthening" and "restorative." But a steak is materially different than its liquid runoff. How did people come to believe that the liquid squeezed out of meat contains the vital essence of the food, and not the substantial stuff that's left behind? 

Part of the answer to this question can be found in the South American Pampas of 1865 -- specifically, Fray Bentos, Uruguay, home of Liebig's Extract of Meat Company. (You can find another version of this story at the Chemical Heritage Foundation magazine.)

The company bears the name and the imprimatur of Baron Justus von Liebig (1803-1873), a Hessian, one of the pioneers of organic chemistry and of the modern chemical laboratory. Beginning in the Enlightenment, life processes (circulation, respiration, digestion) were investigated as physical and chemical processes, and one of the central questions for chemists was this: how does food become flesh? The answer to this was to be found not by alluding to some invisible vital force, but by careful analysis and quantification: calculating measurable changes in mass and energy, using tools like balances and calorimeters and conducting experiments with dogs and prisoners on treadmills. Chemists like Liebig engaged in a kind of nutritional accounting, identifying and quantifying the components of food that make life, growth, and movement possible.

This new way of thinking about food and bodies had consequences. It became possible to imagine a "minimal cuisine" - food that's got everything you want, nothing you don't. This was important and desirable for various reasons. The Enlightenment marked the emergence of the modern nation-state, which was responsible for the well-being of its population in new ways.  Industrialization displaced rural populations, creating desperate masses of urban poor who were not only pitiable, but were also potential insurgents. Modern wars and colonial ventures meant provisioning armies and navies. There was an urgent and visible need for food that was cheap, portable, durable, its nutritional and energetic content efficiently absorbed to fuel the calculable energetic needs of soldiers and workers.

I won't go into to much detail about the chemistry (you can find a substantial account of the history of nutritional chemistry here), but Liebig, in the 1840s, believed that (Nitrogen-containing) protein was the key to growth; fats and carbohydrates did nothing but produce heat. In his monumental 1842 tome, Animal Chemistry, or Organic Chemistry in its Application to Physiology and Pathology, he analyzed muscle, reasoning that protein is not only the substance of strength but also its fuel. An extract that concentrated the nutritional virtues of beef muscle fibers, then, could be the perfect restorative food.

This led him to develop a formula for his meat extract -- a concentrated "extract" of beef that promised to solve the growing nutritional crises of modernity. Imagine how much simpler it would be to provision an army when 34 pounds of meat could be concentrated into one pound of virtuous extract, which could feed 138 soldiers! No more bulky chuckwagons or questionable rations of salt pork and hardtack! Plenty of concentrated food for the poorhouse! Moreover, Liebig certainly believed in the healing power of meat extract. When Emma Muspratt, the daughter of his close friend James, a British chemical manufacturer, fell ill with scarlet fever while visiting the Liebigs in Giessen in 1852, Liebig, desperate to restore the failing girl to health, spoon-fed her on the liquid squeezed out of chicken. She survived.

However virtuous, Liebig's meat extract was too expensive to produce in Germany. In a public gesture that was only partly an act of self-promotion, Liebig offered his idea to the world, vowing to go into business with anyone who could make it happen. It would be nearly twenty years before someone took him up on it.

This brings us back to the South American pampas, where the missing ingredients in Liebig's formula could be found: cheap land, cheap cows, and ready access to Atlantic trade routes. A fellow German (or possibly Belgian), Georg Giebert, wandering the plains of Uruguay noticed that the herds of grazing cattle rarely became anyone's dinner. Their valuable hides were tanned and turned into leather, but the carcasses were left to rot. Wouldn't it be great, Giebert wondered, if there were a way of using that meat, salvaging it by concentrating its nutritional value into easy-to-export extract?

Entering into partnership with Liebig, Giebert established a vast factory at Fray Bentos, where the meat was crushed between rollers, producing a pulpy liquid that was steam-heated, strained of its fat content, and then reduced until it became a thick, mahogany goo that was filtered and then sealed in sterile tins. Extractum Carnis Liebig - Liebig's Extract of Meat - first hit Europe in 1865 and was initially promoted as a cure for all-that-ails-you. Typhus? Tuberculosis? Heavy legs? Liver complaint? Nervous excitement? Liebig's Extract of Meat is the medicine for you!

Then came the skeptics. Chemists and physicians could find very little measurable nutritional content in Liebig's Extract of Meat. Dogs fed exclusively on Liebig's extract swiftly dropped dead. As British medical doctor J. Milner Fothergill thundered in his 1870 Manual of Dietetics: "All the bloodshed caused by the warlike ambition of Napoleon is as nothing compared to the myriad of persons who have sunk into their graves from a misplaced confidence in beef tea."

But this did not sink Liebig's extract of beef or the factory in Fray Bentos. (It would take a salmonella outbreak in the 1960s to do that.)

As Walter Gratzer notes in his book, Terrors of the Table: The Curious History of Nutrition, Liebig changed his tactics in the face of his critics, downplaying the medical benefits of beef extract, and instead arguing that its use is "to provide flavor and thus stimulate a failing appetite."  "Providing flavor," then, was an essential functional component of the food. But this applied to more than just those with "failing appetites." Liebig's Extract of Meat was a success for decades not because of its consumption by "delicate invalids" and the enfeebled poor who needed cheap nutrition, but by ruddy Englishmen and other gourmands, who used it as an additive to increase the "savour" of their cuisine.

Beef extract provided what the 19th-century French gourmandizing scientist Brillat-Savarin dubbed "osmazome," and what we would call "umami": the glutamate richness that connoisseurs relished before science gave it a name. As Brillat-Savarin writes, "The greatest service chemistry has rendered to alimentary science, is the discovery of osmazome, or rather the determination of what it was."

And the Chemical Heritage Foundation reprints an ad for Liebig's from their collection which emphasizes the appeal of beef extract to the gourmet, rather than to the invalid:

"NOTICE: a first class French Chef de cuisine lately accepted an appointment only on condition of Liebig Company’s Extract being liberally supplied to him.”

Instead of becoming a "minimal food," fulfilling the nutritional needs of humans in the simplest and most efficient way, beef extract became a flavor enhancer - without, however, completely losing its hold on the health-giving and restorative benefits that it initially claimed. This is why the meat juice extractor was manufactured, and why my mother drank warm meat juice to recover from a bout of hepatitis. 

The question that haunts all of these investigations into minimal foods is the following: Is flavor a luxury, or is it a necessary component of foods? Some later nutritionists believed that the beneficial effect of meat extract was due in part to its flavor - or, more precisely, the effect the flavor had in "stimulating the appetite." In Dietotherapy, a 1922 nutritional textbook by William Edward Fitch (available free on Google), Fitch cites Pavlov's experiments as evidence that no substance is a greater "exciter of gastric secretions" than "beef tea."

As the blog for the (totally real, possibly not dystopian) "food" product Soylent puts it, "there is more to food than nutrition.... Even a product as minimal as Soylent must concern itself with the “hedonic” aspects of eating. These include, but are not limited to: appearance, taste, texture, and flavor / odor." (I'm definitely writing more about Soylent and flavor in a future post...)

Regardless of whether it is nutritionally adequate, lack of flavor or poor flavor can be a problem for food. The argument that prison loaf is torture is due in part to its total absence of "hedonic" qualities. However, not only can flavor preferences be debated, but the importance of flavor itself can be called into question. Many nutritional experts at the turn of the twentieth century prescribed mild, bland diets as the best for health and well-being; "highly flavored" foods, they cautioned, were hazardous, a cause of both obesity and its attendant diseases as well as emotional instability. And in our own cleanse-obsessed era, an appreciation of the bracing flavor of green juice or the intense bitterness of turmeric are signs of moral and physical enlightenment. Indeed, on the Soylent blog, the product's creators assure concerned readers that the inclusion of vanillin in the ingredient list is not to make "vanilla" soylent, but rather to offset the "bitter and fishy" flavors of other ingredients. The stated goal is to make the flavor of Soylent "pleasant without being overly specific." 

And on that note... enjoy this gorgeous collection of Liebig extract of beef chromolithographed trade cards.

I Want I Need

I watched Part I of Adam Curtis' fascinating and prickly documentary series, The Century of the Self, last night -- a sort of sociopolitical whodunit, where the crime is neoliberal consumer capitalism, and the culprit is the government-industrial-psychoanalytic complex. Go watch it! Even if you don't agree with all its arguments (I certainly didn't), it has the real satisfaction of a good conspiracy yarn -- unmasking the secret coherence behind the structures of social life.

Also, it added another knot to my knotty pile of modern entanglements (e.g. Samuel Beckett chauffered Andre the Giant to grade school). Did you know Freud's nephew was the Great Caruso's press agent! (And also, apparently, the agent for the Ballet Russes on their North American tour -- can you imagine seeing Nijinsky in Wichita in 1915?). 

A young Edward Bernays with an admirably dapper mustache.

A young Edward Bernays with an admirably dapper mustache.

So, Part I of the documentary is about this nephew of Freud, Edward Bernays, a U.S. citizen who coined the term "public relations" and who, through his consulting work, revolutionized the tactics and techniques of public persuasion. Before Bernays, the documentary claims, products were promoted based on their functional virtues -- buy these durable pants! Buy this suitable cutlery! It's made to last!

After Bernays, advertisers (and politicians, and anyone who wants to sell a bill of goods to the mass public) made a play for the emotions -- and especially the unconscious libidinal drives that were presumed to motivate our actions. This car will make you feel like a real man. Smoking these cigarettes will make you a liberated woman (literally, because you now have your own torch-like phallus). (Or perhaps: This car will make others see you as a real man. Smoking will tell the world that you're liberated, lady!)

In other words, where marketers previously appealed to people's "reason," after Bernays, they tried to tap into their unconscious, and fundamentally "irrational," minds. In part aided by Bernays' flacking for his uncle "Siggy's" books, these ideas about the irrational unconscious permeated culture far beyond the world of advertising. This theory seemed to be less about individuals than about the mentality of crowds, and, to its adherents, it pointed to a fundamental flaw in democracy itself. If the mass public is basically irrational, how can a democratic form of government persist without collapsing and cancelling civilization? 

For business, however, it represented an opportunity. The documentary quotes the recommendations of an analyst (from Lehman Brothers!) in the 1920s: "We must move from a need-based culture to a desire-based culture."

The implication is that needs can be met, but desires are never satisfied -- and only desire can drive the constant consumption necessary to avoid crises of overproduction and keep a mass-market economy ceaselessly humming along.

So. Here's where I come in. A central part of my dissertation project is about desire -- how flavor chemists and others in the flavor industry create chemical compounds that tempt our appetites and gratify our palates. Flavor chemists and food technologists are manipulating molecules, not deploying psychoanalytic tropes. But, explicitly or not, just like marketers of cars and clothes and cigarettes, they are charged with making their products -- irresistible. In other words, my story is about how food fully becomes a part of consumer culture by becoming delicious.

But the statement about transforming a need-based culture to one distracted by desire -- one of the primary indictments made by the documentary against Bernays and his fellow propagandists, a category in which Curtis pointedly includes Goebbels and the Nazi party -- presumes that there is a clear, bright line between desire and need. And that in manipulating people's desires -- stimulating insatiable appetites, arousing powerful emotions -- you also divert them from recognizing and acting upon their real interests.

This is, I think, the argument that Michael Moss makes in Salt, Sugar, Fat (I haven't read it yet) -- that food companies have gotten so skillful at servicing our desires (for salt, sugar, and fat) that they no longer create products that fill our (nutritional) needs.

But I believe that the line between desire and need isn't as simple as that, nor is the distinction between "authentic" desires and those that are "artificially stimulated" an entirely coherent or useful one. (Of course, the idea of an "authentic self" that "expresses itself" through things like consumer choices is one of the notions that Bernays et al. promulgated.) What is good for us, what is not, and who decides? How do we come to want what we want? What is the relationship of pleasure, or even happiness, to the fulfillment of our needs, the gratification of our desires? Possibly, advertising works on us in ways even now not entirely understood. Certainly, malnutrition is real, obesity is real, and the baleful effects of vast areas of the globe turned over to corn and soy monoculture are real. But Curtis' documentary stumbles, I think, in drawing an intractable binary between "active citizen" and "passive consumer."  

Listen, for instance, to this fragment of an interview with Bernays himself -- about selling the virtues of a "hearty breakfast" to the American public on behalf of his client, the Beech-Nut Packing Company, a food processor that sold canned and vacuum-packed foods.

The problem for Beech-Nut is that most Americans ate a light breakfast, which was a shame because the company wanted to sell more of its prepared breakfast foods. So, in order to change American habits, Bernays solicits the authority of a medical expert:

"We went to our physician and found that a heavy breakfast was sounder from the standpoint of health than a light breakfast because the body loses energy during the night and needs it during the day."

They then asked the physician whether he would write to 5,000 physicians and ask whether they shared his opinion. "Obviously," Bernays intones, "all of them concurred that a heavy breakfast was better for the health of the American people than a light breakfast."

Crucially, Bernays and his firm didn't run paid advertisements, they publicized this "fact" in the media -- newspaper headlines across the country described the consensus of 4,500 physicians that heavy breakfasts -- including, crucially, bacon and eggs -- were better for people's health and strength. Bacon sales went up, Bernays said - he has the numbers to prove it.

Beech Nut Packing Company c. 1946 Courtesy Penn State Special Collections

Beech Nut Packing Company c. 1946 Courtesy Penn State Special Collections

Which is this? Desire, or need? Or desire and need tangled up? Did Bernays believe this claim about bacon being good for you? Did the doctors who endorsed it believe it? Were Americans duped, or did they actively and conscientiously make a choice that they thought would improve their health and their childrens' health -- and fortify the nation's strength? In other words, was the choice to eat a heartier breakfast that of "passive consumers," duped by what we all agree (for the moment, at least, or some of us) is fallacious medical advice, or that of "active citizens," fulfilling a civic duty towards better health?

EDITED TO ADD: I've ruminated on this a bit more, and realized it's probably not the best example of what I'm trying to say. I'm not trying to say that consumer choice is a move commensurate with political action or real structural change, and this example shows how thoroughly immured the consumers are in the system Bernays is buttressing -- eating bacon and eggs not even for their own pleasure, but to fortify the state, egads. What I'm trying to say is that desire and need are not mutually exclusive, that consumers are not thoroughly passive, and that consumer culture produces not only new appetites, but new varieties of discernment, new sensibilities, maybe. And that desire and longing also have a place in a (more egalitarian) state.   

My other quibble with the documentary has to do with the historicization of the changes Curtis describes. I know that this kind of media makes its claims on viewers' attention by insisting that what it's showing us are the real turning points of history, man, but still. Perhaps the explicit invocation of the psychoanalytic/libidinal element is new to Bernays and his followers, but the evocation of consumer desire (in excess of mere need) predated him by at least a generation. The phantasmagoric allure of manufactured stuff begins in the nineteenth century -- the Crystal Palace exhibition, the Paris arcades, the department store -- if not before. Think of that unforgettable scene in Zola's Au Bonheur des Dames (1883) where the Countess de Boves, a respectable and somewhat austere member of the petty nobility, is found with yards and yards of the finest Alençon lace crammed up her sleeves:

"She would steal for the sake of stealing, as one loves for love's sake, driven by desire, in the neurotic sickness that her unsatisfied desire for luxury had earlier produced in her through the huge, crude temptation of the department stores."

Monsieur Mouret, who owns the department store Au Bonheur des Dames -- the Ladies' Paradise -- is, in Zola's novel, a visionary of spectacular displays, who arranges his store to showcase the inexhaustible plenitude of consumer goods. Fountains of shimmering silks in all colors, towers of different laces unspooling in puddles of white and cream, overcoats and china pots and umbrellas and children's hats. Everything is here, and so much of it, and constantly changing. A dynamic that highlights both abundance and evanescence. Zola describes the department store literally as a machine for selling, a machine whose product is desire.

Mint Chocolate Polar Seltzer: Awesome

I have an addition to the top Polar seltzer flavors list I made here. Friends! I implore you. Search your grocer's aisles for: Mint Chocolate Polar Seltzer. It is one of their "limited edition" winter flavors, and it is amazing. It reminds me of eating mint chocolate chip ice cream from Baskin-Robbins with my dad after he picked me up from ballet class, before he figured out that he was lactose intolerant. Memories!   

Mint chocolate polar seltzer, animal pants

Mint chocolate polar seltzer, animal pants

Wine bottles and wine snobs

It's the new year, I'm taking a little break from imbibing spirituous liquors, and so have been reading a lot about wine (instead of just guzzling it.) One of the things that I admire about wine snobbery is its claim to make time and place sensible to the palate: the terroir of the grape and its vintage. Reading up on the history of wine, I came across a nice example of how the emergence of wine connoisseurship depended on the most humble of technologies: the cylindrical glass bottle. 

(I'm basing all the below (mostly) on tidbits gleaned from the all-you-can-eat buffet of interesting facts that is the Oxford Companion to Wine (highly, highly recommended) under the entries: "bottles" and "aging.")

So -- the ancient Greeks and especially the Romans enjoyed old vintages, but for the thousand years after Rome fell, people in Europe mostly stopped drinking aged wines. This wasn't just because they lived in the dark ages and didn't know any better. Vineyard production had largely shifted to Northern Europe, and the kinds of wines that were customarily made there had to be drunk fresh, or else they got sour. So how was the European wine snob reborn in modernity?

Enter... the cylindrical glass bottle.

The thing contained is always somehow shaped by its container. What changed in the 18th century was: glass. Although glass existed in the ancient world (think of the Egyptian pulled glass bottle in the shape of a fish), the spread of new glass-making technology in the 17th century made it possible to produce glassware in commercial quantities. But before the 1730s, wine bottles were not the familiar cylinders that we hoist around today; instead they varied from bottle to bottle, and were usually squat or onion-shaped or bulbous. The Oxford Companion speculates that these were buried in beds of sand for storage. Then in the 1730s, this happened:  

"While it was known that some vintages of wine were better than others even in prehistory, their keeping and consequent maturing qualities were not realized until the introduction of binning, the storing of wine in bottles laid on their sides.... All this was achieved by the abandoning of onion-, bladder-, and mallet-shaped bottles in favour of cylindrical ones which stack easily."

Cylindrical bottles meant stackable bottles, stored in wooden bins in the cool dark subterranean cellars of urban wine merchants. This standardization of the container allowed for the biochemical processes of maturation to occur in the bottle, revealing a world of nuance and difference in the thing contained. Wine merchants didn't set out to find a way to bottle-age wine. It just happened. Maybe it happened in the hold of ships as wine was transported from one place to another (as was the case with vinho da roda, a kind of Madeira that had made a cross-Atlantic round trip through the tropics). But once it happened, bottle-aging become part of the process of production and consumption for many kinds of wine.

One of one of the best things about doing history is how it shakes your faith in straightforward causality. The closer you look, the less history seems like "one damn thing after another," the more it seems like big messy clots of phenomena getting pulled into relationships -- and then suddenly everything has changed. So, if I were to claim "cylindrical bottles made wine snobbery possible" it would not only be an oversimplification; it would violate (I think) the spirit of good history. Because it wasn't just cylindrical bottles that made modern connoisseurship possible, but the whole social and technical system in which they were enlisted and put to use: the wine merchants who needed a convenient storage solution for their increasingly crowded urban cellars, merchants who also kept systematic records, which allowed them to evaluate wines and value them differently -- and to discover that they could create value (and profit) with time. And none of that could have happened without customers -- the growth of a consumer economy and the emergence of a market for wine where people were willing to pay more for vintages and varietals that they perceived to be better or more prestigious. Which in turn depended on people who believed that money spent tastefully was money well spent. And there we have it: the bottle in the cellar is all tangled up in the story of the history of capitalism. 

Turning back to the Oxford Companion:

"Demand for mature wines transformed the wine trade. Aside from a few wealthy owners, most vine-growers could not afford to keep stocks of past vintages. Only merchants could do that, and their economic power and hold over the producers increased during the 18th and 19th centuries. This was most demonstrably the case in Bordeaux, Beaune, and Oporto, where merchants amassed huge stocks, vast fortunes, and powerful reputations."

A change in the shape of wine bottles -- and the new appetites that it makes possible -- is a crucial element in reshaping the agricultural and economic landscape of Europe, the set of social relations between merchants and producers. And out of this welter, the wine snob, fastidiously training his (or her) senses to discern the distinctions between vintages, to name those differences, to place a new kind of value on time, to enrich (if not prolong) the fleeting sensation of flavor.  

 

How to become an expert: Cigarette edition

I listen to a lot of "old time" radio - especially mysteries and detective shows - in part to satisfy my insatiable appetite for narrative while up to my sudsy elbows in the dishwater of history.  The other day, I heard an episode of "Mysteries in the Air," starring Peter Lorre, with his quavering syllables and his lightning-speed mildness-to-mania transitions.

The show was sponsored by Camel cigarettes, and the version I listened to kept the sponsor's message intact in the broadcast. Smokers are notoriously brand-loyal. They're not like consumers of other stuff, switching from Charmin to Quilted Northern on a whim or a spree. They'll ask for their pack of Luckies or Reds or Virginia Slims every time, without fail, no hesitation. You smoke what you've always smoked. But how do you get people to switch? How do you get people to believe that their choice is their own to make, and not somehow compulsory? Here's a complete transcript:

[Cymbal-clash] "Voice of God"-type voice, distorted as though through a PA speaker, intones: Experience is the best teacher.

"Average Joe": Remember the wartime cigarette shortage? Who doesn't! One thing about it though - smokers who went through it really learned a lot about cigarettes. They had first-hand experience with many different brands.

Dame: [Giggles] How true! Goodness, we certainly smoked whatever brands we could get in those days. I smoked so many brands I'm practically a walking encyclopedia about cigarettes. Well, I'm a Camel smoker now, and believe me, I know Camel is the cigarette for me because I've compared so many brands.

Joe: Yes, smoking whatever brands they could get during the wartime cigarette shortage made people everywhere experts on judging the differences in cigarette quality. That experience convinced a host of smokers that they preferred the rich, full flavor and cool mildness of Camels. The result:

PA-speaker Voice of God: More people are smoking Camels than ever before.

Joe: Experience really is the best teacher. Try a Camel yourself.

The ad is interesting to me because it tries to make a conditioned, manipulated, somewhat arbitrary choice -- the choice of what brand of cigarette to smoke -- seem like a reasonable one, made with deliberation and informed judgment. These people, we are told, are experts about smoking, walking encyclopedias. Hey, thanks to the war, you're an expert! The wartime cigarette shortage created a circumstance that never exists in civilian life - you had to smoke what you could get. This wasn't privation; it was a de facto tasting panel. You developed the capacity to judge the differences in cigarette quality. Informed consumer, you can now choose your brand based on the exercise of your newly cultivated expertise. You base your choice on taste, not habit or nostalgia, nor are you a puppet of advertisers. But it's not just individual judgment that's definitive here - there's a consensus. After all, "More people are smoking Camels than ever before." Does your judgment concur with the multitude, or is there something different or perhaps defective about your powers of discernment? 

In my own research into flavor and taste, I've become increasingly skeptical about the claims of sensory expertise even as I recognize the capacity to refine sensory discernment. Objective Methods in Food Quality Assessment, a textbook published in 1987, describes the lengths that sensory scientists go to create "objective" data about food preferences and sensibilities. The first chapter, with the perhaps over-insistent title, "Sensory Evaluation Can Be Objective," advises: "since humans are being used as measuring instruments, every effort must be made to control the effect of the environment on judgment." The testing room should be slightly higher pressure than the exterior, in order to eliminate the introduction of non-relevant odors. The temperature and humidity should be rigidly controlled. Colored lights might be useful, to make color differences in foods invisible. In the author's laboratory, they place tasters in an individual "domed hatch," where they can press a button to indicate when they are ready for a new sample. This way, they eliminate any possible influence introduced by the technician who delivers the sample. The taster is in a pod, isolated from all direct human contact, with a color-indeterminate cube of stuff to decide about.  

Sensory science tries gamely to create "objective" data, staging tasting tests where all potentially corrupting stimuli are stripped away, and the individual is "independent" of outside influence and exercises only her or his own sensory judgment. That is, a situation that is never like actual consumption, where we look everywhere for cues about whether something is delicious, disgusting, valuable, cheap, good to like, bad to like. It's an impossible task - a dream of a science that believes it can exist outside of the social, with laboratory as a space that maintains a cultural cordon sanitaire, sanitized from social factors. 

Which is not to say that one cannot prefer a brand of cigarettes or whiskey, or be a walking encyclopedia about tobaccos or wines or ice cream. Just that in a certain way, perhaps, our choices about taste are not only our own. 

The first fragrance insert?

According to this fascinating article, fragrance inserts in magazines -- the scented matte strip that, when unhinged, releases a waft of Coty's Chypre or White Flowers or whatever -- first appeared in in the 1940s, with microencapsulation technology developed by the National Cash Register Corporation (soon after to play a big role in the history of computing).

Looking through old trade journals at the Hagley, I found an example of this technology in use three decades prior to the 1940s, implying that it was first in use in 1910. From the May 1912 The American Perfumer & Essential Oil Review:

Rose Aldehyde C Fragance Insert.jpg

I did obligingly smell the circle, but alas, the odor of Rose Aldehyde C has been lost to time and history...

 

Polar Seltzer and the Evolution of Flavor

Hello! It's been a while since I updated this blog, but now that other distractions (ahem, marriage...) are behind me, I'm back to hiking up that Sisyphean mountain (Ol' Dissertation) and hopefully will have more stuff to post up here.

One way to tell the history of flavor additives is to track their changing uses. In the early days of flavor extract manufacture, in the last third of the nineteenth century, flavor additives typically came packaged in syrups or in alcohol-based solutions. Soda-fountain operators and bottlers, ice-cream makers, makers of fruit preserves, and other food manufacturers would (presumably) purchase the kind of extract (alcohol, glycerin or sugar-syrup based) that seemed to suit their needs. Starting around the mid- to late-1940s, however, flavor additives are increasingly designed to operate as a component of processed foods -- flavors manufactured to withstand the particular rigors of processing, distribution, and the expanded shelf (or freezer) life of mass-produced, mass-consumed goods. Using an increasing stable of chemicals and manufacturing methods (such as spray-drying), flavor additives were engineered to deliver a wider range of flavor experiences to consumers at the point of consumption. 

Polar seltzer has been around for 130 years, making carbonated beverages with "natural fruit flavors." I pick on them here, because their seltzers are a great example of how flavor has changed even in cases where you'd expect the most continuity. 

I'm not a shill for the company, but, full disclosure, I am a long-time fan of their flavored seltzers. You can get them in NYC now, but until very recently, that was not the case. I remember calling their Worcester, MA headquarters four or five years ago and asking if I could find their vanilla seltzer anywhere in NYC; the lady on the phone told  me I was out of luck, but asked me if I was interested in becoming a distributor. I seriously considered it for a moment, though I probably would have gotten high on my own supply, if you know what I mean. 

Anyways, many of the Polar Seltzer flavors are, as far as I'm concerned, flavor masterpieces - and illuminate some of the ways that flavor additives operate to produce their effects. Is there anything in the world less like fudge cheesecake than sugar-free, calorie-free sparkling water? Nonetheless, Polar's Fudge Cheesecake flavored seltzer (one of their limited-edition winter 2013 flavors) marvelously evokes ... something about deli-style fudge cheesecake. As you move to take a sip of the seltzer, you get the aroma of bakery fudge - a little tinny, like the chocolate side of a black-and-white cookie from the bodega - and then, after you've swallowed, the aroma that reaches your nasal cavity from the back of your throat subtly recalls cheesecake's creamy notes. It's all aroma, there's very little actual "taste" to it, but the aroma is masterfully constructed, and the bubbles of the seltzer actually seem to amplify the effect - releasing more of the volatile molecules into the air, where they do their work.

What's the right metaphor for this relationship, between the flavor and the thing itself? Polar's seltzer has almost nothing in common, materially, with fudge cheesecake. Even if they do share some of the same characteristic flavor molecules -- that is, if the chemicals Polar uses to flavor its seltzer are the same as those found in fudge cheesecake, which is in no way a given -- this is a material resemblance only on the scale of parts per million. Yet we (most of us) accept that the seltzer and the fudge cheesecake are somehow related through the medium of a volatile chemical mixture that we call flavor. 

The fragility of this volatile chemical mixture is evident if you do something rash like adding stevia to the beverage. I haven't tried this with the fudge cheesecake seltzer, but when you add stevia to vanilla seltzer, the flavor vanishes. The stevia must react with the vanilla compounds in some way, rendering them less volatile.

To conclude, a list of my top favorite Polar seltzer flavors (in no particular order): 

  • Granny Smith
  • Vanilla
  • Toasted Coconut
  • Strawberry
  • Fudge Cheesecake
  • Ginger Lemonade
  • Georgia Peach

Print and Eat the Food of the Future

One of the best parts of the pseudo-Freudian space fantasy Forbidden Planet is when Robby the Robot obliges the poor space sailor who's been left to guard the ship with a heap of liquor. Robby scans and chemically analyses the spaceman's bottle of whiskey, and then duplicates it... and duplicates it... and duplicates it... until he has a lovely pile of whiskey bottles -- at least until the invisible Monsters from the Id come and annihilate his fun.

All matter is chemicals, after all, and all chemicals are elements, and elements are just atoms, and atoms are everywhere, so why not? Anything can become anything else; stuff can be made out of no stuff.

The wait is over (maybe): why cook, when you can print your food and eat it? Sadly, there's no gracious Robby to butler our meal for us out of thin air. This is basically a modified 3D printer, the "revolutionary" technology that keeps threatening to transcend mere novelty, one of these days, maybe. 

I mock, but this article on the print-and-eat food from the IEEE Spectrum is really fascinating. At first, 3D food printers were limited by the material it used: a paste that hardened into different shapes, pretty much the edible equivalent of the standard 3D printer's plastic. (yum!) 

But then a breakthrough: Daniel Cohen, a grad student at Cornell, had the idea to treat the printer's materials as a set of miscible components, the way the three RGB printer cartridges in a color printer can produce a full-color reproduction of a multi-hued image. That is, he proposed a standard basic palette of food materials, reimagining food's basic components as though there are edible equivalents to the primary colors, which can additively produce any hue in the visible spectrum. This itself is not a novel idea: sensory taxonomers from Linnaeus to Arthur D. Little Consulting Company (and many more) have proposed systems that attempt to break the smellable-tastable world into irreducible elements. However, It's important to note that the color spectrum is a metaphor; it translates imperfectly unto the much different (chemosensory, multisensory) system of flavor perception.

Jeffrey Lipton, the article's author and an engineering student intimately involved in the development of commercial 3D-printing technology and its applications, is concerned with making the food printer's products not only palatable but desirable. The "uncanny valley" of "mushroom shaped bananas" is too "artificial", and thus likely to be rejected by the "home cook." He also dismisses proposals to use 3D food printing as a sort of hedge against a Malthusian crisis (by making palatable foods -- like "steak" -- out of cheap or repulsive proteins -- such as insects) as off-trend: today's savvy consumers reject "highly processed foods." (Incidentally, in my research on the history of flavor additives, I've found this "socially useful" application of flavor additives cited by the flavor industry starting in the 1950s and 1960s -- that synthetic flavor chemicals will help forestall a malnutrition crisis by making cheap nutritive substances (combinations of carbs-proteins-fats manufactured, perhaps, from industrial waste) edible and acceptable). 

Instead of working from basic components, Lipton says, they've taken a "top down" (rather than "bottom up") approach with the printer, working with chefs to produce fried scallops shaped like space shuttles and Austrian cookies with writing on the inside. (How this addresses purported consumer desires for "less processed" foods is not really clear...) The most exciting result is a new form of fried corn dough, impossible to achieve without a 3D printer; the dough forms "a porous matrix that allowed the frying oil to penetrate much deeper into the food. The result was something delicately crispy and greasy, like a cross between a doughnut, a tortilla chip, and raw ramen noodles."

In this incarnation, the 3D printer becomes an exquisitely refined tool for the production of highly processed food. A tool that doesn't just replicate what already exists in the world from a basic color palate, the way a camera reproduces visible reality, but something that makes new, unforeseen things possible -- maybe. Can we use this to imagine and create new flavors, or just to dress up familiar things in fancy, unfamiliar, space-ship forms?  

 

The Bird Climate

Earlier this week, The New York Times reported on eBird, a network of bird observers using smartphones to collaborate on the vast project of making a global picture of bird populations. Launched in 2002 by the venerable Cornell Lab of Ornithology, the network has already compiled nearly 150 million reports of bird sightings, and the amount of data it receives each year continues to grow. Poignantly, eBird is also promoted as a way to prevent the diligent observations of disaggregated bird-watchers from being lost -- to science and, by extension, to eternity. 

Dr. John W. Fitzpatrick, director of the Cornell Lab of Ornithology, comments on this in the article:  

“People for generations have been accumulating an enormous amount of information about where birds are and have been.... Then it got burned when they died.”       

The eBird network saves this information from the fire, so to speak, by converting it into data - accumulated, centralized, and brought into sensible communion with other data.

The dynamics of this data, the constant addition of new information about bird sightings, and the scope of the eBird database distinguish it from previous efforts, such as the Audubon Christmas Day Bird Count, which also organized amateur birders, bird lovers, and pro ornithologists (initially in the Northeastern US, later across the North America) for a one-day extravaganza of bird watching, identifying, and tallying. In contrast to this "static" one-day count of these moving objects, what eBird makes possible is a conception of birds as a phenomenon like climate -- global, interconnected, dynamic. If the Audubon Christmas Bird Count is the local bird weather report in various locations on a particular day of the year, eBird is the global bird climate: the patterns and moving fronts, with concomitant capacity to make predictions about future local bird weather. The scientists who use the program even call their records of particular species a "heatmap." 

The birders who participate in eBird aren't just ordinary birders, they are -- in eBird's words -- "biological sensors," nodes in a technosocial network to produce knowledge of the bird climate.

But as in any case where bodies and machines come together, there are ticklish issues at the interface. Though humans may be the best bird detectors, they lack some of the qualities of machine parts: consistency, reliability, regularity, standardization. And so the biological sensors' information, entered via the eBird smartphone app, has to filter through other humans - the Cornell Lab of Ornithology - to be sanctified as data. The information has to pass through the experts. These experts may also avail themselves of machines: the Times reports that eBird's creators are trying to make up for the variations among its biological sensors by using "machine learning" to "train" their program to distinguish signal from noise, to flag and discredit false or misreported or misidentified sightings. And they are also curbing bad data the old-fashioned way: by sending scientists out to refine the capacities of the biological sensors, training non-scientist eBird users to make the correct calls.

One thing the article gets a bit wrong: the Times article claims that prior to eBird, one-day counts were the only source of information about bird populations. The archetypal example of this is the Audubon Christmas Day Bird Count, which began in 1900. I'd also argue that bird banding, which was first used as a scientific method of tracking birds around the same time the Bird Count began, is another major source of information about bird populations, migration, and behavior. It's no coincidence that both the bird count and bird banding appeared at a similar time. If bird migration had long been a phenomenon of scientific interest, at the turn of the twentieth century, organized networks of ornithological observers (proto-eBird) affiliated with institutions like natural history museums, national governments, or conservation groups, made viable the vast data collection project entailed by the study of migration.

Illustration of birds dead on the pavilion below the Statue of Liberty's torch, from Duluth Daily News, November 8, 1887

Illustration of birds dead on the pavilion below the Statue of Liberty's torch, from Duluth Daily News, November 8, 1887

There's yet another, lesser-known, source of information about migration that was also used at this time: birds that collided with human-built structures. The Statue of Liberty's electric torch first blazed in 1887; the statue of William Penn which crowned Philadelphia's city hall (briefly, the world's tallest building) was floodlit in 1898. Under certain weather conditions -- drizzle, low cloud cover -- hundreds of migratory birds were killed on certain nights in collisions with these or similar structures, other monumental electric-lit structures in the still largely gas-lit city. As one 19th-century article describing the avian casualties at the Statue of Liberty put it, these "victims of liberty and their love of light."

Moreover, these were not urban birds - sparrows and pigeons - they were migratory birds passing along ancestral flyways, forest dwellers and waterbirds rarely seen in the city's vicinity.

What happened to the bodies of these birds? 

At a time when feathers for ladies' hats were a hot commodity, these bodies could have been plundered for their valuable plumes. Colonel Augustus Tassin, who was in charge of the Statue of Liberty grounds, did not allow this to occur. He told a newspaper reporter in 1887:   

“I have heretofore received many letters from all sorts of people offering to buy the birds which were killed in this way. But I believed they were public property, and that I had no right to dispose of them.... When I have collected about 200 specimens, I send them to the Washington National Museum, the Smithsonian Institution, and other scientific institutions, where I know that they are wanted.”

Indeed, the Smithsonian's 1888 Annual Report records the receipt of 260 birds of 40 species "in the flesh" from Tassin, recognized as one of the "more important accessions during the year." Government regulations required Tassin to record data about avian fatalities at the Statue of Liberty, which was technically a lighthouse and thus subject to this requirement. But the practice of scientific collecting at bird collision sites was adopted at other late 19th- and early 20th-century urban civic sites that saw similar mass fatalities.

The tale thus becomes a sort of redemption narrative, a conversion of meaningless death to meaningful data – and reclaiming the specimens as public, scientific property rather than private commodities.  

Further, the data produced by bird collisions had certain advantages over information from bird sightings during migration. What you had were the real bodies of birds, material specimens. This allowed ornithologists to make note of things that a sighting cannot provide a clue to: the bird's final meal, its sex, its approximate age, its weight. At the turn of the 20th century -- a time when the issue of "scientific collecting" (killing birds for research) was drawing sharp scrutiny and criticism from emergent conservationist groups like the Audubon Society -- bird collisions provided specimens that illuminated the phenomenon of migration while evading the question of whether killing wildlife was justifiable on scientific grounds.

This practice continues to this day. Birds that die after colliding with buildings in New York, in Chicago, in Philadelphia, and other urban areas are collected by bird collision monitors, bagged and tagged and incorporated into natural history collections and also used to raise awareness about the vast fatal scope of glass and architecture on bird life in, above, and around cities and other places where people live and build shiny or disorienting things. (Not every collision is fatal; many of these groups also save and rehabilitate wounded birds.)

Which brings me back to Dr. Fitzgerald's quote at the beginning of this post, that the collection of data is a way to prevent loss, to stave off the fire of oblivion.

Bird specimens lead productive afterlives in natural history collections, and continue to yield information about population genetics, historical ecology, behavior, and physiology, among other things. But making a bird a specimen entails loss - things that are discarded in the process of bringing the bird's body into conformity with the other bodies in the regimented drawers in the back rooms of natural history museums. Likewise, eBird certainly allows the birdwatcher to give her or his observations a rich and productive afterlife. But that shouldn't stop us from asking: what might be lost here? What does not pass into the eBird data set? And does that absence matter?


Kasugai Mangosteen Gummies, or, What is a New Flavor?

How do you describe a flavor to someone who has never tasted it before?

Most of us would probably first reach for an analogy: there's a reason "it tastes like chicken" is a cliché to describe things like alligator or rattlesnake or other "weird" meats. Almost everyone (in the US, at least) can be assumed to have eaten chicken; it's a cute way of downplaying the allure (or disgust) of the exotic. But this statement only works because we can't adequately explain what chicken tastes like. It, like most of the foods that we are familiar with, has become a cipher.

And really, what is a new flavor? Are there any really unprecedented flavors still out there?

IMG_1897.jpg

As a case study, I offer this bag of Kasugai Mangosteen Gummy Candy, purchased for $3.59 at the Japanese bodega.

What is a mangosteen? I can tell you what it looks like if you've never seen one. It fits in the palm of your hand; it has a leathery purple peel capped by a crown of three or four tough green leaves; the fruit itself is segmented like an orange, milky-colored.  

But what does it taste like?

The package offers few clues:

"The Mangosteen has the perfect balance of sweet and sour taste, known as the 'Queen of Fruit'. Enjoy its delicious flavor in Kasugai Mangosteen Gummy Candy."

R.W. Apple confronted the problem of describing the taste of the mangosteen when he wrote about it for the New York Times in 2003. Apple is an enthusiast, a lover, an avid apostle for mangosteen. His readers, however, must be presumed largely ignorant of the fruit, its flavor, and its reputation. At that point, mangosteens were forbidden fruit in the U.S. Native to Southeast Asia, the fruit was host to a pernicious type of fruit fly that the USDA wanted to keep away from American crops.  

How does Apple confront the problem of describing the mangosteen's flavor? He writes: "I could tell you that the flavor reminds me of litchis, peaches and clementines, mingled in a single succulent mouthful, but words can no more describe how mangosteens taste than explain why I love my wife and children. Merely typing the name makes my mouth water. Whenever in my travels I spot a mound of those precious orbs in a marketplace, my heart pounds."

Does Apple tell us what a mangosteen tastes like? Instead of giving us a portrait of the flavor, he describes the effect it has on him and on other people; he provides us with the evidence of its value. A chef he knows bursts into tears at her first taste of mangosteen. Queen Victoria pledged to knight anyone who could bring her a mangosteen, ready to eat (no one was able to meet this challenge). Apple himself claims to prize a mangosteen above even a hot fudge sundae. Simply listing the things the mangosteen tastes like does not do justice to the experience of the fruit; what vouches for its deliciousness is its desirability, its valuation above all other fruits (of which it is the queen) and other delectable things.

When I read this article way back in 2003, the mangosteen seemed to me the most marvelous thing I could imagine. I wanted it as much as Rapunzel's mom wanted the cabbage from the witch's garden; I would trade a baby for one, no question. Robbie and I searched for a source online, coming across all kinds of other fascinating fruits, such as miracle berries - but no mangosteen. In Chinatown, we bought the mangosteen's co-regent, the spiky durian, "king of fruits," and one memorable evening, split it open and managed to eat only a few spoonfuls of its custardy flesh - which reeked of corpses, oniony sweat, and gasoline - before we threw it out with the trash.

Not long after, we did indeed find mangosteen, quite by chance. We were in Victoriaville, a small town in Quebec, for the annual festival of "musique actuelle;" the sweet smell of cow manure pervaded the landscape. Shopping for provisions at the chain supermarket in this unlikely locale, we discovered a pyramid of mangosteens displayed unassumingly besides bananas in bunches and fat green pears from Chile. We were with two American friends, who were singularly unimpressed by our discovery; they had traveled in Southeast Asia and dined on fresh mangosteen at outdoor markets. Robbie jumped up and down; I wept among the produce. We bought a half-dozen at a nearly extortionary price, and hurried to our rented house to tear open the purple hulls and taste the jewel-like white fruit inside.

But the flesh was livid grey and mushy, its flavor was sour and musty. There was nothing delicate about it. No litchis, no tangerines, no alpine strawberries. The thick rinds left a unpleasant maroon residue underneath my fingernails, the color of old blood.  

We had mangosteen in Victoriaville, but we did not taste its flavor. This disappointing experience couldn't be the flavor of mangosteen, precisely because it was sour and soft and kind of gross. A mangosteen is by definition delicious, exceedingly delicious, the queen of fruits.    

Subsequently, I noticed that mangosteen began to feature in nutritional supplements and in energy drinks. Along with goji berries or acai, it was touted as a new "superfood" with an antioxidant payload that would annihilate the toxins of industrial living. (It's interesting that potency, enhancement, comes from elsewhere - either "exotic" parts of the world, or the past ("traditional knowledge") - realms that have "escaped" modernity.)

But these supplements don't promise the flavor of mangosteen. What they offer is some other virtue of the fruit, another way of having it without tasting it.

More recently, now that mangosteen (imported from Puerto Rico, or from Southeast Asia irradiated against fruit fly pests) has been available for import, I've seen some sorry looking specimens at supermarkets, for sale at an astronomical price per pound. The fruit seems hardly worth it: their purple husks dented, their bonny green crowns dingy, a rind of white fuzz where the fruit was separated from the tree. Evidently much the worse for wear from their long voyage from the antipodes. I have not splurged on any of these specimens.

So, what do the Kasugai mangosteen gummies taste like? And do they taste like mangosteen?

In three days, I have consumed more than half the bag, but the more gummies I eat, the less specific the flavor becomes. The gummies are sweet. They are a little sour. They are monotonal. Maybe a bit like pineapple?

mangosteen gummy.jpg

Am I learning what mangosteen tastes like, what is meant by mangosteen flavor, by eating them? Or will it change my experience of "real" mangosteen, when the day finally comes that I get to eat an perfect fruit, at the peak of its flavor? Will it be like Picasso's portrait of Gertrude Stein, where when he was told that it looked nothing like her, he replied, "Ah, but it will"? Will it just taste like Kasugai mangosteen gummy?