Sunday, September 28, 2008

Recipes (1): Mericonda


I'm posting a little ahead of schedule this time since today I'm off to Italy with my oldest son, Joseph, who turned seven last week. He has been there before but this is the trip that I have some expectation he'll remember, so we'll try to do the odd extra-special memorable thing - two days in Venice ought to cover that - and get him acquainted and re-acquainted with the people and the places that my family and I cherish the most. My mother's age and ill health lend some urgency to these endeavours, and we take that seriously, but of course we are also mindful that the young lad needs a plan of fun.

As luck would have it, much of my family's memory travels by means of food, and Joseph is not going to mind that one bit. I promise you it has got nothing to do with Marcel Proust and everything to do with the surname on my maternal grandmother's side: Magnoni, or 'big eaters'. My grandma (hereinafter nonna) married a Farina (or 'flour' equivalent to the Anglo-Saxon name Miller), and that took care of the need for ingredients.

Here she is, my nonna, at age 16.











It was her first ever photo and there wouldn't be another for close to fifty years. Of her own mother, who died in 1911 aged 39, no picture was ever taken, which of course was not at all unusual in those days. But it does put the notion of how you remember your ancestors in perspective. Nonna was barely literate, worked as a seamstress of her life, and left none of those documentary traces that nowadays we demand of ourselves and society and decorum demand of us - my partner and I have taken roughly one squillion photos of Joseph alone, for instance. But she was a good person, a socialist back when it meant something, generous, a great cook, loved her whanau fiercely.

And boy, did she not travel. Close to her entire life was spent on this rectangle of land, measuring scarcely 15 miles across. Born in Pieve di Coriano, a long time resident of Villa Poma, died in Quistello. This was her patch.


Armed with Google Earth and some family knowledge, or better still direct contact with the relevant old lady, you too can measure your Grandmother Range, and compare it to your own. The results may or may not surprise you. In the case of nonna, apart from the odd visit to her daughter's family in Milan (that's us) and a trip to Padua to thank Saint Anthony for helping her with a fibroma, the above little patch of the Po Valley, at the crossroads of Lombardy, Veneto and Emilia, the cradle of Parmigiano Reggiano, was her whole world.

I don't know how to explain that to Joseph, who at seven has already touched down on four continents and is reserving the option of one day becoming an astronaut. But I can tell him about the food, and even cook him one or two things, in a clumsy but well-meaning attempt to keep that memory and those connections alive. So, pausing only to acknowledge the formidable Islander over at Public Address, with whom I had a lovely email conversation on the topic of food and grandmothers just last week, here is one of nonna's favourite recipes, and mine. It's for mericonda, breadcrumbs and Parmigiano dumplings in a beef and chicken stock. Or, as my recently converted friend Giacomo calls it, 'boiled dough'. (Cheeky bugger.)

For the stock: half a chicken, 400g of chuck steak - or the piece that in Italian is referred to as the priest's hat, some butchers will know where to cut that - one clove of garlic, one onion, a celery stick, a carrot. Dump everything in when the water is still cold, bring to the boil then let simmer for three to four hours.

For the dumplings: two cups of breadcrumbs, one cup of grated Parmigiano Reggiano (don't even think of buying the fraudolent 'parmesan', you'd make an entirely different dish I'm afraid; Grana Padano will do in a pinch), three eggs. When the stock is ready, simply mix the dumplings ingredients together into a nice moist big lump. If it seems too dry and hard, mix in a tablespoon or two of the stock.
The dough
Back to the stock pot: salt to taste, remove the meat, chuck the veges in the nearest compost bin, bring back to the boil and use a two handed potato masher or ricer with big round holes to press the lump into little worms about 3-4 cm long, dumping them directly into the pot.

through the ricer
The ricers made in New Zealand are a bit too flimsy for the job, I find, so it may be safer to use a masher. It's essential that the holes are big, though, at least 2.5 mm. The dumplings cook in a couple of minutes.

The finished product
After the mericonda you serve the boiled meat ("il bollito"), with some sort of sauce or gravy and potatoes sliced and cooked in oil, with garlic and parsley. The wine that goes with this meal is a red Lambrusco, but I understand that most people abroad - not to mention in Italy - turn down their noses at the idea of a sparkling red. And a novelty wine it may be, but it is a key part of the experience so I'm going to gently but firmly demand that you use it. It is my memory, after all, although I would love nothing more than hearing about and trying out the family or ancestral recipes that others might have to offer.

If the concept of boiled dough fails to appeal, I have a couple of testimonials. Here's Joseph:


and here's young Lucia. The Magnoni genes are strong in this one.


To me a trip home means also reconnecting with the source of these memory foods, and every time I go I try to learn to make one or two of the old dishes. This time mum has promised to teach me how to bake pane ferrarese, a type of bread that defies the laws of physics by keeping for weeks in a steady state of deliciousness - I'll report back on that. But now the lad and I are better be off.


Cross-posted at Public Address, with thanks to Russell Brown.

Monday, September 22, 2008

O Time Your Pyramids


I pledged to Grunt after her comment in last week's post that I would lay off the pessimism long enough to talk about some of the very many inspiring examples of memory work that are to be found these days on and around the Web. But then the discussion of the contrarian view continued very productively in the comments and I thought it would be best to keep stringing these introductory posts together rather than taking a detour quite yet. Apologies and patience.

The take-home message from the Platonic half-volley was that a new memory technology - such as language, writing or the computer - displaces the modes of remembering that preceded it. And although we might take the view that the bargain involved is a good one, or even a very good one, a loss is a loss and needs to be accounted for. But there are other propositions that undercut the contemporary will to memory, and today I want to talk about one that appears initially to concede a major point, if not the whole argument, by allowing that we might one day in fact be able to remember everything - which is, arguably, what most memory technologies ultimately strive for. But then the sceptic goes on to ask: would such a thing even be desirable?

This particular counterargument hinges on two closely related questions: if you (singular) were able to remember everything, how could you then select what is important to you? And if we (plural) were able to access the totality of all possible meanings, how could we ever hope to find anything that is in fact meaningful to us? The former is the personal archive at the turn of the millennium, Bill Gates’ perfectly documented life, Gordon Bell and MyLifeBits. The latter is the sum of all knowledge - as dreamed of in the early age of print - and the projection of what the Web might soon become, the Googleian library of everything.


Jorge Luis Borges imagined this first. In two short stories, The Library of Babel and Funes, the Memorious, first published in 1941 and 1942 respectively, he conducted both thought experiments, and concluded that total recall, either in the form of an individual’s capacity to remember everything or of a truly universal library, is a most troubling gift.

Ireneo Funes, paralysed by a horse-riding accident that also gave him the ability to remember every instant of his life in the most minute detail, lives unhappy and neurotic, unable to forget himself long enough to fall asleep, and obsessing about the exactness of experience and the inexactness of language:
it was not only difficult for him to understand that the generic term dog embraced so many unlike specimens of differing sizes and different forms; he was disturbed by the fact that a dog at three-fourteen (seen in profile) should have the same name as the dog at three-fifteen (seen from the front).
Meanwhile the Library of Babel, indefinitely, maybe infinitely vast, perplexing, lit by a light (the search engine?) that is that is ‘at the same time insufficient and incessant’, contains every possible book, and hence
everything which can be expressed, in all languages. Everything is there: the minute history of the future, the autobiographies of the archangels, the faithful catalogue of the Library, thousands and thousands of false catalogues, a demonstration of the fallacy of these catalogues, a demonstration of the fallacy of the true catalogue, the Gnostic gospel of Basilides, the commentary on this gospel, the commentary on the commentary of this gospel, the veridical account of your death, a version of each book in all languages, the interpolations of every book in all books.
which sounds exhilarating enough. But in practical terms, the chances are overwhelming that the next book you’ll pick up, and the next one after that, and so on for the rest of your life, will be a jumble of orthographic symbols that neither you nor anybody you know could make any sense of, a point illustrated by the narrator’s remark that a book very much consulted in his zone is ‘a mere labyrinth of letters, but on the next-to-the-last page, one may read O Time your pyramids’.

Once again, as in the case of Plato v. writing, we are not asked to take either of these scenarios too literally. Borges was a prodigious scholar and a keen student of the ancient arts of memory; and by the time he went on to become the director of the National Library of Argentina, his first decree certainly wasn’t to get rid of the books. Both stories are thought experiments, limit cases (although there is at least one remarkable example of a real-life Funes), whose function is to sensitise us to the limits of memory and comprehension, educate us to think of our individual and interconnected minds, of our singular and plural cultural capacity, as resources that are vast but not limitless. Armed with this understanding, we may question for instance what Jake referred to last week as ‘the warehousing model’ of storing our digitally encoded memories, choosing instead to focus on preserving the richest content along with its dynamic connections. And we may reflect on the bits of our lives that are worthy of an archive, and those that are best let go and happily forgotten.

As Katherine Hayles points out, cybernetics has educated us to think of information as a thing without a body, weight, or material dimensions, as combinations of symbols that simply exist. But it is not so, the symbols need to be encoded somewhere - in a mind, perhaps through conversation or recitation, or in a hard drive, on a page, in the vault of an ancient Egyptian pyramid. None of these material supports will last forever, all of them need to be accounted for. Developing an ecology of memory involves thinking about the value of subtraction, of economy, of sometimes talking less and writing less (says the guy who just started a blog). In this regard, Borges’ work ought to be praised also for its extraordinary, exemplary, generous parsimony. Other writers would have killed for a tenth of his ideas, and to each of them they would have devoted a trilogy. In the original Spanish, Funes and The Library of Babel run together fifteen or so pages, or a little over five thousand words. That, too, is genius.



Jorge Luis Borges. ‘Funes, the Memorius.’ In Fictions, translated by Anthony Kerrigan. London: John Calder, 1965. pp. 99-106.
‘The Library of Babel.’ In Fictions, pp. 72-80.
Both stories are available on the Web in different translations.

Katherine Hayles. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature and Informatics. Chicago & London: The University of Chicago Press, 1999.
This is one of the books I'd take to a desert island, along with How to Survive on a Desert Island.

Tuesday, September 16, 2008

The Platonic Half-Volley

With apologies to Katherine Hayles.

Any discussion of memory and technology needs to account at some point for That Thing That Plato Once Said. There is no way around it, so we might as well deal with it right away.

Here’s the old chap, by the way, in a file photo taken by Raphael and posted at the Vatican.

Although my favourite version is the preparatory cartoon at the Pinacoteca Ambrosiana in Milan.

The Thing That Plato Said, or rather what he had Socrates say in one of his dialogues by way of a parable, is that the invention of writing would bring about the end of human memory: people would begin to rely on the written text instead of the contents of their own minds, externalise their memory to the point of atrophy. And further: the knowledge acquired from books would be degraded knowledge, not worth knowing. As it has been variously translated into English: the ‘conceit’ or ‘appearance of wisdom’, rather than wisdom itself.

This is clearly the mother of all contrarian points of view: with the benefit of twenty-four centuries of hindsight, we simply know that Plato was wrong, that writing – far from making fools of us all – has in fact enabled us to sustain and develop an ever more complex culture, as well as preserving the works of, yes, Plato himself, who unlike his teacher Socrates chose to write his thoughts down. But the parable still resonates because of the widespread feeling, not many years after the introduction of another paradigm-shifting communication technology, the computer, that we are also in the midst of a reconfiguration; and while most of us see the potential to do more things better, some are anxious that we may be in for a few unpleasant surprises, and worried that we cannot control or fully comprehend this new reality.

The relevant passage of the Phaedrus (274e-275b) has been the subject of much learned commentary, most famously by Jacques Derrida in 'Plato’s Pharmacy', an essay that spawned its own cottage industry of critique to the critique; my particular take-home points there: Plato makes a whole lot of neat distinctions and assumptions that can be shown to be deeply contradictory. Walter Ong in Orality and Literacy, and Erik Havelock in Preface to Plato offer other classic discussions of this perplexing invective (1). More recent treatments, post-advent of the Internet, can be found in Erik Davis’ Techgnosis and Darren ToftsMemory Trade, two books I couldn’t praise highly enough (that’s why I link to them, if you care to look to starboard).

But, as always, I am interested in a broader range of discussions in the fullness of the public domain, down (if you think that’s the way it goes) to the level of chatter in newsgroups and blogs such as this one. A Google search of the terms plato phaedrus memory computer returns 2,710 hits. Adding matrix brings the number down to a still respectable 675. Restricting the search to Google Scholar, to get rid of non-peer-reviewed schmucks such as yours truly, returns 1,220 and 223 documents respectively. Why did I add matrix? Because that film more than any other text is a sure marker of the angst that permeates the discourse of computing, subjectivity and the real, in which memory is most often embroiled (in fact, removing the search term memory doesn’t increase the numbers by much). And also, a bit cheekily perhaps, I’ve now added the magic word to this page. Web analytics tell me most people found this essay of mine because they were interested in The Matrix. And I like being read.

The numbers tell us that those four or five words are used together quite a bit, but not to what end. So I’ve done a random assay, the top of the page result of one page in five of the largest sample. And there are some false hits, predictably – a page of random quotations, another of theatre reviews in which the terms occur in unrelated pieces; others portend to discussions that occur elsewhere, in university courses and such like. But the majority of the sample is representative of the enduring resonance of the Platonic argument, in spite of (or perhaps because of) how integral computers have become to the work of memory and knowledge. Thus for instance Tim over at SansBlogue links to an article by Nicholas Carr in The Atlantic entitled 'Is Google Making Us Stupid', commenting on its reaction and adding a thoughtful analysis of his own; In ‘Britney? That’s All She Rote’, Jenny Lyn Bader of The New York Times seizes on the latest Britney Spears’ oops moment - forgetting the lyrics that she was meant to lip sync at last year’s MTV awards - to draw a broader critique of a society in which ‘it’s gotten easy to forget to teach young people how to remember’; while in her blog, Girl Meets World, then-journalism student Amanda Cochran grounds the discussion in relation to the issues faced by her chosen profession, singling out for praise the following, exemplary contention of John Churchill’s in ‘What Socrates Said to Phaedrus’:
It is not that facts are not valuable. It is that in addition to possession of them, which is information, we need a sense of how they are connected, which is knowledge, a sense of how they came to be and how we came to know them, which is understanding, and a sense of what they mean for us, which is wisdom.
Adding matrix to the mix makes the sample veer towards the more metaphysically minded critiques, and the more extreme interpretations of the reconfiguration imagined by Plato; it also rarefies the air a bit, increasing the number of articles hosted by universities and subscription-only journals. But the general tenor of the texts remains broadly the same, and is consistent with my experience of researching this topic on the Web over the last several years. Back in 2000 as much as now, I would have expected to find more out of hand dismissals, like this one, by the anonymous author of a wiki that lists Plato’s argument as proof that the ‘new media are evil’ trope is ‘older than dirt’. What one finds instead for the most part is a readiness to engage, a willingness to explore the distinction between information and knowledge, between automatic recall and meaningful remembrance. Most commentators seem to have taken on board the lesson that Neil Postman derived from the Phaedrus in Technopoly: namely, that new technologies not only introduce new words, but change the meaning of old ones; and that it is incumbent upon us to understand what memory has come to mean, and how we can remember better, as opposed to just remember more. All the while knowing that, regardless of our protestations, we are operating from within the new paradigm, which imposes strict constraints on our agency and capacity for meaningful critique. As Ong writes
One weakness in Plato's position was that, to make his objections effective, he put them into writing… The same weakness in anti-computer positions is that, to make them effective, their proponents articulate them in articles or books printed from tapes composed on computer terminals [this is 1982 – no Internet back then]. Writing and print and the computer are all ways of technologizing the word. Once the word is technologized, there is no effective way to criticize what techonolgy has done with it without the aid of the highest technology available.(2)
Ong understood that ‘writing restructures consciousness’, and that once the word is technologized, that transition cannot be undone – except perhaps post-apocalyptically, and nobody wants that. This holds true for memory as well. We could research the Rhetorica ad Herennium and learn the ancient ways of memorising long speeches by building palaces in our heads, or practice the art of Renaissance mind theatres and wow the audience of the MTV awards with our flawless fake singing; but we cannot effectively revert to orality, really know what it was like to operate in a pre-literate world. The oral world of the Homeric bard or the Maori storyteller is gone. And the pre-computing world of our parents or perhaps grandparents is on its way out, too – witness the efforts of those who seek to equip every child on the planet with his or her own laptop, thinking perhaps that with technology alone will come education, employment, prosperity, and a safe and functioning society. But nothing is quite so simple.

There is a further complication, another reason why understanding and reforming memory is so urgent: a time of change is also a time to remember; we need to document the now. The twentieth century brought us another great invention, the motion picture. Well,
80 percent of all silent films made in the United States are gone without a trace. Fifty percent of films made in the nitrate era (that is, before 1950) have also perished. Among those extant, a significant portion is not well preserved. Given that the materials that have vanished were not well documented at the time of their creation, the full extent of this loss will never be known.(3)
The early documents of the Internet, the first hypertext narratives, the works of digital art and of amateur scholarship, are also in danger of being lost. Committed to digital memory in what must have seemed at the time a safe place, they are threatened by material decay of the storage media, software obsolescence, loss of the dynamic links that made them meaningful, and a host of other problems. The important work that needs to be done to rescue them from oblivion requires not only very able technologists, but also a willingness to question notions of information and wisdom, so that those documents can re-enter the public domain and be kept alive by intelligent connections – the same connections that are formed in the pre-literate minds of children who are learning to remember. The Thing That Plato Said, then, serves as a cautionary tale and a reminder, that we should look back, and all around us, and question just who it is who is empowered to produce knowledge and to decide what needs to be preserved, and in what way.





(1) I go over these responses to Plato in some detail in the introduction of my dissertation.
(2) Walter J.
Ong, Orality and Literacy – The Technologizing of the Word, (London and New York: Methuen, 1982), p. 79.
(3) Stephen G. Nichols and Abby Smith (eds.),
The Evidence in Hand: Report of the Task Force on the Artifact in Library Collections (Washington: Council on Library and Information Resources, 2001), p. 5.
Other referencesDerrida, Jacques. ‘Plato’s Pharmacy.’ Translated by Barbara Johnson. In Dissemination. London: Athlone, 1981, pp. 61-172.
Churchill, John. 'What Socrates Said to Phaedrus: Reflections on Technology and Education.' Midwest Quarterly 44:2 (2003). 11p.
Havelock, Eric. Preface to Plato. Oxford: Basil Blackwell, 1963.
Plato. Phaedrus. Translated by C.J. Rowe. Warminster: Aris & Phillips, 1984.
Postman, Neil. Technopoly: The Surrender of Culture to Technology. New York: Alfred A. Knopf, 1992.

Tuesday, September 9, 2008

The Trouble Started with a Google Search

From this and similar experiences, Garner realised that people with dementia frequently go back to old memories in order to make sense of otherwise incomprehensible situations.(1)

In today’s news, United Airlines shares suffered heavy losses after business portal Bloomberg passed its 2002 bankruptcy filings as 2008 news. According to the AP report picked up by the New Zealand Herald, the stock lost over a third of its value within five minutes of the story being posted on the Bloomberg terminal and fell as low as $3 (a 75% collapse in value) before the Nasdaq halted trading. Even after the story was clarified and trading resumed, the airline shares still closed over 11 per cent lower than the day before, with a loss of value in the vicinity of 175 million dollars US.

And yes, it all started with a Google search. As the AP explains
A staffer at investment newsletter Income Securities Advisors Inc. in Miami Lakes, Florida, entered the words "bankruptcy" and "2008" in the internet search engine, according to President Richard Lehmann. ISA does this all the time, searching for overlooked information about companies in trouble, he said.

Lehmann said the top item returned in the Google search was a story about United's bankruptcy filing that appeared on the South Florida Sun Sentinel website. The story referred to United's filing on Monday morning. United had filed for Chapter 11 protection on Monday, Dec. 9, 2002. Lehmann said that date was not on the story.
To cut a short story even shorter, the staffer posted a summary on the Bloomberg information service, without so much as checking another source or contacting the airline, and the rest is so much frantic history on the trading floor.

What does this story teach us? We’ve known for quite some time about the power of rumours to lay temporary waste to the financial markets. And lazy journalists (or, in this case, researchers) are not a fantastically new breed either. What seems designed to give us more pause is that somewhat arresting line in the AP story: The trouble started with a Google search.

Of course Google - the company, its technology - is no more responsible for this than automaker X will be when the next idiot full of booze drives off the road and into a tree in a car made by that company. Or is it? Without cars, booze or trees we’d have no accidents involving a combination of the three. And since a tree is unlikely to attack you, and alcohol by itself cannot propel you with the necessary speed to cause the damage, the car might just be the most dangerous ingredient of the lot. Of course one cannot ignore that a person was behind the wheel; but it's hard to deny that the vehicle is an enabling technology. Cars cause car accidents in the same way that guns kill people, which is after all why most societies impose regulations on both.

Google's lightning-quick access to the well of information too, it seems, can all but bring down an airline, although you still need a human idiot to punch a few keys to get you there. But is this really a new phenomenon? The rumour happened to be concocted via a lazy Google search and spread throughout the markets via the Web, but rumours existed before either the Internet or search engines.

True. But think for a moment about the pace of today’s events. In less than the time it takes to read this post, United Airlines shares lost a third of their value. Suddenly, that single piece of regurgitated memory from 2002 become today’s news, and it was everywhere. Almost eight hundred million dollars vanished, only to reappear in large part - and in different hands, one suspects - by day’s end. All because an old report resurfaced, popped up like a cork in the sea of information. Consider too that Google plans to extend its newspaper archives to the last 244 years, threatening to multiply the number of corks out there by several orders of magnitude. Perhaps the change is not one of degree, but something more radical.

Referring to Moore’s Law, which predicted in 1965 that the number of components on a chip would double every year, Stewart Brand has noted that ‘[a]ccording to a rule of thumb among engineers, any tenfold quantitative change is a qualitative change, a fundamentally new situation rather than a simple extrapolation’(2). At these speeds, the meaning of communication itself changes. And so too do its possibilities and dangers.







This weblog is about memory and technology. The term ‘technology’ has to be understood in the broadest possible sense - in a future instalment I shall argue for instance, and in very august company, that the alphabet itself is a technology. So most acts of committing to memory and indeed all acts of transmitting memory are technologically mediated by one or more languages, and a whole lot of apparatuses. The Internet is just another technology, in this regard, and it won’t be my sole concern; but it’s also one of the most topical and, let’s face it, interesting ones, as of right now. In discussing it, I shall indulge in many more stories like today’s, instances of dysfunction that undercut the experience that most of us have of using the Web and actually finding and exchanging all manner of useful information, without causing major financial collapses in the process.

There are two main reasons why I find these stories interesting. Scientists who study memory and the brain know that in order to figure out how we remember, it is useful to study individuals who can’t, be it as a result of a trauma or illness or genetic defect; similarly, I believe that developing memory practices commensurate to the sophistication of our communications technologies requires paying attention not only to the most exhilarating prospects on offer, but to the failures and the glitches, too. Secondly, and no less importantly, I believe that the anxiety and the concerns that these stories often touch upon or uncover are a most useful reminder of the material repercussions of our digital acts, and of the people and things and data that are left out, unable to remember or be remembered. A lot more on this, I’m afraid, in future posts.

One last thing: why the blog name, I hear nobody ask? I’m not going to go into that right now, except to credit the lovely artwork by Bert Warter, from the 1949 edition of Bruno Furst’s classic Stop Forgetting.




(1) Oliver James, 'My Mother Was Back. The Lights Were On', The Guardian, Saturday 2 August 2008.
(2) Stewart Brand, The Clock of the Long Now: Time and Responsibility (New York: Basic Books, 1999), p. 14.