godisFor all of its chilly white space and algorithmic anonymity, it struck me recently that Google autocomplete is one of the more communal experiences I take part in on a day to day basis. You’re no doubt familiar with autocomplete: type in the beginning of a question or statement and Google will offer a set of predictive endings, based on search volume, to speed you toward the desired information. At first blush, this functionality appears to be nothing more than a kind of streamlining, an attempt to shave valuable milliseconds from billions of daily search processes — that is, nothing more than a striving after efficiency. But in the rush to get to the bottom of our own obscure or banal questions, it’s easy (and sometimes necessary) to ignore what these predictive searches really are: a kind of social cipher, an anthology of digital anxieties, a collective dreaming. Over the years, Google has become much more than a search apparatus to me; indeed, it’s now an essential extension of my ability to learn, grow, and function in creative and professional endeavors reliant on instantaneous access to information; however, recently it has become something else entirely: a weirdly, beautifully human window that is staunchly anti-informational. Poring over these predictive strings — like billions of messages in bottles — there is a sense that I’ve stepped outside the realm of analytical information and into the warm, beating, existential heart of the usually clinical Google experience.

 

whydowe

 

To begin with, let’s first state the ostensible function of autocomplete in no uncertain terms. Google describes it thusly:

Autocomplete predictions are automatically generated by an algorithm without any human involvement. The algorithm is based on a number of objective factors, including how often others have searched for a word.

The algorithm is designed to reflect the range of info on the web. So just like the web, the search terms you see might seem strange or surprising.

That last sentence resonates with my own experience — “the search terms you see might seem strange or surprising” — in that so often the desired search and the predictive strings are entirely at odds with one another. It’s also interesting to me that Google emphasizes that the algorithm is generated “without any human involvement”, despite the fact that, firstly, the algorithm was created and implemented by humans and, secondly, its logic is predicated on the profound and petty desires and questions of its human users. It is a tool made by humans, for humans, and populated by the capillaries of speculative human consciousness. What might initially come off as sterile and mechanistic is actually a kind of growing, learning organism — one which our fingerprints are all over. This is the side of Google that we, or at least I, rarely think about, much less closely consider: the creatureliness of its sleek and gleaming product.

Dustin Illingworth writes a monthly column for Full Stop on the intersections of technology and literary culture. Based in southern California, he is currently at work on a debut novel and a collection of literary and cultural criticism.

Dustin Illingworth writes a monthly column for Full Stop on the intersections of technology and literary culture. Based in southern California, he is currently at work on a debut novel and a collection of literary and cultural criticism.

And yet the evidence is readily apparent with every search. It’s impossible to use Google without seeing what untold numbers of others have done before you, a psychic imprint and social barometer of incredible complexity. What does this do to the individual query? That likely depends on the individual doing the searching. There is, of course, the danger of a despairing white noise, the feeling that one’s individuality is somehow abrogated, as personal questions elide into the sterility of an unthinkably vast data set. It can be unsettling to see a specific, chronic problem — one that is intimately known, intimately one’s own — become subsumed beneath the monolithic collection of Google’s algorithm. But there is also, I feel, comfort to be found here, an expression of the communal Search within the search. It is altogether too easy to utilize Google in an imagined vacuum, a tool of disaffection, isolation, even narcissism. But in cultivating an awareness of the profoundly human face of its service, we bridge some kind of perceptual gap: Googling itself as a social fabric, a social act.

It this sense, Google can act as something of an empathy generator despite its best efforts to the contrary; indeed, while performing these recent searches in preparation for this piece, I’ve found myself thinking of Nathanael West’s criminally underread masterpiece Miss Lonelyhearts, in which the otherwise unnamed titular protagonist, a newspaper advice columnist, is shaken by the broadly troubled, sometimes agonized letters of his readers: nearly incoherent texts comprising a kind of collective existential moaning. This is not to say that Google is used primarily by individuals in torment; however, it is perhaps fair to say that Google serves to mediate longings, both weighty and trivial alike. Moreover, if one is creative and curious enough, one can provide Google with existential prompts with which to peer into a shared, searched psyche: “why do i”, “how do i”, “god is”, “god isn’t”, “why can’t i”, “when will i” — each predictive ending provides a miniature window into our broadest fears, our secret shames, our lusts and fantasies. It seems to me that these predictions comprise a kind of secret cultural text, hidden in plain sight, a tome that only show itself with the proper prompting, one that reveals the outlines of a fraught and questing collective character.

But so, too, can it expose the troubling tenacity of our bigotry. After all, Google is not a utopian vision of a digital commonality, or at least not entirely; it’s also a snapshot of a given historical moment: its monstrosities and cultural abscesses. Prompt Google with two harmless words — “women should” — and its algorithm, built by an unimaginable volume of like searches, will exhibit the pervasiveness of our oft-denied, oft-ignored sexism:

womenshould

 

Or, equally repulsive, try prompting Google with “gays are” — and see a nation’s unbridled fear and hatred manifested in a handful of predictive strings:

gaysare

 

These are merely two examples of Google’s algorithm creating a kind of mirror, albeit a shattered one; or, rather, a fun-house reflection in which our own cultural grotesqueries are presented in their fully and regrettably distorted reality.

The latent humanity present in the Google experience, though, seems, finally, as varied as its users: equally compelling, heartbreaking, and horrifying, its patchwork portrait is only as complete, and as incomplete, as its icily effective algorithm. Who and what are we? What do we want? Who do we love and who do we hate? Where are we going? It turns out that in Google, as in life, the answers are often best found in the searches themselves.


 

Join our mailing list to receive news from Full Stop:

You can also help by donating.