Short on Time

“I share enthusiasm for the emancipatory potential of technoscience to create new meanings and new worlds, while at the same time remaining highly critical. And this involves redefining genuine inventiveness as not just about speed and novelty, but about challenging the assumptions that permeate our scientific discourse. To put it simply, it means thinking about social problems first, and then thinking of technical solutions, rather than the other way around. For example, crunching big data and then looking for applications for it, as if crunching big data isn’t a political act in itself. But we can’t do this while the people who design our technology and design what’s made are so unrepresentative of society.”

Prof. Judy Wajcman; London School of Economics, Dept. of Sociology

One of the books on my reading list is “Pressed for Time” about the proliferation of supposedly time-saving technology and how – ironically – with these advancements we feel even shorter on time. In researching the book, I came across this lecture from the London School of Economics that, if you have an hour (which you probably do), I highly encourage you to watch.

Prof. Wajcman discusses the principles of her book and her key themes and discoveries. Her main point is that the technology is not the culprit here. It’s certainly a part of the societal change we’re in the midst of, but our own sense of accelerating and diminishing time has much more to do with the social constructs and relational expectations we’ve placed on each other culturally.

I pulled out this quote not because it encapsulates the book, but rather that it truly cuts to the core of some of my personal disillusions with modern definitions of “innovation” and “inventiveness.” She is spot on that the tendency to laud anything novel as valuable is far too common and (more importantly) gravely misguided. Newness does not equate to value, yet we are very quick to assign value and “goodness” to things that are new, simply by virtue of their novelty. This is dangerous because it distracts us from the things we could be creating that actually do have real social value and the potential to solve difficult systemic societal ills. In economic terms, it’s the opportunity cost of the time we spent on the novel thing.

The other piece that resonates is her comment on speed. From the context of her talk, she brings up speed in the sense of “X innovation helps you do Y activity faster,” but I’d offer up another type of speed that we equate with inventiveness and ingenuity that doesn’t deserve those titles. And that is the speed that comes with reusing or re-applying something originally intended for some other purpose. The new application or use of a non-novel technology is often rationalized with faulty hindsight and tagged as “innovative” because it came about quickly, when really it is good execution of an otherwise unapplied idea or functionality. Often, these things are a natural next step on a vector of development that the “innovators” and the dough-eyed tech media blindly ignore due to confirmation bias.

For example, Uber is often heralded as a massive disruptor in the taxi industry. That stance is easy to agree with, if you look at Uber solely within the context of the technological advancements that have happened in that industry over the last twenty years. Hybrid cars and credit card readers were the major margin-improving advancements, and mobile payments and bookings weren’t a logical next step to arise from that milieu. But, if you look at Uber as an extension of the sharing economy, using the same principles and technological capital as companies like Craigslist, TaskRabbit, and ZipCar, it is absolutely a logical (i.e. not particularly inventive) next step to apply the sharing economy to the cab industry.

Whether you agree with my sentiment here or not, the podcast/video is a treat, including the second speaker, Genevieve Bell from Intel, who is an anthropologist with a fascinating and compelling take on the limitations of algorithms. She very articulately discusses the challenges of thinking about what I call the “anti-algorithm” or “things you’d hate” recommendation engine. Her most salient point is to call out that algorithms rely upon the past to create experiences that match those which have already occurred.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s