Thinking Inside the Box or How Computers Shut Down Thought

20 Oct

Let’s start with the caveat.

Only fools and media studies students truly believe that ‘the medium is the message’ in any absolutely literal sense.

Thus not all telephone calls are, primarily, assertions about the centrality of telephones in contemporary society.

That would be silly. Many actually are what they appear; invitations to tea, requests for sex or the loan of a power-drill, narratives of bus-bound teenage boredom, or someone singing ‘Happy Birthday’. Not all newspaper articles are ‘about’ the power of newspapers – if they were, then they would be too tedious to form part of that power. Not all universities produce research and teaching programs solely designed to benefit elite groups connected to State or corporate power. (Honest). Nor, likewise, are all emails, blog entries or pieces of music (etcetera etcetera) created through the use of computers actually merely about computers. Or about the internet. Or about html. Or about whatever you happen to think is the medium in this instance.

Caveat over. Here comes the intentional bit of the message.

Computers – and more especially the fetishisation of the use of computers in communication, argument, research and the formation of our sociable selves – serve to shut down thought in a lot of ways. Most of these, we’ll note in passing, are inimicable to poorer people, and to people who favour greater material equality.

 

The Boxiness of Computers

The literal point of my title is significant. So far, computers have hardly reached the public as anything other than boxes. Pretty-coloured boxes. Boxes with holes for wires of many kinds. Boxes with screens, and boxes with screens connected to them. Boxes on desks. Boxes in vast rooms full of similar boxes. Boxes in spaceships and in submarines and shops. Maybe someday they’ll be something essentially different. But for now, really, they’re pretty much all boxes.

Which makes it amusing that it has only been since the information technology revolution (or should that be ‘counter-revolution’?) that the phrase ‘thinking outside the box’ has become ubiquitous. Our thought has become defined by the limits of our boxes, and so we pretend to escape by claiming to think outside them.

At least in advertising and recruitment slogans.

In fact, we have become used to living inside the box.

Whereas students in earlier eras could work on essays and learning on loose sheets of papers outdoors and in company whenever the weather allowed, now student life around the wealthier parts of the world revolves around ‘study bedrooms’. Internet connections, rather than prior reading or a pile of physical books, give the source of quotations. And access to learned journals often exists solely or primarily in electronic form. Thus informal students not enrolled at an educational institution have serious extra obstacles erected to their learning and thinking, especially if they can’t pay their Internet Service Provider.

So who’s going to write the defining new text on the tendency to inherited and growing inequality under capitalism? Goodbye, Karl Marx, college dropout and bloody revolutionary. Hello Thomas Piketty, Paris School of Economics Professor and Nobel prizewinner.

In the more significant case of lesser thinkers (after all, lesser thinkers are the great thinkers of tomorrow) the essential boxiness of computers matters more still. Scratch ‘boxiness’, to be precise. That implies three dimensions. I actually interact visually with a computer solely by looking at a single two-dimensional plane. In reading a book, I can flick over the page to check how an argument continues, as well as flicking back to see where I’ve just been. Indeed, as a skilled book reader I can flick through a book at hundreds of pages a minute to find something new and interesting in it even though I have not decided in advance what I am looking for and nobody is trying to sell me anything. I will also pick up some of the information that I flick past, even though it has not been preselected to appeal to me.

That is to say, I can use it to actually think ‘outside the box’. Never mind coming up with ‘innovative’ answers to systemically predefined problems, I’ve got a mechanism for generating new problems. And that mechanism is still myself.

A connected limitation of the computer screen is affecting me right now (on first writing, that is. I’ll also return later to drafting issues that arise from computer usage). I’d like to check whether my argument flows nicely throughout what I’ve written so far. In the eras of foolscap or typewriter, I’d be able to see the whole thing at a single glance. Whereas now, at each moment of writing, I have the choice of scrolling, squinting or surging onwards whilst blithely ignoring the mess behind me.

But if I was really playing it safe, I’d restrict my writing to whatever would fit on a single page.

 

 

The Twitter Mind

Or less.

Perhaps, for example, I’d say it in 140 characters. Or by providing a link to what someone else has said or thought.

The only trouble with this is that it would restrict me to saying things based upon values shared with my message’s recipient. I love simplicity, but simple things that are genuinely worth saying usually carry certain difficulties, or require definition, context or serious explanation.

With that willingness to explain and investigate and expand, you can get stunning bits of simplicity like this, carrying with it the power to understand and change the universe:

e = mc2

Without it, you get the most popular tweet since the creation of twitter:

 If only Bradley’s arm was longer. Best photo ever. ¹

I shit you not.

Economy of expression is not just for idiots and advertisers. There are serious arts that depend on it. Among them are haiku, listenable music, and Noh theatre. (And profound political commentary in the clothes of sarcasm, as manifested in the preceding paragraph).  But brevity does not, in itself, actually have any value. It is only worth communicating something briefly if that thing is big in its implications, and if communicating it in a compact form signals some kind of linguistic prowess.

After all, if brevity was the sole aim of the communication, it would already have been achieved by saying nothing at all.

This, however, would be at odds with the core ideological value of internet use. We must appear, always, to be there.

 

The Illusions of Being Connected

It is not germane to the nature of the internet, or to the nature of computers, to force us to check back incessantly for messages whilst we are seeking to do other things, even to the point of neglecting what we first switched the computer on for.   Nor is it in the nature of the technology itself to make us censor or misrepresent our own thoughts in order to court popularity. But most of us, if we are honest, have done all of these things at some point with the impression that the technology was forcing its nature upon us.

I’d like to suggest that this is because of two connected illusions. The first is the illusion of global relevance. In this illusion everybody we know online, wherever they are and whatever they are doing, is linked to us. And not merely linked, but linked in a way that we could potentially act upon. As long as we treat that action as urgent. As long as we check in. This illusion particularly feeds upon social networks and blogs that feature a ‘like’ button (more on this later), but is also built upon the pervasive contemporary idea of instantaneous communication changing the very nature of who we are. Whole ‘philosophies’ have been built upon the idea that the world is speeding up – even though there has been no significant increase in several generations in our speed of speech, cutting hair,writing, typing, reading, cooking, walking to the shop or travelling by train from Fort William to Gravesend. Even increases in the record speeds of running and swimming amongst the very fastest people on Earth move in tiny increments, and leave room for the sneaking suspicion that the true champions of the world lived millennia before ‘Sports Science’ was even a thing.

The second illusion is that we exist as part of an online community, and that this community is vital and new and exists by constant renewal. This illusion is fostered by the word ‘community’ as applied by social networking sites. But only a minority of these online communities end up by creating – rather than reflecting – communities of action or interest that already function in the wider world. That is, they function just like minute books, hilltop meetings and whist drives once did. Whilst substituting petition signing and ‘sharing’ of other peoples’ pictures for the action that emanated from those face-to-face meetings, meaning that simply passing on a picture of a pretty sunset with a facile message about how nice it is to be nice is the most popular form of conscious political engagement.

Thus every political activist (and would-be political activist) has witnessed online ‘communities’ that appear huge, until somebody needs to act outside the virtual world. Political communities of engagement are often doing less than similar communities would have done in earlier eras of instantaneous communication technologies (radio, telegraph, smoke signal, shouting through the neighbours’ back door, a smack in the mouth) and doing it slower. The events of the Arab Spring that brought the ‘twittersphere’ to global attention only happened because a very small number of people were willing to go first. And because one Egyptian man was driven to kill himself, very publicly, in a show of opposition to the ruling system. If you’ve read about Vietnam, you’ll know no internet is required for this to happen.

 

Don’t Write, Just Publish

The blogosphere is filled with reflections upon blogging. A few (like this entry , hopefully) have a broader aim than simply seeking attention. But most are the writings of people who perceive writing to carry status, but have not yet written anything. Entries therefore talk redundantly about how many people like the writers’ entries, and how they feel they’re doing ever so well for a new blogger. (Or a “relatively new” blogger, after a few months or years). These typically alternate with short, bad poems about feelings.

 

You Know The Ones

They

Look

Like

This.

And

express

shallow

positivity.

With meaningless

indentations.

Under pictures of photoshopped fitties posing.

As pixies.

 

You’ll be aware of a wide variety of bad art and writing that is more common on the (free parts of) the internet than in commercially published work. What unites these in the way that interests me right now, however, is that they share the quality of regularity. They have been published not from either inspiration or perspiration, or even because of the diktaks of painful conscience, but solely in order to be published. The ‘writer’ or ‘artist’ has encountered the advice that they should publish regularly in order to build an audience. And to validate themself, they need that audience. Without it, they do not exist even to themselves. For these people – as with those who treat ‘social networks’ in the same way – the medium actually is the message.

Alongside the impetus to regularity is the absence of an editor. This can be a good thing. Other things being equal, more information is conveyed in more words than in less, and editors of most publications are in an unfortunate situation of restricting skilled writers through word limits that will not allow them to deal accurately with their subject area.

Conversely, it means that bloggers and internet users more broadly are much more likely to simply finish and hit ‘publish’ or ‘send’. After which they are free to regret the decision, if they have been foolish enough to include anything specific or interesting enough that it could be regretted.

Stephen King in Writing: A Memoir of the Craft suggests that privacy is an important element of creating a story, and recommends using a trusted reader to test what works and what doesn’t. Not simply listening to what they say about it, but watching them as they read, and taking note of how they are really reacting to what is written. Do they laugh? Cry? Yawn? Storm out in a rage because of what happened to their favourite character? Forget to smoke their fag and burn their fingers? And where in the story does that reaction happen? We can only do this if we trust that reader and know them minutely. And if they know us intimately enough to make criticisms and suggestions that will anger and irritate us, even as they make our work better.

Another blogger, unless we are stalking them via CCTV or taking over their computer’s webcam (which would be wrong, btw) can’t help us in the same way. Not even if they’re commenting on what we have termed a ‘draft’.

By allowing someone we don’t know to see it we are already publishing.

 

The Anti-Social Network

Most of us like to be liked.

This is especially true of people who claim not to care, who are thus generally asserting a claim that they deserve to be liked by people who – inter alia – are cleverer, prettier, sexier, better-dressed, richer, funnier, more literate, better-spoken, more street, better-travelled, cleaner, funkier, higher up in the world of perfume, garden furniture or crack-cocaine sales, whiter, blacker, holier, more talented, luckier, harder, more pacific, more enlightened, closer to Paris Hilton, have more friends on facebook, or are more versed in using popular computer operating systems, espresso machines or flamethrowers.

Online though, such qualifications about who we wish to be liked by just disappear. The ‘real world’ and the virtual world are almost entirely at odds here.

In offscreen life it’s a bit weird (and tending towards the behaviour of a serial stalker, see above) to spend too much time telling multiple people you ‘like’ them. None of us want face-to-face friends whose sole conversational repertoire is ‘I like you’, or even ‘I like that’. As a result we have developed codes of non-verbal and verbal cues that can signal liking, disliking and nuanced neutrality of an infinite variety of possible types.

The most important of these codes, of course, is language. With its linked codes of irony, intonation, accent etcetera. Another is music. Another is art. Another is gesture. Another is money and how, who and what we choose to spend it on.

In social networks, however, it has become possible to avoid this essentially and unavoidably complex world most of the time. We are safe within a two-dimensional world.   Our reactions to communication are essentially divided between liking and silence. Of course, it is possible to ‘comment’. But, overwhelmingly, comments tend to be brief, and to be far fewer in number than simple ‘likes’.  The exception, of course, is abusive comments, which tend to be a little like a badly punctuated oversized ‘hate’ button.

It’s also theoretically possible to use social networks more creatively, and attempts to do this have become commonplace. Some suggest that respondents share a song or video or painting that’s meaningful to them according to a semi-randomly allocated letter. This is nice, and an avenue that might introduce many people to good (or bad) art or film or music – but it is not creative in the same degree as writing a song, making a film or picking up a paintbrush. For these activities you might be well advised to switch off your computer. Or, at least, your social network.

 

The ‘Productivity’ Counter-Argument

If you’ve ever thought about the amount of time wasted by people looking at facebook / pictures of kittens / surveys of ‘which superhero were you in a previous life’ on the internet, you’ll also have encountered the argument that this apparent timewasting is actually productive. The rationale is that it prevents people from concentrating too hard, becoming tired and stale etc. This is likely even to be backed with a link to academic research – possibly the first time that you have seen a reference used by this particular referrer.

Notwithstanding the academic link you’ve seen, this is horseshit. For (at least) five reasons, with the first being the most important.

i. The people who sent you the link to that article never do anything to a deadline.

So what if it’s anecdotal evidence? It’s just as true for you as it is for me. Let’s face it, even the people who wrote the article probably submitted it late.

ii. There are kittens in the real world too.

The relevant experiment would not simply compare productivity whilst being able, or not being able, to use social networks or computer-based entertainment alongside doing work. It would also compare both with productivity whilst taking breaks away from the computer. And with productivity achieved whilst being able go and play with real kittens / puppies / real ale dispensing machines at the employers’ expense.  And with the productivity achieved whilst doing the job the worker in question dreamed about getting when they were twelve years’ old and thought anything was possible.

iii. Bollocks to productivity. Take a break.

Or, to put it more politely… When an individual’s break from a task is to devote themselves to similar set of activities redefined as leisure, then they have not truly had a break at all.

Looking at your email – again – is not actually absenting you from work. You’re still there. And, in fact, you’re looking at your email partially because it looks, even to yourself, just like you’re working. Even, or especially, if you’re ‘working from home’.

And the fact that you look like you’re working means that everybody else you work with has to work just a little harder. Or at least, work a little harder at looking like they are.

 

 

But I need the computer…

If you really are working, however, there’s still a pretty good chance you need to use a computer, or a phone that is essentially a computer. The internet is, though over-hyped, as effective a research tool as some of those who hype it claim. And the computer has taken over the functions of card indexes and typists, as Nicholson Baker and Francis Fukuyama respectively have noted. And a host of other routine activities besides.

None of these shifts have left our ways of thinking and acting untouched.

i. “Just Google it”.

I was in a workshop recently for legal observers. These are people who serve protesters by collecting information about the (mis)conduct of the police. It’s a good thing, and they could use your help. (In the UK, they’re at http://greenandblackcross.org/) One of my fellow workshoppers suggested that shorthand would be a useful skill in this context, and said that she used it herself.

I asked her in a break if it would be possible to run a session on shorthand, or get some teaching about it. I write painfully slowly, despite being a relatively competent typist, so any help would be good.

You’ve already guessed the reply.

This person is frighteningly articulate in several languages, and is not afraid (obviously) to engage in anti-establishment thought and activity, even when that activity is dangerous or career-threatening. So this is not simply a conservative perspective on how learning happens, or a refusal to engage, or lazy reproduction of a buzzphrase. It is so much the current mainstream of thought that it has thoroughly infected the counterculture.

The new zeitgeist locates information online automatically, even (or especially) when that information is more easily available elsewhere. Or is in the head of the person being asked for the information. And namechecks and advertises a big, nasty, invasive data-mining conglomerate as a central component of that strategy.

(nb. The word ‘just’ in this phrase, which is the way I have most often heard it, looks initially like a harmless add-on. It isn’t. It’s a device for avoiding debate by making the instruction that follows it appear simple and easy. Imagine the sales impact Nike’s ‘Just Do It’ slogan would have had without the little ‘Just’ in there. ‘Do It’ is not a slogan that will gently persuade me to buy flash trainers under the impression that I will thereby become superfit and supersexy and enjoy hypermarathons in the rain. There is an episode of Mad Men where the slogan ‘Just Taste It’ is accidentally deconstructed at a corporate presentation, with the payoff coming out as an ill-tempered, bitchy, ‘Taste It Already!’ It lays bear the bullying undertone of ‘just’ beautifully. I like to think the writer, at the moment they completed that scene, had a mental image of Nike shoes being binned on street corners across America as their wearers woke up to the con.)

 

ii. Online journals

The best contemporary information and research is not on the free parts of the internet. I noted this earlier, but it bears repeating. And repeating. University tuition fees are not primarily there to ensure high-quality teaching or assistance or research laboratories or halls of residence or the right to go to the ball. They are now there to get the future rulers of the universe past the paywalls.

You may, if you lack imagination, want a ‘reference’ for this claim. Instead, let me propose an experiment.

Take any subject you are not currently a high-level expert on, and on which you know research is currently being conducted. Find out everything you can about it for several days, using only the facilities available to visitors without a library card or any student research privileges. Use your local public library. Do not photocopy or print out anything still under copyright. Do not hack (or pay for) paying sites. Do not ask anyone for help.

Now cross out everything that you learned from dangerously unreliable sources.

These include, amongst others:

Newspapers (or social media reposts) reporting on ‘scientific research’ without providing a clear overview of the study in question, including the journal it was published in and reactions within that journal and from other experts in the field.

Statistics presented without context. An example would be unemployment figures in Britain over a thirty year period that fail to indicate that methods of measurement have changed during that time.

Irrelevant arguments from authority. These can be awkward to spot, given they are usually disguised as relevant arguments from authority. But a prime, and commonplace, example, goes like this: person inherits (or marries and then divorces) a very large amount of wealth, and it becomes significantly larger over time. Their opinions on economic policy then gain tremendous authority. There are several sources of irrelevance here. One is that they may have been much less successful at increasing their wealth than others who inherited a similar amount. Rich people evade a lot of tax (or, to be precise, have people, businesses, foundations and advisors to evade tax for them) so you’ll never really know if this is the case.  Another is that wealth inequality has been increasing consistently for over thirty years – increasingly favouring capital as opposed to earned income. (You want a reference for that, of course. Once again, have a gander at the work of Thomas Piketty and his Capital in the Twenty-First Century). A third is that knowledge of how to make one person richer is not the same as how to make a whole society or country richer – and may be entirely irrelevant.

Anything that you suspect is selling something without being open about it. On that note… Have I mentioned sufficiently thus far that my primary long-term aim in writing is to help, in a very small way, support radically egalitarian social movements?

Most religious texts, at least in terms of their open intentions. The central mechanism of religion is ‘faith’. You don’t necessarily need a book, or any source at all, for that. You simply decide to believe and keep deciding to believe. Therefore conventional Biblical or Qu’oranic ‘knowledge’ is not knowledge of anything except what it says in the book in question. On the other hand, there is useful incidental historical information about lots of things contained in religious texts. In The Camel and the Wheel Richard Bulliet uses Biblical quotations (among other things) to demonstrate that wheeled transport existed through the Roman-occupied middle east, prior to its 1500 year disappearance before the superior efficiency of the camel.

Very out-of-date material on things that change a lot. A first edition of Webdesign for Utter Wazocks printed in 2004, for instance, probably wouldn’t help you learn much (except about the history of patronising book titles, perhaps). Bus timetables. Less obviously, practical maths books published in Britain before decimalisation won’t help you manage your money, the writing of history shifts as new discoveries are made about the past, and written German has less letters than it used to.

Almost any material that depends on the absence of evidence for its central argument. An example might go something like “Chemtrails must be real, because nobody can find out anything about them”.  nb. Note that this is different to having evidence (or reasonable and rationally stated grounds for suspicion) that clearly relevant evidence has been suppressed or avoided.  Although there is a connection, in that you usually need a lot of good evidence before you can convince anyone that someone has suppressed evidence, especially if the person or organisation suppressing that evidence is a powerful one.

Now look at what you have left. Unless you have been supremely lucky with the architecture, connectivity, budget, opening hours and the book ordering staff of your local library, it ain’t much. And it may well be nothing at all.

This is the access to good information that working-class people now have.

 

iii . Card Indexes

Nicholson Baker’s The Size of Thoughts contains a long exploration of the death of the card index. The thing that struck me most powerfully about it was his note that it was perfectly normal for there to be half-a-dozen errors in transferring the information on a single catalogue card to an electronic file. Digitisation means we don’t have to burn the books and newspapers and pamphlets that are the best sources of knowledge about our recent history – we’ve made it infinitely easier to simply lose them.

Baker notes that some of the worst offenders in irrational and rushed digitisation were senior librarians seeking the status that being ‘up-to-date’ would give them. This may have paid off in career terms for some. But the prestige of the profession as a whole has suffered from the attempt to make it ‘funky’ and ‘modern’.

The twentieth century ascribed a mythic power and dignity to libraries, which no desperately contemporary ‘Learning Resource Centre’ will ever touch. Imagine Buffy the Vampire Slayer with a library that has white walls, shelves a maximum of six feet tall, office style lights and four foot high posters by the door featuring E-list celebrities reading books to convince you it’s cool.

You can’t, can you?

Buffy correctly identified and drew on an important cultural idea in using a stereotypically old-style library. This is that real knowledge is the province of serious and dedicated people who are willing to forego baubles and shiny pretty things and funny videos of kittens to reach their own version of enlightenment. This may sound religious, or dull, but it’s neither. It’s Einstein working out relativity in breaks from the patent office, Nye Bevan burying himself in the paperwork that built the NHS (if you’re American and stressed about paying your medical bills, please follow this up), Edith Piaf doing three-day rehearsals, Ginger Baker in a Soho basement with a mind exploding with African drums, Django learning Bach from listening outside churches, John and Paul taking a bus right across Liverpool to meet a guy who knew one chord they didn’t, Jimi twanging the strings in a guitar shop and singing the notes back to himself all the way home, the Rza watching endless martial arts movies to find a new philosophy for Hip-Hop, Steve Vai building a shed for his recording studio, Cindy Cashdollar playing guitar in a closet from the age of eleven, every legendary footballer (if you’re American, you call it soccer, and here follows the reason you’ll never win the World Cup) working so hard on the training field or the street or the backyard that they looked natural by the time the manager saw them, or Tarantino immersing himself in movies and working in a video shop in order to make movies.

In fact, it’s nearly everyone who ever did something worth mentioning. At that moment in their life they weren’t working out how to make friends or count up thumbs up signs or Spotify plays. They were where they were, and they were doing what they were doing.  They weren’t trying to be anywhere else, or do anything else.

 

iv. Typists

Francis Fukuyama’s acknowledgements in The End of History and the Last Man explicitly don’t include what was then (1992) ‘the usual’ thanks to a typist. Instead he thanks the company that made his computer.

As The Cat Empire sang about watching reality tv, “that’s a bit strange, if you ask me”.

But he was noting an important shift in the structure of work in that comment, even though this wasn’t the main thrust of his book. By separating research from secretarial skills, academics (and the beneficiaries of thinktank and Ford Foundation grants, like Fukuyama) became essentially disconnected from working-class people. If you don’t need the help of a working-class person in preparing your manuscript for publication, you can comfortably say anything you like about them, as well as keeping them from gaining the means of taking your job or debating with you on equal terms.

Consider the plot of Mad Men, set in the early 1960s. Peggy, a working-class New Yorker, arrives fresh from Miss Beaver’s Secretarial School at the offices of advertising agency Sterling Cooper. Within short years, despite the rampant sexism of the office, her sharp intellect has been noticed and she has become the go-to copywriter for campaigns targeting women. With a salary to match, although it’s still not the same salary some less gifted male colleagues are pulling in.

Now consider the plot of Mad Men, shifted to the early 1990s. In the pilot episode Peggy, a working-class New Yorker, arrives fresh from getting a first-class degree at Brooklyn College to take a temp job as a typist. She’s ambitious, hardworking and desperate to pay off her college loans. Although she doesn’t have the Ivy League connections the copywriters and account men boast, she gets noticed for her attention to detail, sharp intellect, and astonishing typing speed. Everyone agrees they’ve got to find a place for her.

In episode two the CEO announces that the copywriters will henceforth do their own typing. Peggy is let go.

Final Thoughts

There’s an infinite number of areas I’ve not addressed here, from the lossiness of music files, to the abuses of workers’ time of Amazon’s slavey companies, to the fascination otherwise sane people have with the possibility of getting rich from selling worthless attic tat on EBay (or worthless hippy tat on Etsy), to the Scots referendum resulting in a ‘No’ vote despite the widely agreed ‘Yes’ victory in online media discussion, to the generally superior efficiency of single-function machinery over multi-functional machinery, to pub closures, to the spread of the myth of multi-tasking, to politicians avoiding the poor by using the digital realm as their main point of public contact, to the unnecessary and intrusive start-up noise Apple machines make, to the disturbing willingness of traditional businesspeople to lay down and die before what they are convinced is an unstoppable digital flood, to the peculiar idiocy of remote drone operators who fail to understand that their enemies have the rest of history to take revenge.  Maybe I’ll get round to some of them sometime.

But, for now, I think you’ve got the general picture.

Computers are useful tools.  But by carelessly presuming that we have to fit with them – rather than that they are there for us to use – we are doing damage to ourselves, and our society.  Some of that damage may be irreversible, and the worst of it is likely to fall upon the working-class and the poor.  It’s time to stop thinking inside the boxes.

Advertisements

One Response to “Thinking Inside the Box or How Computers Shut Down Thought”

Trackbacks/Pingbacks

  1. How Not to Write About Atheism | nosuchthingasthemarket - January 15, 2017

    […] It would be just as useful if he’d simply said ‘hey guys, just google it‘. […]

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: