Illustration: Anja Sušanj

Born in the 80s, educated in the 10s, living in the now, Sami Juhani Rekola tries to recalcitrantly keep bios and himself undefined by using fancy umbrella terms like a post-medium thinker, hands-on explorer, and precarity art worker. Current areas of interest include mythical love, enlightenment, digital doppelgängers, prima vista performances, and AI as future experience design.

After all the materiality of the world has been grasped, the next natural step of the Great Devourer - capitalism - is to go deeper into the immaterial within the confines of our minds. As we go about our days, we seep valuable information that can be monetized directly and made into codes of control, redirection, and upkeep. A new frontier has been unhinged by cracking our minds open, where the laws are all but settled. Is the immaterial world beyond our ethical discussions and legislative apparatuses too ethereal to be real, or can a reasonable amount of responsibility be demanded everywhere?

Imagine a people, a population with hardly any previous experience of the internet, social media, the dark side of web rhetorics, and the sinister traps that inhabit the online nooks and crannies. Imagine the vulnerability, the sincere belief in progress, and the possibilities in that belief. Imagine the responsibility and the power of knowing beforehand how the platforms work and how media becomes the massage. Knowing how people will react to fake news before they know they are fake. Knowing how to “plant the seed of doubt”. Knowing how to straw-doll an opponent publicly, without anyone noticing the faulty rhetorics well-visible to digi-natives. Imagine understanding memes before anyone else. Knowing how nudging can cause. Knowing how to invoke mass action through hate and a simple common enemy, without anyone to point it out as cheap information war tactics and sheer condemnable racist hate speech. Imagine giving out cellphone deals with preloaded Facebook to a nation of people new to social media while favoring its use by having it free of incurring data charges, thus making it possible to spread fake news and gain political momentum to unleash a genocide moments later. Nobody knew how to distinguish faux news appearing on social media from the traditional, sourced, and critically reviewed news following the ethical guidelines of journalism, and how could they?

Afterwards, it’s easy to spot these blind spots in the responsibility chain, the loopholes in the system through which Myanmar was flooded full of anti-Rohingya propaganda by the Myanmar military in the years following 2013. Though there are resources to battle against fake accounts that spread volatile and life-threatening material on a platform such as Facebook, these resources aren’t spread out evenly among its users, as it takes roughly the same amount of work for each language to get the machine learning algorithm for removing hate speech functioning efficiently. Maintaining Burmese Facebook responsibly(ish) would’ve taken more Burmese-speaking moderators and algorithms to siffle malicious content specific to that language, but with fewer monetizable users in that language group, I can see why that wasn’t made a priority. Now, remember the way we frustratedly click “Yes, I accept” or “Consent” as if that annoyance were the only price we pay for “free services” provided by companies actually driven by profit instead of the falsely explicated, altruistic want to merely connect people.

icon

The ultimate male gaze (as most of the code gets written by 20-35-year-old white men in Silicon Valley) strips me down to my behavior, my associations, my private search words, the faces of my public photo albums, the quickness I comment to something, the timelines, and maps I provide and fill in. My body gets on the operating table to show what it is when I don’t see it clearly myself, my body is a story about to get written in various ways, a heap of raw material I don’t own. This flesh is interesting, but not in a fulfilling, reciprocal way, only as something to be used in ways I do not know of or that I understand only partially. This body has value, as it has agency, it’s like a host body, ready and willing to be taken as an offering.

One might go online with a false premise: “I can freely enter, explore and use the world of information and its cornerstone websites without their effective intervention on me, like the omnipotently hyperconscious and objective truth-seeker I am”. One might believe the hype of an interconnected world, of a flat internet and a no-strings-attached open-source mentality pervading the digital world. Even though those online zones do exist, both historically and currently, the most public and commonly used digital areas are also, as they are in the physical realm, the most contested and commercially utilized zones. The city centers, malls, and public squares, the main pedestrian walkways of our cities: all eyes are on these square meters as everything there can be monetized. Repetition creates a field ripe with information, just hanging in there as a side-product of human behavior. We are flamboyantly and extravagantly producing it. The way people act in commercial spaces, everything about it: when they do it, how they do it, who they do it with, how long they procrastinate when doing so, where they look, what they want, what they remember and what memories they associate with, who they socially connect to this place and so on. It’s all eyes on everything. And this goes for the digital world also.

Against a near-infinite memory and calculating capabilities surpassing that of my human race, I feel more naked online. More of me gets seen, and other layers that I’ve trusted stay hidden surface. The ultimate male gaze (as most of the code gets written by 20-35-year-old white men in Silicon Valley) strips me down to my behavior, my associations, my private search words, the faces of my public photo albums, the quickness I comment to something, the timelines, and maps I provide and fill in. My body gets on the operating table to show what it is when I don’t see it clearly myself, my body is a story about to get written in various ways, a heap of raw material I don’t own. This flesh is interesting, but not in a fulfilling, reciprocal way, only as something to be used in ways I do not know of or that I understand only partially. This body has value, as it has agency, it’s like a host body, ready and willing to be taken as an offering. Are we living in ultimate submission to the machines of the attention economy, or should we call it fair trade? Users produce a tiny bit of monetizable information for the platform offering them valuable shortcuts across the vast digital ocean. Isn’t it just useful to have the excess information seeping from them produce something useful and good? One man’s trash, another man’s treasure, right?

Yes, I accept

To start at the beginning: we are born into our human existence with the supervision and aid of technological devices, we might already be 3D-scanned thoroughly before the moment of birth, which is also often monitored and documented by various apparatus. We are signed into various logs and forms that begin our way into the plethora of digital surveillance and tracking systems. We might have already started to exist inside these systems as a social media post, regarding pregnancy, like a proto-hello to the world. For a while, we existed as a label attached to the mother within certain juridical systems regarding subsidies, housing applications, and guidance centers.

The growing mass of increasingly accurate, valuable, and personified data gets more additions: weight, height, standardized evaluations, school performance, etc. During elementary school, many children have also started their own in-and-out-of-body travels using tablets, cellphones, and other devices, modifying the reward system in their brain with gushes of dopamine towards the requirements and modes of use that these machines propose. Progressing neural rhizome entanglement, if you will. Children learn to learn with the aid of touchscreen interfaces, and they learn to feel true anxiety and complete fulfillment when playing games. They learn to coexist with machines as a part of their daily life, and they will hear and feel more and more of this technological normalcy in their environment as they grow up to be agents in the world.

A certain externalization of processes is at play: remembering, thinking, searching, studying; all sorts of cognitive expansions and entertainment happen at least partly online, so it’s safe to say that we are internally connected to the data streams of the digital globe. The cyborgian dimension of human existence is silently embraced as a limbic part of it. The reality we are to accept is that of both flesh and machine. We are and have been cyborgs for a long time. Therefore aren’t we co-builders of these machines? Shouldn’t we have a say and have our cut?

We intertwine and mix with the platforms we use: search engines, cloud and streaming services, mail storage, and social media. We use them in our processes of becoming the personae we seek to be as if they were tools lying on a toolshed table. Except they aren’t. A tool is something one can pick up from its resting place to fulfill a function momentarily, to finish a task faster or more efficiently, only to be returned once the task has been completed. A simple subject-object relationship, that is, as it’s not luring us into using it more often than we’d like to. Instead of this clear dichotomy, we ebb and flow into the digital like the porous creatures we are. We terraform but are also terraformed in the process. We become the landscape like the sunset becomes a reservoir of human nostalgia. The tools transform us as we use them.

Oh, they know me so well, as I’ve taught them well. Everything I get advertised online seems vaguely familiar, it has the stench of my own psyché in it, with a feeling of distance and distortion, a trend or a new angle applied perhaps, but it’s somehow mine that’s now getting sold back to me. My online behavior casts a shadow that is now looking back at me as if I were the one in the shade. Maybe I want them to crack my mind open and fill it back up, like a twisted sort of introspection, funhouse mirrors? However, this computed disintegration doesn’t feel consensual. If I am transilluminated completely, isn’t it already violence, hacking me into bits and digits like this?

The thing is, consensuality is an important matter between bodies. It needs to be there in gestures, in conversation, in enthusiasm, and in reciprocality. Suppose my body and its freedom to choose are being deliberately targeted by bodies (that work for companies driven by profit) that use the latest psychological magic tricks known. Shouldn’t the interface of the physical and digital platforms be a negotiated space between bodies? Clicking “Consent” or “Agree” should only be valid until the next point of negotiation, the next limit, or the next friction that comes up. A “yes” is not a waiver of rights for eternity as it can be overturned at any given moment and should be respected inside the digital sphere as well.

Shadow work

icon

Everything I get advertised online seems vaguely familiar, it has the stench of my own psyché in it, with a feeling of distance and distortion, a trend or a new angle applied perhaps, but it’s somehow mine that’s now getting sold back to me. My online behavior casts a shadow that is now looking back at me as if I were the one in the shade. Maybe I want them to crack my mind open and fill it back up, like a twisted sort of introspection, funhouse mirrors? However, this computed disintegration doesn’t feel consensual. If I am transilluminated completely, isn’t it already violence, hacking me into bits and digits like this?

Should a non-biased online search be a basic human right? Or are these just companies providing a service with which they can leverage, barter, and pressure our bodies in exchange for their valuable reactions? We are always a few steps behind technological development and the conversations that that exponential speed would require. It’s actually something that will always be the case, as our foreseeing capabilities are limited, especially when dealing with technologies that can overturn, reimagine or recontextualize everything we know. The masses of information that can be utilized from the shadows cast by us and our online actions are beyond our comprehension. We make available the creation of accessible maps consisting of humane trigger points and vulnerabilities, personality traits and dysfunctions, political razor’s edges, and things we can be lured into doing. We hand them all out on a plate. We emit the excess of what it is to be us, it’s precisely that which gets tracked, resourced, and mined. There’s profit emerging from us, and though it isn’t consciously produced by us, it’s a kind of metawork we provide and can be enticed into doing more of. Even to such a degree that we need to fiddle with addictive content and hang out on social platforms while driving a car. The interface can quickly open into a crossing between bodies, revealing the entanglements of blood, bowel, and wire.

In these post-peak attention economy times, we don’t really have more time to give, only more attention to give. We have divided our time bank so that there’s enough of everything already. And there are enough providers on each imaginable level of commerce to keep up a “healthy competition”. So from a consumerist point of view, it all boils down to how we then divide that attention among the ones desperately in need of it. There lies the true business inside the consumer. But in all fairness, can consciousness be put out for sale in this way? Does ”my inner space” belong to me or is it a contested place, a marketplace, a zone up for grabs like the visual cityscape: exteriors of cars, insides of public buildings? Can secret auctioning take place right there in that instinctive twitch you just made?

We are conscious only part of the time we are physically awake. We can receive a thought, stay with it, rethink it, question it, lose it and get back on it. We have our moments of spacing out and then think/feel more in blips on our way to moments of temporary coherence and continuity. Dots become lines of thought that become topographies of our reality. What we focus on emerges from the haze, crystallizing, solidifying, and becoming separate from the mass of oneness. That which didn’t exist a moment earlier becomes reachable, moldable, and rephrasable. We point the light at something to see what it is. A sense of control, of free will, and a felt tangibility through commanding the window of attention. There’s a certain trust there that it’s the “I” that makes the decisions. We are so used to experiencing agenthood within our minds. Even though my concentration slips to other things and I procrastinate, it’s my very own laziness. It’s my focus, my attention, my reality, it’s what I make of it.

My body, my history, my reactions: there’s an automated, everlasting personalized feed of private content concealed in these three. Everyday behavior draws loops and cycles out of everyone. We are the puppet in a slowly unraveling show, always getting temporary pleasure roughly from the same things: our preferences and tastes reached, surprises and experiences chased. We are in our own personalized way, but still, ragdolls reacting to stimuli in positive, neutral, or negative ways. The first question is: is that puppet show for sale or deeply private, only to be shared willingly? The second question is: is the agenthood over the window of attention for sale or also deeply private, ours to move around? Do we want to allow invasive/persuasive technologies to steal our minds into their metawork megafactories without us consenting to it?

There’s something cheap about attacking the most vulnerable spots of the human body. Even in street fights, there are rules: don’t go for the eyes or the genitals. Even as fierce beasts, we fight leaving a certain area of us unprotected out of trust in this deal, there’s consent in there. Of course, I will blink or shy away from attacks against these private areas, it’s a reactivity that’s deep at the core of my being, a brain stem flinch that I’m helplessly trying to avoid. Eyes and genitals: are these, in fact, the most direct, accessible yet unpatrolled highways headed deep inside us, like hidden tunnels going under the ocean, shortcutting through the plains of me, and are they being utilized secretly to produce value for someone? I hear a bright cling from my cell phone, my focus constricts around it. I have to take my phone out and see once again the reassuring tiny numbered red dot on a messaging app icon. I press my finger on the glass as if it were a mirror, reaching toward two hands and thinking: “woof”. Like the Pavlovian dog I am. I hear, I see, I drool, I want, I need, I eat.

Causality of the Code

icon

I hear a bright cling from my cell phone, my focus constricts around it. I have to take my phone out and see once again the reassuring tiny numbered red dot on a messaging app icon. I press my finger on the glass as if it were a mirror, reaching toward two hands and thinking: “woof”. Like the Pavlovian dog I am. I hear, I see, I drool, I want, I need, I eat.

But the ingredients need to be cheap in order to get the maximal profit out of the final product, eh? The rules of capitalism apply. You find free workers, automatize the process and sell it back/forward for profit. You make it look like there’s no paywall, tradeoff, or hidden agenda. You use knowledge about human psychology to be most efficient in what you do: contextualizing big data and selling it as usable information to all interested buyers. You make people in your company believe their skills are used for good, creating wealth, flow states, and career moves up the ladder. Somebody makes a fun and popular 5-point personality quiz on social media to produce voter-profiles to be sold in order to turn the swing states. Somebody invents the Infinite Scroll to make every scrolling feed oddly non-satisfactory but still highly addictive, like pulling slot machines. The users have to be tightly kept at it. Somebody codes an algorithm to automatically control the masses of data while teaching itself through machine learning to be more efficient (but in a way nobody can tell how). This demi-god then recommends things: like a tender of shadows, herding us like hollow data creatures. Somebody creates a platform for people to anonymously meet, greet and retweet with people out of their bubble but also creates a biased recommendation algorithm that narrows them to mostly see polarizing content that mobilizes/utilizes/monetizes people most efficiently; it pushes their buttons to push buttons to get more people pushing buttons.

Connecting people ain’t so bad, but connecting them to what? Connecting perfectionist pre-teens too early with a society that has a body image issue and homogenous meters of success, and abandoning them there? Oh, they’ll manage just fine connected to a dystopian machine that grinds our unconscious actions into useless data, transmuting it a moment later into a powder of gold: information contextualized to fit a need. Connecting who with what? It’s known that a tiny percentile of social media users can make up most of the posts. People with the deepest, darkest fantasies can also be the most active individuals on a platform. A few influential characters can start riots and movements with a single call to arms. The stronger the virus, the more viral it will be. Polarization and simplification get the common man’s juices flowing. Is it responsible to connect volatile groups of individuals with each other and let them spread the fire? Is freedom of speech also the freedom to become desensitized, desocialized, and detrimental together?

Words of the digital realm reach bodies of the physical world quite quickly. It becomes a matter of consent again. If it isn’t the platform’s responsibility, then whose is it? The chain of reliability has to start somewhere along the line. One proposal coming from inside the tech scene itself, is to make all social media action traceable by having a mandatory identity verification process with an account. Fundamentally problematic is the fact that the companies need to produce profit for their investors, which as a premise pretty much dictates business strategies and the company’s aims, the ethical compass, so to speak. I’d say it is fair to expect that intellectual capital should be shared consensually and not have it stolen. What is then a fair profit for someone providing streaming, social networking, and searching functions? How much attention is enough in the trade? And how far have we already gone, meaning how much of the free market leeway can be taken back into our own hands? How many billion dollars of profit are willingly handed back in this struggle for a crack-free mind?

icon

Words of the digital realm reach bodies of the physical world quite quickly. It becomes a matter of consent again. If it isn’t the platform’s responsibility, then whose is it? The chain of reliability has to start somewhere along the line.

As the worker isn’t necessarily a factory worker anymore, nor is a factory a factory building per se, we need to become conscious of (and become resilient inside) this new paradigm of work that doesn’t look like work anymore. The content and the looks of the frame might change, but in the capitalist economy, there always seems to exist this attempt to reformulate or recontextualize someone else’s work into an automation that, with the least amount of resources and extra input, benefits you the most (as quickly as possible). In this case, for example, the cognitive frontier was up for grabs: seeing (the monetizable potential in excess human behavior), claiming (it as if it were a gold mine), and reframing (it as something other than home invasion, invasion of privacy, and theft). And it took some time before people within these huge social media companies realized that they were creating something they couldn’t fully control anymore and that their actions would affect most of the world. If a person has trouble letting go of their phone after spending a relaxing four hours with it, it’s time to start questioning whether “the fight is fair” and where’s the consent.

There’s talk of taking the digital infrastructure towards well-being and productivity, with empathy towards the user and a desire to serve their needs. Quite a few former employees within big tech companies have started wanting to create fewer distractions in an already hectic and overloaded culture, having realized their responsibility as more and more durational studies come up about what constant distractions do to work culture, not to mention the brains and bodies that are affected. Though this is just a small movement from within and might contain a commercial catch of its own, it’s something. There’s a public eye now on how diversely power is distributed in politics, governance and the economy. The same will slowly take place within tech firms as they see that more views are better than fewer views (pun intended). The algorithms and AIs of the future are hopefully partly queer and mostly loving, as are we. In the meanwhile, if you want to level the playing field, you can try saying no to notifications and social media accounts, using monochrome settings, encrypted messaging apps, VPNs, privacy-honoring search engines, email providers, and cloud services on your devices. Or just try staying conscious of where your window of attention lies, who’s trying to move it, where, and why. What’s for sale?

What’s complicating these discussions is the speed at which the traditional democratic ways of governing tackle them. There’s enough action in the tech bubble for those on the inside to stay flabbergasted, let alone those trying to make sense of it from the outside. The legislation can’t regulate what it doesn’t reach (in time). As new innovations and ever-accelerating technological upheavals are most likely to blast the known boxes of bureaucratic taxonomy into a new order, it won’t get any easier in the future. Is there a way to stay aware of power, to feel the digital’s reach and its heavy destiny-drawing hand on other people and bodies? Could the underlying code be diversified, so it will see that which I am not? Could the somatics of the digital be as soft as possible, as if care or nurture were embedded in it? There’s always a Trojan Horse, so could responsibility be there, unhinging itself at night and touching all that it sees in the inner courtyard: this is my responsibility, and that too?