Identity in the Age of Technology: The Cost for Convenience and the Need for Protection

Were the future generations to ask us what the greatest deception of our generation has ever been, one might not hesitate to say it was the idea of partiality when decisions are left to machines. They would laugh at our faces because of how ridiculous this idea was, and they would be but right to laugh. Because what is a machine but the perpetuation of the same biases and discrimination we have committed throughout generations, however wholly alienated from what being a human might mean? We decided to believe a machine which does not feel and does not understand the nuances of our humanity is best suited to make choices about our overly complex human life. We were sold the idea that it would be convenient and impartial as it lacked the foundations we have as biased beings. What is feeding these decisions, the machine’s outcomes, but if not our skewed choices? What is telling the machine that the best person to hire is one with a male-sounding name and that they are better suited than their female counterpart? Did the engineer consider the darkness of a man’s skin when they created a product that would dispense soap? I would like to believe that they overlooked our non-white skin when designing a simple hygiene product, but has the melanin in our skin not been previously targeted as a crime?

The way upper-middle-class white women clench their purses when someone looks remotely two shades darker than them. Isn’t this but fear that those whom they used to own as property would decide to take revenge? Are we to think that creating a police state is not the extension of a monopoly on violence and that it serves everyone, not only those who belong to a higher class? Not only is this acting against the self-preservation of minorities, but an imposition on society. It is as if we had forgotten George Orwell’s 1984 and Michel Foucault’s view on the Panopticon.

Technology has allowed us to stretch our sense of identity to a point we could have never thought of before: part of our identity now resides in the hyperspace, online. Not only is this phenomenon a novelty, but it also comes with new challenges for our society, some of which governments have started addressing with data protection laws. However, were we to tell a Victorian kid that part of who we are as a person would be in the shape of zeroes and ones, wouldn’t this kid die in front of us?

We could lead a life online that is different from our usual day-to-day in person. We are now determined not only by our memories and physical appearance but also by our actions on the Internet. Society has accepted this as another aspect of our identity, which we adopted and integrated as part of who we are and how we identify ourselves. Our online identities resulted in unpredictable consequences: those ones and zeroes now are considered our data, a part of us that is essential to who we are as humans beings – it must now be protected from the corpocratic world, those who use our data in indiscriminate ways to increase their profits. They create heatmaps of their websites based on where we hover our pointers, make geographic maps based on our locations, enable features that allow them to track our activities on other websites without our explicit consent, and bury the information relevant to users in their Terms and Conditions with legalese and sophisticated words that escape the knowledge of the average person.

At this point, the reader should be convinced of technology’s impact on the metaphysics of our identity, the economic issues that arise as corporations use our data to identify us, prompt us with ads and sell our information to other companies, including the need for legal regulation. However, the story is far from over when discussing such a technological revolution. We taught current technologies to complete mundane and incredibly complex tasks: from compound interest calculations to facial recognition to detect outlaws.

The latter example should be worrisome to the reader. We know for a fact that governments have decided to use this technology for the sake of security. One may even agree with such measures, saying it is a low cost to take with a greater return. However, we know of multiple instances where the outputs have been far from desirable – false positives, especially towards minorities. We know facial recognition is biased in favour of cisgender white males. Understandably so, as most of the software is trained on this particular dataset. Hopefully, this is not a deliberate thought of those who developed such incredible automation but an unconscious decision based on whatever was at hand, good enough for training purposes. However, this does not exempt them from the repercussions of their artificial intelligence. Or is one to allow a constant state of police surveillance, over-policing of minorities and other enforcements? One must not forget that the delicate eyes of robots that inhabit our homes are just a way to sell convenience with the purpose of corrective measures, normalisation of deviant behaviour, as Foucault would rephrase it. The existence of such devices grants significant power to the creators who harvest our information and decide to push consumeristic behaviour as the non-deviant behaviour of capitalism (in Foucault’s age, it would have been sexually diverse people were normalised, but we understand that capitalism benefits from the marketing of sexual diverse lifestyles and have switched towards the monetisation of it instead).

Exclusion from the labour market is also a real consequence we must consider with the introduction of automation in the hiring place. We already brought up that female (and non-binary) counterparts to the male tend to be left out due to biases in the workplace. Yet the same issue can be replicated in the black community, where cisgender males have been taken out of the hiring process because their school of education reveal they are from predominantly black areas. Intersectional feminists would agree that this is not only an issue that oppresses the classical definition of women but also negatively affects those who don’t fit the binary of gender or are not white.

Another implication of the technology’s advancement is its impact on the labour market. Technology is here and will not go away. It has, in exceeding fashion, altered the way we create the products we need for our bare existence. Nowadays, most Western societies can go to bed and wake up knowing that, is there a tomorrow, there will be a plate of food on their table. The ethics of the logistics and the distribution of these resources is a topic one should not only consider for further discussion but understand the impact on the Global South. However, if one thought, given the possibly low cost of the logistics of the distribution and the high benefit it has for humanity, wouldn’t this “charity” act become an obligation for humanity, when literal tons of food end up in daily waste?

We must also consider the displacement of labour. As tasks get automated, we should expect the labour needed in the production of goods to decrease. Would this not widen the already big wealth gap between the least and wealthiest populations in our society? Is one to believe that this phenomenon is not to be looked through the thought of the Frankfurt School?

Deception has been the name of the game, and we are but pawns in it. We were impressed when machines started printing out what we would expect from another human. Aren’t their language processing models something we have known for years as the Chinese Room Argument? Nevertheless, we express awe when we see they can replicate our decisions. Machines whose only function is to optimise the utility of our society do not see the entire picture of our humanity and deliberately send our marginalised communities further into oblivion. We must remember that the input they take, the mathematical model they were given or created, are an incomplete picture of our identity as a society. Those mathematical representations are skewed and biased; they do not fully represent our society or even the phenomena they attempt to predict.

One is not to advocate that we should abandon all progress and developments in the tech world. The opposite is true: we need to make tech more humane and accessible, to understand that technology is not an end in itself but a means to achieve a better state in our humanity. Understanding that our mere existence as both digital and non-digital beings that live in such a complex world, the rising interest in the current economic and political spheres is only the beginning, and one must observe the development, guided through different schools of thought to understand the consequences of the (so far) unregulated development of technology.

Leave a Reply

Your email address will not be published. Required fields are marked *

Comments (

)