The Road to Responsible Technology and the case of Türkiye:

Challenges and Solutions for Civil Society

Handan Uslu

 

The extent to which human will is absolute remains a topic of debate. Marxists assert that our will is shaped by our class standing, while Freud's psychoanalytic theory contends it's influenced by the unconscious. In today's world, technology has evolved into a supersystem that molds our will by reshaping the way we consume information, build relationships, and interact with both ourselves and the world at large. Major tech corporations pose significant threats to human rights and democratic principles, with their influence often rivaling that of sovereign governments. Within this framework, one must question: Can individuals recognize and counteract the manipulations and constructed consents they are subject to? Are there organized entities capable of providing critical advocacy and shaping policy recommendations? And, crucially, does the government possess the requisite knowledge, data, and tools to execute its regulatory duties effectively? A dedicated platform where we can critically assess and challenge these technological powerhouses is notably absent. As Turkey's premier entity focusing on responsible technology, the Observant seeks to usher in changes in cultural, legal, and social perspectives on technology through enlightenment, capacity enhancement, and advocacy.

The extent to which human will is absolute remains a topic of debate. Marxists assert that our will is shaped by our class standing, while Freud's psychoanalytic theory contends it's influenced by the unconscious. In today's world, technology has evolved into a supersystem that molds our will by reshaping the way we consume information, build relationships, and interact with both ourselves and the world at large. Major tech corporations threaten human rights and democratic principles, with their influence often rivaling that of sovereign governments. One must question: Can individuals recognize and counteract the manipulations and constructed consents they are subject to? Are there civil society organizations capable of providing critical advocacy and shaping policy recommendations? And most importantly, do governments possess the requisite knowledge, data, and tools to execute its regulatory duties effectively? A platform that critically assesses and challenges these technological powerhouses is notably absent in Turkey. The Observant seeks to usher in changes in cultural, legal, and social perspectives on technology through enlightenment, capacity enhancement, and advocacy.

Let's consider this scenario: We're posting a job advertisement on Instagram, and the ad is displayed to men at a rate of 85%. In the United States, regarding matters of social rights such as residency and employment, there are restrictions on micro-targeting for advertising. Targeting based on gender, age, or location for these advertisements is prohibited: for example, you can't show an engineering position advertisement only to men. Why doesn't Instagram apply this rule globally? If Instagram won't take on responsibility, why isn't there any regulation on this in Turkey?

As civil society, if we open our eyes, we can detect many violations and instances of discrimination in the ordinary course of life. Even though technology poses threats to democracy, human rights, and individual well-being, until now, it has managed to avoid scrutiny by the Turkish civil society. With the perspective of "responsible technology" that we first introduced in Turkey, we believe that we can create the necessary legal, technical, and social mechanisms for a human-centered internet ecosystem.

Not being  skeptical of technology isn't solely an issue of civil society. Most of us have faith that algorithms are "objective mathematical models," and therefore won't make the same mistakes humans do. Technology is complex and captivates us. We struggle to understand the algorithmic and technical aspects of tech products. What Google offers us is just a blank, white page.

The state, expected to protect competition if not the individual, doesn't have access to the data and knowledge held by tech companies. Yet, understanding complex algorithmic systems is crucial to address the societal impacts of technology and to protect individuals and markets. The imbalance of information among citizens, tech companies, and lawmakers is one of the main reasons we can't hold tech companies accountable.

Technology dictates its own paradigm to consolidate its power. Big-tech has established its cultural hegemony over the Turkish civil society and cultivated the civic approach it desires: actors in the fields of information and media often don't, or can't, step outside the narrative boundaries set by major tech companies. Current interventions are not concerned with the fact that tech companies intentionally design algorithms, platforms, and systems that propagate misinformation.

However, disinformation, hate speech, and populism are not external issues to algorithms; they're actively promoted by them. Tech companies seize people's attention to keep them glued to screens, requiring sensational, manipulative, and polarizing content. If Facebook changed its algorithm that promotes disinformation and stops pitting users in a click economy, fact-checking platforms would lose some of their purpose.

Behind these approaches, we can see big-tech's greatest fear: regulation. To evade regulations, big-tech supports views that frame problems as isolated, unrelated to them. A macro-level perspective, interrogating algorithms, and critically addressing product designs, policies, and algorithms aren't in their interest. It's possible to say that this approach has also permeated civil society: even in our reflexive reactions to issues like cyberbullying and digital violence, we target only the bully or the perpetrator, and forget the role of big-tech in making digital violence happen.

Questioning will be beneficial for all of us. As civil society, our criticisms and approach should target systematic, structural problems. Amnesty International reported that the violence against the Rohingyas in Myanmar was supported by Facebook algorithms with hate speech and calls for massacre. While we agree that collaborations with arms or tobacco manufacturers should be avoided, we should apply the same ethical stance towards tech companies.

There's a need for independent organizations overseeing the accountability of tech companies. Here at the Observant, we offer a critical lens towards tech companies and essentially assume a supervisory role. Our concern isn't just about producing informative content; we delve deep into unexplored topics and make sure that our investigations are actionable. Political micro-targeting, digital profiling operations, content moderation problems in Turkish forums are some of the issues we first addressed in Turkey from a civil society and human rights perspective.

We aim to raise awareness among individuals about responsible technology, and we work to ensure they understand the algorithmic violations they face or will face. Through partnerships and training with institutions, we equip advocates with the necessary tools to work in their fields, providing creative data extraction and monitoring training to decode technological hegemony.

So, who are we? As the Observant, we were once the people who developed these technologies. As former tech company employees, we're familiar with big-tech's foresight, competitiveness, and policies. We aim to bring this proficiency and agility to civil society and advocacy. We are engineers who have advocated from within institutions for fairness, justice, and equality. We know both the internal political games of tech companies and the legal sanctions they fear the most. Our activities and narratives are shaped by our critical stance towards tech companies.

In Turkey, we are expanding the boundaries of digital research, which has largely been supported and narrowly defined by big-tech. The observatory's methods and tools aim to grasp product dynamics, formulate policies, identify human rights violations, catch offenders, and examine algorithmic systems leading to hate speech and disinformation. The matters of rights violations aren't just simple issues that can be solved with engineering or data science solutions. They need to be approached from a human rights advocacy perspective, through social science paradigms. The observatory monitors tech companies with both technical capacity and a conceptual framework that can reveal their societal impacts. It reconnects ethics and technology.

The path ahead is challenging, but with fresh approaches, new solutions are achievable.

Just remember, stars shine brighter in the darkness.

 

TOP