What is the most compelling policy issue in the current (or future) converged media environment?

Adriana Lamirande
2 min readMar 9, 2020

--

Originally written for The Fletcher School’s International Communication course, November 2019.

The difficulties and lack of consensus in conversations about regulating content is due to its inherently close relationship to freedom of speech. There is more pressure on democracies than ever before to tackle events like domestic terrorist threats on fringe forums to quelling neo-nazi platforms from flourishing. Media and digital platforms play a big part in shaping our views and communities — and there is something to be said for protecting vulnerable users from racist or sexist inflammatory expression that can lead to and produce acts of discrimination, persecution and physical violence IRL. We should strive to secure the rights of all members of society, and ensure there is due process to protect them from human rights violations or crimes.

In creating a policy to tackle hate speech online, it will be imperative we move away from traditional modes of building policy by ensuring an openness to a living, breathing document that can be subject to more nimble reevaluation given the pace of innovation. While it is difficult to set moral or ethical standards around what kind of speech and content is acceptable online, it will be necessary to set certain norms if we are ever to form a concrete policy, and that will mean making difficult choices about what is and isn’t lawful to share in terms of text, image or video online.

The internet has reignited the conversation around prohibitive content regulation, so civil society and government authorities should work with tech companies to determine what parameters hate speech falls under and what should be done in response, as well as invite them to grapple with their reliance on algorithms and automated software, and push for more oversight when it comes to the potential costs of filtering, personalization and optimization of certain content. Many companies have found themselves in the throes of inadvertently removing or censoring content in a move that falls outside their jurisdiction, simply because there is no due process. This sets a dangerous precedent for corporate superpowers to censor whomever they choose, a space they do not want nor should be able to arbiter. This system of adhoc self-regulation is not effective nor sustainable — so we should work together to change it.

--

--

Adriana Lamirande
Adriana Lamirande

Written by Adriana Lamirande

A place to gather policy memos, academic research papers, op-eds, and creative musings. Interests include the internet, psychoanalysis & video art.

No responses yet