Social Media Firms have a ‘Duty of Care’
Platforms are not just sentient about the users and content on their platforms but straddle the continuum from mere distributors to publishers
Published in the Hindustan Times
Social media platforms took off in the 2010s with users from across the globe signing up in the hundreds of millions or even billions. By offering easy ways to connect with others, share and consume content, social media democratized free expression and made it easy for everyone to access information. By being able to directly engage with events and discourse around the world, users could feel that they were citizens of a borderless world. Social movements leveraged these online social tools to drive political and social revolutions. Charismatic leaders deployed innovative social media campaigns to acquire power. It appeared that there was much to celebrate. In parallel, social media was also upending established information ecosystems. Traditional news media lost their gatekeeping powers on news and information and were further weakened with the shift towards digital advertising, with an increasing share of advertising revenue going to major social media platforms. With time, the negative effects of social media platforms have come to the fore. The volume and velocity of information flow have come at the cost of quality. Over time, malicious users and organizations learned to leverage these very same social media tools to sow division, fear and confusion and undermine the integrity of democratic processes. Harmful content, misinformation and disinformation started exploding. Just as the WHO declared COVID-19 as a global pandemic, it also warned of an ‘infodemic’.
Social media platforms’ response to the impact of their own product, especially in developing countries, has been anemic. Platforms parry that these social and political ills have always existed in our society and are now manifesting online. This equivocation is facilitated by the legal immunity platforms get for content posted on their platform on the premise that they are mere intermediaries facilitating the posting and sharing of content by third-party users without any kind of editorial control. In this argument, social media platforms are held at par with a regular ISP which only carries packets of data from one point to another over the internet but lacks cognizance of the content of the data itself. However, this argument is increasingly untenable. Platforms are not just sentient about the users and content on their platforms but straddle the continuum from mere distributors to publishers.
As private corporations, platforms have the right to decide what content they will host and distribute. Accordingly, all platforms have extensive terms of service and content guidelines that define what is and is not acceptable to be posted on the platform. Facebook, for instance, bans nudity and pornography while Twitter permits them. Further platforms moderate content they privately determine to be unsuitable for their platform, implying that platforms are aware of the content they are hosting and that distribution of content on their platform is a choice. This point is underscored by the differential treatment of the same content on different platforms. A good example of this is President Trump’s tweet and Facebook post with the statement “...when the looting shoots, the shooting starts…” in May 2020 against the protests in the wake of George Flyod’s death. Twitter restricted the tweet by placing a public interest notice on the relevant tweet by President Trump for breaking the platform’s rules. It stopped short of outright removal of the tweet citing public interest. Facebook on the other hand, chose not to take any action on the same content posted to its platform citing a commitment to free expression and public interest. Platforms are also known to “deplatform” certain users (notably President Trump and more recently many Russian state handles during Russia’s invasion of Ukraine) to showcase their political distance from the deplatformed user. Platforms thus prohibit distribution of the entirety of the deplatformed user’s content on their platform and not just violating content by the deplatformed user.
Further, platforms have become increasingly interventionist with content. To maximize user engagement (time spent on platform) and retain/increase users, platforms first started to curate the distribution of content by pushing new content into user feeds. The *selection* of content to amplify is a key editorial function, even if this function is deployed by an algorithm instead of humans. The internet has led to the transition from a content scarce to "content surplus and attention scarce" economy. Therefore, amplification of user content and consequent mass engagement has obvious political and commercial impact (in proportion to increased distribution). This is evident from the mainstreaming of individuals and narratives which may otherwise have remained on the fringes of public consciousness. Amplification is thus an intervention in socio-economic and political processes of society and platforms must be held responsible. Finally, platforms are starting to pay for original content to increase user stickiness thus completing their transition to a media corporation. Facebook, for instance, has pledged USD 1 billion for creator content in 2022. YouTube established a $100 million creator fund. TikTok has earmarked a $200 million “creator fund” for US creators. Instagram and Snapchat too offer financial incentives for original content.
It is clear that social media platforms play an active political role in our public discourse instead of merely providing dumb and passive technological infrastructure for user interactions. Initially platforms embraced their impact on societies and political systems by positioning themselves as harbingers of democracy and pro-people movements especially during the Arab Spring. However, the pro-David positioning of platforms is no longer compatible as platforms have become increasingly aligned with power instead of ordinary people in areas of conflict: organized political entities have mobilized online (in India through dedicated political party “IT cells”) overwhelming individuals and less-resourced people’s movements; and platforms have been found to be compliant with government take-down requests (of dissident speech). Most importantly, platforms themselves have become interventionist repositories of political power and must be held accountable for their choices. It is no longer tenable to argue that platforms have no responsibility for content hosted on their platforms. Instead, platforms have a “duty of care” in proportion to the harms posed by the content they are hosting along with liability linked to their distribution choice.
Read More:
The Complete Report:
For Platforms, Free Speech has Evolved into a Business Model
Politics of Disinformation