With the tragic events in Texas and Buffalo, we are once again thrown into a national dialogue on how to deal with domestic terrorism. These incidents raise at least five major issues: public health concerns are reflected in firearms law; The applicability of the “red flag” law; Mental illness; The feeling of loneliness and occupation on the part of the young whites who predominate in the shooter population; Often ideologies and theories are based on white supremacy; The safety of our children as well as the black, brown and indigenous communities; And, not least, the Internet and social media play a role in this complex mix. For the sake of brevity, let’s focus on the last point.
Content control policies and practices are at the top of the debate in the United States Suffice it to say that neither the Democrats nor the Republicans hit the right mark because these discussions have reached a tipping point and are paralyzed as the subject of new legislation in Congress. But for the bipartisan divide that plagues the United States, it doesn’t have to be that difficult. A simple and clear law should be available for discussion among citizens, and there should be ready acceptance on the part of social media companies, even if their knee-jerk allergy to any kind of government regulation.
How about this draft?
- All platforms must comply with existing First Amendment laws, including the reporting of posts that create a “clear and current threat” to the health and safety of individuals or communities, including individuals and physical property.
- All platforms must have a clearly marked and functional link on their business home page where users can report illegal activity.
- All provisions of this Act are in accordance with Section 230 of the Communication Decent Act, 1996. (CDA)
Three provisions highlight this provision. The first is the explicit and current dangers of the First Amendment Act. That exception, including federal and state bans against child pornography and pornographic materials, is a disputed law. Apart from this proposed law, private companies are not bound by the First Amendment, but they are not exempt from the existing law. This provision makes that point clear.
In line with this point, creating an obligation for users to report such activity allows the community to participate in interactions with social media companies. Security vulnerabilities and privacy breaches that many experience as a result of helpless mess can begin to be resolved with this basic and essential ability to streamline complaints. Moreover, there is a precedent for this method. The Digital Millennium Copyright Act of 1998 (DMCA) created a similar way for content owners to notify Internet service providers of potential violations. It would be quite easy for lawmakers to pull a page out of a DMCA book.
None of these provisions deviate from Section 230 of the CDA. Indeed, one could argue that these rules are only about the compliance of social media companies with existing laws, outside of the proposed law. In fact, the same can be said for stopping the Enabling Sex Traffickers Act (SESTA) and allowing states and victims to fight the Online Sex Trafficking Act (FOSTA). Although Congress passed it in 2018, it indicates that in some cases it is important to have legislation that should be explicit, but not, or which, like moderating the content, seems to be a clear statement caught up in the culture war rhetoric. Trusts lawmakers as well as law enforcement.
By the same token, nothing in this proposed law prevents a platform from establishing its own policy for content management. The law is the level of expectation that our government sets under which no company has the right to go without redress. The policy, derived from the ancient Greek word for citizens, sets high expectations for users of the platform set community. Once someone hits the acceptance button on a site’s terms, the user effectively becomes a member of that community.
Much remains to be done in the United States to combat internal terrorism. Let’s start with the less hanging fruit: the common sense law of content restraint that we already have.