A new piece of bipartisan legislation aims to protect people from one of the best practices that tech companies employ to subtly influence user behavior. Known as "dark patterns," this dodgy design strategy often pushes users towards giving their privacy unwittingly and allowing a company to access their personal data.
While this particular piece of legislation might go to generate much buzz in Congress, it does point to some regulatory themes that we likely hear more about than lawmakers build support for regulating big tech
The bill, embedded below, would create a standards body to coordinate with the FTC on user design best practices for large online platforms. That entity would also work with platforms to outline what sort of design choices infringed on user rights, with the FTC functioning as a "regulatory backstop."
Whether the bill gets anywhere or not, the FTC itself is probably best suited to take on the issue of dark pattern design, issuing its own guidelines and fines for violating them. Last year, after a Norwegian consumer advocacy group published a paper detailing how tech companies abuse dark pattern design, a coalition of eight U.S. watchdog groups called on the FTC to do just that.
Beyond eradicating dark pattern design, the bill also proposes prohibiting user interface designs that cultivate "compulsive usage" in children under the age of 1
"For years, social media platforms have been relying on all kinds of tricks and tools to convince users of their personal data. Without really understanding what they are consenting to, Senator Warner said of the proposed legislation. “Our goal is simple: to adjust a little transparency in what remains a very opaque market and ensure that consumers are able to make more informed choices about how and when to share their personal information.”
The full text of the legislation is embedded below.