“Are we creating a future we can control?”
As technology rapidly changes, it offers the potential for both incredible improvements to public safety, but also deeply serious harms. How should private companies, governments, and the public address concerns such as the loss of privacy, perpetuation of racial injustice, or the prospect of widespread government intrusion and surveillance?
Microsoft President Brad Smith recently visited NYU Law for a Policing Project-hosted discussion of these issues, which Mr. Smith and Carol Ann Browne explore in their new book Tools and Weapons: The Promise and the Peril of the Digital Age.
Focusing on surveillance technologies, our director Barry Friedman engaged with Mr. Smith on a wide variety of issues that Mr. Smith and Ms. Brown discuss in their book. Much of the conversation explored the need for sound and transparent regulation, including lessons about tech regulation from history. For starters, Mr. Smith used the analogy of the introduction of the radio, noting that regulation and concern came in the third decade of its existence. This is where the internet is today—24 years after its popularization.
The conversation explored the notion of internal regulation – the idea that private sector companies should take steps to limit the privacy and civil liberties risks of their products. Mr. Smith argued it is important for companies to ask themselves hard questions about what they will do, and how they will regulate it, and what they won’t do, either by keeping certain products off the market or by not selling them to non-democratic actors. He added that tech companies have an increased responsibility in this space as their technologies and data have the potential to put lives at risk.
Although Microsoft has taken a strong stance on internal regulation – for example, pushing back on law enforcement requests for customer data in the past – Professor Friedman noted that smaller companies or those with less commitment to robust oversight may not be as judicious when they receive subpoenas and warrants. Mr. Smith said this illustrates the need for robust and transparent laws governing this behavior. “I don’t think companies can be a substitute for law or government, nor do I think they should be,” Mr. Smith explained.
When discussing external regulation, Mr. Smith observed that, at present, it can feel like a challenge to get the government to pass the legislative protections necessary to protect against technology issues. But he expressed optimism about regulation through the courts (e.g., requiring warrants, excluding evidence) and taking advantage of coalition building, strategizing, and collaboration to gain traction for these actions. “It’s not about whether you’re optimistic or not, but whether you’re determined. In order to have meaningful change we have to have determined people,” he noted.
Finally, the discussion turned to facial recognition and what lies ahead. While Microsoft has laid out its operating principles for implementation of AI and machine-learning, Mr. Smith said big concerns remain, including bias, discrimination, privacy loss and risks to democratic freedoms. This is, of course, one of the focus areas of the Policing Project, and a space we feel passionately about. Our work will continue to push for increasing transparency and public debate around the adoption of new policing technologies, including facial rec.