6 min read

Open Data Institute posted a video titled “Regulating for responsible technology – is the UK getting it right?”, as a part of its ODI Fridays series last week. Jacob Ohrvik Scott, a researcher at Think-tank Doteveryone, a UK based organization that promotes ideas on responsible tech.

In the video, Ohrvik talks about the state of digital regulation, systemic challenges faced by independent regulators and the need for an Office for responsible tech, an independent regulatory body, in the UK. Let’s look at the key takeaways from the video.

Ohrvik started off the video talking about responsible tech and three main factors that fall under responsible tech. The factors include:

  • unintended consequences of its applications
  • kind of value that flows to and fro the technology
  • kind of societal context in which it operates

Ohrvik states that many people in the UK have been calling for an internet regulator to carry out different digital-safety related responsibilities. For instance, the NSPCC, National Society for the Prevention of Cruelty to Children, called for an internet regulator to make sure that children are safe online. Similarly, media and Sport Committee is called out to implement an ethical code of practice for social media platforms and big search engines.

Given the fact that many people were talking about the independent internet regulatory body, Doteveryone decided to come out with their own set of proposals. It had previously carried out a survey that observed the public attitude and understanding of digital technologies. As per the survey results, one of the main things that people emphasized was greater accountability from tech companies. Also, people were supportive of the idea of an independent internet regulator.

“We spoke to lots of people, we did some of our own thinking and we were trying to imagine what this independent internet regulator might look like. But..we uncovered some more sort of deep-rooted systemic challenges that a single internet regulator couldn’t really tackle” said Ohrvik.

Systemic Challenges faced by Independent Internet Regulator

The systemic challenges presented by Ohrvik are the need for better digital capabilities, society needs an agency and the need for evidence.

Better digital capabilities

Ohrvik cites the example of Christopher Wiley, a “whistleblower” in the Cambridge Analytica scandal.  As per Wiley, one of the weak points of the system is the lack of tech knowledge. The fact that he was asked a lot of basic questions by the Information Commissioner’s Office (UK’s data regulator) that wouldn’t be normally asked by a database engineer is indicative of the overall challenges faced by the regulatory system.

Tech awareness among the public is important

The second challenge is that society needs an agency that can help bring back their trust in tech. Ohrvik states that as part of the survey that Doteveryone conducted, they observed that when people were asked to give their views on reading terms and conditions, 58 percent said that they don’t read terms and conditions. 47% of people feel that they have no choice but to accept the terms and conditions on the internet. While 43% of people said that there’s no point in reading terms and conditions because tech companies will do what they want anyway. This last area of voters especially signals towards a wider kind of trend today where the public feel disempowered and cynical towards tech.

This is also one of the main reasons why Ohrvik believes that a regulatory system is needed to “re-energize” the public and give them “more power”.

Everybody needs evidence

Ohrvik states that it’s hard to get evidence around online harms and some of the opportunities that arise from digital technologies. This is because:

  1. a) you need a rigorous and kind of longitudinal evidence base
  2. b)  getting access to the data for the evidence is quite difficult (esp. from a large private multinational company not wanting to engage with government) and
  3. c) hard to look under the bonnet of digital technologies, meaning, dealing with thousands of algorithms and complexities that makes it hard to make sense of  what’s really happening.

Ohrvik then discussed the importance of having a separate office for responsible technology if we want to counteract the systemic challenges listed above.

Having an Office for responsible technology

Ohrvik states that the office for responsible tech would do three broad things namely, empowering regulators, informing policymakers and public, and supporting people to seek redress.

Empowering regulators

This would include analyzing the processes that regulators have in-place to ensure they are up-to-date. Also, recommending the necessary changes required to the government to effectively put the right plan in action. Another main requirement is building up the digital capabilities of regulators. This would be done in a way where the regulators are able to pay for the tech talent across the whole regulatory system, which in turn, would help them understand the challenges related to digital technologies.

                                        ODI: Regulating for responsible technology

Empowering regulators would also help shift the role of regulators from being kind of reactive and slow towards being more proactive and fast moving.

Informing policymakers and public

This would involve communicating with the public and policymakers about certain developments related to tech regulation. This would further offer guidance and make longer-term engagements to promote positive long term change in the public relationship with digital technologies.


                                    ODI: Regulating for responsible technology

For instance, a long term campaign centered around media literacy can be conducted to tackle misinformation. Similarly, a long-term campaign around helping people better understand their data rights can also be implemented.

Supporting people to seek redress

This is aimed at addressing the power imbalance between the public and tech companies. This can be done by auditing the processes, procedures, and technologies that tech companies have in place, to protect the public from harms.


     ODI: Regulating for responsible technology

For instance, a spot check can be carried out on algorithms or artificial intelligence to spot harmful content. While spot checking, handling processes and moderation processes can also be checked to make sure they’re working well. So, in case, certain processes for the public don’t work, then this can be easily redressed. This approach of spotting harms at an early stage can further help people and make the regulatory system stronger.

In all, an office for responsible tech is quite indispensable to promote the responsible design of technologies and to predict their digital impact on society. By working with regulators to come out with approaches that support responsible innovation, an office for responsible tech can foster healthy digital space for everyone.    

Read Next

Microsoft, Adobe, and SAP share new details about the Open Data Initiative

Congress passes ‘OPEN Government Data Act’ to make open data part of the US Code

Open Government Data Act makes non-sensitive public data publicly available in open and machine readable formats

Tech writer at the Packt Hub. Dreamer, book nerd, lover of scented candles, karaoke, and Gilmore Girls.