Digital secretary Jeremy Wright announces board of Centre for Data Ethics and Innovation

© Crown Copyright
© Crown Copyright

Wright also unveiled the first projects that will be led by the centre

Share

Digital secretary Jeremy Wright has revealed the board members and early projects of the world's first Centre for Data Ethics and Innovation, the government's attempt to ensure that the UK leads the development of data-driven technologies that will benefit society.

Fertility and genetics expert Robert Winston, former Ofcom Chair Dame Patricia Hodgson and Sage Group VP of AI Kriti Sharma are among the expert advisers that will shape the centre's work under chairman Roger Taylor, the founder of healthcare data firm Dr Foster.

"Roger is a successful entrepreneur and passionate advocate for using data to improve lives, and I know that he will do an excellent job," Wright said at the Parliament and Internet Conference in Portcullis House, Westminster. "And as is clear today, the board will include many other world-renowned experts and leaders in their fields.

"The board will bring their immense and varied experience to tackling some of the greatest policy issues of our time."

The other board members are StarCount CEO Edwina Dunn, Digital Ethics Lab director Professor Luciano Floridi, Ethics Incubator founder Dr Susan Liautaud, Select Committee on AI member Kate RockBishop of Oxford Dr Steven Croft, ASI Data Science director Richard Sargeant, Chief Inspector of Probation Dame Glenys Stacey, and Alan Turing Institute AI programme director Dr Adrian Weller.

Read next: Does Jeremy Wright have the digital experience for the job?

Wright also disclosed details on the role and objectives of the centre. Its first projects will explore how data shapes online experience and investigate algorithmic bias, but its wider remit is to provide an ethical underpinning to the development of emerging technologies. Wright pointed to AI as a special area of focus. 

"The UK has the opportunity to be a world leader of this, and the world already looks to the UK to take a leadership role," he said. "But what's particularly special about AI in my view, is that unlike the many ways of technological development that have preceded it, we've got a real opportunity to develop the ethical structures that should go with those technological advancements in parallel with the development of the technology.

"And if we don't do that, not just will that be an ethical failures, but I happen to think that it will set back the development of AI purely in a development of technology sense. And the reason I say that is that AI requires data. If you are going to make the most of AI, you need to have a ready flow of data, and that data will be best available to you if people are giving it willingly, and they will only do that if they believe that the ethical safeguards - what's going to happen to their data - are properly protected. And that's what I think the CDEI will be able to do."

Data trust pilots

Wright also confirmed that the Government's Office for AI will now work with the ODI to explore the future potential of "data trusts", partnerships that allow multiple organisations to quickly and safely share data, such as local councils sharing food waste data with a recycling startup. 

A pilot project will establish whether a legal structure that provides independent third-party stewardship of data will be useful to manage and safeguard data.

A further project will create a prototype of a data trust City Hall and the Royal Borough of Greenwich that focuses on whether real-time data from IoT devices and sensors could be shared with the tech sector to create new digital solutions to city challenges.

"In 2018 we have become much more aware of who has access to data - data about ourselves, our family, our friends and our work," Jeni Tennison, CEO at the ODI, explained in a press release. 

"While we see many benefits from the use of data, such as being able to find local exercise classes using data from leisure centres thanks to OpenActive, or plan a train journey quickly and easily with an app using route and timetable data, there has also been misuse and harm, as we saw in the case of Facebook and Cambridge Analytica. Data trusts are a potential new way to help realise the benefits while preventing the harm. We're keen to explore them to find out where they might be useful."

Government plans taking shape

Plans to create a Centre for Data Ethics and Innovation were first announced by Wright's predecessor Matt Hancock in April.

In the government's 2017 Industrial Strategy, AI and data was identified one of four areas in which the UK could lead the world in technology and potentially add £232 billion to its economy by 2030 - the equivalent of 10 percent of GDP.

The centre was established to ensure that any development is met with the appropriate governance, to advise the government of specific policies and regulations and to set out best practices for the use of data.

"The rise of AI-driven products and services have posed new questions that will impact us all," said Wright. "Is it right to use technology to to be able to determine someone's likelihood of reoffending? Is it right to use a programme to make hiring decisions? Is it right to have an algorithm that dictates who should be saved in a car crash?

"This is not science fiction, but real questions that require clear and definitive answers from policy makers. That's why we recently established our Centre for Data Ethics and Innovation. The centre is a world-class advisory body to make sure that data and AI delivers the best possible outcomes in support of its innovative and ethical use. It's the first body of its kind to be established anywhere in the world and it represents a landmark moment for data ethics in the UK and internationally."

"Recommended For You"

How Google is looking to ensure AI development is ethical and fair Government publishes Open Data Institute plans