Computing, the use of digital technology, data and information, is now present in almost every aspect of human activity. The Digital Age has rocked traditional legal and regulatory frameworks across many sectors of the economy. It has disrupted government and is challenging accepted rights of access to information.

Elizabeth Tydd, NSW Information Commissioner, spoke at the Objective Collaborate conference in Sydney about the role of transparency and openness in the Digital Age, and the importance of governments being accountable for decisions that are increasingly influenced by algorithms.

The Commissioner explored a range of issues that are impacting legal rights in the digital context and the risks posed to those rights. She discussed key developments in the landscape for information governance, the value of promoting access to information and some contemporary solutions being applied to uphold these rights and deliver the benefits of technology in ways that ensure we ‘leave no one behind’.

An increasing reliance on algorithms
Emerging technologies, including artificial intelligence (AI), machine learning, data sharing and information security raise critical legal, regulatory and ethical questions, requiring innovative responses.

Governments around the world are harnessing the benefits of digital technology and in doing so they are engaging with established industry participants in largely unregulated environments. In many cases the risks are not yet fully understood.

From loan approvals, to recruiting, to legal sentencing, to university admissions, private and public sector organisations are increasingly reliant on algorithms to support decision-making. These decisions can have significant impacts on the lives, livelihoods and privacy of individuals – as well as demographic groups or even whole communities.

Digital transformation has brought tremendous optimism for solving some of the world’s most difficult problems such as poverty, disease and violence. The challenge is to ensure we avoid creating new problems in the process, and that we maintain mechanisms for integrity and accountability.

‘Algorithms can create barriers to information access, that lead to social or economic inequality,’ says Elizabeth Tydd, NSW Information Commissioner, speaking at Objective’s recent Collaborate conference. ‘For example, a social housing tenant, or their social housing landlord, may not be able to challenge a subsidy decision if the algorithm used to calculate that subsidy is inaccessible to them.’

‘Algorithms may also reflect the bias of programmers or deficiencies in the underlying data,’ she continues. ‘For example, women typically have lower superannuation balances. If super balance is taken as a variable in a predictive algorithm that respond to higher superannuation balances, the results may be skewed against women.’

The transparency challenge
Ms Tydd emphasises the importance of promoting access to information to ensure that we leave no one behind in the Digital Age. In particular, transparency around algorithms themselves – understanding how a decision was made and on what information it was based – is critical for protecting or exercising legal rights.

Technical transparency – revealing the source code, inputs, and outputs of the algorithm – may help build trust by showing that a decision-making process is fair. Yet organisations may regard their algorithms as highly valuable intellectual property and they are not legally obliged to reveal their source code.

If source codes can’t be revealed, one option is for regulators or auditors to have authority to step in and adjudicate; to reassure consumers that the process is fair. Although Ms Tydd emphasised that under government services contracts transparency remains crucial. Algorithms may also be opaque for other reasons. They’re highly complex and notoriously difficult for even their creators to understand.

Many experts have called for rules to make the inner workings of algorithms transparent. For example, the EU’s General Data Protection Regulation (GDPR) is tackling ‘intentional concealment’. Although regulations can’t address the technical challenges associated with transparency in algorithms, they can mandate that people be able to demand the data and or decision-making variables behind the algorithmic decisions made about them.

This is just one of the ways that information governance and open governance can support the wider use of technology whilst protecting people’s rights and entitlements.

Building public trust and the need for ethical governance
‘As algorithms become increasingly complex and inaccessible, we need to build public trust in how we manage information,’ says Ms Tydd. These sentiments were echoed by NSW Minister for Customer Service, Victor Dominello at the NSW AI Thought Leaders’ Summit. ‘If we don’t resolve governance around AI - then we are asking the future to jump off the plane without a parachute,’ he said.

People are inundated with information, but many find it challenging to evaluate the reliability of that information or its sources. They may have limited understanding of how their personal information is used and shared by algorithms, devices or organisations.

Information governance has an important role to play. At Objective, we have been investigating the concept of open governance, to support organisations in balancing both privacy and security with openness and transparency. A robust governance framework defines what types of records should be created, to document the development and operation of an algorithm; and the characteristics, quality and provenance of its supporting data. It also establishes who should have the right to access those records, how long they should be kept and when they should be disposed of – taking account of privacy, intellectual property, and other rights and obligations including broader public interests.

These records support transparency of algorithms and accountability for AI-enabled decisions. ‘Open governance’ proposes that organisations should make their information governance frameworks transparent to stakeholders. This enables people to make informed choices. It reassures them that their interests are recognised and protected, contributing to trusted relationships, which are essential for effective government and successful business outcomes.

Building trust relies on high levels of transparency and practicing sound digital ethics as part of a robust information governance framework. ‘To ensure legitimacy and trust, governments must maintain clearly established democratic principles and uphold the public interest in the application of these technologies’ confirms the Commissioner. ‘We must ensure we leave no one behind.’

You can download this Insight Paper to learn more about Objective’s research into information access and building a culture of openness, or contact us directly to discuss solutions that help you put principles into practice.

Sources

Information governance key developments: Promoting access to information and ensuring that we leave no one behind in the Digital Age. Presentation by Elizabeth Tydd, NSW Information Commissioner at the Collaborate conference hosted by Objective Corporation, 2019 (Sydney)

Harvard Business Review: We need transparency in algorithms, but too much can backfire

Forbes Magazine: What will come after the information age?