The chief executive of Alphabet and Google made it plain. Artificial intelligence needs to be regulated. It is too important not to, wrote Sundar Pichai in a Financial Times opinion piece.
“The only question is how to approach it,” said Pichai succinctly.
That’s what the Office of the Privacy Commissioner of Canada (OPC) is grappling with as well.
The privacy commissioner has waded into the discussion with a series of 11 proposals, “Proposals for ensuring appropriate regulation of artificial intelligence” that would amend the federal Personal Information Protection and Electronic Documents Act, (PIPEDA).
“We are paying specific attention to AI systems given their rapid adoption for processing and analysing large amounts of personal information,” said the OPC in its consultation document. “Their use for making predictions and decisions affecting individuals may introduce privacy risks as well as unlawful bias and discrimination.”
The OPC goes further. It maintains that AI presents fundamental challenges to “all foundational privacy principles as formulated in PIPEDA,” and that PIPEDA itself “falls short” in its application to AI systems. A case in point is the data protection principle of limiting collection. Rather timidly, the OPC says that the principle “may be” incompatible with the basic functionality of AI systems as data is the fuel that drives the nascent technology.
AI also challenges the aim behind the purpose specification principle. Under this principle, organizations are expected to provide an explanation of the purpose of collection, usually through a notice. The other key element of the principle is that organizations should be “limiting use and disclosure” of personal information to the purpose for which it was first collected.
AI calls into question the practicality of the principle. In some cases, organizations may not necessarily know ahead of time how the information will be used by AI in the future, points out an Australian privacy commissioner that examined AI and privacy.
“There is a risk of excessive data collection beyond what is necessary ‘just in case’, using overly broad collection notices and privacy policies in an attempt to ‘catch-all’,” noted the Office of the Victorian Information Commissioner in a 2018 report. “This kind of practice allows organizations to claim technical compliance with their privacy obligations, but it is disingenuous and inconsistent with the underlying goal of the collection limitation principle.”
Against this backdrop, the OPC developed 11 rather prescriptive proposals, many of which have “direct parallels” to the European Union’s General Data Protection Regulation (GDPR), point out legal experts.
“The prescriptive nature of the proposals, coupled with rules directed to a specific technology, are a shift in the OPC’s traditional positioning of PIPEDA as “principles based” and “technology neutral,” noted Laila Paszti, Of counsel with Norton Rose Fulbright Canada LLP. “The risk of a rules-based approach, if adopted legislatively, is the loss of the current flexibility under PIPEDA and an accompanying risk of overly circumscribing AI system development and deployment.”
Many of the proposed changes to PIPEDA may significantly impact how organizations conduct business, particularly with respect to the collection and use of personal information through AI and machine learning processes, point out lawyers from Bennett Jones LLP.
On top of that, several of the proposals could potentially be applied by Parliament “so as to have a more general effect.”
The comment period for the proposals ends on March 13, 2020.