Privacy commissioner launches consultation on artificial intelligence

The chief executive of Alphabet and Google made it plain. Artificial intelligence needs to be regulated. It is too important not to, wrote Sundar Pichai in a Financial Times opinion piece.

“The only question is how to approach it,” said Pichai succinctly.

That’s what the Office of the Privacy Commissioner of Canada (OPC) is grappling with as well.

Continue reading “Privacy commissioner launches consultation on artificial intelligence”

Blockchain operationalization

The federal government is cautiously wading in. Enticed by the promise of blockchain to simplify the management of trusted information in a secure fashion, the Canada Border Services Agency and the Port of Montreal, the country’s second-biggest port, are now testing a blockchain-enabled digital solution to see if it will streamline freight shipping. Several provincial governments too are swayed by the prospect of being able to better deliver government digital services, beginning with the British Columbia which launched OrgBook BC, a new blockchain-powered online search tool that allows users to verify if businesses and organizations are legally incorporated in the province.

Continue reading “Blockchain operationalization”

Cultural change is the biggest challenge law firms face in keeping up with technology

An overwhelming majority of law firm leaders believe technology will have the greatest impact on law firms over the next five years but are deeply concerned that cultural changes may prove to be a barrier in keeping up with new technology, according to a new report.

The global legal industry is at a tipping point, and there is an urgent need for law firms to consider the longer term impact of technological change on their strategic and competitive market position, suggests a report by accountancy and business advisory firm BDO LLP. The report, entitled Law Firm Leaders Survey, polled the managing partners and senior partners of 50 international and United Kingdom law firms.

Law firm leaders believe that greater client demand, generational change and legal market consolidation will be factors that will affect the business of law over the next five years, but four out of five cited technology as the factor that will have the biggest impact. Yet while technology is considered a strategic priority for 94 per cent of law firm leaders surveyed, it is only a top strategic priority for only six per cent of them.

The practice of law however has been largely shielded by technological developments over the past fifty years, suffering little more than glancing blows. While the way that law professionals process and share information has evolved with new technologies, primarily with the emergence of personal computers, email, and the Internet, it did not fundamentally transform it.

“Technology’s impact on the legal sector is not new, but the degree of change over the next five to ten years is likely to be very different,” said the report.

Nearly one in five law firm leaders believe that artificial intelligence is the technology that will change their law firms. Some think it may replace the work of lawyers while others believe it will shed a significant layer of work and revenue from law firms. This in turn could bring about changes to the resourcing mix at law firms, the law firm model and its financial structures. Other law firm leaders would not go so far, believing the impact AI will have is unpredictable.

Artificial intelligence aside, nearly one in five law firm leaders believe that technology could lead to greater efficiency and productivity, but not necessarily a disruption to the business model. One in ten said that technology could lead their firms to offer new services to clients and it could change the way they are delivered. Technology, these law firm leaders believe, will enable or drive them to offer a wider range of services to clients, and provide better and deeper legal analysis.

“New technologies are likely to replace some the routine work which is currently undertaken by junior lawyers,” said one law firm leader. “In turn, this will have an impact on the shape of the law firm of the future.”

But almost half (49 per cent) of law firm leaders said cultural change is the “greatest” challenge law firms face in keeping up with new technology. There is an interesting disparity however between global and UK firms. Twice as many global law firm leaders believe that cultural change amongst the partnership – as opposed to across the firm – was the main impediment. UK heads were three times more likely to say the challenge lay within the firm compared to the partnership level.

Investments in technology too was a preoccupation of global law firm leaders. Nearly 30 per cent of them viewed new investment funding as their greatest challenge, “perhaps because of the scale of investment many are looking to make in new technology.” The nature of the partnership model is another factor that was cited.

“In this new world where technology and changing client demands are causing firms to reconsider how legal services are delivered, is it feasible that law firms can continue to provide legal services in the same way they have done for decades?” asked rhetorically Matthew White, international practice leader, professional services group at BDO. “Law firm leaders must accept that if they want to maintain competitive advantage they will have to be much bolder in their approach to overcome with these disruptive market changes.”

Legal profession concerned about algorithmic bias

Algorithms, the set of instructions computers use to carry out a task, have become an integral part of everyday lives, and it is immersing itself in law. In the U.S. judges in some states can use algorithms as part of the sentencing process. Many law enforcement officials in the U.S. are using them to predict when and where crimes are likely to occur. They have been used for years in law firm recruitment. And with advancements in machine learning they are also being used to conduct legal research, predict legal outcomes, and to find out which lawyers win before which judges.

Most algorithms are created with good intentions but questions have surfaced over algorithmic bias at job hunting web sites, credit reporting bureaus, social media sites and even the criminal justice system where sentencing and parole decisions appear to be biased against African Americans.

And the issue is likely to gain traction as machine learning and predictive coding become more sophisticated, particularly since with deep learning (which learn autonomously) algorithms can reach a point where humans can often no longer explain or understand them, said Nicolas Vermeys, the assistant director at Cyberjustice Laboratory in Montreal.

AlphaGO is a case in point. When AlphaGO, Google’s artificial intelligence system, defeated the 18-time world champion in the complex and highly intuitively game of the ancient Chinese board game GO, it was not just a demonstration of yet another computer beating a human at a game. GO, a game with simple rules but profound complexity, has more possible positions than there are atoms in the universe, leading some to describe it as the Holy Grail of AI gaming. It was a remarkable feat because AlphaGO was not taught how to play Go. It learned how to play, and win, by playing millions of games, using a form of AI called deep learning, which utilizes neural networks that allow computer programs to learn just like humans. More than that, the victory showed that computers are now able to rely on its own intuition, something that was thought only humans could do.

Another example is Deep Patient. The brainchild of a research group at Mount Sinai Hospital in New York, it is a machine learning tool that was trained to detect illness from data from approximately 700,000 patients. Deep Patient turns out to be good at detecting hidden patterns in the hospital data that indicate when people are becoming ill. It also appears to be really good at anticipating the onset of schizophrenia, a very difficult disease for physicians to predict. But the people behind Deep Patient do not yet understand why Deep Patient seems to be good at predicting schizophrenia and do not understand how it works.

“We have no idea how algorithms arrived at their decision and therefore cannot evaluate whether the decision has value or not,” said Vermeys, whose research institution is studying the issue of algorithmic bias. “There is a risk to relying completely on machines without necessarily understanding its reasoning.”

No human is completely objective, and so it is with algorithms as they have been programmed by programmers, noted Ian Kerr, a law professor at the University of Ottawa and the Canada Research Chair in Ethics, Law and Technology. Programmers operate on certain premises and presumptions that are not tested by anybody else which leads to results based on those premises and presumptions which in turn gives rise to bias, added Kerr.

On top of that it is very difficult to challenge such decisions because “whoever owns the algorithms has trade secrets, isn’t likely to show you the source code, isn’t likely to want to talk about the secret source and what makes the algorithm work,” said Kerr. “What justifies the algorithm is its success or perceived success which is very different from whether or not it operates in biased ways.”

Aaron Courville, a professor with the Montreal Institute for Learning Algorithms, shares those concerns. “We are really in a phase where these algorithms are starting to do interesting things, and we need to take seriously the issues of responsibility,” said Courville.

Europe is taking a serious look at these issues. Under the European Union’s new General Data Protection Regulation (GDPR), automated individual decision-making that “significantly affect” users will be restricted, argue Bryce Goodman of the Oxford Internet Institute and Seth Flaxman of the University of Oxford’s Department of Statistics in a paper. Expected to be in force in 2018, the GDPR will also effectively create a “right to explanation,” according to the authors. In other words, users can ask for an explanation of algorithmic decision that was made about them.

“This is where Europe and the U.S. go wild in their disagreements,” explained Kerr, who has also written about the issue of a right to explanation. “Europe starts with this principled approach that makes sense. If a decision is about me and it has sort of impacts on my life chances and opportunities, I should be able to understand how that decision was made. It invokes large due process concerns.

“The due process idea is that no important decision should be made about me without my own ability to participate. I have a right to a hearing. I have a right to ask questions. So all of these kinds of rights are kind of bound up in this notion of the duty to an explanation. And the hard thing is that an algorithm isn’t in the habit of explaining itself, which means that if that kind of law prevails then people who use algorithms and design algorithms will have to be a lot more forthcoming about the mechanisms behind the algorithm.”


Further reading:


Machine bias: There’s software used across the country to predict future criminals. And it’s biased against blacks by ProPublica, an American independent, nonprofit news organization.

Chief Justice John Roberts is a Robot by University of Ottawa law professor Ian Kerr.



And for the technologically-inclined:
Mastering the Game of Go with Deep Neural Networks and Tree Search by David Silver, the lead researcher on the AlphaGo project.

Artificial intelligence: Law firms are a hard sell

Fernando Garcia is looking forward to the day when he can get his hands on Beagle, an automated contract analysis system powered by artificial intelligence that reads contracts in seconds, highlights key information visually with easy-to-read graphs and charts, and gets “smarter” with each reviewed contract. Also on his bucket list is an offering by yet another Canadian legal tech start-up, Blue J Legal, that too uses AI to scan legal documents, cases files and decisions to predict how courts will rule in tax decisions. At a time when the majority of in-house counsel are under intense pressure to shave costs and run a lean team, such powerful tools are a godsend. “There’s always that pressure to do more with less so when a tool comes along that can provide more efficiency, more risk mitigation, and can let you do your job better and focus on providing value-added, it is a strategic advantage,” noted Garcia, general counsel, government affairs and corporate secretary with Nissan Canada Inc. “It’s going to fundamentally change our job.”

Continue reading “Artificial intelligence: Law firms are a hard sell”