Good afternoon, ladies and gentlemen. It's a pleasure to be here to speak to you today. I've worked with Michael for many years, so it's wonderful to be here with him to speak on these important issues.
What struck me in what you will be doing—I'm just going to read it out—is that your committee is to “undertake a study of digital government services, to understand how the government can improve services for Canadians while also protecting their privacy and security”.
That is so vitally important. That's how I want to address something which I created years ago and is called privacy by design, which is all about abandoning the zero-sum models of thinking that prevail in our society. Zero-sum just means that you can only have a positive gain in one area, security, always to the detriment of the other area, privacy, so that the two total to a sum of zero.
That either-or, win-lose model is so dated. What I would like you to embrace today is something called positive sum. Positive sum just means that you can have two positive gains in two areas at the same time. It's a win-win proposition.
It was started years ago. I did my Ph.D. at the U of T when the father of game theory, Anatol Rapoport, was there. We used to discuss this. I always remember saying, “Why do people embrace zero-sum?” I am the eternal optimist. I would much rather deliver multiple wins than an either-or compromise. He said, “It's simple, Ann. Zero-sum is the lazy way out, because it's much easier just to deliver one thing and disregard everything else.”
I want you to do more, and I think you want to. You want to deliver privacy and security as well as government improvements that can improve services to Canadians.
My privacy by design framework is predicated on proactively embedding much-needed privacy protective measures into the design of your operations and the design of your policies for whatever new services you want to develop and whatever you want to do in terms of data utility, but we do that along with privacy/security. It's a multiple-win model. It's privacy and data utility services to individuals. You can fill in the blanks, but it's “and” not “versus”. It's not one to the exclusion of the other. But how do you do both?
I know that I only have 10 minutes and I've probably used up five, so I'm going to keep the rest short.
In the privacy world, there's a key concept called data minimization. It's all about de-identifying data so that you can benefit from the value of the data to deliver much-needed services in other areas of interest to Canadians and individuals without forfeiting their privacy. When you de-identify personally identifiable data, both the direct and indirect identifiers, then you free the data, if you will, from the privacy restrictions, because privacy issues arise and end with the identifiability of the data. If the data are no longer personally identifiable, then there may be other issues related to the data, but they're not going to be privacy-related issues.
Data minimization and de-identification will drive this goal of having what I call multiple positive gains at the same time, making it a win-win proposition. I think it will make governments more efficient. You will be able to use the data that you have available and you will always be protecting citizens' personal information at the same time. That's absolutely critical.
I am happy to speak more. I can speak on this issue forever, but I want to be respectful of my time restrictions. I will gladly turn it over to you and answer any questions that you may have.
All right. Great. I don't think my wife is listening in.
Voices: Oh, oh!
Dr. Michael Geist: Good afternoon, everybody. My name is Michael Geist. I'm a law professor at the University of Ottawa, where I hold the Canada research chair in internet and e-commerce law and am a member of the Centre for Law, Technology and Society.
My areas of speciality include digital policy, intellectual property and privacy. I served for many years on the Privacy Commissioner of Canada's external advisory board. I have been privileged to appear many times before committees on privacy issues, including on PIPEDA, Bill , Bill , the Privacy Act and this committee's review of social and media privacy. I'm also chair of Waterfront Toronto's digital strategy advisory panel, which is actively engaged in the smart city process in Toronto involving Sidewalk Labs. As always, I appear in a personal capacity as an independent academic representing only my own views.
This committee's study on government services and privacy provides an exceptional opportunity to tackle many of the challenges surrounding government services, privacy and technology today. Indeed, I believe what makes this issue so compelling is that it represents a confluence of public sector privacy law, private sector privacy law, data governance and emerging technologies. The Sidewalk Labs issue is a case in point. While it's not about federal government services—it's obviously a municipal project—the debates are fundamentally about the role of the private sector in the delivery of government services, the collection of public data and the oversight or engagement of governments at all levels. For example, the applicable law of that project remains still somewhat uncertain. Is it PIPEDA? Is it the provincial privacy law? Is it both? How do we grapple with some of these new challenges when even determining the applicable law is not a straightforward issue?
My core message today is that looking at government services and privacy requires more than just a narrow examination of what the federal government is doing to deliver the services, assessing the privacy implications and then identifying what rules or regulations could be amended or introduced to better facilitate services that both meet the needs of Canadians and provide them with the privacy and security safeguards they rightly expect.
I believe the government services really of tomorrow will engage a far more complex ecosystem that involves not just the conventional questions of the suitability of the Privacy Act in the digital age. Rather, given the overlap between public and private, between federal, provincial and municipal, and between domestic and foreign, we need a more holistic assessment that recognizes that service delivery in the digital age necessarily implicates more than just one law. These services will involve questions about sharing information across government or governments, the location of data storage, transfer of information across borders, and the use of information by governments and the private sector for data analytics, artificial intelligence and other uses.
In other words, we're talking about the Privacy Act, PIPEDA, trade agreements that feature data localization and data transfer rules, the GDPR, international treaties such as the forthcoming work at the WTO on e-commerce, community data trusts, open government policies, Crown copyright, private sector standards and emerging technologies. It's a complex, challenging and exciting space.
I would be happy to touch on many of those issues during questions, but in the interest of time I will do a slightly deeper dive into the Privacy Act. As this committee knows, that is the foundational statute for government collection and use of personal information. Multiple studies and successive federal privacy commissioners have tried to sound the alarm on the legislation that is viewed as outdated and inadequate. Canadians understandably expect that the privacy rules that govern the collection, use and disclosure of their personal information by the federal government will meet the highest standards. For decades we have failed to meet that standard. As pressure mounts for new uses of data collected by the federal government, the necessity of a “fit for purpose” law increases.
I would like to point to three issues in particular with the federal rules governing privacy and their implications. First is the reporting power. The failure to engage in meaningful Privacy Act reform may be attributable in part to the lack of public awareness of the law and its importance. Privacy commissioners played an important role in educating the public about PIPEDA and broader privacy concerns. The Privacy Act desperately needs a similar mandate for public education and research.
Moreover, the notion of limiting reporting to an annual report reflects really a bygone era. In our current 24-hour social media-driven news cycle, restrictions on the ability to disseminate information—real information, particularly that which touches on the privacy of millions of Canadians—can't be permitted to remain outside the public eye until an annual report can be tabled. Where the commissioner deems it in the public interest, the office must surely have the power to disclose in a timely manner.
Second is limiting collection. The committee has heard repeatedly that the Privacy Act falls woefully short in meeting the standards of a modern privacy act. Indeed, at a time when government is expected to be the model, it instead requires less of itself than it does of the private sector.
A key reform, in my view, is the limiting collection principle, a hallmark of private sector privacy law. The government should similarly be subject to collecting only that information that is strictly necessary for its programs and activities. This is particularly relevant with respect to emerging technologies and artificial intelligence.
The Office of the Privacy Commissioner of Canada, which I know is coming in later this week, recently reported on the use of data analytics and AI in delivering certain programs. The report cited several examples, including Immigration, Refugees and Citizenship Canada's temporary resident visa predictive analytics pilot project, which uses predictive analytics and automated decision-making as part of the visa approval process; the CBSA's use of advanced analytics in its national targeting program with passenger data involving air travellers arriving in Canada; and the Canada Revenue Agency's increasing use of analytics to sort, categorize and match taxpayer information against perceived indicators of risks of fraud.
These technologies obviously offer great potential, but they also may encourage greater collection, sharing and linkage of data. That requires robust privacy impact assessments and considerations of the privacy cost benefits.
Finally, we have data breaches and transparency. Breach disclosure legislation, as I'm sure you know, has become commonplace in the private sector privacy world and it has long been clear that similar disclosure requirements are needed within the Privacy Act. Despite its importance, it took more than a decade in Canada to pass and implement data breach disclosure rules for the private sector, and as long as that took, we're still waiting for the equivalent at the federal government level.
Again, as this committee knows, data indicate that hundreds of thousands of Canadians have been affected by breaches of their private information. The rate of reporting of those breaches remains low. If the public is to trust the safety and security of their personal information, there is a clear need for mandated breach disclosure rules within government.
Closely related to the issue of data breaches are broader rules and policies around transparency. In a sense, the policy objective is to foster public confidence in the collection, use and disclosure of their information by adopting transparent open approaches with respect to policy safeguards and identifying instances where we fall short.
Where there has been a recent emphasis on private sector transparency reporting, large Internet companies, such as Google and Twitter, have released transparency reports. They've been joined by some of Canada's leading communications companies such as Rogers and Telus. Remarkably, though, there are still some holdouts. For example, Bell, the largest player of all, still does not release a transparency report in 2019.
Those reports, though, still represent just one side of the story. Public awareness of the world of requests and disclosures would be even better informed if governments would also release transparency reports. These need not implicate active investigations, but there's little reason that government not be subject to the same kind of expectations on transparency as the private sector.
Ultimately, we need rules that foster public confidence in government services by ensuring there are adequate safeguards and transparency and reporting mechanisms to give the public the information it needs about the status of their data and appropriate levels of access so the benefits of government services can be maximized.
None of that is new. What may be new is that this needs to happen in an environment of changing technologies, global information flows and an increasingly blurry line between public and private in service delivery.
I look forward to your questions.
Thank you both for attending before this committee.
The study of digital government is a huge topic. We began it last year and then back-burnered it, because of the Cambridge Analytica, Facebook and AggregateIQ study.
I was fascinated when I spent some time last year with Prime Minister Juri Ratas of Estonia. He showed me the card, the chip it contains and the fact that it's basically cradle-to-grave data. They've had a couple of breaches and glitches with their chip manufacturer, but it's a fascinating concept.
I'd like to ask both of you this. Whereas the Estonian digital government model is built on a fledgling democracy after the collapse of the Soviet Union, with a still compliant society that accepted the decision of its new government leaders to democratically impose this new digital government on the population, in our context, our wonderful Canadian Confederation has had, through 150 plus years, democratic challenges to government, with skepticism and cynicism in many ways, with regard to significant changes in government and referenda on any number of issues. I'm just wondering, for any government, whether federal, provincial, regional or municipal, in any of the contexts, how practical the pursuit of a single card with a chip à la Estonia is for Canada and Canadians.
Dr. Cavoukian, would you like to go first?
Forgive me; I was shaking my head. Estonia is highly respected, no question. I personally would not want to go with one card with one chip that contained all your data. That's a centralized model that is just going to be so problematic, in my view, not only now but especially in the future.
There are so many developments. You may have heard of what's happening in Australia. They've just passed a law that allows the government there to have a back door into encrypted communications. Why do you encrypt communications? You want them to be secure and untouched by the government or by third parties, unauthorized parties. Australia has passed a law that allows it to gain back-door access into your encrypted communications and you won't know about it. No one can tell you about it. It is appalling to me.
Personally, I am not in favour of one identity card, one chip, one anything.
Having said that, I think we have to go beyond the existing laws to protect our data and find new models, and I say this with great respect. I was privacy commissioner of Ontario for three terms, 17 years. Of course we had many laws here and I was very respectful of them, but they were never enough. It's too little too late. Laws always seem to lag behind emerging technologies and developments. That's why I developed privacy by design. I wanted a proactive means of preventing the harms from arising, much like a medical model of prevention. Privacy by design was unanimously passed as an international standard in 2010. It has been translated into 40 languages and it has just been included in the latest law that came into effect last year in the European Union called the General Data Protection Regulation. It has privacy by design in it.
The reason I'm pointing to this is that there are things we can do to protect data, to ensure access to the data, digital access by governments when needed, but not across the board, and not create a model of surveillance in which it's all in one place, an identity card, that can be accessed by the government or by law enforcement.
You might say that the police won't access it unless they have a warrant. Regrettably, to that I have to say nonsense. That's not true. We have examples of how the RCMP, for example, has created what are called Stingrays. These impersonate cellphone towers so they can access the cellphone communications of everyone in a given area when they're looking for the bad guy. Of course, if they have a warrant, I'd say to them, “Be my guest, by all means. Go search for him.” Did they have a warrant? No. They did this without anyone knowing, but CBC outed them, and they finally had to come clean that they were doing this.
With the greatest of respect and not to say anything negative about Estonia, that's not the direction I would want us to take here, one of greater centralization. I would avoid that.
I didn't resign lightly. I want to assure you of that.
Sidewalk Labs retained me as a consultant to embed privacy by design—my baby, which I've been talking to you about—into the smart city they envisioned. I said, “I'd be very pleased to do that, but know that I could be a thorn in your side, because that will be the highest level of privacy, and in order to have privacy in a smart city...”. In a smart city, you're going to have technologies on 24-7, with sensors and everything always on. There's no opportunity for citizens to consent to the collection of their data or not. It's always on.
I said that in that model we must de-identify data at source, always, meaning that when the sensor collects your data—your car, yourself, whatever—you remove all personal identifiers, both direct and indirect, from the data. That way, you free the data from privacy considerations. You still have to decide who's going to do what with the data. There are a lot of issues, but they're not going to be privacy-related issues.
I didn't have any push-back from them, believe it or not. I didn't. They agreed to those terms. I said that to them right at the initial hiring.
What happened was that they were criticized by a number of parties in terms of the data governance and who was going to control the uses of the data, the massive amounts of data. Who will exercise control? It shouldn't just be Sidewalk Labs.
They responded to that by saying they were going to create something called a civic data trust, which would consist of themselves and members of various governments—municipal, provincial, etc.—and various IP companies were going to be involved in the creation of it. But they said, “We can't guarantee that they're all going to de-identify data at source. We'll encourage them to do that, but we can't give any assurance of that.”
When I heard that, I knew I had to step down. This was done at a board meeting in the fall. I can't remember when. Michael will remember. The next morning, right after the meeting, I issued my resignation, and the reason was this: The minute you leave this as a matter of choice on the part of companies, it's not going to happen. Someone will say, “No, we're not going to de-identify the data at source.”
Personally identifiable data has enormous value. That's the treasure trove. Everybody wants it in an identifiable form. You basically have to say what I said to Waterfront Toronto afterwards. They called me, of course, right after my resignation, and I said to them, “You have to lay down the law. If there is a civic data trust, or whoever is involved in this, I don't care, but you have to tell them that they must de-identify data at source, full stop. Those are the terms of the agreement.” I didn't get any push-back from Waterfront Toronto.
That's why I left Sidewalk Labs. I'm now working for Waterfront Toronto to move this forward, because they agree with me that we need to de-identify data at source and protect privacy. You see, I wanted us to have a smart city of privacy, not a smart city of surveillance. I'm on the international council of smart cities—smart cities all around the world—and virtually all of them are smart cities of surveillance. Think of Dubai, Shanghai and other jurisdictions. There is no privacy in them. I wanted us to step up and show that you can create a smart city of privacy. I still believe we can do that.
Thank you both for attending.
To begin I want to clarify a bit of a misconception in some of the questions from Mr. Kent with respect to the e-ID in Estonia. It is not a mini computer that centralizes all personal information. In fact, the very foundation of the Estonian digital government is decentralization. The digital ID is an identity card that allows them to access the system, but it's not storing mountains of personal information.
What I really want to get at, and I think the usefulness of this study, is to ask how we can apply the idea of privacy by design to digital government so that we can actually improve services for Canadians.
At the outset I would note that according to Estonia's public information, nearly 5,000 separate e-services enable people to run their daily errands without having to get off their computer at home. As a Canadian who wants better service out of his government, I want that. How do we alleviate privacy concerns from the get-go so we get better service?
If we look at the Estonian model, we have a digital ID. We have a separation of information between departments using X-Road and blockchain technology. Then we have transparency in the sense that when a government employee accesses my information, I can see who did it and it's time-stamped as to when they did it. If you add those layers of detail into a digital government system, is that sufficient to address privacy concerns? Are there other things we should be doing if we're looking to digital government?
I'll start with Dr. Cavoukian and then Dr. Geist.
I think I understand. We're skating a little further ahead.
Let's say we start today. Estonia's benefit is that they were a new country and they were starting fresh, so they wouldn't have to convert. They're a small country with a small number of people, relatively speaking. Let's say we start today. It seems to me that the first step we have to do.... I agree with Ms. Cavoukian that we can keep the silos, and I agree that it's the safer approach. What Estonia does is they have a backbone so you can come in and go here or you can come in and go there, but it's not all one big database.
I also believe it will be a lot easier to build out from our existing silos, as opposed to trying to do it.... I'm in agreement with that, but it seems to me that if we're going to do it, we have to start off with a digital link, okay? Let's say I'm Frank Baylis and I just showed up on the system. “Okay,” it says, “prove to me you're Frank Baylis.” Right now, it says to type in my SIN, that number, and that's pretty easy to rip off, right? Whereas if it says, “Let's get a scan of your eyes” or “Let's get some biometrics” and some questions asked and all of that, it seems to me that, to your point, you could have privacy and security.
I think that was the first statement you made, Ms. Cavoukian. Can we not start there and have an agreement on that before we get into all the other stuff?
I'll pass it back. I cut you off. I'm sorry.
One of the questions we've raised in opposition over the years is about giving police more tools, because if you give police tools, they use them. My colleague Mr. Erskine-Smith suggests that if we get everybody's data and information, government can help them by sending information to them.
I've been in opposition for 15 years and I've seen government often use those resources to say, “Hey, have we told you about our great climate change plan? Have we told you about the great child tax benefit?” To me, if you had everyone's data, the power you would have to send that out in the months leading up to an election is very disturbing.
I represent a rural region in which a lot of people have real difficulty obtaining the Internet, and yet seniors are told, “We're not taking your paper anymore. You're not filling this out. You're going to have to go online.”
We're forcing citizens to become digital. What protections do we need to have in place to say that citizens are being forced to use digital means to discuss with government, but they don't want to hear back from government, so that we limit the ability of government to use that massive amount of data to promote itself in ways that would certainly be disadvantageous to other political parties?