Good morning, Mr. Chair and members of the committee.
Thank you for the opportunity to appear before you to discuss the 2019-2020 main estimates.
With me today are Deputy Commissioner, Compliance, Brent Roman; Deputy Commissioner, Policy and Promotion, Gregory Smolynec; and Deputy Commissioner, Corporate Management, Daniel Nadeau.
In the time allocated, I will discuss some of our plans for the coming year and how we expect to make use of the new funding announced in the recent federal budget, which comes in recognition of growing demands on my office.
Our annual resources have been approximately $24 million in recent years. We hope to have your support in maintaining that funding.
We plan to use the additional resources to enhance our ability to deliver on our mandated obligations in the face of the exponential growth and complexity of the digital economy. Privacy issues are multiplying rapidly and often adversely affect Canadians' privacy. It has been an ongoing challenge to keep pace with those advancements and to protect Canadians in the way they deserve.
We require program integrity funding to enhance our capacity to protect the privacy rights of individuals and achieve meaningful results for Canadians.
Of course, we welcome the recent federal budget announcement of additional resources for my office as a positive step. It would allow us to take concrete steps towards implementing our proactive vision for privacy protection. I am using the conditional here because even though the federal budget has allocated sums to my office, we will only have access to those funds once Treasury Board has given its approval.
Part of the funding included in the federal budget is temporary, to help us deal with a backlog of complaints. While we have undertaken initiatives such as the increased use of early resolution and the revamping of our investigative processes, we have nevertheless struggled to respond to complaints in a timely manner.
We currently have a backlog of more than 300 complaints older than a year. The new funds would allow us to reduce the backlog to approximately 10% of its size by 2021. We would also be in a much better position to achieve our goal of meeting service standard targets in 75% of cases.
Ultimately, however, we think the best solution in the enforcement area is to modernize legislation, in part to give the OPC greater authority to manage its caseload according to risk. We need the discretion to focus our efforts on those cases with the greatest impact on Canadians.
The new funding will also be used to process privacy breach reports. Since mandatory breach reporting requirements came into effect in November 2018, our office has seen the volume of reports increase to more than five times what it was when reporting was voluntary. At present, we can only superficially respond to the vast majority of private sector breach reports to our office.
New resources would enable the OPC to more thoroughly review 40% of private sector breach reports to our office and 15% of public sector breach reports.
A third activity for which the federal budget provides additional funding is in the area of public education and guidance. The number of privacy issues for which parliamentarians, businesses and individuals require our advice and guidance is multiplying at a rapid pace.
In the past five years, requests for advice to Parliament have risen considerably, and this trend is expected to continue. Calls from various parliamentary committees are up 41% from five years ago, and in 2017-18 alone we made 34 parliamentary appearances and submissions. Clearly, privacy is becoming a very important issue for parliamentarians.
In the coming year, we will remain responsive to parliamentarians' requests for advice on the privacy implications of bills and studies, and we will seek to contribute to the adoption of laws that improve privacy protection.
New resources would also help increase our capacity to inform Canadians of privacy issues relevant to new technologies, their rights and how to exercise them. As well, we would be better positioned to guide organizations on how to meet their privacy obligations.
With current capacity, we can produce a maximum of three new pieces of guidance a year. Following our consent consultations a few years ago, we have developed an ambitious plan for much-needed guidance related to a wide range of important issues—guidance that we were asked by stakeholders to produce. Guidance to be developed over the next few years includes important issues such as biometrics, the Internet of Things, social media and de-identification, among others.
As well, existing advice and guidance needs to be updated to ensure that our website continues to be a trusted and comprehensive source for both organizations and individuals. There are over 150 guidance pieces on the OPC's website, approximately 40% of which are five years old or more.
Another important area for our office in providing guidance involves our advisory services to both industry and government. The new funding would help support our work with industry proactively in an advisory capacity to better understand, advise and help mitigate any privacy impacts at the design stage of their services.
Finally, I would add that guidance needs to be complemented by sustained and effective communication and outreach to have meaningful or significant impact on awareness and understanding of rights and obligations. We would therefore like to increase our capacity to conduct more public education and outreach activities to have a greater impact on awareness and understanding of privacy rights and obligations.
Of course, as you've heard me say before, our federal privacy laws require a number of very urgent reforms. As our recent Facebook investigation so starkly illustrated, we have reached a critical tipping point upon which privacy rights and democratic values are at stake.
I look forward to discussing those issues in an hour or so.
In conclusion, keeping pace with the rapid changes in technology is going to be an ongoing challenge for our office. We will continue to make optimal use of the resources given to us to carry out our mandate to have a greater impact on the privacy rights of Canadians. The recently announced new funding is an important interim measure and positive step towards achieving our targets as we await much needed legislative modernization.
Thank you, Mr. Chair. My colleagues and I look forward to your questions.
I think what really surprised our committee when we started to delve into this case with the Cambridge Analytica scandal was how complex it was, how difficult it was for our parliamentary committee to get answers. You're dealing with, basically, dark data by people who work in a very different realm from ours in what we do as legislators.
Christopher Wylie had stated that he felt that the U.K. ICO was very unprepared when it came to taking on Cambridge Analytica, because it did not have the experience of knowing how these players operated. Fortunately, the ICO in the U.K. did an excellent report.
Putting it to you, in terms of the changing world we're dealing with of surveillance capitalism and particularly data mercenaries, some of whom we brought here, do you have the resources that are necessary to actually be able to play in that milieu, of having the technical people, the people who know how the hard drives are being misused, how data's being moved around? It's in a very different realm than anything we've dealt with in the past.
I approached this incrementally.
The first point of order is that the law needs to change so that we have the right tools, the right legal tools, to ensure compliance by corporations. That won't happen immediately, but my hope is certainly that within a very few years this will happen. Then, at that point, there needs to be a discussion around the resources necessary to make that system work.
With the sums reserved for us in the federal budget, I think there's a.... I asked for more, but we received a not inconsequential sum of money to bridge us toward this new legislation, which I hope will be adopted within a couple of years. I'm fairly optimistic.
Do we have all of the tools we need, including resources? No, and choices have to be made. You're right to point out that, as with any other regulator, because of the exponential changes to technology and the digital economy, we have many issues and companies to monitor and look at, and we need to make choices. We cannot go after all problems—even serious—but the resources that were given in the budget will certainly make an important difference. Let's have a discussion around what the shape of the new legislation should look like, and then we can talk about the necessary resources.
As a comparator, I would say that with the new funding our size would be similar to that of large European data protection authorities, but much smaller than the U.K. Information Commissioner's Office. What is the right size is a question for discussion.
You have a statement from me on Facebook. I'll use it liberally.
As to the conclusions of our study, we found that Facebook violated privacy on a number of counts, including the rules on obtaining meaningful consent.
We studied two groups of Facebook users. The first was made up of users who installed third-party apps. As far as they were concerned, Facebook counted on the privacy policies of app developers to see to it that users' privacy would be respected. However, when we dug a little to see if those policies had any substance, we found that Facebook did not in fact verify whether app developer policies protected privacy properly. That is one example we found of Facebook's lack of responsibility.
Facebook has direct obligations under PIPEDA, the Personal Information Protection and Electronic Documents Act. When that company discloses information to a third party application, it is unacceptable that Facebook counts on the other company's privacy policies to respect its own obligations, which are independent. There is, consequently, a breach of privacy in that instance.
The other type of user we studied included the friends of Facebook users who installed third-party apps. When people joined Facebook, according to Facebook, they consented to the disclosure of their own information when friends installed third-party apps. The friend of the user was thus considered, according to Facebook, to have given consent to some unknown action that could take place years later, for unknown purposes. That is the very opposite of informed consent. One of our conclusions was that informed consent was not obtained.
Ultimately, our final conclusion was that Facebook breached one of the PIPEDA principles, which is that companies that collect and use personal information are responsible for the management of that information. We feel Facebook's main transgression is that it shifted its responsibility onto the users or the third-party app developers it dealt with.
Facebook even challenged our conclusions. Among other things, and in a fundamental way, it challenged our assertion that when a user uses a third-party app, Facebook discloses information to that app. According to Facebook, the transfer of information from Facebook to the third-party apps was not a disclosure on its part. It characterized this as making information available at the request of its users.
Once again, we see that Facebook is sloughing off its responsibilities. It claims that it is up to others to be careful, whereas we are of the opinion that Facebook has a legal responsibility to obtain informed consent if information is disclosed.
Among the matters we will be submitting to Federal Court is this fundamental issue: does the fact that Facebook transfers information to third-party apps constitute a disclosure under the law, or not? We believe it is quite clear that the answer to that question is yes.
Another thing I would insist on is the difference between Facebook's actions and its statements; it says that it wishes to adopt a position that is favourable to protecting privacy, and that it wants to work with governments and regulatory agencies to better protect the privacy of its users. All of that is good, but in reality, we saw exactly the opposite. Facebook stated that it wanted to work to further the respect of users' privacy with the regulatory agencies, and so on. However, we had some conclusions to present to it, and recommendations to ensure the company would comply with federal legislation. In the final analysis, the result of our discussions with Facebook, which lasted a few weeks, was that it rejected our legal conclusions as well as our recommendations.
That is exactly the opposite of the official position Facebook puts out, which is that it wants to work to ensure the protection of privacy with the regulatory bodies.
Very briefly, Facebook, in our view, violated PIPEDA with respect to consent. We think the main violation is with respect to its lack of accountability. PIPEDA's first principle is that companies have a legal obligation to be accountable for the way in which they handle the personal information of those from whom they collect information. They did not comply with that fundamental obligation. At the end of the day, they refused our findings, point one and point two, our recommendations. I think it is untenable that the law is such that this is our current state of affairs.
A company should not be able to say to a regulator, after the regulator has done serious work to look at the practices of the company, “Thank you very much, but we disagree. We don't think we are disclosing information to third party applications. We think they are making that information available at the request of our users, therefore we, Facebook, think that you're incorrectly applying PIPEDA.”
It is completely unacceptable and untenable that as a regulator I am in that position and that my decisions are not binding on the company. That's the plea that I'm making to you. I know you have agreed with our office in the past that we need stronger enforcement powers to make sure that companies do comply with the law. I have to, in this forum, underline how unacceptable it is that we at the OPC are in that situation as we speak and that we have to go to court to ensure that this company is under an order to comply with the law.
I would say there are at least two important solutions. One is to make sure the regulator has the right powers, and in that basket would fall binding orders, penalties and proactive inspection powers that I've discussed in this committee before and can expand on if we have more time.
But I will move to another part of the solution, which I think is to ensure that we have rights-based legislation. Facebook and Cambridge Analytica demonstrated the link between privacy protection and the exercise of other fundamental rights, in this case democracy. But there's also a link between privacy protection and other fundamental rights: equality, for instance, in the employment context; freedom to go on the Internet to develop as a person and look for issues of interest without the fear of being monitored by corporations. A clear link was demonstrated in Cambridge Analytica, but it's just one example of the clear link between privacy protection and the exercise of fundamental rights.
I think this shows that, in addition to giving powers to the regulator, the new legislation has to be framed as perhaps principles-based, as PIPEDA is, but also rights-based, and recognize that privacy protection is linked to the exercise of other fundamental rights. We're all at risk if privacy is not protected. We would not only lose our privacy, but other rights would also be at risk.
I have a more general question regarding the consent, of Canadians in this case, that is required by applications. On Facebook, a page appears and it says that in order to continue, you have to read what is written there and click on the ''I accept'' button to give consent.
Last week a witness told us that we should regulate that and that more information should be given to Canadians about the consent forms drafted by Facebook or third-party apps. By giving our consent, we are also giving it to apps that don't even exist yet.
Very often Canadians don't even read the conditions. They are in such a hurry to access the app that they accept automatically. Facebook's defence is that Canadians agreed and that this protects it.
Is there a better way to inform Canadians? When they click on the ''Accept'' button in an app, they are indeed entering into a type of contract. Is this type of acceptance that Facebook requires to protect itself against potential legal proceedings valid?
In the course of an investigation, we examined Facebook's current policies. We concluded that that consent was not valid, because users were consenting to something that might only occur years later. Obviously, they can't know what will happen years later. Even if there is a lovely legal text that is 50 pages long, if it does not inform Canadians about the use that will be made of their information, that consent is not valid.
As you know, in January of this year, we published guidelines to get companies to develop clearer privacy policies. That is part of the solution, but it is not the whole solution. That is why I advocate the adoption of a privacy protection law that goes beyond important principles such as consent. Consent is important, but it does not solve everything. We need a law that defines privacy in sufficiently general terms.
Protecting one's privacy does not end with giving or withholding consent. Consent is a means. Having the right to privacy means being able to communicate with our friends on social media without worrying that some company is constantly monitoring our activities. Cambridge Analytica used our information to try to influence our political opinions and our vote. We have to define the right to privacy in a sufficiently general way, and that goes beyond consent.
I have before me a bill that was tabled in a previous Parliament. It defined the right to privacy, among other things, as the right to be free from all surveillance.
Any new law on the protection of privacy, in the public or private sectors, should begin with that. What is the right to privacy? Is it tied to the idea of granting consent? No, it is not limited to that. The right to privacy is the right to one's own physical privacy. It is the right to be free from all monitoring, the right to be free from having one's private communications intercepted by the state or by private companies. That is where the definition should lie. After that, procedures or mechanisms like giving consent come in to protect the respect of privacy, but protecting privacy is not limited to consent.
Both parts of the documentary talk about fake news, as well as Cambridge Analytica.
Obviously, the questions that were put to Facebook's former executives give us to understand that there was extreme naïveté among all of the executives, so much so that no one was aware of consequences or of the legal repercussions of using the information they collected from people.
In parallel, one of the important defences Facebook used rested on the famous Safe Harbor Rule in America. In fact, under that rule, you cannot criticize a business for the type of actions we are discussing here, to the extent that the nature of the company means that it cannot be caught under the terms of the law.
Does that mean that the structure of Facebook or the nature of its activities allows it to take advantage of a type of legal void and consequently, to not get caught?
Otherwise, is that hypothesis not applicable because the company is defined like that, it has activities that are also defined a certain way, and it claims to be providing a given service?
Does that exclude it from any legal proceedings?
My follow-up question to that is that I'm wondering how much of Facebook's actions are maybe due to a perceived weakness among our position. One of the things I find very disturbing in all of this—and this comes out of another committee I serve on—is government is moving all of our advertising out of local Canadian newspapers and radio and pushing a large share to Facebook. I have to wonder about the laughs they might be having in their head office knowing they're fighting with you over PIPEDA, but at the same time the government is pushing taxpayers' money to them.
Then I look through your departmental results and I look at how we, as a government, take privacy rather less seriously. I'm looking at a National Post report. When a National Post reporter asked a question of PSPC and DND, within an hour and a half they got a threatening phone call from the president of Irving, threatening a lawsuit because PSPC and DND gave away private information for the third time. The government gave away private information to a corporation.
I look at your departmental results and look at the percentage of government organizations that are informed and guided to protect privacy rights. Our target is only 60% of government organizations to actually follow our own rules. I wonder if Facebook looks at this and just thinks that if you guys aren't serious about privacy anyway, why should they be? Oh, and by the way, at the same time they'll take taxpayers' money from us.
I'm outraged, frankly, at the stupidity of it all. The same time that you're fighting against them, the government's handing them cheques, and at the same time—
I don't think we will see an international treaty any time soon, either.
If different countries adopt similar or interoperable laws, the multiplication of those legislative and regulatory measures could lead to results. What's encouraging, as you know, is that Europe has already passed regulation that provides some solid privacy protection. It came into effect about a year ago.
I'm encouraged by the fact that serious discussions are now being held by the American Congress to adopt privacy protection legislation. When our trading partners in Europe, the South and several Asian countries go in the same direction, there is reason for greater optimism but it's taking a long time.
Ultimately, the adoption of stricter laws in various countries, some of which are Canada's trading partners, should improve things.
In the current legislation, the first principle is referred to as “accountability” in English and “responsabilité” in French. As I understand it, the term “accountability,” in an accounting sense, means a “reddition de comptes.” In other words, it means that the responsible party must report to its constituents on how it has carried out its responsibilities.
In the current PIPEDA, the term “accountability” is referred to as “responsabilité” in French. Responsibility ends before accountability. It's already a lot, but it ends before accountability. Responsibility consists of adopting procedures to implement the other principles of the legislation, including consent, openness and access.
The company or organization fulfills its obligation to take responsibility by adopting procedures. However, the company or organization isn't accountable, either to users or to the regulatory agency, when it comes to demonstrating that the procedures implement the PIPEDA principles.
One consequence of Facebook—and there are other signs of this—is that it can no longer be trusted. The principle of accountability is important. Companies must take responsibility, and the entire privacy burden mustn't be placed on users. It would be unrealistic to think so. Companies must take responsibility.
The case of Facebook, for example, clearly shows that the fact that the legislation imposes a responsibility doesn't mean that the obligation will be fulfilled, hence the need to require accountability. Companies must be required to show that they've adopted procedures to implement the principles of the legislation, while providing proactive inspection powers.
Under one model, companies would provide a report to the regulatory agency on the procedures that they've adopted. The reports would be similar to the privacy breach reports, which have been a legal obligation since November.
Imagine legislation where companies must provide a report to the regulatory agency on the procedures that they've implemented to fully comply with the PIPEDA principles. The regulatory agency, which has limited resources, would review the reports and note issues in certain places. It would inspect the companies, and perhaps it would find violations and penalize the companies.
If responsibility were to lead to true accountability,
a real accountability in accounting terms,
it would eventually have an impact on the entire industry, because companies wouldn't want to run the risk of being inspected and penalized.
I want to follow-up on your comments about accountability, because I'm trying to think of a similar situation where there is a corporate lack of accountability. Facebook has an enormously successful platform. It's used all over the world. It's making unprecedented money. It has no competition and yet, in this past year, the U.K. parliamentary committee has made a finding that it was a digital gangster and the privacy commissioner in New Zealand found it to be morally bankrupt. Facebook was denounced by the UN for complicity in the Myanmar genocide.
It would seem to me that normal corporate practice would be to get on a goodwill tour and start to fix the problems and reassure people, yet Mr. Zuckerberg ignored his appearance at the International Grand Committee, and now we have your report coming out.
Facebook said, “Thanks, but we don't want to spend any money to actually comply, so we'll just pretend you don't have jurisdiction over law.” You referred to its policy as an empty shell. I'm trying to figure out what is fundamentally wrong with Facebook.
Is it the corporate culture, which I'm not asking you to venture in on, or is it that its fundamental business model, like the fundamental business model of surveillance capitalism, is based on ignoring the privacy rights of citizens, and it simply will not change a business model that has worked extremely well for it, even if it is breaking the law of Canada and numerous other jurisdictions?