We're no longer suspended. Thank you very much, colleagues, for getting that little bit of business taken care of.
We're now pleased to get back on track on the study of the Privacy Act. Pursuant to Standing Order 108(3)(h)(i), we are studying the Access to Information Act.
We're happy to have with us this morning the following witnesses: Teresa Scassa, full professor from the University of Ottawa and Canada research chair in information law; David Lyon, who is joining us by video conference, professor at Queen's University; and Lisa Austin, associate professor, University of Toronto, faculty of law, in the David Asper Centre for Constitutional Rights.
Thank you very much for taking the time to join us, and thank you for your patience as we dealt with a little bit of business at the start of our committee meeting. We just finished up our review of the access to information legislation, and now we're going to continue on with our review of the privacy legislation.
We're going to ask each of you to do about a 10-minute presentation. Then we'll proceed to rounds of questions and hopefully use up the full two hours.
Based on the order that they appear on my sheet, we'll begin with Teresa, please.
Thank you, Mr. Chair, and thank you for the opportunity to address this committee on the issue of the reform of the Privacy Act.
I have had a chance to review the commissioner's recommendations for Privacy Act reform and I am generally supportive of these proposals. I'm going to be focusing my remarks today on a few specific issues that are united by the theme of transparency.
Greater transparency with respect to how personal information is collected, used, and disclosed by government enhances privacy by exposing practices to comment and review and by enabling appropriate oversight and accountability. At the same time, transparency is essential to maintaining public confidence in how government handles personal information.
The call for transparency must be situated within our rapidly changing information environment. Not only does technology now enable an unprecedented level of data collection and storage, but enhanced analytic capacity has also significantly altered the value of information in both public and private sectors. This increased value provides temptations to overcollect personal information, to share it, to mine it, or to compile it across departments and sectors for analysis and to retain it beyond the period required for the original purposes of collection.
In this regard, I would emphasize the importance of the recommendation of the commissioner to amend the Privacy Act to make explicit a “necessity” requirement for the collection of personal information, along with a clear definition of what “necessary” means.
The goal of this recommendation is to curtail the practice of overcollection of personal information. Overcollection runs counter to the expectations of the public, who provide information to government for specific and limited purposes. It also exposes Canadians to enhanced risks of negligence, misconduct, or cyberattack, which can result in data breaches.
Data minimization is an important principle that is supported by data protection authorities around the world and reflected in privacy legislation. The principle should be explicit and up front in a reformed Privacy Act.
Data minimization also has a role to play in enhancing transparency. Not only do clear limits on the collection of personal information serve transparency goals, but overcollection also encourages the repurposing of information, improper use, and over-sharing.
The requirement to limit collection of information to specific and necessary purposes is tied to the further requirement on government to collect personal information directly from the individual, where possible. This obviously increases transparency, as it makes individuals directly aware of the collection.
However, there are many exceptions to this general rule. These exceptions include circumstances in which information is disclosed to an investigative body at their request in relation to an investigation or the enforcement of any law, or when it's disclosed to government actors under court order or subpoena. Although such exceptions may be necessary, they need to be considered in the evolving data context in which we find ourselves.
Private sector companies now collect vast stores of personal information, and this information often includes very detailed core biographical information. It should be a matter of great concern, therefore, that the permissive exceptions in both PIPEDA and the Criminal Code enable the flow of massive amounts of personal information from the private sector to government without the knowledge or consent of the individual.
Such requests or orders are often, although not always, made in the course of criminal or national security investigations. The collection is not transparent to the individuals affected, and the practices as a whole are largely not transparent to the broader public and to the office of the Privacy Commissioner.
We've heard the most about this issue in relation to telecommunications companies that are regularly asked or ordered to provide detailed information to police and other government agents. It should be noted, however, that many other companies collect personal information about individuals that is highly revelatory about their activities and choices. It is important not to dismiss this issue as less significant because of the potentially anti-social behaviour of the targeted individuals. Court orders and requests for information can and do encompass the personal information of a large number of Canadians who are not suspected of anything. The problem of tower dump warrants, for example, was recently highlighted in a case before the Ontario Supreme Court. The original warrant in that case sought highly detailed personal information on about 43,000 individuals, the vast majority of whom had done nothing other than use their cellphones in a certain area at a particular time.
Keep in mind that the capacity to run sophisticated analytics will increase the attractiveness of obtaining large volumes of data from the private sector in order to search for an individual linked to a particular pattern of activity.
Without adequate transparency regarding the collection of personal information from the private sector, there is no way for the public to be satisfied that such powers are not abused. Recent efforts to improve transparency—for example, ISED's voluntary transparency reporting guidelines—have focused on private sector transparency. In other words, there has been an attempt to provide a framework for the voluntary reporting by telecommunications companies of the number of requests they receive from government authorities, the number they comply with, and so on. However, not only are these guidelines entirely voluntary, but they are limited to the telecommunications sector, whereas disclosures may be sought from any private sector company.
They also only address transparency reporting by the companies themselves. There are no legislated obligations on government actors to report in a meaningful way, whether publicly or to the Office of the Privacy Commissioner of Canada, on their harvesting of personal information from private sector companies. I note that the recent attempt by the OPC to audit the RCMP's use of warrantless requests for subscriber data came to an end when it became clear that the RCMP did not keep specific records of these practices.
In my view, a modernization of the Privacy Act should directly address this enhanced capacity of government institutions to access the vast stores of personal information in the hands of the private sector. The same legislation that permits the collection of personal information from private sector companies should include transparency reporting requirements when such collection takes place. In addition, legislative guidance should be provided on how government actors who obtain personal information from the private sector, either by request or under court order, should deal with this information. Specifically, limits on the use and retention of this data should be imposed.
It's true that the Criminal Code and PIPEDA enable police forces and investigative bodies under both federal and provincial jurisdiction to obtain personal information from the private sector under the same terms and conditions, and that reform of the Privacy Act in this respect will not address transparency and accountability of provincial actors. This suggests that issues of transparency and accountability of this kind might also be fruitfully addressed in the Criminal Code and in PIPEDA—the reform of which this committee is also considering—but this is no reason not to address it in the Privacy Act. To the extent that government institutions are engaged in the indirect collection of personal information, the Privacy Act should provide for transparency and accountability with respect to such activities.
Another transparency issue raised by the commissioner relates to information sharing within government. Technological changes have made it easier for government agencies and departments to share personal information, and they do so on what the commissioner describes as a massive scale.
The Privacy Act enables personal information sharing within and between governments, domestically and internationally—in specific circumstances for investigations in law enforcement, for example, or for purposes consistent with those for which it was collected. Commissioner Therrien seeks amendments that would require information sharing within and between governments to take place according to written agreements in a prescribed form. Not only would this ensure that information sharing is compliant with the legislation, but it would also offer a measure of transparency to a public that has a right to know whether, and in what circumstances, information they provide to one agency or department will be shared with another, or whether and under what conditions their personal information may be shared with provincial or foreign governments.
Another important transparency issue is mandatory data breach reporting.
Treasury Board Secretariat currently requires that departments inform the OPC of data security breaches, but the commissioner has noted that not all comply. As a result, he is asking that the legislation be amended to include a mandatory breach notification requirement. Parliament has recently amended PIPEDA to include such a requirement. Once these provisions take effect, the private sector will be held to a higher standard than the public sector unless the Privacy Act is also amended.
Any amendments to the federal Privacy Act to address data security breach reporting would have to take into account the need for the commissioner and for affected individuals to be notified when there has been a breach that meets a certain threshold for potential harm, as will be the case under PIPEDA.
The PIPEDA amendments will also require organizations to keep records of all breaches of security safeguards, regardless of whether they meet the harm threshold that triggers a formal reporting requirement. Parliament should impose a requirement on those bodies governed by the Privacy Act to keep and to submit records of this kind to the OPC. Such records would be helpful in identifying patterns or trends within a single department or institution, or across departments or institutions. The ability to identify issues proactively and to address them either where they arise or across the federal government can only enhance data security, something which is becoming even more urgent in a time of growing cybersecurity threats.
I'm going to stop my comments there.
Thank you very much, Mr. Chair.
Thank you very much for inviting me to participate in what I think is an important initiative. The Privacy Act is out of date, and Canadians urgently need a new and strong law that speaks to the tremendous technological changes and political economic shifts that have occurred since the 1980s.
In general, I am in agreement with and grateful for the proposals made by the Privacy Commissioner. At the same time, I should make it clear that I am not a lawyer, and nor do I have any legal expertise. I speak as a university professor who has been engaged in the social sciences. I direct the Surveillance Studies Centre at Queen's University.
My last book was Surveillance after Snowden. The large-scale team project I direct at the moment is called Big Data Surveillance. The book that I'm currently working on is The Culture of Surveillance. I mention these simply to give you some sense of the angle from which I am coming and from which I speak, which is the broad context of this act rather than the details.
Let me start by pointing out that there's a publication our research team brought out a couple of years ago. It's called Transparent Lives: Surveillance in Canada. It's a highly accessible study of the trends in surveillance today. I commend it to the committee. You can get it from any good bookstore, or it is downloadable online.
It is also available in French, under the title Vivre à nu: La surveillance au Canada.
This book encapsulates the key issues about surveillance in the 21st century and gives a comprehensive background, for anyone who would like to see it, for the need for a changed privacy law.
The trends that it examines, and for which it offers Canadian examples, include the rapid pace of increasing surveillance, the role of security concerns in prompting surveillance, the blurring of public and private sectors—Snowden's disclosures make this very clear—the ambiguity of personal information, the growth of mobile and location-based surveillance, the embedding of surveillance in everyday environments—sometimes discussed as the Internet of Things—the growth of biometrics, and social surveillance on Facebook, Twitter, and other media.
The Privacy Act is premised on some rather fixed ideas about personal information in terms of who collects it and where, if at all, it travels. Today, fluidity rather than fixity is the order of the day. Words such as “databases” define the old document, and this suggests silos in contrast to the multiple conduits through which data flow today. Information was seen then as pertaining to those specific sites, and sharing information could only happen under certain circumstances.
There still, of course, need to be limits on this practice, as we've just heard, and it has to be acknowledged at the same time that information sharing today exists on a scale that wasn't dreamed of in the 1980s, a scale that would be very difficult to quantify, let alone control.
It also occurs across boundaries assumed by the distinction between government activities and commercial ones in the two main federal laws of 1982 and 2004. The easy traffic in each direction between these domains was never envisioned in the 1982 act, and this is a key issue to be confronted in any review.
At the same time, surveillance can and does happen without there being any obvious handles for identifying personal information. The very category of personal information is badly blurred today. Once you could have imagined that this category would cover such matters as name, address, telephone, and perhaps some official identifier such as the social insurance number. Today, license plates captured by highway cameras count, and although this is controversial, so do IP addresses on computers.
Moreover, one can be identified through facial recognition. The software, for example, that is routinely used by Facebook doesn't even require a Facebook account in order for it to function. Indeed, it's relatively straightforward to identify people with no obvious identifying information provided. A Montreal study recently showed that 98% could be positively identified with birthdate, gender, and postal code without names and addresses being known.
The post-Snowden debate over whether or not metadata around phone and Internet messages count as personal data is another example. This is supposedly contextual, sometimes dismissed misleadingly as phone book-like information rather than content, but metadata is frequently more revealing, not less.
The two items mentioned refer to socio-technical and political-economic changes that have occurred over the past 40 years, and I wish to turn to matters of research and education, on which the commissioner also speaks.
On the one hand, much more research is required to properly understand the momentous changes that have occurred since the 1980s. It must be stressed that these are both socio-technical and political-economic changes and cannot safely be reduced to technical and legal categories.
For a number of years the commissioner has overseen a very successful program of funded research under the contribution scheme, but given the magnitude of the issues and their centrality to matters from national security to domestic life, much more is needed if the law governing the uses of personal data is to be kept up to date in a way that genuinely addresses all whose lives are touched by surveillance of all kinds, which is everyone.
This research program could be expanded under the act as a background to the revision of the Privacy Act, but it could also be widened by requests for surveillance and privacy research by the Tri-Council or by the Royal Society of Canada for a dedicated report on surveillance and privacy law in Canada. I suggest that such study is needed before the law can be revised.
On the education front, it is clear that much has to be done here, and this too could be coordinated by the Privacy Commissioner with an expanded brief.
In the 1980s, computing still meant primarily what were called “mainframes”, and the era of personal computing—not to mention the popular diffusion of distributed systems, mobile devices, and the cloud—was yet to flower. In that decade, if you wished to connect with others, for example, or with what would emerge in the 1990s as the Internet, you had to use a cumbersome system of plugging your land-line phone handset into rubber sockets—I don't know if anybody remembers that; it was called an acoustic coupler—to create a very uncertain data link modem.
Today computer devices and networks have proliferated in ways that demand fresh approaches to what I think should be called “digital citizenship suitable for all ages”. All Canadians need to know their rights, understand the issues, and engage actively and in an informed way. This is not a minority option. This is not something on the side. This again could be initiated by the commissioner. It could accompany the new law and could refer to the work of many other agencies where such matters are central, and in my little brief I've put some references for you.
While I believe all the above are essential components of a revised privacy law, it seems to me that the nature of the debate also has to shift to consider carefully the underlying ethical direction that should be encouraged to enable the most just and fairest uses of digital media and personal information and to exploit the best purposes of the great potential of digital technologies.
The very notion of privacy, of course, has undergone considerable change since the 1980s. These are not minor or peripheral matters and cannot be addressed in merely technical or legal ways. It's not only that privacy in some narrow sense might be violated by the misuse of these powerful technologies, but rather that our opportunities to live as free and fulfilled human beings are enhanced or curtailed by surveillance, whether by government or corporation.
As Eric Stoddart argues, much monitoring and tracking today is the surveillance of others. We would do well to consider how surveillance could be harnessed for human flourishing, which would be surveillance for others.
Thank you very much.
I thank you for inviting me to appear before you today. I appreciate the opportunity. I have prepared a written submission for your committee. It's currently being translated and will be distributed to you. My comments will be a summary of that submission. I welcome your further questions.
The basic point I want to stress to you today is that Privacy Act reform must take account of the Canadian Charter of Rights and Freedoms and its protections for privacy. We should not think that compliance with the Privacy Act means compliance with the charter, and we should not think that strengthening the Privacy Act's adherence to fair information principles means that it's thereby consistent with the charter's protection for privacy.
It's crucial that we understand this, for we're now in an era when the government collects large amounts of information about individuals and shares this both within government and with other governments, including foreign governments. This is not just for the provision of social services but for law enforcement and national security purposes, as both the prior witnesses stressed as well. Indeed, when the former government introduced Bill and the new Security of Canada Information Sharing Act, Canadians were told that because the Privacy Act applied and the Privacy Commissioner would provide review, there would be an appropriate balance between protecting the privacy of citizens and ensuring national security. This is an illusion, and it's a dangerous one.
The Privacy Act is quasi-constitutional legislation, that's true. The Supreme Court has said that multiple times. However, it should not be equated with the constitutional protection of privacy rights. The Privacy Act is based on what have come to be known internationally as “fair information principles”. Its basic model is a response to the growth of the administrative state and its accompanying information practices. An individual seeking government services in a social welfare state context has an interest in receiving those services. The administration of those services requires personal information to be collected and processed, so the individual interest in relation to this personal information is not about preventing its collection, use, or disclosure, but in preventing the overcollection of personal information or its subsequent uses or disclosures for different purposes, as well as in ensuring that the information is accurate. The central individual entitlement is to have access to the information the state holds about oneself, and to correct it for inaccuracies. This law was never really meant to apply to the context of law enforcement and national security in any robust way, and many of its exceptions capture those uses.
In contrast, the constitutional protection of privacy in Canada has developed largely in relation to section 8 of the charter, although privacy has also been protected through section 7. Its central paradigm is its search and seizure context, where the state seeks information in relation to law enforcement investigations. Here the individual interest lies completely in opposition to the state interest. It is a coercive relationship. The central individual entitlement is to have state access protected through the warrant requirement and the reasonable and probable grounds standard. These are two different frameworks, but they need to be integrated if we think the Privacy Act has anything to say to the increasing information practices the government employs in the context of law enforcement and national security. Charter review should be built into a strengthened Privacy Act review, particularly in this context.
In light of this, I have four recommendation I want to offer to you. Again, those are outlined in the written submission.
First is an interpretive principle. We recommend that the Privacy Act should include a reference to privacy rights protected by the Canadian Charter of Rights and Freedoms. Put a reference to it in the purpose section to allow for arguments to be made in reference to the Charter of Rights and Freedoms.
Our second recommendation is that government information practices should be reviewed for compliance with charter rights. The necessity standard that the Office of the Privacy Commissioner of Canada is advocating is not adequate. It's better than what we have, and it's good in many contexts, but it's not adequate.
Why do I say that? Charter rights can be at issue with the collection, use, or disclosure of personal information. The charter is engaged when there's a reasonable expectation of privacy; it's not simply when personal information is collected, used, or disclosed, but where there's a reasonable expectation of privacy. The Supreme Court of Canada has repeatedly held that information that has been collected by the state for one purpose can retain a residual reasonable expectation of privacy in relation to other purposes, including disclosure to foreign states.
Engaging in something like a necessity test modelled after the Oakes test for section 1, which is what the Privacy Commissioner advocates, is not going to be adequate in this context. Why? The section 8 reasonable and probable grounds test, which is the basic standard, is not a test that says the state gets access to information if it is necessary for a law enforcement purpose; it's a test that says that “...law enforcement goals hold sway only at the point marked by the probable effectiveness of reaching that goal.” This idea of probable effectiveness is not part of the the section 1 jurisprudence to date.
It's actually quite unclear when a breach of either section 7 or section 8 of the charter can be upheld under section 1 of the charter. That's because there's an internal balancing in section 1 as well as as one in section 7, and courts are loath to uphold them under section 1, so we should not be quick to regularize some kind of section 1 analysis until we actually import the charter privacy protections, particularly in the context of state use of this information for law enforcement and national security purposes.
Therefore, we recommend that the use or disclosure of personal information for law enforcement investigative or national security purposes should be subject to a review that reflects the protection of an individual's charter rights under sections 7 and 8, and not simply be reviewed on a necessity standard.
Our third recommendation is that the Office of the Privacy Commissioner be empowered to undertake charter review of government information practices. Charter review of these information practices should not be a burden placed on ordinary Canadians to both discover information practices that are difficult for them to see and understand—to come to know what those practices are—and to challenge them in court. It should not be a burden on the individuals to initially challenge these things in court in a context where we have an access to justice crisis in this country. Instead, we should build it into the Office of the Privacy Commissioner's function.
However, it's also important that this be reviewed on a standard of correctness in the courts. It should not be built into an administrative process such that the courts are then reviewing charter complaints on a reasonableness standard. It should be correctness.
Therefore, we recommend that the exemptions, particularly those under sections 7 and 8 of the Privacy Act for uses and disclosures of personal information without consent, should be subject to charter review conducted by the Privacy Commissioner, subject to judicial review on a standard of correctness.
Our fourth recommendation is that you strengthen the obligation of accuracy under the Privacy Act.
Inaccurate information can have grave consequences on fundamental rights and freedoms. This is one of the tragic lessons from the Arar commission. Currently the obligation of accuracy is in subsection 6(2) of the act. It applies to uses of personal information, but it should apply to uses and disclosures of information, not just uses. It's currently confined to administrative purposes, and it should be broadened to all the purposes that it's used for.
I think that the act should also be modernized to recognize what academics are increasingly terming “algorithmic responsibility”—that is, the idea that the issue is not just the accuracy of the information that's collected, used, or disclosed, but the accuracy of information processing methods used by the government.
In an era of big data, an era when vast amounts of information are being collected and analyzed in different ways, we need to be concerned about the accuracy of those methods of analysis. We need to be concerned that they're not building in biases, for example, or other forms of inaccuracy. Therefore, we recommend that subsection 6(2) of the act be amended to impose an obligation to ensure the accuracy of any personal information that is used or disclosed by the institution for all purposes. The obligation of accuracy should also apply to methods of information processing.
I'll end my comments there.
I would say that the written agreements are a start. Again, I would want charter compliance built into them, because some of this information sharing can raise charter issues, and these need to be flagged early on.
The charter jurisprudence is clear in saying that just because one government institution has information that it has collected for one purpose doesn't mean it can use it for subsequent purposes; sometimes a charter issue is flagged, and there needs to be charter compliance. That can also happen with sharing it with foreign states.
Section 8 was triggered in the Wakeling decision, although there was a disagreement on whether the provisions in the Criminal Code were reasonable. In the end, they were found to be reasonable.
The written agreements are a start, then, but you need the charter review of the information sharing, because some of it will raise charter issues, but not all of it, hopefully. You thus need to build it in at the beginning.
I would also say that whenever some of this information is shared, particularly with foreign governments, the accuracy issue is enormous, so building in an obligation of accuracy is important.
I don't see how the current obligation of accuracy actually applies, because it's about use for administrative purposes. If you're sharing this information for national security purposes or for transnational law enforcement purposes, it seems to me it's not part of that, but it's crucial that accuracy be built in. You could, through regulations, specify perhaps what that might mean in particular circumstances, but I think it's an absolutely crucial amendment.
It's a great question. I haven't finished writing the book yet, but what we're working on is looking at the ways in which.... Well, it's in contrast with the situation in the 1980s, when these kinds of issues were still seen as relatively discrete in that they didn't apply to everyone. In what I'm calling a surveillance culture, people have a kind of surveillance imaginary, a sense of what's going on, and engage in practices that relate to surveillance, whether it's avoiding certain kinds of surveillance or actively participating in them or complying or negotiating or whatever.
In talking about surveillance culture, I'm trying to draw attention to the fact that there's no point in talking about a surveillance state anymore, or even a surveillance society, although those are important concepts. We have to think about the ways in which people in everyday life interact in numerous ways, and increasingly, with all kinds of surveillance.
Of course, I'm understanding surveillance in the broad sense of any kind of activity or experience of gathering and analyzing personal information for all kinds of purposes, whether they be for influence, control, management, or whatever. I'm working with a fairly wide definition of surveillance that, again, was not envisaged by those who were writing the Privacy Act in the 1980s. I'm thinking of situations, for example, where people are engaged with social media and are actually very aware of the kinds of risks that they take in certain kinds of communication, certain kinds of web-browsing, and so on and so forth.
That culture of surveillance that is developing in many different aspects actually has an effect on the ways in which surveillance is carried out and privacy is maintained, and for all that some say that privacy is less of a matter of interest to younger people who are using social media, in fact you discover that there's a very sophisticated and complex understanding of privacy. This relates both to the big issues of the charter, for example, and to the small issues, such as which particular party you do or do not want your own communications to be open to.
Therefore, I'm thinking of something that is developing in Canada and in other countries that affects our understanding of what it is to be enjoying privacy, our understanding of what it is to be under surveillance, and how those understandings and those practices make a difference to the ways in which surveillance actually works—to its very efficacy—and also to privacy.
I think one of the important issues around how we store and protect information is that it also has charter dimensions to it.
The recent jurisprudence in the Supreme Court of Canada has been very strong on the idea that you need safeguards around information. For example, when there's an analysis of the reasonableness of a law in the context of a charter privacy issue, there's an increasing discussion on the question of safeguards, in that if you don't safeguard the information properly, that can be the charter breach.
The gravity of that issue is that it's not some sort of technical, administrative element to the Privacy Act. There are serious charter issues in not safeguarding that information properly that the courts are starting to really pay attention to.
My own view is that we haven't built in enough on the technical side of the review process. We still seem to be thinking about it much along the lines of what David Lyon has been talking about, seeing personal information as if it's discrete information collected in a kind of paper environment that's shared in filing cabinets, but these are information systems. They're technical systems. It's software. It's algorithms. The whole issue of safeguarding has an incredible technical side to it as well. Getting the legal standards right, whether it's in the legislation or in regulations, is important, and getting the oversight right is important, but there's a whole technical side to that too. I think we're not building enough technical expertise into the review process.
As to what that looks like particularly, I don't have an answer for you, but I think we need to really understand the fluidity that David Lyon is talking about. The practical expression is that these are software systems. These are algorithms that we're talking about. These aren't social security numbers in a paper file in a filing cabinet. It's a highly technical environment.
I'd be happy to comment on cross-border data flows.
This doesn't seem like a Privacy Act issue per se, but I do think we should really understand the issue, again from a kind of constitutional perspective. As a Canadian, if you are physically in Canada and you're living here and residing here, but your data goes to the United States, their position is that you are a non-resident alien—we're in Canada, so we're not resident in the United States—so the fourth amendment of the U.S. Constitution, which provides for protection of privacy, does not apply at all.
There's a lot of Canadian jurisprudence that says that once you're dealing with what happens in a foreign state, it's their rules that apply, not ours, so what you do when you put your data in the U.S., is what I call plunking your data into a constitutional black hole. There's no constitutional right there.
What should we be doing? Data localization is one response to that dynamic. I think it's an unrealistic response to think that this is a solution in the long term. Another response, though, given the size of Canada and the size of our economy, is to negotiate a bilateral agreement with allies like the U.S. to say that when Canadian data is in the United States, you protect us to the same extent that you protect your own citizens.
I would actually go further and say you need to protect us according to our own standards in the Canadian charter, because Canadian charter standards of privacy are better in relation to data in most of these contexts than the American constitutional standards. Why? It's because the Americans still buy into what's called the third party doctrine. They say that if you share information with a third party, such as a telecommunications provider, there's no longer a reasonable expectation of privacy. You've given it up in relation to the States.
It's a crazy doctrine. We've never agreed with it in Canada. The Supreme Court of Canada has denounced it for more than 20 years.
It's crucial, I think, that we actually negotiate and say, “If you want access to our data for any kind of law enforcement or for national security, it's the Canadian charter that applies.” That mimics what the MLAT process tries to accomplish in having the constitutional rights of the data bearer apply, and we need to find a way to do that. I think that's the way forward, but I think it's a treaty that needs to be negotiated.
I would emphasize the importance of two levels of breach reporting, similar to what's been done with PIPEDA.
When the PIPEDA amendments come into effect, you're going to have a first level of breach reporting when breaches reach a certain threshold of harm, and that triggers an obligation to notify both the Privacy Commissioner and individuals who may be facing that potential for harm. That's one level, and it's a tremendously important one, because it's not just reporting the breach but also trying to mitigate harm and notify those individuals who may be affected.
The second level that's in PIPEDA, one which I think is quite interesting, is a requirement for organizations to document any breaches whether they reach that threshold or not, meaning things that are essentially breaches even though the information ultimately didn't end up in anyone's hands. I think that kind of record-keeping and reporting to the Privacy Commissioner doesn't necessarily have to be made open to the broader public—that decision would have to be made—but it could be just reporting to the Privacy Commissioner.
I think it's important because this goes to another thing, which is trying to identify those security practices that are weak and need to be improved within. If the Privacy Commissioner has access to this information, it gives a chance to see whether this is a common problem across government that should be addressed or whether it's a particular department that hasn't adequately trained its staff on certain privacy measures. It allows a more proactive approach to try to address security problems that become visible through this level of reporting.
I would encourage having those two levels so that it's not just harm that triggers notification, but that there's another level where any breach should be reported in order to try to diagnose problems and address them before they become more significant.
My understanding is that this was a recommendation pertaining mostly to the question of order-making power. The Newfoundland model was a hybrid model, and the hybrid model had much to recommend it over an order-making power.
I would say that I don't have a firm view on that particular debate, except that I lean heavily towards the order-making power. I would encourage you, in thinking that through, to take the perspective of the individual rights holder here in terms of privacy, and ask which is going to be better for them in terms of which of these models puts more of a burden on the individual to go to court to vindicate their rights rather than have it dealt with in this other process. We have an access to justice crisis here, and putting burdens on individuals to take it up in court when they are supposed to have these robust rights is, I think, unrealistic. Recommendations from the past that have focused on courts just don't take that into account. That's one thing.
The other thing is that the debate seems to involve a lot of hand-waving and anecdotal evidence. We have multiple jurisdictions in Canada that have different ways of doing this. In Ontario there's order-making power. In B.C. there's order-making power. If there are questions about whether that changes the dynamic by shifting away from an ombudsman model or whether it makes for a more contentious relationship with the government, certainly there are jurisdictions you can get evidence from. This could be a more factually based inquiry. You can take a look at what's going on in those jurisdictions and find that out.
The only other thing I would say is that in these charter contexts that I'm extremely concerned about, having a strong stick is good, because in these charter contexts, the individual is in a conflicting relationship with the state, whereas in the more administrative context, where the state's administering a social program, there's not that strong conflict. There's some conflict, but it's not that fundamental conflict.
I do think that from that perspective, order-making power has a lot to say for it, but I don't have a definitive view.
There are things that we have already been talking about.
In a sense, they can be talked about in terms of technological changes and the new kinds of means of finding out about individuals for one purpose or another. There are things I mentioned in terms of the trends toward a greater use of biometrics, and sensors being embedded in buildings, streets, vehicles, and so on. A lot of it sounds like coming to terms with the technological changes that are already occurring. That seems to me to be crucial.
On the other hand, I've been trying to stress the ways in which the very idea of privacy has altered since the 1980s, when the act was originally conceived. It seems to me to be essential that we bear that in mind as well. This comes into, or is completely consistent with, what Lisa Austin is saying about the need for charter compliance here.
It seems to me that the notion of privacy was once conceived in a very atomistic and individual way, and it had to do with very specific harms that could be identified. In today's situation, we have to think about a much broader range of issues that have to do with democratic participation and human rights, so the very notion of privacy, it seems to me, needs to be expanded.
It's both things: it's coming to terms with the real technological changes—and again, big data is a huge issue here—on the one hand, and it's also understanding how the notion of privacy has itself evolved into a much more social and participatory matter than was thought of in the Privacy Act originally.
Well, it was more than that. They wanted to know every cellphone transmission that had gone through the tower. In addition, they wanted the subscriber information linked to those cellphone numbers from the companies, they wanted credit card and billing information, and they wanted to know who those 43,000 people who had just been in that part of the city people were calling,.
After Rogers and Telus pushed back, the police narrowed the scope of their warrants, saying, “Never mind. This is all we want. Now don't take us to court.” They tried actually to get the case thrown out on the basis that they had narrowed the scope of their warrants and therefore the charter issues weren't raised. The court decided to hear it anyway.
It's a very strong decision. In it the court is basically saying that we need guidelines for judges who are issuing these types of orders. The police need to be very careful about what they're searching for. We shouldn't be allowing fishing expeditions. The information sought went way beyond what was required. There should be a different approach to it.
The other thing that the judge said at the end of his decision, about an issue that had been raised by Telus and Rogers, was that once all of this information is in the hands of police after these search warrants are issued and the police collect the information, there are no rules in the Criminal Code, PIPEDA, or any statute as to what happens to that information. Is it kept forever? Is it used for other purposes? Is it just stored in a database somewhere, where there might be a data breach of credit card information and other data?
The judge said this is not for us; this is for Parliament to deal with. The court can't create guidelines around that.
This is an issue if police are going to be collecting huge volumes of information. What happens to it and what are the guidelines around disposal of that information once the purpose for its collection disappears?
That's a complex challenge. Right now there are class action lawsuits already under way against the federal government for negligent handling of personal information, for data breaches. Civil recourse and class action lawsuits are going to become more common, so that is one way in which people can have their day in court.
Professor Austin has talked about charter recourse, and there is charter recourse that's available. In some cases it can be brought by the affected individuals. We were just speaking about a case in which it was brought by telecommunications companies that felt that too much data was being sought from them, and that is not the only case in which companies have pushed back. There are these other recourses that are outside the Privacy Act.
In terms of the Privacy Act itself, one concern is exposing the government to liability. If you create obligations or standards that are set in very strong terms in the legislation, that may increase the risk of liability for the government.
In part, the model has also been one of attempting to improve compliance and improve practices within government around personal information. On one level, that's been the ombudsman model. Now the commissioner is seeking additional recourse, an additional means for citizens to insist on compliance with their rights.
Whether that involves just getting a court order for recommendations to be enforced and practices to be changed or whether that also includes a right in damages is not entirely clear, because you can have a recourse to have a court order, a change in practice, without having recourse to get damages. Whether it's required is something to consider.