Thank you very much for allowing the Canadian Civil Liberties Association to appear before the committee today.
We were founded in 1964 to protect the rights and freedoms cherished by Canadians and entrenched in our Constitution.
Too often these days, privacy is characterized as a barrier that someone can decide to erect or to take down. An institution or a group might want to build it higher, chip away at it, or smash it completely, depending on its assessment of privacy as a value. The barrier metaphor, which we see increasingly in the media and other conversations about encryption, health information, and national security, to name a few, is a confrontational, unproductive, and arguably ineffective way to think and talk about privacy.
CCLA suggests that, particularly in the context of this much-needed conversation about Canada's federal privacy legislation, we need to talk about privacy as a human right. A rights-based approach, of course, doesn't remove all conflict, because all of our charter-protected rights and every right that is enshrined in international law exist in tension with other rights. However, it does provide us the motivation to engage on the level of first principles, so when we begin to specify what privacy protection actually looks like in Canada, we are all operating from a common understanding that it matters, not just to individuals but to us as a society, nationally and globally. A commitment to privacy as a human right can help us navigate the dramatic changes we have seen since the act came into effect 30 years ago.
Technology hasn't just changed the ways we can collect and use data; it has also changed our societal attitudes toward information. The potential of large collections of information—big data—to reveal useful patterns and probably hidden secrets is regularly heralded by both private and public sector bodies. Government both collects information itself and potentially has access to the ever-increasing stores of information in the hands of the private sector. At the same time, what we hear from people when they call and speak with us at CCLA is that citizens are afraid of technologies and processes they don't understand being used in ways that can have serious consequences for their life chances without their knowledge. This legitimate fear is undermining public trust in bodies, including governments, that collect, manage, and store information.
In this data-rich environment filled with data-hungry actors and fearful citizens, it is increasingly important that Canada's privacy law be revised to be strong, flexible, and well-grounded. It needs to encompass contemporary and future uses of personal information and, most important, it needs to engender trust in Canadians.
All of our recommendations are made with these overarching concerns in mind. I am going to rocket through 10 points, most of which are going to be familiar to you because they are in agreement with the submission to this committee, in March, by the Privacy Commissioner of Canada and subsequent witnesses. I am going to be extremely brief, but I am happy to clarify during questions.
First, we must ensure that there is a necessity and correctness standard put into the legislation, to be applied when deciding whether to collect information, whether to keep it, and whether to share it. Is it needed? Is it correct to collect and use it? By that I mean, would it withstand a charter challenge? This standard will encourage data minimization and guard against what we all know is a well-known tendency to over-collect data and store it for too long, just in case it might be useful sometime in the future.
Requirements for information sharing agreements need to be clarified in this legislation as well. This is particularly vital since the passage of the Anti-terrorism Act, 2015, which greatly expanded the scope of information sharing between government departments. At the time then Bill C-51 passed, the reassurances that Canadians were given in relation to these new sharing powers were that the Office of the Privacy Commissioner of Canada would have review and oversight. Regardless of the changes that are or aren't made as a result of the ongoing national security consultation, we believe revisions to the Privacy Act can provide much-needed safeguards and transparency for all information, for any purposes, in Canada. There need to be openness and transparency regarding the way information sharing happens, the extent of that sharing, and the explicit safeguards that we assume will be put in place to ensure that sharing is proportionate and that privacy risks have been properly assessed and mitigated. Of course, this holds true for sharing domestically and with foreign governments.
Transparency reporting requirements, in that same vein, need to be clarified and established. In particular, that is the case for lawful access requests made in a law enforcement context to private sector bodies that hold information about individuals. These reports provide valuable public information that can foster and inform public debates and decisions about privacy and, going back to my earlier comments, enhance trust in government institutions. Citizens deserve, and many want, an understanding of the nature and frequency of requests by law enforcement bodies for their personal information when it happens without their consent or knowledge. CCLA has always argued that the ability of law enforcement to make these requests should be limited, as per Spencer, but to the extent that these requests are allowed, with or without a warrant, a strong transparency regime is necessary to ensure the public is properly informed.
In keeping with the theme of enhancing public trust in the way government collects and uses personal information, CCLA would also recommend that privacy impact assessments be mandatory when government departments create new or expanded programs that might affect Canadians' privacy. The assessments need to be submitted to the OPC, Office of the Privacy Commissioner, for review during the design and planning phase while there is still time to mitigate any privacy risks. At the conclusion of the process, appropriate summaries should be made public so that citizens can see that this process has happened.
In a similar vein, we suggest that there should be consultation with the OPC when drafting legislation and regulations that affect the privacy of Canadians. Again, that should happen before the bills are tabled. This recommendation is directly relevant to my preamble, where I asked for privacy to be talked about as a human right. Having a process in place where privacy interests can demonstrably be shown to have been taken into consideration in the development of new legislation gives privacy rights the appropriate weight and is consistent with international trends.
We would also encourage government institutions to lead the way in cybersecurity by adding a specific obligation in the act for them to provide the appropriate level of both technological and processual protection to data collected, whether it is in transit, at rest, during use, in storage, or at the time of destruction. We recommend the federal government take a proactive approach to making sure the data its institutions hold is protected to an exemplary standard. We believe this can be achieved, in part, by revisions to the Privacy Act. Of course, more information will come out about that in the cybersecurity review.
We would like to see breach reporting made mandatory in law rather than just policy. Government institutions should have to report breaches beyond a relevant threshold, an agreed-upon threshold, to the OPC and notify individuals in a timely manner. The threshold needs to be clearly defined in the legislation, much the way it was done in similar amendments to PIPEDA.
Even if breaches fall short of the standard that is agreed upon for mandatory breach reporting, government institutions should be required to keep records of all breaches for possible review by the OPC. Knowing that they are accountable for doing so will be a strong motivator for needed data security and improved data stewardship.
The record-keeping requirements need to be sufficiently robust so that the commissioner can look at them and make sure that the assessments about whether or not a breach meets the threshold are happening properly.
We would like to see order-making power given to the Privacy Commissioner. It was with interest that we noted he now agrees. More information sharing and collection means that more potential harm can come from excesses. There need to be consequences in proportion to the risks, which means that the commissioner needs expanded powers to make sure the fullest protection of the revised law can be brought to bear in a timely and effective manner.
Last, we recommend regular review of the act every five years. I don't think that requires elaboration in this changing environment.
Once again, thank you very much for allowing us to appear.
Thank you very much, Mr. Chair, for the invitation to participate in your meeting. I'm going to share some thoughts about technologies that are just around the corner and that I believe will have a profound impact on how we think about privacy. My goal is to help us understand them so that, as much as possible, our laws can be ready for what's coming next.
I am a professor in the Faculty of Environmental Design at the University of Calgary, as well as an adjunct professor of computer science. I'm a research fellow of our Centre for Military, Security and Strategic Studies and of the Canadian Global Affairs Institute here in Ottawa. I've spoken to all the major hacker conferences like DEF CON, Black Hat, and one with the intriguing name of Hackers on Planet Earth, so I try to keep track of what both the good and the bad hackers are up to.
I'm also pretty sure that I taught Canada's first course in information security in 1974. Back then it was simple: lock your computer room doors, choose good passwords, and don't put confidential stuff in the trash. Today, it's much more complicated.
Consider a 2015 project called “The Face of Litter”, sponsored by Hong Kong Cleanup. Workers collected discarded chewing gum and cigarette butts on that city's streets and sent them to Parabon NanoLabs, a privately held Delaware corporation. Parabon used DNA phenotyping to create an approximate digital portrait from each sample. A week later, on passing the scene of the crime, the spitter saw an eerily familiar face on a video screen, a DNA-driven self-portrait.
Now, how could they do this? There was plenty of saliva left on those discarded items to do DNA analysis. In fact, it requires only one nanogram. Certain traits like eye colour, hair colour, and facial shape are easy to work out. Ancestry can be analyzed. Stir in machine intelligence and real-world knowledge—gum chewers are more likely to be 18 to 34, and cigarette smokers older—and you get a very creepy scenario whereby biodata is used not to identify someone specifically, but to infer things about the person. This challenges our long-held definitions of personally identifiable information and personal health information.
In my 2014 book, Technocreep: The Surrender of Privacy and the Capitalization of Intimacy, I suggest that a store might grab a few skin cells when you type in your PIN and send them off for analysis. The next time you visit that store, you might see a pop-up asking if you knew you were pre-diabetic and saying, “Here's a special coupon just for you.” While to my knowledge no store is doing this yet, we have seen retail outlets in the U.S. and the U.K. use facial recognition to identify shoplifters, VIP customers, and known litigious individuals. Banks such as HSBC are already using facial recognition for client identification, and several Canadian banks are doing biometric trials.
Your biometric data, be it your voice, face, or DNA, might well be covered by the Privacy Act and under PIPEDA's definition of personal health information, though those definitions will need to be updated as technologies emerge, but does this legal protection do the average person any good? In practice, many customers would not notice an obscure clause authorizing the use of their biometric data in the retail or banking environment. It could be buried in the terms and conditions document, which hardly anyone reads. Some people might even give consent to the use of their biometrics, hoping to save money, get better service, or obtain useful health information.
I believe that citizens may not fully understand all the implications of collecting, storing, and exchanging their biometric data, as well as secondary uses and cross-correlation of biometric and other databases. We need laws that mandate full disclosure and a process to ensure real compliance, which would mean more than just guidelines on the OPC website. Even today, public overt surveillance cameras are supposed to carry proper signage. In my experience, most carry no signage, and nobody does anything about it.
Then there's the time problem. Fifty years ago, a criminal may have left blood at a crime scene with impunity since, aside from determining blood type, it didn't hold much information. Today, law enforcement is solving long-dormant cold cases through DNA analysis of old samples.
We cannot predict what future data analysts will extract from our biological and biometric data, except to say it will be more than they do today. Experts also suggest that quantum computers will be able to retroactively decrypt decades of data that we currently believe is secure. There's a wonderful phrase that describes all this: beware of time-travelling robots from the future.
I do detect a growing unease in the Canadian public. When I talk to people about biometric identifiers, from ear shape to heart rhythms to your unique body odour, which can identify you, their ears perk up. Recently I was approached by Costco's magazine for an article on the downsides of biometric identification. I explained how fingerprints can be stolen and put on a fake finger with a 3-D printer. A hacker named Starbug even captured the fingerprints of the German defence minister from a high-resolution photo of her hand.
Even more troubling is the belief that biometrics are infallible, which they are not. They have error rates that vary depending on parameters set by the designers. The first-generation NEXUS terminals used at Canadian borders would sometimes fail to uniquely match a person from the eye biometrics they obtained.
Illinois and Texas have passed specific commercial biometric privacy laws, and article 9(1) of the European Union's forthcoming general data protection regulation puts specific restrictions on use of genetic data and biometric data where processed to uniquely identify a person. Canadians need a similar level of protection, and these laws provide a starting point for us.
Another area that needs serious thought is behavioural biometrics. In Technocreep I review Progressive Insurance's Snapshot device, which people install voluntarily to try for a discount on their car insurance. It records how much they drive, when they drive, and how hard they hit the brake. I suggested that it might be a sensible choice for some people, especially since it didn't track where they drove. Then Desjardins insurance brought out the Ajusto app, which uses your smartphone to create a driving-quality score. Unlike Snapshot, this system knows exactly where you are, and even how well you respect the speed limit.
Right now, systems of this nature are opt-in, and the companies take pains to tell consumers that even bad driving will not raise their rates. However, there is certainly the possibility of driving monitors and even wearable fitness monitors becoming de facto mandatory in order to obtain insurance at a reasonable rate. Insurance is, after all, about spreading risk and charging risk-based premiums.
In opposing the long-overdue genetic privacy law for Canada, Bill , Jacques Y. Boudreau, chair of the committee on genetic testing for the Canadian Institute of Actuaries argued that an essential element for insurance to work properly is an equal access to information by both parties. There is clearly tension brewing between our right to keep information private and commercial interests.
We spend a lot of time worrying about how an authorized data collector uses our data. However, a flood of data breach examples, from the Sony hack, to the DNC emails, to the Ashley Madison fiasco proves that our personal data can fall into the wrong hands with devastating consequences. People whose email addresses appeared on the Ashley Madison client list have received blackmail threats, suffered workplace repercussions, and in three reported cases have committed suicide. A further complication here is that people could appear on that list without having actually signed up due to the lax design of the system.
While there are hacking-related Criminal Code provisions such as mischief in relation to data and unauthorized use of computer, these do not directly address the privacy implications of hacking. Of course, many perpetrators are never caught, but some are. There should also be consequences for the entity that manages the data if they did not take reasonable precautions to secure it.
Therefore, I support effective data breach notification in both the public and private sectors, as well as enhanced mechanisms, including order-making powers, to enable the Privacy Commissioner to preserve public confidence. I also support regular review of our privacy laws at least every five years.
I will close by revealing that you've been listening to a cyborg, a human being with a new technological body modification. I had an RFID chip implanted in my hand at this year's DEF CON conference. Right now it gives me only one superpower: I can open my door at the university without fumbling for my ID card. In the near future, devices will be available to give people telephoto vision, super-acute hearing, and enhanced mental powers.
Canada's first privacy laws date from the era when information was kept on paper, and we dragged them into a world where our data lives in cloud networks somewhere on the planet. Our next challenge, one that will keep us busy for a long time, is dealing with the implications of the data being us, an intimate part of our humanity.
Thanks so much for your attention. I look forward to your questions.
Surveillance is always scary.
I'm back here to testify given my involvement for over four decades in privacy matters and advocacy. My privacy advocacy work began with a local civil liberties group dealing with the growing use of social insurance numbers as an identifier. As an investigative researcher, I dug up information on the problems with increasing use of computer matching of personal information, and yes, back then, I was a witness testifying on the limits of Canada's proposed privacy act and on secretive data sharing.
Now the privacy issues are even more complex in this digital age and are threatening given the widespread legal and illegal sharing of and access to personal data, metadata mining, data profiling, and massive surveillance.
Throughout I've never wavered in the belief that Canadians need more access to and control over their personal information and better information about intrusions. Canada cannot continue three and a half decades later to have weak privacy legislation. The focus on limited privacy access to one's records to the detriment of regulating the state and private sector's relentless intrusions into the lives of Canadians has left us with inadequate safeguards.
There is not much in the current Privacy Act and PIPEDA that puts a stop to online snooping, data mining, and biometric identity matching or that addresses and restricts the growing use of secretive newer surveillance technologies like Stingray cellphone listening devices or prevents the increasing sharing of Canadians' personal data with foreign authorities.
No Canadian minister or prime minister has stepped in demanding better privacy protection or proposing remedies against what Edward Snowden revealed as the means of secret massive surveillance trolling.
No Canadian prime minister has put in place regulatory restrictions that, for instance, deal with the handling of increased amounts of Canadian personal data housed or transmitted through the United States and potentially captured under its Patriot Act or subjected to other foreign entity intrusions.
's recent discussion paper on police security powers does not alleviate civil liberties privacy protection concerns. 's statements, including before this committee, that more not fewer records must be exempt under national security, do not calm those concerns. Brison went on to say that the Information Commissioner, or for that matter the Privacy Commissioner, would have limited review and access to such security records, so the Trudeau government's opening moves are then far from reassuring.
What we do need is a greatly strengthened data protection act. Let me briefly turn to 10 areas in which improvements can be made.
My first recommendation to improve legislation is in agreement with testimony of a previous witness, Lisa Austin. We must begin by framing further advances restricting privacy invasion in terms of and in line with Canada's Charter of Rights and Freedoms, so first and foremost a new act's purpose clause must recognize privacy protection as a constitutional protected right.
My second recommendation is that a basic rewriting of privacy legislation needs to create a whole new predominant part one section that emphasizes transparent and enforceable obligations and restrictions on data sharing, matching, profiling, and tracking.
If a privacy act is to become, as it should be, a data protection act rather than simply a limited and outdated access to personal information act, there must be provisions added for tougher and clearer regulation and restrictions on personal information sharing.
While the Privacy Commissioner calls for prompt mandatory reporting of public sector personal data breaches, he only advocates some selective notification of those affected and minimal transparency, and he sets out no enforceable binding order or penalty powers for his office despite the fact that such breaches occur fairly regularly. I'll explain that more.
My third recommendation is threefold. First, individuals should be given mandatory rights of consent on a timely basis for government collection and use of their information. Second, there should be fewer exemptions, exempt banks and delays so individuals can promptly obtain more fully their information. Third, all agencies, including the prime minister and his office, should be covered.
My fourth recommendation, which former privacy commissioner Jennifer Stoddard suggested, is that unrecorded information such as personal biological samples, including DNA and iris scans, be covered. Data gathered from radio frequency identification chips or now by Stingray collection needs to be explicitly covered by public and private sector privacy legislation.
My fifth recommendation is that officials' salaries and perks and private sector violations no longer be considered as personal information, but be public. For example, exact bonus payment information received must be made public The company's name, as in the case of the bank fined $1.1 million by FINTRAC, or in the case of companies and individuals found to be tax haven offenders must also be made public.
My sixth recommendation is for a privacy commission to have order-making power. Now Commissioner Therrien agrees at this point, but enforcement powers and stiffer penalties for privacy invasion would still be needed to help effectively restrict privacy invasions and regulate transborder data flow. His office would need wider investigative powers to review such matters as questionable transborder data flow and metadata collections. It's not simply a matter of order powers.
My seventh recommendation, in agreement with Commissioner Therrien, is that both he and all Canadians need a legislative expanded right to go to court, including in cases of improper collection and use of personal data. Courts now are only able to hear cases about access to blocked individual personal files. It would help too if individuals and groups bringing such privacy violation cases to court were given resources to sue the government. It is important to note that individuals and groups may still challenge commissioner orders as limited and want the courts to provide greater privacy protection than commissioner orders offer.
My eighth recommendation is that oversight be separate when it comes to access to information and privacy. Joining such acts so closely together destroys their opportunities to more fully develop their separate and, at times, conflicting public interests. One is for proactive disclosure and multi-transparency tools and accountability practices; the other is for restricting privacy invasions and enhancing data sovereignty. It's time to untie privacy legislation from access to information legislation.
My ninth recommendation is that in order to have an effective data protection act, the House privacy committee must consider bold changes to the Privacy Act in conjunction with improving the Personal Information Protection and Electronic Documents Act, PIPEDA. The threats under both acts are similar, the remedies the same and the object the same, which is that Canadians want more control on what personal data third parties, from police to marketers, can access.
My 10th and final recommendation is for greater transparency—no surprise—when it comes to the public knowing about the use of privacy invasion powers. Canadians remain largely unaware of the systems and means authorities are using to conduct surveys that can affect them. Little is known about the cost of surveillance and about the budget and expenditures law and security forces have in this regard. We remain in the dark about how frequently and where, for instance, Stingray equipment is used and the cost involved. We remain unclear about what laws or authorities allow surveillance. I can think of dozens of such laws.
The committee, in addition to conducting periodic reviews of privacy legislation, should have a subcommittee tasked with reviewing and questioning laws that broadly allow privacy invasion and intrusions.
Let me end with an example of where the public is kept in the dark on how Canada's system of surveillance operates. Recently I uncovered data whereby the public safety minister and his officials had issued, in a 2014 to early 2016 period, licences to the RCMP, CSIS, and CSE of National Defence which in turn allow unnamed private companies to have and sell surveillance equipment to unnamed buyers, be it possibly malware, Stingray, or other surveillance equipment or components. This is done under the cover of a section of Canada's Criminal Code.
Documents obtained indicate, for instance, that CSIS has trusted, long relations with certain surveillance companies, and, in one instance, a ministerial licence granted was backdated. We do not know, then, what kind of surveillance occurs, and there is no known reporting requirement, like under wiretap legislation.
The point, Mr. Chairman and members of the committee, is a minister of the crown oversees this surveillance arrangement far away from public scrutiny. His or her first concern is not to champion privacy protection for the public. I and others are offering up suggestions for Canada to move beyond weak privacy protection legislation and lax regulation to protect citizens.
I thank you very much.
Mr. Chair and members of the committee, good morning. My name is Tamir Israel, and I am staff lawyer with CIPPIC, the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic at the University of Ottawa's Centre for Law, Technology and Society and the Faculty of Law. CIPPIC is a public interest legal clinic that works to advance the public interest in policy debates arising at the intersection of law and technology.
I wanted at the outset to thank you for inviting us to testify before you today, as well as for undertaking this important review of the federal Privacy Act, a central component of Canada's privacy, transparency, and accountability framework.
Since the introduction of the Privacy Act in the late 1970s, the policy landscape surrounding data protection has evolved dramatically, driven by tectonic shifts in the technical capability and general practices surrounding the collection and use of personal information. The federal Privacy Act has simply not kept pace with these dramatic changes, a reality that hinders its ability to continue to achieve its objectives, in light of heightened incentives and technical capacities to collect and keep personal information at unprecedented scales. The nature of the objectives incentivizing state data practices has rapidly evolved over the years since the adoption of the act, which initially focused primarily on regulating data practices animated by administrative purposes.
Today's privacy challenges are driven by a far more diverse set of incentives. The era of data-driven decision-making, colloquially referred to as “big data”, increasingly pushes state agencies to cast wide nets in their data collection efforts. Additionally, more often than not, the act is applied in review of activities motivated by law enforcement and security considerations that are far removed from the administrative activities that animated its initial introduction.
Finally, data sharing between domestic and foreign state agencies now occurs on a more informal, and often technologically integrated, basis than could have been envisioned in the late 1970s.
The Privacy Act is in drastic need of modernization, and to that effect, CIPPIC has reviewed and largely endorses the recommendations made by the Office of the Privacy Commissioner of Canada to this committee with respect to changes necessary to ensure today's data protection challenges are met. We will elaborate on a few of these, as well as on some additional recommendations that we have developed in our comments today. In addition, in our written comments, which will eventually make their way to the committee, we provide some legislative language suggestions, which we hope will help guide your review of this act.
The remainder of our opening comments focus primarily on discussing and highlighting specific recommendations designed to enhance proportionality, transparency, and accountability, as well as address shortcomings that have arisen from specific technological developments.
Before turning to these broader themes, however, our first recommendation addresses the Privacy Act's purpose clause, which we believe should be updated to explicitly recognize the objectives of the act: to protect the right to privacy of individuals, and to enhance transparency and accountability in the state's use of personal information. Express recognition of these purposes, as is done in provincial counterparts to the Privacy Act, will assist in properly orienting the legislation around its important quasi-constitutional objectives, and will help to secure its proper and effective application if ambiguities arise in the future, as they surely will.
Necessity and proportionality are animating principles that have become central to data protection regimes around the world, but are absent from the aging Privacy Act. It's important to explicitly recognize these principles in the act, and to adopt additional specific measures that are absent from its current purview, but are nonetheless essential to ensuring private data is collected in a proportionate manner.
As a starting point, first, the Privacy Commissioner's recommendation for explicit recognition of necessity as the standard governing data collection practices should be implemented. Necessity is a formative data protection concept and provides important context for assessing when data should or should not be collected, used, or disclosed. The existing standard, which requires only that data practices relate directly to an operating program or activity, is simply too imprecise in the age of big data, where organizations are increasingly encouraged to collect data that has minimal clear, immediate connection to current objectives.
Second, the Privacy Act imposes no explicit limitations on how long data can be retained once it is legitimately collected. The lack of any explicit obligation to adopt reasonable retention limitations can mean that that data is kept well beyond the point where its utility has expired, exponentially increasing the risk of data breach and of inappropriate uses. The lack of an explicit retention limitation requirement can even lead to the indefinite retention of data that has only a very short window of utility, greatly undermining the proportionality of a particular activity.
As an example, our clinic, along with Citizen Lab at the Munk School of Global Affairs, recently issued a report examining the use of a surveillance tool called a cell site simulator. These devices operate by impersonating cellphone towers in order to induce all mobile devices within range to transmit certain information that is then used to identify or track individuals or devices. The devices operate in a coarse manner. For each individual target the devices are deployed against, the data of hundreds or thousands of individuals within range will be collected. Non-target data collected is only immediately useful for identifying which datasets belong to the individual, the legitimate target of the search, and which do not, an objective that could be accomplished within 24 to 48 hours of collection. However, as the underlying collection of these thousands of non-targeted datasets is legitimate, these datasets might be kept indefinitely. These large datasets can then be reused at any point in the future and, subject to ancillary statutory regimes such as the Security of Canada Information Sharing Act, which was recently adopted via former Bill , can be shared across a wide range of other agencies.
Including an explicit retention limitation provision would not only mandate state agencies to adopt clear retention policies, but would also allow the commissioner to address unreasonable retention in a principled manner. This, in turn, will reduce the risk of data breach and generally increase the proportionality of data collection practices.
Third, we would recommend the adoption of an overarching proportionality obligation that would apply to all collection, retention, use and disclosure of personal information by government agencies into the Privacy Act. This would be comparable to its counterpart that is currently found in subsection 5(3) of PIPEDA. As you have heard from other witnesses, the Privacy Act increasingly provides an important avenue for ensuring charter principles for the protection of fundamental privacy rights are fully realized. An overarching proportionality or reasonableness obligation modelled on subsection 5(3) of PIPEDA would provide an avenue for assessing charter considerations across all data practices. It will also provide the Privacy Act with a measure of flexibility, allowing it to keep pace with technological change by providing a general principle by which unanticipated future developments can be measured.
In addition to these proportionality measures, there are clear gaps in the Privacy Act's current transparency framework and further opportunities to enhance the openness of state practices, which in turn will encourage accountability and public confidence.
At the outset, we encourage the adoption of the Privacy Commissioner's recommendation for a public policy override to the act's confidentiality obligations. This would allow important information regarding anticipated privacy activities to enter the public record in a timely manner.
Second, the Privacy Act should be amended to include statistical reporting obligations attached to various electronic surveillance powers in the Criminal Code. As Mr. Rubin mentioned, statistical reporting obligations were once a hallmark of electronic surveillance regimes and are attached to certain electronic surveillance activities, such as wiretapping, but these activities have largely been superseded by other electronic surveillance activities that have no comparable statistical reporting obligations attached to them.
One investigation conducted by the Privacy Commissioner's office recently found that law enforcement agencies themselves did not have a clear picture of the scope of their own practices in relation to the collection of subscriber information from telecommunication companies. Understanding the nature and scope of state surveillance practices is all the more important in light of the tendency for rapid change in practices in this sphere. Imposing a statistical reporting obligation in the Privacy Act that applies across the spectrum of electronic surveillance powers would therefore provide an important transparency mechanism.
Finally, the adoption of a general obligation on state agencies to explain their data practices would greatly enhance transparency. While the act currently obligates government agencies to explain to individuals the purposes for which their personal information is collected and used, it lacks a general obligation to explain agency practices.
One modelled on PIPEDA's openness principle would be beneficial. If this concept is adopted, it should address the challenges raised by algorithmic non-transparency, which would entail an obligation to explain the logic of any automated decision-making mechanisms adopted by the state.
We have some suggestions on accountability and compliance measures that I will submit in writing and you folks can review at a later time.
I did want to very quickly touch on a couple of recommendations we have that address very specific technological developments that have led to gaps in the Privacy Act.
We would recommend updating the definition of “personal information” so that it is aligned with the comparable definition under PIPEDA. The current definition only applies to personal information that is recorded, whereas many modern data collection and use practices never actively record any personal information, but can still have a very salient privacy impact.
In addition, we would endorse the Privacy Commissioner of Canada's recommendation to adopt an explicit obligation to adopt reasonable technological safeguards, as well as individual breach notification obligations.
Finally, and very briefly, we would also endorse the Privacy Commissioner's recommendation to formalize the privacy impact assessment requirement, as well as recommend an avenue for facilitating public input into the process so that discussions of privacy-invasive programs can occur with public input at the formative stages.
Thank you. Those are my comments for today.