Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.
Welcome to meeting number 20 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.
Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Monday, December 13, 2021, the committee is resuming its study of the use and impact of facial recognition technology.
Today's meeting is taking place in a hybrid format, pursuant to the House order of November 25, 2021. Members are attending in person in the room and remotely by using the Zoom application.
I think everyone here is probably fairly familiar with how this works, so I won't go into more detail. If you're on Zoom, please be sure to unmute yourself when you begin to speak, and certainly select the official language that you wish to receive or simply the floor feed, if that is what you wish.
This is a resumption of the testimony we were receiving from the RCMP and the Toronto police that was cut very short due to votes both before and after our committee meeting began a week ago last Thursday.
With that, I'm going to dispense with opening remarks and go straight to our questioning. We are also monitoring what is going on in the House. There is a notice of time allocation. If it is moved and we end up having a vote this morning, then we will deal with that when it happens. I think we'll have quite a bit of time for questions to resume with these witnesses.
Facial recognition technology has been used on only three occasions. On two occasions it was with the child exploitation centre that I'm in charge of, where they were able to identify victims of this horrible crime and create safeguard measures to protect the victims who were located in Canada. On a third occasion it was utilized to track an offender, a fugitive, who was internationally abroad.
There have been no prosecutions using this technology. It's simply been used for identification on two different files with our child exploitation centre. One was when a person from outside the country was trying to exploit two children in Canada to perform sexual acts. We were able to identify the victims and provide safeguards to protect the victims from the person who was trying to offend.
Another situation in which it was used was on an international case. There was a file from 2011 on a victim who was not able to be identified through traditional means. We were able to use facial recognition technology within our scope to identify this victim, who actually was in the States. The entire international community was trying to find this victim for a series of about nine to 10 years and were unsuccessful. We were able to use facial recognition to identify this victim who was situated in the States. We reached out to the Americans, and they were able to confirm that in fact this person was charged and convicted in the States from their information on their charges.
I guess the importance of the facial recognition is that the international community had continued to look for this victim for a series of nine to 10 years and were unable to do so. We were able to use facial recognition to identify this victim. In fact, a court process had completed in the United States of America, and he was convicted on that American charge. It had nothing to do with what we did in Canada.
When it was initially rolled out, our members started to utilize it on those three cases only.
A lot of members were testing the technology to see if it worked. They were using a lot of searches on their own pictures, on their own profiles, to see if this technology worked. They also used media searches. They took photographs of celebrities and ran them through Clearview to see if it worked.
In fact, by testing this technology, we realized that it wasn't always effective. There were certainly some identification problems, and that's why we use it only as a tool in the tool box and do not rely on it, because you do need that human intervention to identify who the victim is. It is not always correct. It was absolutely critical that we did have that human intervention when we utilized it.
Many of the queries were testing the program. The only three cases were the three cases that I just spoke of.
For the Toronto Police Service, during testimony on April 28, you acknowledged that the Toronto Police Service uses FRT in limited circumstances. Is the use of facial recognition technology in an investigation disclosed to either the court or the individual over the course of an investigation after an arrest?
Part of the Clearview AI issue was that we didn't have a proper assessment process, so we're in the process of putting that in place. We've had consultations on the board policy that looks at AIML, and we're in the process of drafting the procedure that will sit underneath that.
Essentially, it starts by a determination of what the benefit of the technology might be that would drive us to even look at it. Then there's a set of flags, which would increase the risk around a set of various risk factors that we determine through the consultation that we ran on the public policy, and those risk factors would flag it into a separate process, ultimately to go through public consultation around that specific technology and a risk assessment to determine whether it needs to go forward.
Sure. There is extreme risk, which is something that we would not do. It would be banned. There's high risk, and medium, low and very low. The reason we needed more strata was to account for AIML applications we're getting that are baked into existing and sort of very simple and non-controversial types of applications.
The board policy calls for all of our technology to be posted and to be evaluated under this frame. We are not going to be transparent about the very low risk and low risk, because we expect there are going to be a great number of them and the load on our service was going to be very high.
Good morning, Mr. Chair and honourable members of the committee. Thank you for this opportunity to speak to you today.
We've made significant progress in the implementation of the national technology onboarding program, which is the main caveat to meeting all of the recommendations from the OPC. Every technology will be assessed, not only from the privacy aspect but also from a bias, ethics and legal perspective, before being used in any operation or investigation going forward.
As per the recommendation, we have until June 2022 to implement the program, so we still have a bit of time. We're working very hard right now to complete that. There's a slight risk that not all the training will be given by that time frame, and we may have a capacity issue, because we're having challenges with recruitment of additional resources within the program. However, the key foundation pieces for that program will be in place by June 2022.
With regard to the key pillars for the national technology onboarding program, or stakeholder outreach and partnership, which includes the training, obviously there's a policy review in development to identify all gaps with existing policy and to modify and update new ones. There's a technology assessment portion, where we built a full intake process through a series of questionnaires. Also, we're implementing a technology inventory for awareness oversight. The last component is going to be public awareness and transparency.
Can you explain to me why the Privacy Commissioner's report released on June 10, 2021, mentions that the RCMP confirmed that it had purchased two licences to use Clearview AI services in October 2019, and that its members had also used Clearview AI services since then?
At this time there are two lawsuits against the RCMP. According to the legal provisions, we may not be able to provide you with all the information requested. We will certainly provide you with as much as we can.
I insist that the information be sent to us, because we need it to do our parliamentary work properly, especially since my question was not answered accurately and I had to insist a lot. I want you to know that we will insist.
I have another question for the RCMP officials.
Earlier, you told us that facial recognition was only used on three occasions. As with the contract, I guess we have to take your word for it.
If a situation arose that you would consider urgent, how ready would you be to deploy this type of technology again very quickly and at very short notice?
You have suggested that you would use it in urgent circumstances. Tomorrow morning, would you be prepared to deploy this technology again in an emergency?
At the present time, I'm unable to use that technology, which is very unfortunate, because there are victims at risk in Canada under the child exploitation side of the house. I cannot attempt to identify them, because I don't have the technology.
You talk about an urgent file; that is the most urgent file, in my eyes. There are victims in Canada who are being exploited by people—
I'm waiting on a decision from our national technical operations, our NTOP process, to do that assessment. Once I have that assessment from there and I'm told that I can use it, I will continue to look for victims of child exploitation. Until I get that process completed from my NTOP people, I cannot use it, and victims are at risk today.
With other officers, when they access CPIC, and this was a point of contention I had with your superior.... When they surreptitiously access CPIC to gather information that has not been lawfully granted.... There is a code of conduct within the RCMP. Was the investigating officer who pursued this technology ever investigated for a violation of the code of conduct?
—presented completely, accurately, fairly and fully.
Is this not something you are familiar with?
For instance, when you're seeking to get a warrant, sir, are you not aware that as an RCMP officer, when you're presenting evidence for a warrant ex parte, you must present it even if it casts a negative light on you and that you would still have to present it? Are you familiar with that principle?
Would you not agree that even in the testing technology, your officers who were testing celebrities and other people, invariably using this technology, would have drawn in facial profiles from hundreds of thousands, if not millions, of people who are under no lawful investigation?
Director Sage, am I hearing in your testimony today that any ground-level investigator can, either through procurement...?
What we're hearing in other services, quite frankly, is that they're using trial services, trial subscriptions, on AI and different types of technology to test their capabilities, because there aren't existing frameworks in place.
I want to make sure that I get this on the record.
In a previous question, my friend from the Bloc asked for those documents. In those documents, are you prepared to provide us with the name of the person who signed the licence in advance of receiving those documents? That wouldn't be anything that would be subject to any kind of solicitor-client privilege.
Mr. Boileau, if it helps you, we're talking about the duty of candour. This is a parliamentary committee that you're before. There have been instances in these committees when we have asked witnesses to swear an oath. Are you aware of the seriousness of the committee in which you're testifying today?
I'm going to ask that you please provide that to the committee. I think that's pertinent to what we need that for.
When you're using that in the force right now, you're saying that you identify mug shots and then use that for.... Is it for evidence? Is it to identify criminals? Could you please explain exactly how that is being used?
Sure. You might have an image from an event, and you have a person, who is usually a suspect you're trying to identify. That would be handed over to FIS, and they would look at the situation and ensure that it meets our criteria, that it's a significant enough crime and the right type to meet the criteria we've set up. At this point, they would run the image against our Intellibook system, and it would result in a ranked order of matches, some of which might be relatively good and some of which might be poor. There will be an assessment by the FIS technician as to any of those being viable, and that would be presented back to the investigators.
If none of the matches was sufficiently strong, then there would be no result returned. The investigator would then have to corroborate that identity through other means. Facial recognition is not considered an identification; it's a suggestion of where to look.
In terms of trying to understand why the Privacy Commissioner is finding fault in this—and this is what we're trying to investigate—if you had a crime scene and you had fingerprints, can you use them the same way that you're using FRT right now?
I guess the difference on this one that we're looking at—and I'm going to go back to some other earlier testimony—is that when we have this FRT system, it's identified that we see up to 35% error rates in identifying, for instance, Black females versus white females.
When it comes to that identification, you stated in past testimony that you have a human who looks through that data, but are we still seeing that? Your testimony—I'm just going to get you to confirm that—was that the technology you're using was the least biased. Is that correct?
Okay. I guess the difference between a fingerprint, as you were saying, in a crime scene and this technology is that this one has proven to be inherently biased, or to have some bias, whereas a fingerprint would not have a bias, correct?
What about the body cameras that I know are used by some police services, including by Toronto police, I believe? How are those images used with the police service? Do those images ever get into your database?
They don't go into the mug shot database. That's a separate digital evidence management system that holds all the video from body-worn cameras. Body-worn cameras would generally not be used, or no data would be used. The circumstances wouldn't arise. There is no connection between the body-worn cameras and the Intellibook system, no automated connection.
The only way to do a facial recognition off a body-worn camera image would be to lift the still, export it and then bring it into the Intellibook system through the process I described. That would be highly unusual, because if you're interacting face to face with someone, you don't usually need to then determine their identity through that kind of means.
You were talking about how the force is currently trying to develop policies and procedures surrounding facial recognition technology. Can you talk to us about that process? Who's involved in that process? Is it just sworn officers, or do you have advisers from outside the police force, maybe people with ethics backgrounds, who can help develop these frameworks and these ethical questions that should be included?
The process was initiated by our board in response to Clearview. The scope of it is slightly larger. It's looking at all AI and ML technologies, not just facial recognition. There are other technologies that have different but similar types of problems. We're looking at all of those.
We had an open consultation to specific groups—law societies, privacy groups, ethics groups and technology specialists—and then we had an open consultation that was open to any members of the public. We went through a round of that on the policy. Now we're expecting to do a similar round on the procedure, which sits underneath the policy and directs the service members.
What sort of outcome are you looking for? Are you looking for an ethical framework whereby you have a certain number of questions you have to ask before using any new technology? Can you describe a little bit about the outcome that you're hoping to get out of the process?
Sure. I think part of the problem we've got that triggered this conversation is that we have insufficient visibility and guidance to frontline officers on how they should approach new technologies. What we're looking to do is create a framework that allows us to filter and surface to our board and to our public the types of technologies that we intend to use and why we intend to use them, and then have a discussion in the full light of day on those technologies.
With the Intellibook program—I think you've already covered this, but this is just so that we're extra clear—an officer will pull up a list of potential suspects, and then it's really just a clue. It's not a piece of evidence that would be used in a court of law if a photo comes up in the Intellibook system.
If we're talking about facial recognition, when we have an unknown subject in a violent crime or involved in a significant issue, and sometimes when we have an unknown witness, these technologies can be helpful. They're very much limited by the scope of our mug shot database. We don't pool that with other police services; it's only our city of Toronto mug shot database.
Some of the flags are the procedural Criminal Code and charter rights. Something that could violate those in any way is certainly a flag. Not facial recognition, but something called algorithmic policing that might direct resources to different communities is a flag, because it can reinforce biases—
I utilize the technology to rescue victims and have not gone down the offender side of these investigations. My use is simply to identify victims only so we can provide safeguarding measures to the victim and then start the investigative process required to identify an offender.
I would only use it for a victim at this point in my world. I feel the needs of a child—
The NTOP process assesses the risks and the ethical issues, including a privacy assessment, on that technology. Once that is done, if I'm able to use it, I will use it. If I can't, I can't and I don't.
I said you're unfamiliar with the person who served in your position before you. Quite frankly, I think that's unbelievable, by the way. You're stating here today that you don't know the person who served in your position before you.
I'll save you the embarrassment, sir, because quite frankly, when I talk about the duty of candour and full and frank disclosure to this committee, what I have is a significant trust issue.
I reference that your service, sir, first denied the use of this technology and has, in initial claims, rejected the findings of the Office of the Privacy Commissioner. It has not, in my view, demonstrated the ability to have the kind of candour and frankness with civilian oversight bodies such as the House of Commons to provide basic information for Canadians who are concerned about their civil liberties.
Mr. Sage, you have quite frequently referenced “I” and “my” and “victims”. This study is not about you, sir.
Through you, Mr. Chair, to Mr. Sage, are you familiar with the RCMP using these technologies in divisions outside of your own?
Has the technology ever been used to provide supplementary information that would have become lawful evidence for the granting of warrants? In other words, could this information be used with three degrees of separation in order to get lawful warrants?
Just to backtrack a bit, we had the Privacy Commissioner here. His report on Clearview Technologies said the RCMP did not satisfactorily account for the vast majority of searches it made. The RCMP disagreed with the Privacy Commissioner's conclusion that they violated the Privacy Act concerning Clearview AI.
Does that position still stand today at the RCMP, after the hearings this committee has undertaken over the past month?
We presently have an employee working at the privacy commission. We've done a work exchange. As we develop our new ways forward, we have a member located within their office, and we are asking for one of their employees to be with our office in order to strengthen that knowledge and relationship. We do have a member there presently to help us.
Wouldn't having somebody from the Privacy Commissioner's office seconded to the RCMP indicate that the RCMP is quite concerned they are offside with the Office of the Privacy Commissioner, who is definitely the most knowledgeable person on the Privacy Act and the protection of personal information?
If we can learn from another agency in any way, we do. We encourage that. That's why we encourage having a member from their office located with us, so we can be integrated and produce a much better product at the end.
One of the concerns I have with facial recognition technology relates to all the false positives, people who were erroneously identified and targeted, often based upon their race. If it's so bad about giving us so many false positives, shouldn't we also be concerned, then, that it's wrong and giving us false negatives? Are people who should have been identified slipping through the system, especially with respect to things like child exploitation and missing persons?
When we look at Clearview's technology and their unlawful scraping of images from the Internet, shouldn't we think that potentially this would lead to more harm than good when dealing with things like child exploitation?
When it comes to child exploitation, we realize that Clearview AI is not always correct. That's why we have a human intervention piece in there. It is absolutely critical to have a member actually view the results to see if they are true.
We did, in fact, test it ourselves, and we did find that false negatives were coming out of the program, so we're fully aware of that. If there is a better technology, that would be fine; however, you always need that human interaction and that human review process to take place. As we propose, in the future we will always have that, and it's absolutely critical.
Facial recognition technology is simply another tool in the tool box. It cannot be operated on its own, independently of any other processes. The human process is absolutely critical. The technology simply gets us to identify the victim in a quicker fashion. Traditional ways can then take over, but it will always be used with human interaction.
I must admit right away that I have a bias. I am not in favour of the use of facial recognition technology, but I was open-minded enough to listen to the evidence. It shocks me that my colleagues' questions, which are quite simple, are not being answered.
Mr. Sage, can you explain to me why you cannot answer these questions directly?
Yes, definitely, and that's through our national technology onboarding program, whereby every technology will be assessed from all those facets that were named previously, from a privacy, ethics, bias and legal perspective, and before they're used in an operation or investigation.
Mr. Sage, do you or do you not share the Privacy Commissioner's opinion that the RCMP's use of Clearview AI technology represented mass surveillance and a clear violation of the Personal Information Protection and Electronic Documents Act?
Well, it is a point of order, because I do believe that if you look at our rules of procedure in Bosc and Gagnon in chapter 20, there is an expectation put upon the witnesses who appear before a committee to answer all questions put by committee members, fully and truthfully. I do see that some of the answers we are receiving today have been very much limited. I would suggest that witnesses should exercise their responsibilities to this committee, and that those of us around the table have parliamentary privilege and do expect complete answers. Giving one-word answers and being dodgy is not fulfilling the work of the committee.
It's noted, Mr. Bezan. I, as the chair, don't want to be in the position of judging the responses that come from our witnesses. You are absolutely correct that witnesses do have an obligation, when they appear at our committee, to be truthful and to answer to the best of their abilities. I don't want to get into a debate about the quality of the answers as chair, but your point is noted.
I see that Ms. Khalid has a point on the same point. Go ahead.
I just want to remind members of the committee that you have ruled on this point of order that it's not really a point of order. When we have found the answers of witnesses to be lacking in the past, we've invited them to provide further responses in writing. I think that we should do the same in this instance and not really get bogged down in the minutiae of it right now, and rather continue with our questioning.
I think what we're seeing here, Mr. Chair, is a huge gap between the way in which the RCMP views its role in public safety and the way in which our committee, as an elected civil society group, views its role. I want to get specific, because the language does matter when we talk about things like mass surveillance, and that's why I can appreciate my colleague's frustration that the answers have not sufficed.
In the investigation of the RCMP's use of Clearview AI, the Office of the Privacy Commissioner found that the company's technology allowed law enforcement to match photographs to a database of three billion images scraped from the Internet—three billion.
Mr. Sage, would you not agree that three billion images would constitute, quite rightly, mass surveillance?
Mr. Chair, this is, again, what our public safety institution is doing indirectly when it cannot do it directly. Clearview AI's technology is used to identify people by matching photographs against their database of three billion images. That's just a fact.
In fact, according to the Office of the Privacy Commissioner, only 6% of the searches recorded by Clearview appear linked to NCECC victim identification, and approximately 85% are not accounted for at all by the RCMP.
Given this context, what was the purpose of the RCMP's staff who conducted these searches? Would you not agree that with a 6% hit rate and 85% unaccountability, that would constitute mass surveillance and an unlawful and unwarranted gathering of information against the general public?
But isn't testing it a surreptitious gathering of information?
Let me ask one last question, Mr. Chair. With respect to the practice of street checks and racial profiling—the analog version of this, which the RCMP is still, at least to my knowledge, using actively across the country—at least that process would have some framework of accountability. Is it your testimony here today that in “testing” this, you can use that phrase to perhaps justify the gathering of this information without legal frameworks?
As a point of clarification to our witnesses, there's been some request for further information, so I would simply ask—and I hope with the agreement of other members of the committee—that the documents that have been asked for be provided by June 1. I think that would be a very reasonable request.
Director Sage, could you describe for me Project Arachnid? Do you have any involvement in that? I note on their website that it specifically states, “Project Arachnid does not use or rely upon facial recognition technology. It uses hashing technology — which is technology that assists in matching a particular image or video against a database of known child sexual abuse material.”
Mr. Sage, could you outline your familiarity with Project Arachnid and explain exactly what it is?
Yes, I am aware of it. It's a program that CCCP runs out of Winnipeg for the child exploitation centre there. It is not using facial recognition technology, and I confirmed that with the director of their program. They use a hashtag search, which generally is the DNA of a photograph. It crawls the Internet based on the DNA of that image. When you have an image, it creates a hashtag and it is based on that. They do not use facial recognition technology at all.
Certainly I think it's this committee's wish to find that right balance to make sure that law enforcement has the tools needed to deal with those who commit heinous crimes, while ensuring that the rights of Canadians are respected and that challenges with racial bias and things like FRT are called out.
I would ask this question, as well, to our witness from the Toronto Police Service. Are you aware of Project Arachnid, and has that been used with any Toronto Police Service investigations?
To follow up on the quality of the answers we've been getting, they seem to be intentionally evasive from some of the witnesses. I remind witnesses that at committee, you can be held in contempt of Parliament if you aren't fully co-operating or are it is found that you haven't been fully co-operating. I'll take your counsel on this, Mr. Chair, that we'll allow the witnesses' testimony to stand.
Based on some of the conversations we've had in the past, potentially we need to have a more senior member of the RCMP here, such as Commissioner Lucki herself. That is something we should consider.
I also want to reiterate that the documents that have been requested by committee members should be provided by June 1 so that we can take them into consideration in doing our work on this study.
I want to go back to IntelCenter Check.
The witnesses were saying they haven't heard about it before, yet IntelCenter advertises this product as terrorist facial recognition technology software, using open-source images of terrorists from the Internet and the RCMP in its procurement documents. That suggests that not only is the RCMP using it, but possibly CSIS and possibly the Department of National Defence.
As has been said before, we can't do indirectly what we're prohibited from doing directly under charter rights in surveilling Canadians. To the RCMP, are you using any FRT technology other than Clearview, which is right now not available in Canada? Again, there is the issue around the IntelCenter database FRT.
I can comment on the IntelCenter software services. This software was acquired on an internal trial basis only. It was not tested or used in any national security investigation or other operational capacity.
In March 2018, it was identified that the IntelCenter service software was not approved for operational use, and its use by E Division was discontinued.
Thank you, Mr. Chair. Thank you to our witnesses for being here today.
My question is to the RCMP. We've had a considerable number of witnesses come in and talk about how many agencies are using FRTs. At a previous committee, Mr. Boudreau said that the RCMP does not use any new FRTs.
Which old FRTs do you use, and do you share the data gathered with any provincial agencies?
We heard last week from the National Council of Canadian Muslims that police agencies, specifically in British Columbia, have been using FRTs at rallies, gatherings or protests. Is there any evidence of that?
If we're talking about transparency, how can the RCMP ensure greater transparency around its use of artificial intelligence technology, such as facial recognition software, going forward? How can we learn more about these FRTs and transparency within the RCMP?
You asked two questions. They're about the past and the future.
In the past, Paul Boudreau, my superior, queried all the detachments and RCMP units across the country, and he responded accordingly.
Moving to the future, all of that process would be through the NTOP process. Any software being asked about to be used across Canada by the RCMP needs to go through that NTOP process. If it's allowed, we would. If they say no, we don't. In that process, there's a privacy assessment as well. We will only use what is approved by the NTOP process.
Good afternoon. It's the national technology onboarding program, whereby all technology leveraged for operational or investigation purposes will be assessed from a privacy, ethical, bias and legal perspective before being deployed in operations anywhere.
From a public awareness and transparency piece, it is built in as part of our communications strategy to relieve the categories of technology that the RCMP will be leveraging in the future.
They said that police agencies have been using FRTs at, let's say, common gatherings, a protest on an issue, or a community rally on something. They said that police have been surveilling those rallies and using FRTs.
The surveillance might be visible with body-worn cameras, static cameras, etc. There may be operational reasons for those to be deployed. Whether they're used for FRT would not be visible to the people who are at those events.
I don't understand whether they're suspicious that they're used or they have some sort of evidence that they were used.
Yes, I think that what we're doing now, going through the NTOP process to do that full review, is a good thing. It's as important as a privacy assessment. That was not done back then, and I wish it had been done.
I think it's a good process, and we've learned from that. We can now implement a better process. I wish the NTOP process had been in place back then. It wasn't, but we've learned from that. We've moved forward to create a great process to ensure that the privacy and rights of Canadians are maintained.
It was 2019, and not 2018, when they were purchased. They were given to the officers on the ground level to make that decision. They did, and I believe, probably, inappropriately. There could have been a better way, but they were working in an environment that didn't have the NTOP process, and that process, which is in place now, would minimize those things from going awry as you described.
I am pleased to hear that, finally, you recognize the facts that the commissioner has mentioned. I'm going to take advantage of this moment of candour, Mr. Chair, to make a motion.
I note that from the beginning we have had few answers to our questions. So I would like to formally request by way of a motion that no later than June 1, documents be tabled by the RCMP. I would like to have any contracts and licensing agreements, unredacted, that have been entered into in the last five years with the company Clearview AI, as well as any ethical analysis that has been conducted by the RCMP prior to the use of such technology, or, if none exists, confirmation that no analysis has been conducted.
I want to be clear. When we talk about the use of IntelCenter, we had the collection of a database, or at least access to a database, of 700,000 images of what this company called “terrorists”. Who they were, how they were determined to be terrorists and the accuracy of the company's information was basically impossible to assess.
The RCMP didn't reveal why or how they used this system. We've heard earlier testimony, Mr. Chair, that it stopped, and quite rightly so.
My question through you to Mr. Sage is this: Does that information remain within the intelligence files of the RCMP or other security agencies? We know much of this information is shared through systems like CPIC.
I want to state for the record, before Mr. Sage is relieved of his very unfortunate duty of being before us here today, that the original person who was suppose to be here, his supervising director, was not here.
I want to be clear for the record. Mr. Sage, I'm going to put this question to you one last time: Are you familiar with Marie-Claude Arsenault? Is that the retired person who is your predecessor, yes or no?
I just want to state for the record, Mr. Sage, that it is my perspective that you afforded your predecessor more consideration in their right not to be named in a situation that is really public information in a public forum than the billions of people who have had their images compiled and analyzed by this AI technology.
I want to also acknowledge while we're here that the interim director was Dr. Roberta Sinclair. Is it correct that she was the acting director general?
Again, Mr. Sage, noting that you are new to this, I'm to understand that you weren't in this department prior to this. You were somewhere in Alberta. I respect that. I'm not going to double down on you.
The challenge we have in providing this type of new technology to our security frameworks, our intelligence agencies and our police is that there's very little oversight and willingness to share basic information and to have that duty of candour.
Mr. Chair, I'll leave that comment there because I don't want Mr. Sage, who was unfortunately put on the hot seat today, to leave here thinking that this was by any means personal. It was not. The person who he reported to who was here last time....
We've heard from my good colleague, Mr. Bezan, that we will be duly putting a motion. I'll just do it right now, Mr. Chair. I move to have the commissioner, Brenda Lucki, appear before this committee for the purpose of getting answers.
No, we would have 15 minutes, with five, five, two and half, and then two and half.
Bells are likely to go in about 20 minutes. Time allocation has been moved in the chamber, but it appears that we are just beginning the 30-minute debate period on that. We might not even have bells. I don't know. We'll see.
We're getting a little bit irregular here. I would be happy if there's unanimous consent to proceed that way. We can do another round of questions and then deal with both motions.
Actually, Mr. Chair, we agreed to finish the round of questions that was already in progress, so that the motion could be written and translated. It's a matter of minutes. We could debate the NDP motion and then debate our motion. I think the timing would be appropriate.
Well, we're of course free to have as many meetings as we want. Our original motion on meetings spoke of minimums, not maximums. I don't believe we are under a maximum, but we certainly have agreed to have three more meetings. That to me would be—
Ms. Iqra Khalid: [Inaudible—Editor]
The Chair: I would consider it one of the three in terms of minimums, yes.
I too have no objection to inviting Ms. Lucki to this meeting. I think it's important to do so.
I don't know if my colleague across the way would be...or if maybe there could be a general understanding. I know there's no such thing as a “friendly”. I would just like the people who are really responsible for this to be invited to the committee. If that's even a retired officer, I wouldn't mind having that person back.
I'm glad that Mr. Green mentioned that this is not personal to Mr. Sage at all—not at all—but I just want some more answers. Like Mr. Green, I did a quick Internet search. Within two minutes, I found out the name of Madam Arsenault.
I just want to make sure we have the right people before us who can answer these questions. Otherwise, I'm afraid we're going to get the runaround again.
If no one else wishes to speak to Mr. Green's motion, I'll call the question.
All those in favour of inviting Commissioner Lucki to committee?
(Motion agreed to [See Minutes of Proceedings])
The Chair: Are we ready to discuss Monsieur Garon's motion? The motion has been distributed. Everybody should have it in writing now.
Monsieur Garon, what you have distributed in writing is a clearer iteration of what you had dropped on the table. It is not precisely the same. I might ask you to withdraw what you had orally moved and allow the motion as distributed to be the text of your motion.
I would have preferred that we proceed in reverse order, Mr. Chair, that is, that we pass the motion and withdraw what I asked for verbally afterwards, but I agree to withdraw my previous requests and that we debate my motion.
I think it's clear among everyone in this committee that there is more information that we require. It's definitely needed, and we are well within our rights as a committee to request the presentation of documents, contracts and so on.
My concern is in the request for them to be “unredacted”. While I can appreciate that we really are pushing here to get the transparency we need, it's a precedent that we have to consider in terms of other committees. If it's tabled here, then it will have impacts elsewhere.
There are times when contracts and information do need to be redacted. I've had my own experience on the foreign affairs committee, where we had initial documents presented to us that were redacted. Once we reviewed them, we asked for additional clarifications.
We always have to be mindful of security concerns and of privacy concerns of corporations and so on, and also the precedent. If we always ask for unredacted documents, then witnesses will not necessarily co-operate.
I think that if there is a precedent to be set, it is the deference that we show to our security apparatuses, including CSIS, our military and police. As parliamentarians, we have privileges. There is lots of jurisprudence on which we have done lock-ups and had access to unredacted documents for that purpose. I don't think it would prejudice any other committees in the work they do.
What we've seen here, in my opinion, time and time again, is a clear unwillingness to adhere to what I have called the “duty of candour”. Having accountability on this technology would, I imagine, be a part—a significant part, hopefully—of the legislative recommendations that would come out of this study.
What we heard today was an unwillingness to be frank and concise in answering very basic questions, so I would ask that they be—I would require that they be—unredacted. There shouldn't be anything overly sensitive, unless, of course, it's contrary to the testimony that has been provided to this committee through witnesses, in which case it would open up a whole other subset of challenges that we would face.
However, for the purposes of this, Mr. Chair, I would be willing, if it suits the government side, to have a lockdown requirement within this committee so that we would have direct access to the documents. They would not be made public, but we would retain our long-standing traditions in the Westminster system for parliamentary privilege to send for documents, people and any other evidence as required by committee.
I absolutely agree with the sentiments of this committee. I think it is important for us to have a clear, open, transparent process on how policing is conducted within our country, but I also take note of a number of things that Mr. Sage has said and done—and a number of other witnesses—with respect to public safety and the safety of witnesses and victims.
I am in agreement with the motion presented by Monsieur Garon. I think that we should make some concessions here, such that if matters of public or individual safety or matters of national security exist within the documents we are requesting, they should indeed be redacted.
The second point I'll make on the wording of the motion before us is that we're asking for any “ethics analysis”, which I find is pretty unclear language. I would prefer it if we could request any “charter analysis” that was done, or “constitutional analysis”. I think that makes it a little more clear.
I'd like to hear members' view on the two points I've just outlined.
I agree with my colleague Ms. Khalid. We are looking for a degree of transparency here and to understand what has transpired in terms of the contracts. We all want to be able to move forward with a clear set of recommendations.
This technology isn't going away, and I'm sure that the TPS, the RCMP and many other policing services in the country understand that FRT is out in the world, and we really need some clarity on how to wrangle it in, including on the contracts that are signed with our security services in order to know what safeguards and guardrails need to be in place in such contracting in the future.
That said, I would caution our colleagues here in terms of understanding the scope of privacy laws and security concerns when we do ask for these documents of what's at play. We should always proceed with caution, while at the same time getting the documents that we require to have a fulsome understanding of what is at hand.
Mr. Chair, the very existence of this motion stems from the fact that the RCMP witness, Mr. Sage, explicitly refused to be transparent, explicitly refused to answer our questions, and explicitly refused to give us any information. He even refused to admit that the contracts we are trying to obtain today exist. So, in the circumstances and in the context of this public contract, I think it is entirely appropriate to ask for the documents as they are. As parliamentarians, we will accept our responsibilities, including any obligation of confidentiality.
I'd like to come back to the question of co-operation. I understand that sometimes requesting such unredacted documents could be seen as potentially discouraging potential co-operation from witnesses. However, in this case we are dealing with a public official who refuses to co-operate with members of Parliament. I think it is important that the committee have access to the documents as they are, i.e., unredacted.
I'd hoped that the member would address the two points I had outlined with respect to the unredacted piece and the ethics analysis piece.
Perhaps I will just move an amendment to the main motion to remove the word “unredacted” from the motion itself and then replace the words “ethics analysis” with “a charter analysis”. Those are the two amendments I would seek to the main motion.
Just to explain, it's always the committee's prerogative, if the documents requested and received from the RCMP are not satisfactory, to go back and request them again or see how we can conduct ourselves after the fact.
At this point I really think we should go forward with these two amendments, Mr. Chair.
That's the trick. The only way to do that would be to move them one at a time. Ms. Khalid is moving them together in one amendment. That will be the question for the committee, unless she would like to withdraw that amendment—
I want to seek clarification from members, then, on how we will deal with these unredacted documents, and perhaps we can come to an agreement as to how we will protect the potential sensitivity of these documents. I'd like to hear from my colleagues on that.
Mr. Chair, to Ms. Khalid's comment, I believe that any documentation, anything submitted to the committee, is always owned by the committee and handled by the committee. It is not necessarily turned public unless it's attached to reports we release down the road. I believe that this would be held in confidence and only available to and under the control of the committee members themselves.
Similar to my colleague, I think that respecting confidentiality, at least for an unredacted review, would be valuable for security and privacy law considerations.
I'd also like to encourage us to ask.... I'm curious as to whether or not the Privacy Commissioner, in their own analysis of Clearview AI, had an opportunity to review the documents themselves. It would be safe to assume they may have.
Perhaps that could be part of the consideration as we do this review, because the Privacy Commissioner should have had, in their own review of this situation and this file, a look under the hood, as they say, at the contracts.
Chair, very respectfully to all members on this committee, I would really like to set the terms clearly before we move to a vote on anything. It's been noted in the past, when things have not been clear, that we've seen actions happen to the detriment of members and to the public as well.
Can we please set out clear terms for how we are going to be reviewing these documents and how these documents will be received before we go to a vote?
Again, I'll respond from the chair just to point out that it's up to members of the committee to propose anything. If there's an amendment to be proposed, someone must propose it. Otherwise I'm going to go to the vote on the main motion.
Look, I'm comfortable moving that we receive the documents in camera. We also, as a committee, have the right, once we review them, to disclose them publicly if we feel that's the will of the committee. However, for the initial onset, I'm certainly willing to move a motion that we receive the documents in camera. I certainly look forward to the government side, having considered this amendment, supporting the main motion.
Mr. Green, just on one point there, the documents will be received by email or as a physical copy. They won't be received in a meeting. Your amendment perhaps would be that they be reviewed or debated or discussed in camera.
My apologies. Thank you for that clarification, Mr. Chair. Yes, it's for the consideration of this committee to decide whether we want to move forward in an in camera capacity, given the sensitivities, or in a public forum, given the public interest.
Thanks, Chair. Just for further clarification, who exactly would receive these documents? Would it be just members of the committee? Would staff have access to them? Would House personnel have access to them?
Thank you, Mr. Chair. I'd like to just give a suggestion based on my previous experience dealing with the arms export documents over at the Department of Foreign Affairs. The classified documents were provided with an access code for committee members only. There is a particular way of doing in online. Staff did not have access to them. It was only the members of the committee who could take them under review.
I'm certain that our clerk could take reasonable steps to ensure the security.
We will vote on the amendment.
(Amendment agreed to [See Minutes of Proceedings])
The Chair: All those in favour of the main motion?
(Motion as amended agreed to [See Minutes of Proceedings])
The Chair: Bells are not ringing yet. We still have our witnesses. We do not have time to complete a full round. I think that perhaps at this point, unless there are objections, I'll release our witnesses and conclude the meeting.
Are there any objections by anybody who's dying to get an extra question in? No.
That being the case, my thanks to our witnesses, Mr. Sage, Mr. Boileau, Mr. Stairs and Mr. Séguin. Thank you very much for appearing today.