Skip to main content
Start of content

HUMA Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication

STANDING COMMITTEE ON HUMAN RESOURCES DEVELOPMENT AND THE STATUS OF PERSONS WITH DISABILITIES

COMITÉ PERMANENT DU DÉVELOPPEMENT DES RESSOURCES HUMAINES ET DE LA CONDITION DES PERSONNES HANDICAPÉES

EVIDENCE

[Recorded by Electronic Apparatus]

Thursday, May 4, 2000

• 1109

[English]

The Chair (Mr. Peter Adams (Peterborough, Lib.)): Colleagues, I call the meeting to order. We're meeting today pursuant to Standing Order 108(2), a study of HRDC grants and contributions.

I will introduce our witnesses in a moment. I suggest that we start immediately with the witnesses and then proceed to discuss the future work of the committee and the notices of motion, which are listed on our agenda, around 12:30 p.m. So my objective would be to conclude at about 12:30 p.m. and then proceed to other business.

• 1110

Colleagues, you know that today the question in our interim report that we are trying to address is, what do private sector and community funders expect in the way of administration and accountability? Are there private sector or community lessons that can be applied to the public sector? That is the particular part of our report we're trying to deal with.

We're extremely grateful to have with us today at the moment, anyway—we hope that another may arrive—Dr. Tom Brzustowski. Tom is the president of the Natural Sciences and Engineering Research Council of Canada, the acronym for which is NSERC. Tom, we greatly appreciate your being here. Also, representing Private Foundations Canada, we have the president and CEO, Julie White. We welcome both of you here today. It's very kind of you to come.

I think at the very least through the media you have some sense of the exercise we're going through here. Normally the way we proceed is that the witnesses make their presentations first, and then we ask questions of them collectively. If that's okay with you, we will proceed. Have you any thoughts as to who might go first?

Ms. Julie White (President and CEO, Private Foundations Canada): I'd be happy to lead off.

The Chair: So it's Julie White of Private Foundations Canada. Julie, we do welcome you here. If you would proceed, we'd be glad to hear what you have to say.

Ms. Julie White: Thank you very much. I'm delighted to be here.

I'm representing Private Foundations Canada, which is a new organization that was incorporated in 1999 to represent the interests and strengthen the capacity of the private foundation sector. Our 23 sustaining members represent about half of the assets of private foundations, or about $2 billion. They make grants in the area of $100 million a year. We'll be drawing on the experience of my members and also my own experience working in the public foundations sector as the CEO of the Trillium Foundation prior to assuming this role.

What I'm going to talk about in my eight minutes is the notion of what constitutes due diligence in the foundation sector. I'm going to start by talking about the due diligence related to fiscal responsibility, and then I'm going to talk about the other aspect of that, which is the due diligence related to the program impact itself.

Fiscal responsibility related to grant-making is really a matter of ensuring that the money is spent well. There are a number of ways of looking at that. Certainly, we would set up systems and supports that would start right from the very beginning of the application process asking the right questions of the organizations themselves around budgets, audited statements, and what their track record was. We look for references. At the Trillium Foundation we checked three references, looking at how they've managed money before and what their relationship had been with other funders. We set up internal systems. We had a process of external reviewers as well as internal reviewers, and we designed a series of decision tools that enabled people who may not have had a financial background to assess the financial stability of the organizations we supported. We also provided training internally and to the other reviewers.

Occasionally we would impose or request grantees to have a sponsor that could manage the funding for them. We would look at the amount that was requested, we would assess whether that was the correct amount, and then we would set up systems that would enable us to ensure that the money was spent. Some of those systems would also include regular reports, and there were sanctions around reports. I know that some of my members withhold a certain amount of money until the final report is in. If it's a long-term grant, the interim payments are conditional upon receiving reports as well. So what we've tried to do is to set up a kind of early-warning system so that if there's any potential mismanagement in terms of either the amount of money being spent or how it's being spent or if there needs to be adjustments, we're brought into that at a fairly early stage.

The other aspect is the program impact. This is both harder and more important. This is really ensuring that the funds are spent well. There are two reasons for that: one is that money changes things. It changes things for organizations, it changes things for communities, it changes expectations, and it changes systems. I think those of us who are giving money away at the foundation level are all too aware of the degree to which we can change organizations, and we need to be fully aware of what that kind of impact is.

• 1115

I can give you a couple of examples. It's not just the money itself. It's how the money is distributed. For years the Trillium Foundation required matching funds. They were very large grants, so the matching funds were quite significant. This was in the early 1980s. By the mid-1990s the charitable sector in Ontario had gotten so heavily into Nevada tickets in order to raise that kind of money, that I personally think that the Trillium Foundation is almost responsible for the proliferation of charitable gaming. Yet it came from a very well-intentioned purpose of supplementing income.

So when we develop granting programs, we really need to take a look at the broader impact, not just the grant itself but the program itself and all of the conditions and requirements we impose.

Another example of that would be the fact that the fundraising consultants industry has expanded enormously because of the increasing requirement for sophisticated proposals and difficult analysis. Again, we need to take a look at that broader thing.

The second part of that is the opportunity cost. For every dollar we give to one place, we're not giving it someplace else. It's important for all of us, particularly those of us who have a fixed amount of money to give away, to look both at what we can do and what we're not able to do as a result of what we do.

The next part of that is around implementation. Implementing the program is really the hard part. It starts with aligning all the systems from the articulation and understanding of the program itself, taking a look at making sure that it's grounded and well understood and that it is in fact related to what it is you want to try. Then there's the process of application. Are we asking the right questions, and are we gathering the right kinds of information? We look at the kinds of requirements. At the Trillium Foundation, every time we set up a new system we would ask ourselves, why are we doing this? For every new thing we asked or required we took one off, because it's so easy to just keep adding things that in fact may not lead at all to a good decision or a good outcome.

Finally, around that category, what constitutes success? It surprises me how often we forget to say, what does success look like? How will we know if we've got there? What does failure look like, and how will we know if we've got there?

In addition to the things we require externally, there are certain kinds of things that need to be set up internally to make sure these things work. An organization that's going to do good, responsible, accountable grant-making needs to be focused on learning, learning about the impact from both the macro and the organizations they fund. It needs to have the ability to respond and incorporate that learning on a regular basis. It just doesn't help for somebody to be learning over here if it doesn't have any institutional impact.

There needs to be a tolerance for ambiguity and a respect for the uniqueness of individual efforts. It's a rare program that spreads out the same in every community, and it's important to recognize that opportunities and assets vary from community to community.

That is the next part, too, around requiring flexibility in the system and requiring skilled and knowledgeable staff and external reviewers, spreading the net widely in terms of getting broad input, and a system that is judgment based. In the end good grant-making isn't a science. It's an art. It requires good judgment, and that judgment has to be based on qualified, trained, and knowledgeable people who are involved in that decision.

My last point is that a good relationship with grantees is essential. If a grant is going down, if it's going bad, if there's a problem in the organization, if there's a good relationship between the funder and the grantee, that applicant will talk to you about it, and you can engage in mutual problem-solving. You can catch things at a very early stage. You can save initiatives.

That also goes for where a grant may be going bad, and it gets some public recognition. That happens in all sectors. It's not limited to government by any means. This happens in the foundation sector as well. We working in the foundation sector take pride in the fact that we do risky grant-making, that we're out there trying new things. By the very definition of a risky grant, sometimes they go wrong or sometimes people don't like them. Certainly I've been subject to grants that have had various levels of public support. The better your relationship with the grantee and the more you understand what's going on, the less you're likely to be sideswiped in the media or in the public.

• 1120

Those are my official comments.

The Chair: Thank you very much indeed. We appreciate that.

Tom Brzustowski.

Dr. Thomas A. Brzustowski (President, Natural Sciences and Engineering Research Council of Canada): Thank you, Mr. Chair and members of the committee. I realize now I'm sitting in a very squeaky chair, so if you see me move suddenly, it will be to change chairs.

Thank you for the opportunity to speak to you today. I hope my remarks will be useful to you.

The way the chair outlined the question the committee is dealing with is very interesting. In fact it's quite appropriate for NSERC to be here, because the private sector and the community are involved in what we do, but in a way that may perhaps surprise you and be not a totally direct answer to the question.

I'm well aware that the circumstances under which NSERC operates are quite unique to the support of university research. I must admit also that I know so little about the details of the operations of HRDC that I'm not in a position to suggest which of our practices might be lessons that are transferable. I judge that the committee members will of course make that assessment themselves.

[Translation]

I must insist that I am well aware that the circumstances under which NSERC operates are unique to the support of university research. I must admit, also, that I know so little about the operations of HRDC and I am not in a position to suggest which of our practices may be of the greatest interest to you. That said, I am very pleased to discuss with you how NSERC manages its funds.

[English]

First please remember that NSERC funding goes to universities. It goes to established institutions, where it is held in trust for the professors who are the successful applicants for grants.

The universities have a capacity for responsible management of public funds, and they are a key element in our system. The money we provide can be spent only according to some very clear, well-publicized, published rules, available in print, available on our website, and supported through information visits to the universities. In addition, we check university practices from time to time to ensure that the rules, the conditions of our grants, are in fact being followed.

About half of our funds are used to support people, and they're mainly young people. They're university students at both the senior undergraduate and the post-graduate level, they're post-doctoral fellows, and they're other research personnel. The rest is used for the actual current expenses of doing research: the operation and maintenance of equipment, consumable supplies, computing, what have you.

NSERC funds cannot be used—and I underline this, cannot be used—to contribute to the salaries of the professors who get the grants. They cannot be used by the universities for overhead expenses, such as heat, power, and light in the labs and the various administrative costs of doing research, including—and this is perhaps a bit ironic—the costs of the administration our rules impose on them. That's how it is.

As a result, we believe NSERC pays for only about 60% of the costs of the research done with our support, even when the principal investigator's, the professor's, salary is not included. The rest is found by the universities from other sources, and I'm sure they could speak volumes about how easy or how difficult that is.

The selection process is very important to us. We want to fund only excellent research, and to achieve that, the decision process is as independent and objective as possible. And to achieve that, we use the peer review system.

• 1125

Peer review both minimizes the risk of making bad investments and maximizes the quality of what we're likely to get in return. All funding recommendations, every single one, whether for funding or against funding, in all of our programs, come from these volunteer specialist peer review committees in the applicants' fields. They have studied the detailed applications, and in addition they have received third-party expert referee reports on the applicant's record, their previous contributions, and on the current proposals. We consider only those applications that come from applicants who qualify under the eligibility rules.

The committees recommend which proposals to fund based on the highest standards of academic excellence, the importance of the proposed work to advancing knowledge, and a number of other criteria, which differ from program to program, but they're all published; they're all specific. Staff oversee the process to ensure committees adhere to our policies and apply the selection criteria consistently.

Mr. Chairman, the text was distributed, but I think I should depart from it here to give members a feeling for the scale of the operation.

The amount distributed in grants and scholarships is about $500 million a year in the current budget year. The number of university professors, the investigators who are supported, is close to 8,000. The number of students, research staff, and post-doctoral fellows supported is close to 13,000. About 3,000 decisions are made each year that a number of people apply for four-year grants, and these are made by 25 discipline committees, all volunteers, of maybe 10 to 15 people each, including Canadian academics but also including scientists from industry, some academics from outside the country, and the occasional government scientist. These are the peer review committees. They serve as volunteers, and they are absolutely a key element in the way we make our decisions.

The committee recommendations become funding decisions when the president of NSERC approves them in writing, and one can think of only two reasons for withholding approval. One would be if staff noted a serious irregularity in the process, and another would be if—and this is more recent—together with the application approved for funding, there was not a satisfactory outcome of an environmental assessment review in those areas where there might be an impact on the environment. Beyond that, the recommendations would be accepted. Let me just say there may be delays to make sure ethical requirements involving human subjects and requirements for the treatment of animals in research are complied with, but these generally have been only delays.

The Auditor General noted in a recent report on innovation, and I quote paragraph 19.77 from last fall's report:

    Scientific merit well evaluated. The merit of the research proposals and the quality of the researchers were well evaluated. NSERC uses a system of project review committees consisting of researchers and industry representatives. In addition, projects are submitted to external referees who provide a written evaluation of each proposal. The use of external referees greatly expands the expertise available within the panel and thus provides a vital source of information in the decision-making process.

I should add that on many occasions those external referees are from outside Canada, and there are many thousands of them. The participation rate, the response rate, to our requests for reviews is very high; between 60% and 70% of those people who get five papers to review and an application actually respond. This is typical of the way that community works around the world. This is not a favour done uniquely to Canada.

Now let me move on very briefly to monitoring of NSERC-funded research. We consider monitoring an essential part of the administration of our programs, and we have a number of processes in place to do that, including site visits, information sessions with university researchers and administrators whenever any rules or policies are changed, provision of advice regarding eligibility of expenses, annual verification of eligibility, and so on. The philosophy behind this, because we're a small organization, in one place, is we try hard to help the grantees and their universities to manage their grants responsibly. We feel the approach has been successful.

• 1130

Again, let me depart from the text, Mr. Chairman. I would say we operate at a level of 99%-plus of success in these measures. This is my own impression. I'm aware each year of something in the order of two dozen appeals against decisions—some of the appeals against positive decisions, when people feel they haven't had enough money—and perhaps an equal number of suggestions that there are weaknesses in the system and the university. But I'd say 99%-plus is a successful operation, for the reason that we work with the researchers and their universities to help them manage these things responsibly.

We do annual monitoring, and it differs from program to program. For example, in our programs of partnership with industry, we make sure the research is on track, the industrial participant is living up to commitments, and the researcher is progressing properly. If we find this is not the case, then ultimately a project can be terminated, even though a decision to fund it had once been made. So we do monitor progress.

Again, I should underline that when I talk about the involvement of industry, I mean they are involved as our partners in sharing the risk of university research—that is, putting their own money on the table. They do not receive a cent of NSERC money. Industry is our partner in funding university research.

In our research grants program, our biggest program, we deal with perhaps between 2,600 and 3,000 applications for four-year grants each year. About one-third of these applicants do not get funded as a result of the review in these discipline selection committees, and always for the same reasons: in the opinion of the committee, they haven't been productive enough, they're not up to doing the work they propose, or the work might not be interesting in advancing knowledge; it could be too routine. The reasons are always the same.

That means if one-third don't succeed, two-thirds do. That might seem like a very high success rate, but one has to remember this is a system that's been in place for many decades, and many individuals have been through two or three competitions before any particular competition, so the ones not likely to succeed aren't there. So there are new applicants coming in, and they get very careful attention.

We unfortunately are able to give the ones we fund only about half of the money they ask for—a little bit less than that, actually. We simply can't afford to give more. But it has turned out over the years that particular practice still allows some progress to be made, and it builds a broad base of competence in research. And there are other programs to which people might apply. Had we more money, we would of course fund these people better and allow them to make greater progress towards the goals that are approved, but this seems a reasonable approach to make use of scarce public resources.

I've mentioned the visits to the universities and post-award administration. We do have also evaluation and performance reporting. We have a formal audit and evaluation process in place to meet our accountability requirements to Treasury Board and to Parliament, of course. Our departmental performance report describes this. Everything is posted on our website.

Let me summarize by saying we think we have achieved a balance between cost-effectiveness in administering our programs and being accountable in the use of public funds. Typically, in the more complicated university-industry partnership projects, we spend between 1% and 2% of the project value on the monitoring costs. Within the research grants program it's about 1%. That means we spend quite a great deal of money on projects worth maybe $2 million or $3 million, such as the networks of centres of excellence, and very little on things worth $10,000 or $15,000. But that's a conscious decision based on risk assessment and on an attempt to minimize the overhead costs and maximize the amount of money actually voted for research support that goes to support research.

• 1135

We operate with 4% of our total budget going into administration. We're very stretched by that; we think that's too little. We're very thin, and the results often are that there are delays in providing a response to our applicants. We would like to do better than that. We will try to get a little more money for the administration, particularly because with such things as the new requirements of the Environmental Assessment Act coming in and the new policy on the ethics of research involving human subjects, this adds complexity.

We're feeling very pressed. We recognize that 4% has become a bit of a standard, but I would venture the opinion that the gold standard should be a little more than 4%.

Mr. Chairman, I've tried to make four points here: our money goes to the universities; the rules on how the money can be spent are widely known, widely understood, and we provide a great deal of help to those who administer the money to interpret the rules properly; we count on objective peer review and expert assessment in making our decisions; and we do monitor the spending with spot checks based on risk assessment.

Let me stop there and thank the committee for their attention. I'll be happy to offer whatever additional comments I might. Thank you.

The Chair: Tommy and Julie, my thanks to you for two very clear presentations. I really appreciate your addressing the interests of the committee.

Colleagues, bear in mind that we have until 12:30. My list begins with Paul Crête, then John Godfrey, Rey Pagtakhan, Raymonde Folco, Bryon Wilfert, and Larry McCormick.

[Translation]

Paul Crête.

Mr. Paul Crête (Kamouraska—Rivière-du-Loup—Témiscouata—Les Basques, BQ): I would like to thank both of you for your presentations. My question is for the president of the Natural Sciences and Engineering Research Council of Canada.

In your presentation, talking about the selection process, you said that the Auditor General noted in a recent report on innovation the rigour of the process. Could you somewhat elaborate on that, because we certainly did not hear the same type of comments from the Auditor General about Human Resources Development Canada. The information you are going to give us might somehow inspire us.

I would like you to do it in connection with the following sentence, which can be found in the summary at the end of your submission:

    I believe that at NSERC we have achieved a balance between cost effectiveness in administering our programs and accountability.

I would like you to elaborate on the impact of fundings from the following perspective. Within the framework of programs where grants are provided for job creation, we must first think of developing something more original than what already exists. We have to know whether jobs were actually created, how long they were maintained, and what is going to happen afterwards. There are only a minority of enterprises which renew their participation in the program, while in your case, the researchers are likely to be compelled to get involved again. They have an intrinsic motivation due to the fact that you will remain a potential funding source. So, I would like you to share with us your experience at that level, keeping in mind that this is a different context.

Dr. Thomas Brzustowski: Thank you for your question. If I may, I am going to answer part of it in English.

University research is an ongoing process the outcome of which is attributable to highly-qualified people in advanced science and technology. spin-offs of discoveries, of inventions, can be found everywhere, in all sectors of the economy. There are hundreds of examples of new enterprises which have been created as a result of university research.

• 1140

At NSERC, we have a sample of 111 enterprises that we can trace back to their first research grant, 20 or 30 years ago. It's an ongoing process.

[English]

I'll start with the whole situation of the impacts. It is very difficult to trace and at the same time very easy to trace. Let me start with the very easy answer.

One of the main products of university research, where there is research done but there's also the education of people at an advanced level, is the flow into society of people who have advanced degrees in all the areas of knowledge that are being advanced around the world. For NSERC, over the last 22 years, that number is around 50,000 individuals who have obtained Masters or PhD degrees in natural sciences and engineering. The vast majority of them have gone on to work in the Canadian economy. Some have started businesses. We have some of that.

The easy part of the answer is to say that if it were not for university research and science and engineering of the kind supported by NSERC, we would not have any companies working and competing at the leading edge of their technologies in the field today. I make that as an assertion. I would welcome anybody proving me wrong on that. I stand by that. That's the easy part.

The difficult part is to say how many jobs were created. How many jobs were saved across the whole economy by improvements in productivity arising from innovations that have their roots in research?

As I mentioned, we have a sample—and we believe it is a very conservative sample—of 111 companies that in total employ about 7,000 well-educated people, with sales, from those who are prepared to report sales, of over $1.2 billion a year that we can trace back to specific investments in university research. These may be companies, for example, arising out of research in mathematics that sell scheduling software for transit systems around the world or scheduling software for the use of crews for airlines all around the world.

[Translation]

In saying that, I have in mind Entreprises Giro in Montreal and Ad Opt Technologies.

[English]

These are difficult things to trace. We can trace them in a sample. We can't trace them easily through the whole economy.

I'll stop my answer there, Mr. Chair.

The Chair: Thank you very much.

John Godfrey.

Mr. John Godfrey (Don Valley West, Lib.): I'd like to pick up on a couple of themes that Julie White mentioned, which are expressions that may be anathema to bureaucrats, notions like a tolerance for ambiguity and a pride in risky grant-making. I don't know whether that has any place in the culture of the Government of Canada, but it does strike me as being the key to creativity and innovation. What we're trying to understand in terms of what you might have to help us with is what matters most in these processes. I guess it might be the case with Dr. Brzustowski as well.

One of the deficiencies that have turned up is incomplete files. Are there ever incomplete files in either of your lines of work? If you make the kind of value judgment that incomplete files matter less at the end of the day than a tolerance for ambiguity, creative pride, and risk-taking, that's one issue. How do you balance the need for proper controls and due diligence and honouring the money that has been put up to the client, so to speak, with the need to get on with it? How does that reconcile itself? That's sort of a subsidiary issue.

• 1145

I suppose the final way in which one thinks of this is with risk management and risk-taking. Presumably what you actually have in common in very different fields is innovation, which can only occur through risk-taking, and the process has to be managed and it has to tolerate failure.

So if you can focus both on...well, it's really the equation between risk and bureaucratic nicety, if I may put it that way, and how you can help us with that.

The Chair: Could the chair interject before he forgets the first question about the incomplete files? With respect to NSERC, I understand some professors are quite forgetful, and I wonder if that ever affected the files.

A voice: [Inaudible—Editor]

A voice: That was a confession, I think.

Ms. Julie White: Paperwork is a challenge in any organization in any sector. Certainly when you're operating as a lean organization and you're trying to focus your attention on what matters, it's often the paperwork that slides. Having said that, I think it's really important to have good files, and I think it's not just about the accountability issue, although that's really important; it is also the way we learn. The most frustrating thing is when you've got an application that comes in, it has a really good idea, it sort of resembles something that might have happened in another grant you made, you pull out that file, and the evaluation is missing, or it doesn't really track it. So I think you need to try to balance the files, keeping that and all the documentation up to date with the intent of the grant itself.

Again, the challenge is to make sure you're asking the right questions. You can have a complete file of everything you've asked for, but it still might not answer your question; it still might not provide you with the protection you need in terms of ensuring that you've given to the right organization, that they're spending it properly, and so forth. So the first thing is to make sure your systems and requirements are actually valid and that you've asked for the minimum amount you need. We use a concept called minimum specs: what are the minimum specs on this that you absolutely have to have?

Having said that, on the issue of risk-taking and ambiguity, again, that's where the judgment comes in. Anyone who's every reviewed a request for funding will know that the syndrome of going through reams and reams of stuff and just seeing the glimmer, just seeing something in there that's an interesting idea—it may be a small part of what's actually coming in—and being able to follow that, to work with the application, to flesh that out, to turn that into something else, to recognize that it's positioning in that particular community, that particular organization, particular opportunities or challenges or needs.... It may not fit completely into the program guidelines, and I guess that's where it starts to get a little bit risky. You might have a program that says we're looking at this kind of box and it fits a little bit outside the box, but the intent, the overall purpose of it, may in fact be the way...and there may be a huge opportunity to learn.

I think balancing the risk around that is in the analysis. It's in making sure you have the support of the community, making sure you've done external reviews, making sure the decision was based on something substantial. It may still not work, but there's a big difference between funding something, trying a new idea that doesn't work—and maybe you have learning you can use someplace else—and giving a grant to an organization that uses it for an entirely different purpose.

So I think the risk management is about the quality of the decision and the process for the decision, and then what you do with the realization that it's not working and the kind of transparency that you need to apply there.

Mr. John Godfrey: Tom, I guess it's a question of which kinds of organizations have the greater tolerance for ambiguity and risk-taking and whether we're all in different cultures with different tolerances. What's your take on this?

Dr. Thomas Brzustowski: I think you'll find my answers a little different. It comes back to the way I described our process. Let me challenge the assertion that somehow we don't live in ambiguity in government; at times I think we live knee-deep in ambiguity.

• 1150

But the issue of the balance between completeness of files and risk-taking is actually resolved in a very neat way in our organization. NSERC staff are very concerned that files should be complete. When they make spot checks of the university, they want to make sure that for money paid out there are invoices, there was proper authority for any payment, that accounts were closed properly when accounts were being closed, and that the money went where it was directed. That's their job.

The people who take the risk in the quality of what is being done are on the expert review panels. They take the risk at the time when the applications are being assessed. They judge what was done four years later or at the end of the project. They evaluate on very firm terms—not just by themselves, but with the help of all these people around the world—what was achieved with the money.

Perhaps that's a fortunate split of responsibilities. It doesn't put any individual's intention between too much risk-taking and not enough accountability.

The one element that I think we have to be concerned about is the quality of the service we give to the research community, and that includes not demanding too much of them in their role as peer reviewers and giving them the time to do research. I think our vulnerability is there.

Ms. Julie White: I just want to make a supplementary comment.

Your last clarification and comment struck a chord. I don't think there's any question that foundations are in a position to take more risks than government. There's no question about that. I think that's the importance of the foundation sector, that it is independent money, that it can move quickly, that it can fund idiosyncratically, and that it can make some mistakes that government can learn from. So I'm not applying that same standard to government.

The Chair: Thank you.

Rey Pagtakhan.

Mr. Rey D. Pagtakhan (Winnipeg North—St. Paul, Lib.): Thank you, Mr. Chair. If I may comment on ambiguity, I would like to make a distinction between intended ambiguity and unintended ambiguity. Intended ambiguity would have a very negative impact, because you do not know the basis for the intent. So the presence of ambiguity certainly could be an alarm signal, to my way of thinking. I don't think we should tolerate ambiguity on the basis of a blanket principal.

Am I correct in understanding you on that, Ms. White?

Ms. Julie White: When I'm talking about ambiguous situations, I'm talking about where we're going into new territory where there are some built-in contradictions or where there haven't been a lot of precedents being set. I think we should always try to take a look at clarifying what it is we're trying to do, and the information we get back should be as clear as is available. But that's not always the case.

Mr. Rey Pagtakhan: Perhaps it's what we call unintended ambiguity. Of course, in other words, we should look at it and ask, “Why is this? Does it make sense?”, etc., and not just accept any ambiguous set of data before us as, yes, I am tolerant of that. Otherwise, we are risking quality control.

Dr. Brzustowski, I'm really intrigued and fascinated by your 25 committees—with an average of 10 or so per 250 people—delivering about half a billion dollars a year at a success rate of 99.9% in terms of quality and product in the peer review. Obviously the membership of these committees is composed of highly motivated people.

My question to you, for the benefit of the committee as we deliberate on what we would like to recommend for the evaluation of reports, is, how did this process start? What motivated this group of experts to be volunteers? What is keeping them in terms of the attitude of the mind, the culture of quality, and the search for ethics of excellence?

Dr. Thomas Brzustowski: This is a splendid question, and I welcome the opportunity to talk about the peer review system.

• 1155

You do have the numbers right. I mean, we believe that we receive somewhere between 60, and perhaps 80, full-time equivalents of volunteer time a year from the research community in this work. If one were to visit NSERC during the month of February, when the research grants program is holding its competition, you would see all the conference rooms at 350 Albert Street jammed with people with piles of paper, their briefcases, and computers in which they're entering their comments alongside them.

Now, how did it start? It was possible for us to do it because it has always been part of the culture of the international scientific community—the notion that before something is published, before something is funded, it is put to the test of criticism from one's peers. Is it far out? Is it likely to succeed? Is it a valid result? This has been part of the culture for many, many years. Essentially, we plugged into that culture; we didn't create it.

The process that NSERC operates under is a process that has been evolving for about 40 years, the last 22 of those with NSERC as an independent organization and the rest of the time as the office of grants and scholarships at the National Research Council. They were split in 1978. So there has been this evolution.

Another fact that is very, very important in the research community is that researchers are intellectual risk takers and innovators within the context of their own discipline, but they want the context in which they do their research to be stable and predictable. So you've got that.

What do these volunteers get out of it, “it” being a week in Ottawa in February? This is not one of the great attractions of the world. Some of them bring their skates, but not all of them do. And it has meant several weeks of reading the materials in preparation for that. What they get out of it is a number of things. They get a sense that it's a better chance to learn what is going on in their field than they would have by simply reading the literature. It gives them an advanced glimpse of where the field is heading. There's a lot of value in learning.

They get to know some of the leading experts in the country in these fields. They enjoy a sense of responsibility to their colleagues. They sort of take turns. You know, it's my turn now to do this for three years or two years, then it'll be somebody else's turn. So it's a measure of responsibility.

Let me add one more thing to it, which is a real challenge to us. In each of these groups we try to achieve a balance between small universities and big, between sub-areas of the field, between the language groups in this country, between the regions of Canada, between young and old, between men and women. To try to do this with 10 or 15 people is very difficult, but what the committee sees is that there's a constant attempt to do this and everybody appreciates it. So this is one of the challenges to our staff in seeking nominations for these committees, to get those balances, all of those things in balance with the common factor of expertise in the subject.

So you're putting your finger on something that is unique because of the culture of the research community, not just in Canada but around the world, not just recently but for decades and decades.

Mr. Rey Pagtakhan: What you have said is a message to be passed on to the minister for the Treasury Board, who is in charge in the government for the volunteer program, which is a priority these days.

My last question, through you, Mr Chair, is in terms of monitoring. Obviously you alluded to the annual report and you alluded to site visits, not only to publications. My question is, how many of the site visits are spot visits and how many are scheduled visits that the research groups or universities know in advance are to take place? And what proportion of these research projects have resulted in publications?

Dr. Thomas Brzustowski: Let me answer the question in this way. The site visits that I've been talking about are site visits in which the spot checks are administrative on the universities' administration and fiscal management system. So they in fact do not involve the researchers. Their progress is monitored in different ways that I'll describe in a second.

• 1200

Remember, our philosophy is not to catch them doing the wrong things but to help them do the thing right continuously. So we give in advance a year or two's notice, saying these nine universities will be visited or these 18 over the next two years, these nine over the next year, that kind of indication. In terms of the actual progress of the projects, it splits into two. For the research grants that support basic research for a period of four or perhaps even five years, it's the peer review process that really assesses progress, and it assesses it at the end of that time by publications, by the number of graduate students finished, by the results achieved, by the discoveries made.

We give the freedom to researchers. After all, research is learning that which isn't yet known, so you can't specify it too much. We say, okay, you've decided to set out in this direction, but you've made a discovery here that has identified this path as more important than that path. They should have the freedom, but they're accountable to their peers at the end of that time. If they haven't been productive, if they haven't done excellent work, if they've taken a risk in which they've failed, then their grant suffers the next time and their career suffers, because the universities depend on NSERC grants in judging their people.

However, when we have project research, that is, solving industrial problems that can't be solved with existing knowledge—that would be consulting—then we have the definition of the problem, we have a project plan, we have a management structure, we have milestones, we have a predicted cashflow, we have partners involved. Those things are monitored on a much more continuous basis, and if their research is not advancing, then projects have been closed down.

This is in the area of university-industry partnerships, and our staff have to be much more deal makers than in the other programs, than it is perhaps customary for public servants to be. This is one of the areas where our 4.1%, or whatever it is, expenditure on monitoring and running programs really stretches us. We have to put the time into those projects.

But they are partners. There's industry at the table. They've been involved in designing the project. They know what the milestones are. They know when they have to make their payments. They know when they expect the results. So it's a much easier thing to monitor.

[Translation]

The Chair: Raymonde Folco.

Ms. Raymonde Folco (Laval West, Lib.): Ms. White and Dr. Brzustowski, I consider that, in some way, you are privileged witnesses about matters relating to Human Resources Development Canada and to what this committee is working on. I am going to explain you why.

Given the fact that you represent two organizations which, although they are completely independent from the government, are more or less working in the same direction than that department, I think that you could enlighten us considerably.

The question that I want to ask you is about control or, as Dr. Brzustowski would say, monitoring, more specifically internal monitoring versus external monitoring. How do you see the relative importance of internal monitoring, within your own organizations, since I guess neither of you is prepared to answer my question in connection with the department? If you are actually not prepared to do it, you could tell us about internal monitoring within your organizations. How do you exert that monitoring? When do you do monitoring? Do you consider it important to exert external controls, and, if yes, when and why should it be exerted? If you could link your answers to the way things work in the Department of Human Resources Development in that regard, I would certainly appreciate it.

Dr. Thomas Brzustowski: I find your question difficult, but I am still going to try to answer it.

• 1205

First, NSERC must report to the Treasury Board in the same way as any government department.

[English]

We're governed by exactly the same accountability rules, exactly the same requirements for reporting. We report to Parliament through the Minister of Industry. So in terms of the internal controls within government on the way we use money, as far as I know we are being treated in the same way as is any department of government.

However, I would describe it this way: that we respond also to an external community, the researchers. You have described them as internal. I would say they are external, because they maintain a constant pressure on us to be as effective as we can be in informing the community of any policy changes, any rules, of getting the maximum amount of money through parliamentary appropriations to meet their needs, of carrying their message and, above all, of maintaining the standards of excellence in all the grants we give out.

Viewed from the point of government, you may see them as internal to the NSERC function. Viewed from the point of NSERC, they are an external pressure in addition to the external pressure of the Treasury Board and all the departments of government.

Ms. Raymonde Folco: Excuse me for interrupting. I just want to clarify. Perhaps there was an error in translation. I did not refer to researchers as either internal or external controls. But that's fine, thank you very much.

The Chair: Julie White.

Ms. Julie White: Thank you.

I'd like to talk about this from the point of view of recognizing that for most foundations we have two systems that are separate but complementary. One is monitoring and one is evaluation, and they overlap, but they actually require separate things. The monitoring is really ensuring that what has been promised is being carried out.

That starts off at the beginning, where we establish what it is that we require. We put that in a contract form that's quite specific; it's usually in the form of a letter, not a big contract. Then we have a form and standards of reporting and things that we do if things start to slide or if we get different kinds of information that we require.

Complementary to that is the evaluation, which is to look at what the actual impact is. It may be that they're carrying out what they said they would, spending the money the way they said they would, but it's not working anyway. Or maybe it is, and the reverse is true. You could almost look at it as a grid.

How we do evaluations can vary depending upon the size of the grant. For a small grant, it might be a little thing. It might be a simple way of taking a look at the impact. For a more complicated one, there are two components to it. One is evaluating what it is the organization wants to do and its impact, and often we make sure that's built into the design of the program right at the beginning. It's not something you do after the fact. It's something you build in so that you're gathering the correct information and analysing it as you go forward.

Sometimes with a grant, though, what we want to learn from it isn't necessarily what the organization wants to learn from it, because we're tracking something else or it's part of a bigger program or there is a piece that we're particularly interested in. In those cases, foundations will often add to the funding to cover that, or they will bring in an external evaluator, or they will do cluster evaluations. For example, at Trillium we did a number of cluster evaluations where we took a look at all the organizations that were involved in addressing food security. We took a look at all the organizations that were involved in economic development. So what we were trying to do there was assess not just the impact of the individual grant but the body of work as well.

My final point on this is that in addition to evaluating the grant, we're always evaluating the program, whether or not our assumptions were correct, whether or not the program was correct. That includes both the body of the grant and the impact of the grant as a whole as well as the individual, but also some of the fundamental assumptions.

We would do that in a number of ways. Sometimes it would be—we did this all the time, and many other foundations do this as well—to survey the applicants and the grantees to talk about what they learned, what could have been different in the program, what worked and didn't work from the point of view of what they got from us. The other way is to go to the community and talk to the community about the impact of the program or the impact of the particular project we funded.

• 1210

There are other funders. It's rare today to find any one organization as the sole funder. Usually there are a number of funders, and it's really important for funders to work together and share information and critique each others' programs.

So there are a number of levels here, from the individual grant to the overall program, from the monitoring to the evaluation. You could look at it as a grid with a number of strategies in there that should be complementing and supporting each other.

The Chair: Raymonde Folco.

[Translation]

Ms. Raymonde Folco: Thank you very much. I found it extremely interesting to hear what you said, but I can see that neither of you accepted to establish a relation between what you do within your own organizations and what is done in the Department of Human Resources Development. Either you don't want to make such a link, or you just cannot. Could you make it in a second part?

[English]

The Chair: I think there should be very short responses, witnesses, because we have to move on. We're due to finish in twenty minutes.

Dr. Thomas Brzustowski: I can't do it. I don't know enough about how HRDC works.

The Chair: Okay. Julie.

Ms. Julie White: I'll try it. I think mistakes happen in any sector. Being on the outside only having access to the media, it's impossible to know what the systemic issues are.

The Chair: Thank you very much.

[Translation]

Raymonde, I thank you very much.

[English]

Colleagues, I have Bryon Wilfert, Larry McCormick, Maurice Vellacott, and then I'm going to finish it if I can. Bryon Wilfert.

Mr. Bryon Wilfert (Oak Ridges, Lib.): Thank you, Mr. Chairman.

We on our side of the floor would like to thank you for providing us with some insights as to how your organizations deal with the issue of grant funding and processes.

One of the issues we are tackling of course is accountability and transparency, and I was fascinated that this is one area, particularly when you're dealing with small amounts, as in some cases you were talking about in your presentation, Mr. Brzustowski.... You talked about one to two percent in terms of monitoring, and the fact that even if it's only a few thousand dollars, obviously you're not going to put the resources there, or $100,000, depending on.... You have a budget, you said, of about $500 million allocated.

I would be curious to know, in terms of accountability and transparency, when you have a problem where you have a fund, you're going to fund something, you have an agreement, and you learn that agreement in fact is not being carried out in the way you expected, how do you respond? What mechanisms are there, and are those mechanisms in fact announced up front to the grantee?

Dr. Thomas Brzustowski: In the area of university-industry partnerships—that's the area where the agreement is clearest, most unambiguous—if the research isn't proceeding, if anybody isn't living up to their terms that were approved when the project was approved, first this is pointed out, then an attempt is made to get it back onstream. But ultimately—and it doesn't take too long—payments are withheld, progress payments. And if a project shows no sign of being redeemable in those terms, it's simply terminated. It just happens on that basis.

Now, on the research grants, take the field of mathematics. Research grants there are very small—maybe $12,000 or $15,000 a year. But every penny of that grant is explained beforehand in the grant selection committee, and how it was spent afterwards has to be explained in terms of the results achieved after three or four years. Along the way, that grant is subject to the same spot checks for proper procedures, and the university has a much larger one.

Mr. Bryon Wilfert: Ms. White, in terms of third parties, with HRDC we often deal with the issue of risk when you're dealing with the provinces, which often play a significant role, or other third parties in the sense that it's not going directly from us to them. How have you managed to do it? Did you deal with third parties? If so, how do you deal with that in terms of accountability?

• 1215

In terms of the issue we're looking at, there have been some suggestions that rather than have a project officer do the evaluation, you might have more of a community committee. One of you mentioned the role of committees, and I don't know how that would work, without feeding it to your friends. But that could happen. Let's assume you had four or five people from the community and the involvement of the officer. That might, in my view, be helpful also, in terms of more accountability, because if it's on the shoulders of one person.... The chances are that probably the committee may also drop in on the project, or certainly do more follow-up, which they may not do under the present circumstances. So maybe you could comment on that as well.

Ms. Julie White: I'd like to comment on your last part first.

The notion of community committee is one that is used very widely in the foundation sector. Keep in mind that foundations are in fact charities, so they do have a charitable board. So you have an independent body. It's not staff making the decision necessarily. They may have some discretionary authority, but it does go to the board for final review.

The Laidlaw Foundation, for example, has a number of different programs—the children at risk program, for example—where they have experts in the field. The staff do the work-up, write it up, submit it to the committee, and the committee makes a recommendation to the board. This is a very common process. It's kind of a check and balance, because it ensures that the due diligence is done in terms of the paperwork and the questions and the references and all that kind of thing that is very difficult for volunteers to do. At the same time, it draws upon the expertise and the knowledge, whether it's local expertise or whether it's issue expertise. It really depends upon whether the grant is a kind of a community-based grant or an issue-based grant.

I think those are excellent things, and I've seen them work also in government departments where people have been brought in to do reviews. Now, the final decision is a political one, but again with a very strong external input, and not just staff.

On the issue of third party, I'm not entirely sure I understand what you mean by that. The parallel in the foundation sector is that up until the last few months actually, foundations who wanted to support an emerging organization or an organization that didn't have its charitable status yet would fund a third party. They would fund let's say the Y, who would then provide the funding to a shelter for battered women.

The Chair: Excuse me. The sense was whether it was a third party who also had responsibility by a grant.

Mr. Bryon Wilfert: So you may have a partner. For example, we may go in with a partner who is also a funder, and for whatever reason they may not follow through. Then the question is what happens as the other partner—“we” being the government.... How do you make up for that portion that's no longer being covered? Or certain decisions were made based on the role of that other partner, they're no longer there, and guess who gets the fallout? Do you have that kind of situation, where you have maybe more than one partner involved?

Ms. Julie White: Funding partner.

Mr. Bryon Wilfert: Yes, funding partner in terms of the grantee.

Ms. Julie White: That happens in the foundation sector. Often the decision is made to fund matching grants, to give an example of that. The matching fund might fall apart, or the government might change and certain money that was expected is gone. Does the foundation continue to fund it? Again, that's part of the evaluation. It's done on a case-by-case basis, depending on whether the project can go ahead without it.

One of the advantages of foundations—and I have seen this happen—is that where a project is considered quite important to the foundation, the other funder may pull out for whatever reason and the extra funding is provided. So that will happen, but again it really depends on whether or not it's viable.

Mr. Bryon Wilfert: Your comments have been instructive. It's unfortunate we didn't have more here today to hear that, because I think it's useful. Thank you.

The Chair: Larry McCormick, then Maurice Vellacott, and then we'll conclude.

• 1220

Mr. Larry McCormick (Hastings—Frontenac—Lennox and Addington, Lib.): Thank you, Mr. Chair, and thanks to the witnesses for being here.

The doctor mentioned that they had only about 4% going into administration, and that this might not be the most positive part of their programs. It sure sounds a lot like HRDC, to me, where you have to make decisions. When we, the government, make decisions to do cutbacks, for whatever reasons—and usually they're expected to be apparent—it can come back to haunt us. So I'm glad to hear you mention things like that.

Charities, Ms. White, have independent people who sit on an independent board, but here we're talking about politics. I believe every member of Parliament here is serving for the right reason. I know that 98% of all members are working fools, in street talk. It's politics.

In the case of charities, when you're involved with these foundations, do those people get lobbied either directly or indirectly to much of a degree as to how they should look at the applications for funding from wherever?

Ms. Julie White: Lobbied by government or just lobbied?

Mr. Larry McCormick: Well, no, lobbied by whomever. I just think everyone gets lobbied. I know some doctors or professors would say it never happens in their field of expertise. There are other people who differ.

Ms. Julie White: Sure. It varies so much in a sector. You can have a small family foundation, or someone living next door to someone who has applied for a grant, and they know you, so they take you out to lunch. Oh, sure, that thing goes on all the time.

Sometimes it can be helpful in the sense that there might be information to be provided. It kind of points to the support that's happening in the community. So I think you need to take that thing into consideration.

The advantage of foundations is that we don't have to worry about being re-elected or being equitable or even being fair. I mean, we can do exactly what we think is important to the particular program or initiative, and that's just an entirely different opportunity.

Mr. Larry McCormick: Thank you, Ms. White. I think I've surprised the odd person in my riding when I've told them of the importance of lobbyists as educators, and how much I've learned from some of these people. We can still make our own decisions.

Doctor, I have just come from another committee where we were talking about GMOs and food and so on. I think part of that will be education. I'm wondering if you could take the opportunity to tell us how we, as a government, could do an even better job of selling the importance of research to the taxpayers. I just thought I'd give you that opportunity.

Dr. Thomas Brzustowski: That's a wonderful question. Thank you very much for posing it.

Let me make the assertion that in the area of health research, the Canadian public is absolutely convinced of the causal link between more health research and better health. The evidence I cite for that is that we all give to health research charities $200 million or $300 million a year. So that's clear.

There is, in my opinion, no accepted causal link between more research in science and engineering and greater prosperity for the country. We're a country that has relied on its natural resources, and some of that attitude still continues.

I think if government could in fact help people understand, by providing illustrations, by supporting good examples, and by demonstrating, as is done in many other countries around the world, that there is that link, particularly in this new global, knowledge-based economy, between more knowledge and more highly qualified people and greater prosperity, then that would be very helpful.

Just for the record, I'd like to touch on the issue of lobbying. I do want to say that at NSERC there is no political involvement in the decision process. The acid test for this is that when somebody occasionally tries to appeal, after exhausting our internal appeals, a negative funding decision by the political route, I'm pleased to say that ministers have steadfastly shut the door on those. So that doesn't exist.

I'm lobbied all the time when I visit universities. It doesn't do anybody any good, because I don't make the decisions. It's the discipline committees.

• 1225

The Chair: Maurice Vellacott, and then we'll conclude.

Mr. Maurice Vellacott (Wanuskewin, Canadian Alliance): Thank you, Mr. Chair. It's a relief to be back in this committee. I just wanted to remark this. I came from the health committee, and I do want to commend you, Mr. Chair, for your civility and respectfulness and politeness in respect to all the committee members here.

The Chair: Is this lobbying?

Voices: Oh, oh!

Mr. Maurice Vellacott: No, it's not. Truly, it's not. I just realized the contrast. We differ, we disagree on things, but in health committee, where there was a motion to depose Lynn Myers as chair, I was taken aback by some of the snide and sarcastic remarks and so on made by the chair. I wanted to commend you for.... You sometimes absorb the fury of either our side or your own members there.

The Chair: Maurice, praise from the opposition is often the kiss of death—

Voices: Oh, oh!

The Chair: —so please continue. You have the regular time.

Mr. Maurice Vellacott: Obviously. Right.

First, my apologies for being late. That's my explanation.

To both our presenters today, on the matter of peer review, I'm just wondering how we can apply that or make application of that to the whole issue of HRD grants and contributions. I'm wondering if you would speculate and surmise in terms of....

I notice here you talk in terms of there being lots of volunteer hours and involvement, and that being the reason. I'm a little different from my colleague here, but I understand that's why you were able to do it at 4% of the amount of grants distributed. I don't question it. I think it's done responsibly and accountably and so on. But because you have so many volunteers involved, they have an interest in it, they want to support it, and so on.

I guess I'm wondering if there is any potential for doing that in the area of grants and contributions—for example, in the Transitional Jobs Fund, which is about giving employment to people and so on.

I think we have compassionate Canadians who would want to see some of these good things go forward, but could we bring businesses from that sector—and I guess this is for both of you, Julie and Tom—into a peer review type of system? Maybe there are problems with that, but could we do that, to get some of the benefit of the sectors you work in, for this area of grants and contributions? I think it might be worth trying. I just offer it.

You might suggest that there would be problems in that the business community would say, well, it's a competitor. But if it's truly not going to be picking a winner and disadvantaging them in any great way, maybe they would be cooperative and say, hey, that's a bogus program, so don't get into funding that business. This is better. We want some percentage of government moneys to be used for creating employment, so let's share with this group.

Do you surmise that there could be any transfer of the principles here into dispensing of tax moneys to peer review?

Go ahead.

Ms. Julie White: I come at this a little bit differently from Tom in that the use of volunteers and volunteer review, in my experience, is one about quality of decision-making. It really doesn't save you any money. It's expensive to work with volunteers. I think there's a bit of an illusion that volunteers are free labour, but it includes travel costs, providing information, training. A whole lot of things go along with working with volunteers.

Having said that, the qualitative issue, I think, overweighs that. But it is something you do need to factor in. It's not like getting free staff.

Mr. Maurice Vellacott: I understand that, but obviously there are some advantages. I mean, they're giving hours and hours of expertise after having served in that area. But, yes, you pay honorariums, or possibly travel expenses, accommodations, and so on.

Ms. Julie White: Yes. The management of volunteers is more time-consuming and expensive than you might think.

Mr. Maurice Vellacott: Yes, I understand that.

Ms. Julie White: Having said that, I have seen government programs use advisory boards and committees very well, and I think it does add to the quality. I think it's a perfectly transferable process. Most people I know who have participated in that process do so with a great deal of enthusiasm, wanting to provide influence and expertise to the decision process at the government level, while understanding that the final decision is made somewhere else. The more that people are involved in reviewing and inputting....

I would also say it's not just about reviewing particular grants but also about reviewing the program and being part of the program design. So I would encourage that.

Mr. Maurice Vellacott: Because at a point those peer review people may say that this program needs to go, it's gone the way of the dodo bird, it's not relevant any more. It would be within their realm to do that.

Ms. Julie White: Exactly.

Mr. Maurice Vellacott: For sure.

Dr. Thomas Brzustowski: Let me just add that there is one thing that would be more difficult in that case—not insurmountable, but more difficult than what we've faced. That is, we plugged into an existing culture of peer review and participation by volunteers, and in the case you described, one might have to create it.

• 1230

Mr. Maurice Vellacott: You'd have to create it.

Dr. Thomas Brzustowski: Yes.

Mr. Maurice Vellacott: Right.

The Chair: I have to interrupt this, Maurice, even though I did appreciate the comments at the beginning.

I would like to thank Dr. Tom Brzustowski, the president of NSERC. Tom, we greatly appreciated this. You can see the members were very interested.

As well, to Julie White, president and CEO of Private Foundations Canada, we greatly appreciated your taking the time. We realize this is something your organizations get no direct benefit from, but it's very important, I think, for those who deliver and receive the HRDC grants.

To repeat my remarks, I particularly appreciated the way you addressed, at the beginning, the particular concerns of our HRDC inquiry. Thank you both very much.

Colleagues, I'll briefly suspend the meeting. We'll be discussing future business in a moment.

We are adjourned.