Skip to main content
Start of content

INST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication

STANDING COMMITTEE ON INDUSTRY, SCIENCE AND TECHNOLOGY

COMITÉ PERMANENT DE L'INDUSTRIE, DES SCIENCES ET DE LA TECHNOLOGIE

EVIDENCE

[Recorded by Electronic Apparatus]

Thursday, November 22, 2001

• 0924

[English]

The Vice-Chair (Mr. Walt Lastewka (St. Catharines, Lib.)): Pursuant to Standing Order 108(2), study on the three federal granting agencies, peer review funding, and the Canada research chairs program, I'd like at this time to welcome our two witnesses, Dr. Jacquelyn Thayer Scott and Dr. Alan Winter.

I apologize to them for the delay of the meeting. I should also say that we've had some late evenings, or early mornings, this week, so I understand some of the reasons. But we'll carry on the meeting, and I would ask that we begin with the testimony. I should remind the committee that we must stick to the 11 o'clock deadline, so we're going to go as crisply as we can.

• 0925

Who will begin? Dr. Winter? We'll go as on the agenda.

Dr. Alan Winter (President and Chief Executive Officer, New Media Innovation Centre, Council of Science and Technology Advisors): Thank you, Mr. Chairman. I appreciate the opportunity of being invited to speak this morning.

The topic we're going to discuss is peer review. My name is Alan Winter, as the chair has said, and I'm the president and CEO of the New Media Innovation Centre in Vancouver, or NEWMIC, as it's more commonly known, but I'm here today on behalf of the Council of Science and Technology Advisors, the CSTA, and I've been part of that since its inception.

The CSTA was created in 1998 to provide the federal government, specifically the cabinet committee on economic union, with external expert advice on internal S and T issues in the federal government requiring strategic attention. I think your own report deals with this in identifying something like $6.3 billion in annual expenditures. That's in the A Canadian Innovation Agenda for the Twenty-First Century report. CSTA was formed really as a direct result of the 1996 S and T strategy, “Science and Technology for a New Century”, which called for greater government reliance on external advice. The CSTA consists primarily of representatives from the science advisory bodies that provide external advice to science-based departments and agencies, and it draws these advisers into a single body to approve and improve federal S and T management by examining these issues and the ones that are common to departments and highlighting synergies and joint actions.

The CSTA provides advice to government, but we're not mandated to be a watchdog per se. So I'll focus my remarks specifically on CSTA's findings with respect to peer review and ensuring excellence in federally performed S and T. I think there has been a handout of slides, so I'll just mention when I'm changing.

The next slide is on excellence. In each of its reports CSTA has emphasized the critical importance of sound science to support the roles of government and inform policy decisions. In that context, the CSTA has stressed the importance of S and T excellence and mechanisms for ensuring that excellence.

The CSTA's first report, which was called SAGE, Science Advice for Government Effectiveness, focused on science advice in government decision-making and called for government science advisory processes that include due diligence procedures for assuring quality reliability, including scientific peer review.

The second report, which was called BEST, Building Excellence in Science Technology, identified the roles and responsibility of government in performing S and T and its capability to deliver on those roles. It also identified excellence as critical to public and stakeholder confidence in the credibility of government S and T.

As an aside on that report, in case it's seen as just asking for more money for S and T, that's not the case. In fact, that report dealt very much with making sure that within the federal government there is, first, alignment with the various departments before S and T is performed internally, and second, collaboration with external bodies doing R and D, in other words, the linkages that are required. Only after those two criteria are satisfied should S and T be performed in the federal government. But when that is decided, the S & T has to be of excellent quality, because the government decides to use that in the way of advice. According to that particular report, S and T must be of the highest quality.

Public investment in science demands scrutiny to ensure that public dollars are not wasted on frivolous or flawed procedures. The expert opinions the government deploys makes sure that in fact, the objectivity of the science is conducted and used in the development of policy. In this context, the report specifically identifies the importance of regular and appropriate expert review.

• 0930

On the next slide, Science and Technology Excellence in Public Service was really the third study CSTA did. This report identifies the characteristics of federally performed S and T and provides a framework to stimulate scientific excellence. So this was really quite a logical progression for CSTA. First it asked how science is used within government for providing advice; the second report really said what S and T should be performed in government; and the third said that if you're going to do S and T, it's got to be excellent, and here are some of the ways of making sure that happens.

So this framework reflects the unique characteristics of excellence that distinguish government S and T from S and T performed in other sectors. It's built on a foundation of conditions essential for excellence, and the four pillars were identified as quality, relevance, transparency and openness, and ethics, to define the elements of federal science and technology excellence. Those pillars become extremely important. So the federal S and T must be of high quality, it must be appropriate to the nature of the S and T conducted, it must be relevant to the roles and priorities of government, it must be conducted with the degree of openness called for in a democratic nation, and it must be pursued in accordance with the ethics of society.

Now there is an argument about peer review. The peer review really applies primarily to the first item, the quality of the science and technology. Peer review can be applied to the others, such as relevance, but it doesn't always require the same people. Peer review has traditionally been taken as really making sure the excellence of science is reviewed.

So the STEPS report, in summary, identifies mechanisms to measure excellence in the conduct and management of federal S and T. The report draws attention to the use of the traditional and more recently developed mechanisms available to measure and foster excellence, but calls upon government to implement expert review as the centrepiece of review.

The next slide deals with peer review itself. Due diligence procedures for assuring quality and reliability must be built in to ensure excellence in science and technology. In this context, the most widely accepted measure of scientific quality is peer review. There are many methods of peer review, but they're all based on the premise that the quality of scientific work is best judged by experts—and I would add, independent experts—in the field. The integrity of the review process requires the selection of qualified reviewers, whether internal or external to the organization, who possess the appropriate personal credentials in respect of the expertise they are expected to contribute to the review and the independence of the science and technology being assessed. In other words, they must avoid such things, obviously, as conflict of interest, there must be a balance of opinions, and in many cases, in fact increasingly, there has to be an international component of peer review.

Peer review is a key aspect of science that supports rigorous methodology, skepticism, transparency, professional independence, and accountability—and I would add to that a lack of ego. I've been a founding director of two network centres of excellence and a founding director of CANARIE, the federal organization Dr. Scott has been a member of as well, and I think peer review is not necessarily an ideal way to go at any time, because it's a human endeavour, but I think of all the systems we have it is one, when used properly, that can be extremely valuable as one input to a decision process.

On the next slide there are a number of considerations concerning peer review. These are things to consider, I think, as we look at the way we decide on science in Canada.

First, innovation is a touchstone of progress in science, but that means science must be tolerant. Tolerating unconventional ideas is important. It's critical that review remain open to new ideas, while weeding out science that isn't credible.

• 0935

Second, I think we must strive for stable scientific processes of checks and balances with a system that encourages and is receptive to scientific breakthroughs. Peer review, which is really the test of convincing one's peers that your view should prevail, has served us well over the years, and it has often prevented us from going down unproductive paths. But if conservatism leads to bias or lack of openness to new ideas, the core value of innovation is lost.

Third, the diversity of approaches to peer review of science and technology raises fundamental questions as to who should be considered a peer and exactly what constitutes peer review, particularly in the context of government science and technology. While a traditional approach to scientific peer review has employed persons qualified in the appropriate discipline, government science is often multidisciplinary in nature, and thus requires a broader range of expertise to be included. This is an obvious case within government. Peer review is really only one input. If you can imagine in a department a deputy minister making a decision on investment in a science and technology area, obviously, that deputy minister would want input as to the excellence of the science, but that's not the only reason we would invest in Canada in a particular scientific endeavour. There are many other strategic reasons for carrying out that particular investment in science and technology.

On the next slide we deal a little bit with peer fatigue. The potential for peer fatigue is quite critical. It's particularly true in a country like Canada, compared to the States, where there are a limited number of qualified experts who can serve on review panels. Countries such as Sweden, Australia, and New Zealand make substantial use of international experts to ensure the independence of their review processes and to counteract peer fatigue. Certainly, I'm aware of international experts participating in Canadian reviews, and I think that's an increasing trend.

To ensure excellence in science, the decision-makers and the public all require validation that data analysis is first rate, but peer review is only one method, a method focused on assessing quality primarily. Determining relevance is a different issue. Often there are many other factors that have to be included in that review. My background has been primarily in industry, and when we look at industry relevance, we want to make sure industry people conduct that particular review, for example.

Expert review relies on collective judgment. Quantitative measures are receiving increasing recognition. In fact, I'm part of a group that develops a system called ProGrid, which deals to some extent with consistent application of review of such intangible processes. For example, in a peer review you might ask about the quality of a team. To some extent that is quantitative, but it is also qualitative. In those cases you want to be consistent across different teams. There is increasing use of tools for this type of activity.

In conclusion, first, I think it is commonly acknowledged that science is generally performed in the public interest and in the interest of the greater good. Scientists are proud of their contribution to the body of knowledge and to the advancement of science.

Second, peer review has its benefits. It helps to ensure that science and technology is of a high quality. At the same time, it's important that quality is defined in terms that are appropriate to the nature of the science and technology conducted.

The third point is that peer review is only one input into a strategic decision to invest in science and technology.

Fourth, conducted properly, peer review focuses the right peers on review and ensures that S and T is excellent. However, no system involving human judgment can be without some controversy from time to time, so it is crucial to ensure that peer review is conducted in a consistent and transparent way.

There is no one perfect method. To paraphrase Winston Churchill, in a democracy and an imperfect world, peer review is still one of the best ways we've come up with to evaluate the quality of science and technology.

Thank you.

The Vice-Chair (Mr. Walt Lastewka): Thank you very much, Dr. Winter.

Dr. Scott.

Dr. Jacquelyn Thayer Scott (President and Vice-Chancellor, University College of Cape Breton): Thank you, Mr. Chairman.

• 0940

I'm here today on behalf of the Prime Minister's Advisory Council on Science and Technology, of which I've been a member since its inception. The ACST, like Dr. Winter's group, was formed in 1996 as an outgrowth of the federal science and technology strategy, and we were designed to provide the federal government with some external expert advice from business and higher education on external issues of policy importance to Canada's development of a knowledge-based economy. Periodically, we provide requested policy advice to the cabinet committee for economic union, and in one case also to the cabinet committee for social union as well. Our reports generally are also made public.

Our most recent report may be of some interest to you. It's called Creating a Sustainable University Research Environment in Canada, and it was published on our website, which is htdp://acst-ccst.gc.ca.

Other related reports we have submitted and published that are germane to the topic this morning include Public Investments in University Research: Reaping the Benefits, in which the ACST examined the question of commercialization of university research and the ways in which Canada could obtain greater benefits from its investment in university research.

Another report, Stepping Up Skills and Opportunities in the Knowledge Economy, which I had the pleasure and honour to chair, included a number of recommendations for strengthening partnerships among universities and colleges and other sectors of the economy to ensure that research leads to enterprise opportunities for Canadians, including those who live, learn, and work in non-metropolitan regions and subregions of Canada.

The third report of some interest is Reaching Out. Canada, International Science and Technology, and the Knowledge-based Economy, in which the ACST recommends ways of increasing participation by Canadian researchers and firms in major international science and technology endeavours and ways to increase planning and coordination of government policies and activities in this area.

While most of my remarks will be directly related to the ACST advice that bears on the issues at hand, from time to time and by way of example I may also wear my hat as president of a smaller university college in an economically depressed subregion. I'll do my best to distinguish between those two roles in my commentary.

Let me turn for a moment to the recommendations we made in our latest report on creating a sustainable university research environment, because these are very much germane to the discussion of peer review. These include that the federal government should begin funding the indirect costs of university research in proportion to the amount of funding for the direct costs of research that it provides to universities through the Canadian Institutes of Health Research, the Natural Sciences and Engineering Research Council of Canada, and the Social Sciences and Humanities Research Council. Hospitals and research institutes that receive granting council funding for the direct cost of research should also receive funding for indirect costs.

Second, it is most important to begin this funding soon and to increase it to appropriate levels over time. We recommend that funding for indirect costs reach 45% of the level of funding of direct costs provided by the granting councils. This is based on a significant review using three different models that attempt to find what the appropriate level should be, and it's very consistent with what competitor countries are providing in that regard. Funding for indirect costs, we suggest, could begin at the level of 40% and increase over a three-year period.

We further recommended that these funds should be delivered directly to the universities in a manner similar to the Canada Research Chairs arrangement, rather than through the granting councils or any transfer arrangement between the federal and provincial governments.

Further, this approach, based on a three-year rolling average of funds received from the granting councils, the development of a set of institutional research objectives, a plan for achieving them, and a reporting regime indicating progress and reaching these goals by each university, provides a good model for an indirect cost program. The Canada Research Chairs might very well furnish the delivery mechanism.

We also recommend that the university should receive funds in a way that accounts for economies of scale. We recommend a smooth, progressive range of rates from about 95% of direct cost funding for those receiving smaller amounts of granting council research funds to 40% of direct cost funding for the universities gaining the largest amount of funds for the source. In estimating the cost of such a program, we assume it will initially be based on a rolling average of the research funds each institution received from the granting councils for the years 1995-1996 to 1997-98, the most recent data available at the time of the report. This would give a cost of between $250 and $260 million in the first year. Costs rise to slightly less than $450 million as the rate increases to 45%, and as increased funding for the granting councils announced in recent budgets is incorporated into the rolling average.

• 0945

We also recommend that this program should be permanent, that it should be reviewed after five years to ensure that funding levels are neither too high nor too low, and that the concerns of all Canadian universities are being addressed.

In making our recommendations, for both substantive funding and appropriate means for delivering it, we are keenly aware of the limitations of a single standard of excellence applied to all Canadian universities regardless of their mandate or the state of development of their research and innovation infrastructure. Similarly, in many of the recommendations we made in Stepping Up Skills and Opportunities in the Knowledge Economy we recognized that smaller communities, especially those in more rural or remote regions, face different challenges from those of more richly endowed metropolitan areas with complex knowledge-based infrastructure.

While some smaller universities may see themselves primarily as liberal arts colleges providing excellent teaching and learning environments, other smaller universities and colleges have a community-based mandate for subregional economic, social, and cultural development, in addition to their direct education and training roles. Such institutions often are newer, 30 years old or less, were founded in times of federal and provincial economic fiscal constraint, and do not have the advantages of age and geography to assist them in developing large endowments through individual alumni and corporate donations.

Canada's well-developed granting council operations serve extremely well the standards of excellence achieved and desired by major comprehensive research institutions with well-developed research faculty, components, and infrastructure. These institutions are very important to Canada's present and future, and recent federal investments in this model are both needed and appreciated. The standards of excellence applied in peer review within these systems focuses heavily on the research track record of the individual researcher, international comparators, where those are available, and incremental evolution of research findings. It is both human and objective that when such standards are applied, they most often result in confirmation and expansion of well-developed infrastructure and major comprehensive research institutions or specialist niches in second-tier, mid-sized institutions.

By contrast, research activities in smaller institutions with subregional development mandates are generally less well developed. There are a few exceptions to that, but not many. Typically, they begin with involvement in some form of sectoral economic activity within their subregion that may support a very limited number of mid-sized or larger employers, for example, forestry, mining, agriculture, or regional manufacturing. It may be difficult for such institutions to move into other niche areas that will focus on assisting SMEs or helping to diversify the base for their local economy. As smaller institutions, they have less well-developed, if any, research and development offices, commercialization facilities, and partnerships with national or international institutions or organizations. Their capacity to develop any of these aspects of research and innovation may be further limited by the financial resources of the province in which they are located.

By way of extreme example, I mention my own institutional case in Nova Scotia. UCCB is a young institution located in a poor province where business and governmental activity is highly centralized in the provincial capital metropolitan area. In an island community of 150,000 to 160,000 people we have two private sector companies employing more than 500 workers. In the past decade provincial operating funding support for UCCB has dropped from nearly 70% of our operating budget to last year 36%. During that same time period student bursary assistance was eliminated by the province, as were provincial contributions to any capital building projects of universities. Until the last provincial budget, uniquely among the provinces, there was no provincial funding available for matching requirements for any of the new federal investment programs. There is now a $20 million fund intended to provide matching funds for all provincial universities for all programs. Think about that figure for a moment.

Under such conditions, how does a smaller institution, in our case with a specific and legislated mandate for economic and social development, develop and augment its research and innovation infrastructure? With great difficulty, and not without the help of the federal government, and not by depending solely upon a single peer review standard that is heavily biased towards historical performance and/or the financial and research infrastructure capacity to support research at internationally competitive levels.

• 0950

As an aside, we have managed over the past decade to significantly and continually improve our capacity, through some means I'd be happy to share later with any members of the committee who are interested. We hope to continue our progress on goals that are important to our communities and students, but there are significant barriers to doing so. Some of those barriers continue to be related to economic geography—few large business entities, smaller communities, more remote locations—but others are related to our mandate. We specifically are focused on multidisciplinarity and curricular innovation, we combine both college and university programs, we are focused as well on economic diversification and sustainable growth of SMEs in our region.

Some of what we do is considered in traditional peer review circles as too new or too innovative—no track record of its having been done before. These are comments I've read frequently in peer review commentary we get back from some of our granting applications. We've been told in some peer review settings that we are too ambitious for a small, subregional institution.

Another and newer barrier is the self-reinforcing nature of newer federal investment. One's track record at CFI, for example, depends in some significant part on one's track record with the granting councils. Increasingly, as the granting councils try to also support CFI investments, the opportunities to break into that virtual circle become increasingly constrained for outsiders, even though the results of such convergence are, in themselves, good and proper outcomes for well-developed, major, comprehensive research institutions.

Let me be very clear, both as an ACST member and as president of a smaller institution. I am not criticizing in any way, shape, or form recent federal investments in research and innovation in Canada's universities or the peer review dispensation of these. These were and are needed, and our major institutions, which benefit most from these, require that support. What I am saying is that if Canada pursues only that single model and standard, it will be nearly impossible for newer community-based institutions in rural and remote regions, with access to fewer local or provincial and territorial fiscal resources, to develop and serve their economic and social mandate for research and innovation.

I am saying, as chair of a skills panel report, that subregional communities that cannot develop the intellectual and physical infrastructure that is crucial for a knowledge-based economy will not survive and thrive independently. I'm also saying, as an ACST member and as president of a smaller institution, that the single standard peer review model is well embedded and not likely to change. It may be softened at the edges by special programs for young scholars or by sectoral initiatives that may opportunistically allow for participation by some smaller institutions, but those elements will be very marginal to how most existing moneys are dispensed. There's nothing wrong with that, but what is required to address that other need is a combination of the following.

First, we need a parallel and high-level federal institutional grant for indirect costs to universities that aspire to fulfil their potential in researching issues related to their mandate, with such investment being based on sound institutional plans that have measurable results, but wherein the standards of excellence may be more appropriate to developmental activity, as opposed to evolutionary activity. This, indeed, is what underlines our recommendation as the ACST in the Creating a Sustainable University Research Environment report.

Second, there is a need for increased federal investment and sectorally based innovation, utilizing and broadening the role of Canada's industry sector councils, as recommended in the Stepping Up report.

Third, we need direct investments in selected subregional institutions with compatible mandates that are focused on strategies that will result in the creation and sustainability of local SMEs becoming globally competitive, as set out in a number of specific recommendations in the Stepping Up report, which, while not yet adopted by Canada, have been adopted successfully in other peripheral economic regions and are now being expanded in those jurisdictions.

Thank you.

The Vice-Chair (Mr. Walt Lastewka): Thank you, Dr. Scott.

We will begin questioning with M. Paquette.

[Translation]

Mr. Pierre Paquette (Joliette, BQ): Thank you very much. I would like to thank you for your presentations, and particularly you, Ms. Scott.

You addressed the issue that we, at the Bloc Québécois, are interested in: how to reconcile the development of a system focussed on excellence with the necessity, for both economic and social development reasons, to have high-level research institutions in all regions of Canada.

• 0955

In your conclusion, you proposed a certain number of elements. Unfortunately, I did not quite understand the third one. You first spoke of a program in parallel with the one based on peer review—focussed on excellence—which would address the issue of indirect research expenses. You also spoke of having more sectoral programs, because, as you mentioned, many of these universities develop on the basis of a regional industry or activity.

I did not understand your third element. I would like you to repeat it. I would like to ask Mr. Winter what he thinks about it. Is it a solution to the problem when you have to choose between excellence and access to research for everyone, because teaching is linked with the development of research? I always draw a parallel with professional sports. If there were no draft system in professional sports, only the best teams with resources could have good players, and in the end, there would be no more game because there would no longer be the required balance within the league to make the product interesting. This is somewhat our goal. We want to find out how we can have a system that is equivalent to the draft system in professional sports so that scientific and technological research can develop in all regions of Canada.

I would like you to elaborate on these three points, and then, Mr. Winter, you could tell us what you think about it, since you carry the “ball” of excellence.

[English]

Dr. Jacquelyn Thayer Scott: I'm very interested to hear Alan's commentary on this, but your remarks remind me of another way of saying this: where you stand depends on where you sit. It's not that any one standard of excellence is wrong, but there are different and equal standards of excellence that are more appropriate to different contexts. It would be the same, for example, if we set out the task of determining what characteristics make an excellent MP. You might have different kinds of standards set by the business community from those set by a community that has lower socio-economic capacities and so on. It's not a question of which standard is better or worse, it's that there are different standards that are appropriate to the context.

That really is what the ACST has been saying in at least two of its reports, that we've been doing very good things related to strengthening the good things that already exist in Canada's well-developed universities in metropolitan regions, but now we need to have separate programs that are equally excellent, but with different standards for the excellence appropriate to other communities that are at a different developmental stage.

Dr. Alan Winter: Thank you for the question. I come from an industry background, so I may not have the broad perspective that answers all of these particular questions, but we are faced with some of the same issues. Some of those issues deal with how we attract the very best people to industry or to universities when perhaps we are faced with competition from the States. That's one of the things we have to deal with. We've learned in that process to do a number of things.

You do want to know what is globally competitive. You really do want to know what is excellent. That is only one part of the decision. That knowledge of what is globally competitive has to come through some sort of review, for example, a peer review, as Doctor Scott says.

The second thing is that this is not necessarily the only input into making a decision. There are many other strategic reasons for making a decision. To some extent, that requires focus. So from an industry point of view, if I want to attract somebody into my company in Vancouver, I have to make sure we are seen to be some of the best in the world at what we do, though it may be in a small area.

• 1000

Also, we have to provide sometimes the incentives, as you were saying about the draft, to allow that decision to be made, hoping there will be a critical mass of science and technology or development that then builds on itself, so it's not something that will stay there forever.

I think the federal government, in the case the CSTA has looked at, is also in a competitive situation for attracting scientists, for example, into different parts of Canada, not just into the Ottawa area or into some of the regions. And again, from a federal government science point of view, there has to be focus in some of these areas such that in any one place it is seen to be excellent.

So I guess the summary is that there is focus required, that peer review is only one input into this particular process, because many of these decisions are strategic investments, and we have in this particular process to make sure we are globally competitive.

[Translation]

Mr. Pierre Paquette: Do I still have a little time? Something bugs me quite a bit about peer review. As you said in your presentation, we must ensure that scientific innovation will be supported and that conventional approaches will not necessarily be favoured.

I have studied a long time, unfortunately, at university. I dragged along a little. I was therefore able to see that there are trends. At a certain time, it was the environment. I have known many professors of sociology, economy, etc., who suddenly became interested in the environment because the money was there. They gave up a whole series of extremely important dimensions of thought.

There are also the paradigms. I have a degree in economy; I even taught it for a few years in college. In economy, a few years ago, being Keynesian was completely outdated; as for being a Marxist, well, I need not talk about it. You had to be a monetarist. The Chicago school dominated. Fortunately, this week, the Nobel Prize of Economy was awarded to three neo-Keynesians. Maybe the paradigm will change, but I am convinced that in the 1980's and early 1990's, if you were not a monetarist, you had no chance of getting a research award.

Another phenomenon seems to be developing. Problems being increasingly complex, our scientific disciplines, particularly in human sciences, have unfortunately been somewhat segmented. We are witness to a necessary return to interdisciplinarity. Is our peer review system able to take into consideration the complexity of research? Poverty, for example, is both an economic and social problem which has an impact on health and our concept of society.

What guarantees that peer review is able to avoid the pitfalls of trends and conservatism in science, which in the end prevent specific problems from truly being addressed? I do not know if you have an opinion on this, Mr. Winter and Ms. Scott. I would be very interested in hearing what you have to say.

[English]

Dr. Jacquelyn Thayer Scott: I'll give you a very practical and recent illustration of the point you make, and I think it illustrates again the points I was making.

Part of my recreational reading on the plane last night was a 35-page document of peer review comments on a recent grant application of ours that was turned down. In the case of the external peer reviewers, two of them were highly enthusiastic and recommended that the project be fully funded, one professed that he didn't have the expertise to make a judgment, and the other was opposed. This outcome meant that it had to be kicked to another level for making a decision.

The particular area we had requested the support in was an area that had been determined by local consultation as important in our particular area. The academic community supported it, the local community supported it, a number of first nations communities in our area supported it, and what was really incredibly encouraging for us, we had raised more than $300,000 in subregional private money to support and complement this. That's tough to do in our area.

• 1005

The critique by the external reviewer who had recommended the project not be funded was that it was too innovative and the researcher did not have a strong track record in this particular area. That's absolutely an accurate and correct outcome of what I would call standard peer review. He's right, there isn't an historical track record in this, because it's a new area, and it's an area that the private sector has indicated, by laying money on the table, is important in our particular region. The individual who applied for the grant has been invited internationally, to Australia and New Zealand, Africa, to speak on this particular topic. She's been featured in both national and international higher education publications for the work she's been doing in this area. But quite accurately, the panel above, for whatever reasons—and standard conservatism said, if we had to choose, in a resource-constrained system, we'd go for proposals that were more evolutionary, where there were the traditional hallmarks of peer review, international comparators, historical track record, those sorts of things—made the correct decision based on that model. But that particular standard was applied to peer review as if this were a well-developed area or in a place where there was a lot of research infrastructure.

If a different standard had been applied, still requiring the excellence, a different outcome might have been met. If we had the sort of situation recommended in the ACST's latest report, having been turned down through the traditional system, we could use some of those dollars that are given to us for indirect support to pursue what has been determined to be a niche area that needs to be built in our institution and one of these days will have that track record, will have that international credibility, and so on.

That's the constant dilemma. If we do what we are mandated to do, which is multidisciplinary, innovative, things that haven't been done by government programs in respect of subregional development, we will always fail with the other system. That doesn't mean the other system is wrong. It's right for a whole bunch of people, it's wrong for us. That's why we need the parallel tracks.

The Vice-Chair (Mr. Walt Lastewka): Thank you.

Dr. Alan Winter: Dr. Scott has really answered through a very good example. I'll just make two comments that back it up.

Innovation, by its very nature, as I mentioned, must allow the ability to come up with unconventional ideas, and we have to be very careful in peer review to allow those unconventional ideas to somehow surface. That may be through parallel tracks or through other methods.

The second comment is, you're absolutely right, as science and technology get more and more complex, with more and more factors, so that it's maybe not just quality, but also relevance and so on, then peer review has limitations, and those limitations are often to do with the fact that no one person can cover all the factors you're trying to bring into a decision. So my view is that peer review is only one input into a decision, it should not be the only input.

[Translation]

Mr. Pierre Paquette: I would like to thank the witnesses and apologize, as I must now go to the House. Thank you.

[English]

The Vice-Chair (Mr. Walt Lastewka): Thanks for coming and helping to start the meeting.

Mr. Pierre Paquette: Thank you.

The Vice-Chair (Mr. Walt Lastewka): Mr. Bagnell.

Mr. Larry Bagnell (Yukon, Lib.): Thank you for coming.

I realize this session's on peer review, but you could go outside this in answering my first question, which is related to anything on your council.

My riding's the Yukon, and I was glad to see Dr. Thayer Scott talk about the remote areas and solutions to that. I'm not sure we get as much science input from the federal government as possible, especially since the closing of Science Culture Canada, which, in a more limited capacity, was very useful. Do you have any comments on that, first of all, whether there's enough science investment in the north? We could do research in the north on the north and sell it to other northern nations. It would more logically be done there. A lot of northern research, actually, is done outside Canada or in the south. Do you have any concrete suggestions on the existing system that would make that possible?

• 1010

Dr. Jacquelyn Thayer Scott: I have to answer personally, not as a member of the ACST, because we haven't been asked to consider that question, so I honestly do not know what the council would recommend.

My own view is that many of the problems that are faced in the north are similar to problems that are faced in mid-Canada, particularly in relation to economic and social issues. You're quite right, traditionally, we haven't put as much emphasis on that, in large part because most of our institutions are not in that band. There are some interesting developments over the past few years that could perhaps be better supported, the circumpolar exercise, and the universities around the world that have formed a network concerned with issues in the north.

I think it would also be possible for the federal government, under the kinds of rubrics we've recommended, particularly in the skills panel report, to better develop the capacities of Yukon College and Arctic College to partner with folks who can help them developmentally on a particular track. I know we, for example, at UCCB work with Arctic College regarding faculty development and students in bachelor of science programs. We have a strong program that utilizes the traditional ecological knowledge as well as western models of science in that. There's no program that helps either us or Arctic College with those costs. That's something we feel is important to do and we try to do. We could do a lot better job at it a lot more quickly if there were the kinds of selective investments in developing subregional knowledge-based infrastructure that we recommended in the skills panel report. Those are more needed in the mid-north and the far north, which doesn't mean they aren't needed in some parts of the south, but it certainly is critical to the development of those areas, I would agree with you.

Dr. Alan Winter: I'm afraid I can't add very much to that on behalf of the CSTA, because the CSTA has not been asked to look at that either. But I can make a personal comment.

For a number of years I was director of engineering for Telesat Canada and had the opportunity to travel through the north at that time. I realized that a lot of the push we've had in Canada in communications has come from innovations that had been required because of our coverage of the north. So some of the ideas that had been included in trials and tests in Yukon and other places got fed back into the Canadian space program, for example. I think some of the investments the federal government proposes to make in, for example, communications must be crucial in making sure we're able to carry out research and development science and technology activities in the north as well.

Mr. Larry Bagnell: If you're going into future considerations, I can think of two examples. As you know, the northern pipeline is a huge issue in Canada right now, and apparently, 10 years ago, when they were doing the physical research on the pipeline, they actually did some of it in France, of all places. We have the cold, that's where the pipeline's going to go, why wouldn't... Another one is rime icing on windmills. That problem is not solved in any northern nation. If we did that research on our windmills, we could sell it to other Scandinavian countries etc.

My second question is related to improving peer review, and is primarily for Dr. Winter. Your talk was great, but it was a lot of euphemisms of motherhood and the way it should be, which was fine. If we're given the task of trying to improve it, I have a couple of questions on how we might do that, to push you on a couple of your points and get some concrete suggestions.

You both made the point, I think, that peer review should be only one input in making a decision. I'm not that familiar with the process, but peer review seems to be, from what I've heard so far, the only input in some of these decisions. Is that true? How would you recommend that be changed?

• 1015

Dr. Alan Winter: I think Dr. Scott needs to answer that in more detail, because the ACST looks at external research, some of the granting councils, and so on. At the CSTA we look at the internal government S and T, and in that area, I think, generally, peer review is not the only input into a decision, because the departments have, perhaps, other reasons to want to carry out research in a certain area. They certainly want to know what is excellent research, and whether it should it be done internally or externally, but the decision to go ahead and invest in a certain area is generally within the purview of the departments, their budgets, their strategic reasons, and so on. So I'm not sure it's as much an issue there as it may be in the external research for universities and for the granting councils.

Dr. Jacquelyn Thayer Scott: That would certainly be the case. There are some categories in the granting councils that have strategic areas of focus, where the relevance of the program to the strategy is important, as well as the application of the peer review process.

As well, there are some interesting new developments in the CIHR. It is still to be seen how that plays out, but they have included external community members who are not part of the scholarly community as part of their institute advisory boards, which is an interesting piece.

As far as remedying the situation goes, generally, I think you would notice a fairly significant change simply by implementing the recommendations of the two ACST reports related to this topic on the needs of parallel systems. I think you'd begin to see a lot more innovative activity across the spectrum that has different, but equally valuable, standards of excellence applied in the evaluation and measurement of this.

Mr. Larry Bagnell: Dr. Winter, you also made the good point that a downside might be that it would inhibit the unconventional, which it is important to keep. In fact, you can probably give anecdotes from history. If all the people on the peer review committee had thought the world was flat or they all believed in evolution, it would have prohibited research in another area. What suggestion would you make for the granting councils that would help avoid that problem, if you think that problem is actually occurring with certain applications?

Dr. Alan Winter: It's a good question, because I think, to some extent, the programs need to be set up to recognize that there are unconventional areas. Again, the CSTA doesn't particularly look at the granting councils, but from a personal point of view, for example, when I was in industry running a research centre, we did a peer review to find out what excellent research there was, but that was one of probably ten inputs in that particular case. In that case we always kept, for example, 10% of our budget for what we called unconventional ideas, or ideas that perhaps would have great payoff, but were not the most conventional way to go about doing the science and technology. That is not so much an issue of how you do peer review, it's more an issue of how you set up a program.

With New Media Innovation Centre, which we have now opened in Vancouver and is about 50% funded by industry and 50% funded by government, in those cases we try to say, if you have standard product development, for example, that gets reviewed in a certain way. But if you're dealing with longer-term research, where there are significant areas of interest to a number of players, but in fact, there isn't a centre of excellence for that topic, we should be able to take a percentage of our budget, again, and fund that.

• 1020

So I think there are ways of doing it, but I think the ways generally deal with how you set up a program, and then how you take peer review as one input, not all the input, in making decisions about those particular investments.

Dr. Jacquelyn Thayer Scott: I would reinforce what Dr. Winter said. It's how you set up a program. But I don't think you will significantly change that system. Let me again give an illustration from various peer review panels I've sat on myself.

For the past 20 or 30 years there's been a very lively debate within the scholarly community over the validity of various models of quantitative, as against qualitative, research. You have people from different perspectives submitting different kinds of proposals. It's the luck of the draw. You may hit a panel that's composed of a lot of quantitativists when you've got a qualitative proposal. It doesn't matter what the problem is, doesn't matter what the solutions look like, it's going to fail on methodological grounds. That kind of thing is always going to exist, because arguments within the academic community are never resolved over a short period of time, they're lively debates that take decades. So to that extent, while you might make some marginal or incremental difference in design of program—and I certainly agree with Dr. Winter's suggestion on that—you're not going to see major change in that system.

As I say, it does the things it does very well, but if you want some other things to happen, there has to be a parallel and other system that is not controlled in the same kind of way, that has different standards of excellence and measurement applied to it, because it's a system that's taken a long time to build, and it's not highly amenable to revolutionary change.

The Vice-Chair (Mr. Walt Lastewka): Okay, Mr. Bagnell?

[Translation]

Mr. Drouin, you have the floor.

Mr. Claude Drouin (Beauce, Lib.): Thank you, Mr. Chairman. I also thank our witnesses for their presentations.

Ms. Scott, in your presentation, you said that partnerships between universities, colleges, and the private sector should be promoted. I would like you to explain this, because at home, in the Beauce, we have a partnership that works very well and greatly helps the region in terms of industrial development. I would like to hear your comments on this. I would also like you to give us a concrete example, if possible.

Secondly, we have heard a lot about indirect costs since the beginning. My question is rather hypothetical. Assuming the same amounts of money are maintained, will the government have to reduce its grants to include indirect costs in them, or will it have to try, as you do with your partnership, to find other ways of ensuring that maximum funds will be allocated to research?

[English]

Dr. Jacquelyn Thayer Scott: Those are very interesting questions. Thank you. I'd be delighted to comment on the partnerships in the Beauce, because that's something we've looked at and tried to emulate. It's a very successful region in doing that, and there are a lot of social cohesion factors as part of the success that sometimes don't exist in other regions, but can be built.

I do think the partnerships are the way to go. We're another place that has focused on partnerships with the private sector. But I would say, with respect, that again it depends a bit on where you're located. Quebec has one of the outstanding track records in the country in regional development of its universities and its various regions. All provinces do not make the same kinds of investments and decisions. Similarly, Quebec's science and technology council is highly respected across the country for many of the reports and things it has done, and it has indeed looked at some of these kinds of issues.

So yes, it's great to emulate it, but there is some strong governmental support for those private sector, public sector partnerships to happen, and that's required in other places too where initially, there are not many larger private sector entities to really participate in that kind of way.

With regard to your other question on indirect costs, I'd like to say that neither one of those alternatives is very attractive to me. If you cut the direct costs of research, there's no net gain. They're going to have to steal from the indirect money to pay the direct costs, or not as much work is going to get done. So I don't think it's a satisfactory solution.

• 1025

Is it possible to look for other supports? Yes, but the capacity to do it is highly differentiated across the country. A few years ago we did an interesting little map, which I'd be happy to dig out for anybody who's interested, for some of the MPs in the Atlantic region. We looked at the map of Canada and did a scattergram of where federal investments and research take place, within NRC etc., as well as others, how the granting council moneys get distributed, and where the headquarters of the 500 leading Canadian-owned organizations are. It's a very interesting little exercise, three overhead transparencies. Guess what? You put them together and it's one transparency. It's no accident that it's easier to raise private sector money in Calgary or Toronto or Ottawa or Vancouver or Montreal. That's where people see you, people know you.

We were having an interesting exchange, the chair and I, before we began this. I was saying we get very much involved with recruitment of firms and development of private industry in Cape Breton, and among the single biggest comments we get, whether we're talking to a venture capitalist or to someone who might locate a firm there, is, do I have to take more than one plane to get there? In the chair's case it's, do I have to cross two bridges to get there? But my point is that if you're right next door to where the money is, if you can drop in and see one another in the Petroleum Club or the Halifax Club or whatever over lunch, it's a lot easier to make that fundraising request than if you're in some far-flung place where they don't know you as well, but even if they do know you, by virtue of assiduous marketing and so on, there's not the same cachet in dealing with you. If you get involved, make very large donations, and are widely appreciated at a large university, would you rather be recognized by Goshen College or MIT, to use U.S. examples? The same applies in Canada.

So it's a tougher stretch for regional institutions to do that, and I think some of your Quebec colleagues in some of the regional university systems have found that to be difficult as well. They've had the advantage of the provincial support. That doesn't necessarily exist in all other provinces, and for sure, it doesn't exist in the province I'm in. So the ability to look elsewhere for matching funds and partnerships, while it must be encouraged and while the institute should demonstrate that it's made the effort, is very widely differentiated.

[Translation]

Mr. Claude Drouin: Thank you. I appreciate your comments.

You know that the same problem exists in Quebec, although Quebec is recognized for its considerable efforts in this area. Remote regions are often left behind, and this is where Quebec must innovate and find new ways to meet its objectives so that these regions may develop normally.

Thank you very much, Ms. Scott.

Dr. Winter, you said that peer review was not always appropriate. In one of the answers you gave, you also said that peer review, if I am not mistaken, did not necessarily consider the regions, the dynamics and the problems some of them have. Did I get it right? I would like to understand the circumstances in which these reviews would not always be appropriate. Could you elaborate on this so I can understand what alternative or improvement you propose to these reviews?

[English]

Dr. Alan Winter: My experience comes, as I mentioned, from the CSTA viewing of the federal government expenditures and from industry and dealing with some of the organizations like the Networks of Centres of Excellence. I think to some extent, we allow ourselves to assume that peer review actually makes the investment decision, and if there's any one theme I have in this it is that it's only one input for the decision itself. So in answer to your question, if what is being reviewed is scientific excellence, then it's important to have peers who understand that scientific excellence. Sometimes I think we go beyond that, to go by some of the peer reviews I've seen personally, where we ask the peer review committee to essentially make the investment decision. That, of course, means it is not just on scientific excellence, it may be on relevance, it may be on a judgment as to whether this is the right research to be done for this at this time.

• 1030

So I think it's very important, first of all, to be very clear about who makes the investment decision, what the organization is that does that. Second, what is being asked of the peer review committee? If it is for scientific excellence, then that should be the purview. In fact, instead of checking off a box that says invest or not invest, it should be, what's the level of excellence for this particular review? It should be recognized that there may be several factors that go into making an investment decision.

That, I think, is why you asked the question about regional investment. Regional investment makes decisions, which may be based on several other factors. As Dr. Scott has pointed out, you may have to seed an investment in a certain area in a remote region that you hope will develop into a significant scientific venture, or perhaps a company. That may take several years. So you make a strategic investment based on that particular set of circumstances.

I think those were the two things I was trying to bring out, first, that it's important to understand who makes the decision for any investment, whether that be inside the government or outside, and second, what is being asked of the peer review committee, and are people being chosen to answer that particular question, not a very broad range of questions?

[Translation]

Mr. Claude Drouin: Thank you, Mr. Winter. I have another short question for you.

Did you say that, when peers review a case, they simply have to check the appropriate box, or must they explain why a project is refused? I ask you the question because, in the case of a project where the explanation of the refusal is recorded on file, it is often possible to argue that this or that element of the project was not considered, or was not adequately considered, to demonstrate that the refusal was not justified. If one only has to check a box indicating the project is refused, there is no way out. What I understood from your presentation is that there needed to be peers who know the project and have thorough knowledge of the area involved. Is this what you said? Did I get it right, Mr. Winter?

[English]

Dr. Alan Winter: Perhaps I didn't explain myself well. If we're being asked to evaluate scientific excellence, then obviously, the peers have to understand that area of science. They may not understand all the other factors. In many cases where, for example, there's an international panel, the international panel may know nothing about the particular Canadian place where that is carried out, but they do know, on a global basis, what the science is. What they're being asked is to judge the level of excellence against an international, global, competitive scientific view of this particular topic. That's more what I meant.

Peer review has no magic. There are other ways of evaluating projects. In fact, in the venture capital arena, which many, I'm sure, on the industry committee are involved in, peer review may be one question asked. Is this technology really good? Is this something we should invest in? On the other hand, the decision to invest in a company involves many other factors. Is the marketing system there? What's the access to global markets? Therefore, if you want to get an answer to any one area, I think you have to pick the people you're going to ask to deal with that particular area, not necessarily the whole broad range of the factors that go into an investment decision.

[Translation]

Mr. Claude Drouin: Thank you, Mr. Chairman.

• 1035

[English]

The Vice-Chair (Mr. Walt Lastewka): Thank you very much.

Since we have some time, I'm going to ask some of my favourite questions. First, I'm going to ask Dr. Winter, because of his industry background.

Over the last 20 or 30 years industry has improved in quality, management, and leadership by having good benchmarks and standards and being rated. The programs that have been developed were developed to improve not only the quality coming out of industry, but the systems. My question to you, sir, is, since those programs helped industry to eliminate waste and duplication and much that wasn't value-added, what similar program do we have in research?

Dr. Alan Winter: Again, this is more a personal comment than from the CSTA. My experience in industry would parallel what you have mentioned. Over time we have probably become better in manufacturing, in quality of products, and in the research and development that goes into that. But the improvements, to some extent, are more to do with the processes for development than the processes for research, if you see what I mean. With development, many of the standards that started to be applied on the manufacturing side, as you know, are being backed up, if you like, into the development areas, and for that matter being backed up into the research areas.

The Vice-Chair (Mr. Walt Lastewka): Right.

Dr. Alan Winter: I think to some extent we can help with small companies to provide, if you like, some of the discipline in making good decisions with a limited investment in, say, product development etc. Some of those ways to help deal, as you mentioned earlier, with how you get a very good proposal to go forward, how you describe what it is you want to do under research and development and get approval for that, whether that approval be internal in the company or whether it be, for example, from the venture capitalist. Those are some of the areas we deal with.

The Vice-Chair (Mr. Walt Lastewka): The implementation of those programs in industry, as you said, backed up into the development stage, and now has backed up into the research stage. As a result, there is better research coming to many industries because of the systems after the research, the development, actual marketing, and so forth.

Dr. Alan Winter: Right.

The Vice-Chair (Mr. Walt Lastewka): That discipline has proven that large companies are being challenged by many small companies. In large companies and corporations you can hide the waste and hide the duplication and hide many non-value-addeds, because of their size. I'm not saying that is paralleled in large and small universities, but I see a direct correlation. I see many large universities getting much more money, grants, and research dollars to do their things because of their history and so forth, but I'm not sure how they're being benchmarked for excellence, because excellence can also come from a smaller university.

Do we have anything even being thought about in that way? Dr. Scott.

Dr. Jacquelyn Thayer Scott: I don't know that I can give a complete answer on that, but I can certainly ask our ACST staff to get back to you with a more comprehensive reply.

• 1040

The principal thing I would say we have at the moment is the attempt to measure commercialization based upon expenditure and a few little indices like that. But an equivalent program such as you're describing, equivalent to that of industry, I don't know that such a thing exists, and I'm thinking internationally as well. The U.S. has a whole separate set of accrediting agencies, and some of their national funding agencies do their own kind of accreditation separately from the peer review kind of process. That's not as often the case in smaller countries, because it's a pretty heavy duty investment on the U.S.'s part to do that.

In some jurisdictions, perhaps in Alberta, where they keep performance indicators and a few other things like that, they begin to approach that, but I'm not aware of any comprehensive system. I'll ask our staff to get back to you on that.

The Vice-Chair (Mr. Walt Lastewka): I'd just like to see some of that correlation. When those principles were applied to large businesses and large corporations, and then it started to bubble up into the small businesses, we found out that the small businesses really challenged, because there was excellence in the smaller companies: they could make decisions more quickly, they didn't have a bureaucratic system, they didn't have all the extra items a large corporation has. As a result, they have been more effective.

I'm just wondering whether there is a correlation with smaller universities. I would appreciate any feedback on that.

Mr. Bagnell, do you have any more questions?

Mr. Larry Bagnell: No.

The Vice-Chair (Mr. Walt Lastewka): Mr. Drouin, Ms. Torsney. No.

I would like to thank the witnesses for having excellent presentations and some real good discussion. You've added some different aspects from what we've had in the past, and I think that's one of the reasons the questioners wanted to take that extra time, to get some satisfaction and be able to now compare a few notes and challenge our next steps.

So I want to thank you very much. Dr. Scott, I am sure you can meet your one flight to Cape Breton as soon as possible. Thank you for being with us today. Thank you.

The meeting is adjourned.

Top of document