INST Committee Meeting
Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.
For an advanced search, use Publication Search tool.
If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.
STANDING COMMITTEE ON INDUSTRY, SCIENCE AND TECHNOLOGY
COMITÉ PERMANENT DE L'INDUSTRIE, DES SCIENCES ET DE LA TECHNOLOGIE
EVIDENCE
[Recorded by Electronic Apparatus]
Tuesday, October 16, 2001
The Chair (Ms. Susan Whelan (Essex, Lib.)): Pursuant to Standing Order 108(2), consideration of the peer review process, we are very pleased to welcome for our first meeting the Canadian Institutes of Health Research, Mr. Mark Bisby, the director of the the research portfolio; the Natural Sciences and Engineering Research Council of Canada, Elizabeth Boston, the director of research grants; the Social Sciences and Humanities Research Council of Canada, Ned Ellis, the vice-president of programs, and Dr. René Durocher, the executive director of the Canada research chairs program.
I would propose that we begin with opening statements, unless there has been a different agreement, in the order I just listed, and then we will move to questions together. If that's okay, we will begin with Mark Bisby.
Mr. Mark Bisby (Director, Research Portfolio, Canadian Institutes of Health Research): Thank you very much for the invitation to appear.
The Canadian Institutes of Health Research, CIHR, is the federal government's primary agency for supporting health research at universities, hospitals, clinics, and research institutions across Canada. Its objective is to excel, according to internationally accepted standards of scientific excellence, in the creation of new knowledge and its translation into improved health for Canadians and more effective health services and products, and to strengthen the Canadian health care system.
• 1530
CIHR began
operations in June 2000, replacing the former Medical
Research Council and the National Health Research and
Development Program of Health Canada. It consists
of 13 institutes, each of which is
headed by a scientific director, advised by an
institute advisory board composed of researchers and
other stakeholders. Overall strategic direction of the
organization is provided by a governing council, whose
chair, Dr. Alan Bernstein, is also the president of
CIHR.
CIHR's current grants and awards budget is $452 million, invested in approximately 3,300 research projects across Canada, also supporting 2,100 young Canadians undergoing training in research and about 600 career investigators.
We fund two types of research, generally. The first comes by what we would call open competitions for grants and awards that support the best kinds of research, as determined by peer review in any area of health research. The second type of research activity we fund we call strategic or thematic research, providing opportunities for support of research in an area deemed to be of particular importance to the health of Canadians or to represent a topical opportunity to make research advances, because of the evolution of health sciences.
The decision about which of the many urgent health issues require attention through these thematic initiatives is made largely by the 13 institutes, which, independently or in collaboration, use their strategic research budgets to finance this strategic type of research. Currently, about 20% of the CIHR budget supports strategic research, but this proportion will grow to somewhere between 30% and 40% of the budget as the institutes, which only began operations in February 2001, mature and develop.
Peer review is the process by which CIHR selects proposals for funding. In the 2000-2001 competition cycle CIHR had 50 peer review committees, which involved a total of 730 volunteer reviewers. These reviewers each give up between 4 and 6 weeks of their time to read proposals, to write detailed reports on them, and to meet as committees, usually twice a year. A typical peer review committee meets for two days at each session and considers around 50 proposals. Through a process of seeking consensus, the committee arrives at a numerical rating for each proposal.
I would like to say a few words about the criteria the committees use in evaluating the applications they see. There are essentially five.
The first is significance, the potential of the proposal to generate advances in knowledge about the biological, behavioural, environmental, social, or cultural factors that influence human health and the health of populations, or its impact on the organization and management of the health care system.
The second criterion is the approach the researchers intend to use, the feasibility and effectiveness of the methods they intend to use in their project.
The third criterion is novelty, the innovation one sees in the concept design implementation of the project or the way in which it will disseminate its research results.
The fourth criterion is the team of researchers, their relevant previous experience, their training, their productivity, and their previous achievements in research.
Finally, there is the environment, the adequacy of the research infrastructure that will support the investigation.
Those are the five criteria: significance, approach, novelty, team, and environment. A handy acronym is SANTE, santé, health.
The decisions that are arrived at as a result of the rating each application receives are critical in determining whether or not things get funded. While about 70% of the 5,000 proposals that are reviewed each year are considered worthy of funding, the budget that is available allows only about 30% to be funded. The CIHR's rating scale, which goes from 0 to 5, currently requires about a 4, or an excellent rating, on the rating scalefor a proposal to be sure of being funded.
The peer review system at CIHR is undergoing a period of evolution. We've introduced a dozen new committees in the past year to deal with increased application pressure and to accommodate CIHR's broadened mandate, and we're creating about 15 ad hoc committees to review the strategic or thematic initiatives I mentioned. To try to find the best fit between a specific proposal and the team of reviewers that evaluates it, we'll be forming clusters of similar peer review committees, which will meet at the same time, allowing for a flexible committee membership, better tailored to the review of a wide range of proposals. This cluster organization should also stimulate a more rapid evolution of committee mandates to anticipate, rather than react to, changes in the directions of health research.
• 1535
We also now include non-researcher members of peer
review committees in the thematic and strategic part of
our funding, where other experience and expertise can
add important additional dimensions and perspectives to
the review of applications. For example, we include
community members on committees that review
proposals in the area of aboriginal people's health.
CIHR has an obligation not only to support excellent health research, but also to ensure that there is robust health research capacity in all regions of Canada. The peer review system does not generally consider regional distribution as one of its criteria for evaluation, and we have developed other processes to try to ensure that there is good regional distribution of CIHR funds.
First, we have widened our eligibility terms, so that researchers who are not connected with a university can now apply for CIHR funding.
Second, we operate a regional partnership program in several of the smaller provinces in regard to their health research capacity. That is a partnership program with local sources to fund additional research in those provinces.
Third, we've started a development grant program to universities that have identified health research as a priority in their Canada research chair strategic plans, but have traditionally not been strong in health research. Our objective in this development program is to combine CIHR's seed money synergistically with Canada research chair positions, and so develop specific peaks, areas of health research excellence, in these smaller institutions.
I will stop there.
The Vice-Chair (Mr. Walt Lastewka (St. Catharines, Lib.)): Thank you very much.
We will now go to Natural Sciences and Engineering Research Council of Canada, Elizabeth Boston. Please begin.
Ms. Elizabeth Boston (Director, Research Grants, Natural Sciences and Engineering Research Council of Canada): Thank you, and good afternoon, everybody.
[Translation]
On behalf of NSERC, I thank you for your invitation to appear before the committee. We appreciate this opportunity to discuss with you the benefits and the challenges of the peer review process.
[English]
The president of NSERC, Tom Brzustowski, has asked me to bring his apologies for not being able to attend this meeting. He is chairing a meeting of NSERC's council in Edmonton that had been planned for over a year and couldn't be changed on very short notice.
NSERC is a federal granting agency that makes investments in Canada's capability in the natural sciences and engineering. We also invest in the highly qualified people who are needed to create new knowledge for the benefit of all Canadians. NSERC's budget is about $600 million a year.
As to the subject of today's discussion, peer review is a very well established system, used around the world by many funding organizations. It ensures that only excellent research is funded and that all applicants are treated fairly. In essence, it's the assessment of research proposals or research contributions by impartial experts in the field.
At NSERC we generally use peer review in the following manner. First, a researcher submits an application for funding. The content of the application will vary according to the program, but it always includes details about the proposed research, the past experience and qualifications of the applicant, as well as budgetary information. NSERC then sends the application out for review by several international experts in the field, both inside and outside Canada. Every year we consult more than 10,000 experts worldwide.
The application and all the reviews are then sent to a selection committee composed of experts who have agreed to donate their time to the process. They evaluate the application against the program criteria, which also vary, but always include the quality of the proposed work and the qualifications and past achievements of the applicant. The committee then recommends whether the application should be funded, and if funded, the size and duration of the grant.
• 1540
If the application is unsuccessful or is funded at a
level significantly less than requested, then the
committee also provides feedback to the applicant,
outlining its reasons. The applicants also have access
to the international reviews that were provided in
writing, and they can ask NSERC staff for advice in
preparation of their proposals in the future.
Peer review isn't a perfect system, it has its critics, but like democracy, it's the best system we have. Like many agencies around the world, we believe peer review is the best way of selecting the highest quality applications from thousands of competing proposals.
NSERC has some policies and procedures in place to deal with the issues that can sometimes arise. First, we have guidelines to manage and monitor conflicts of interest, and over the years we've found that our committees and our reviewers behave with great integrity. After all, they stake their reputations on the integrity of the process, and they have to use it themselves as applicants. We continually strive to ensure that all our processes are open, fair, and easy to understand.
Sometimes errors can occur in the peer review process as well, and if an applicant feels their application wasn't assessed appropriately, they can use our appeal process to request a review of the decision. NSERC then requests an independent review by a senior researcher who was not involved in the original decision, and staff make a final decision based on this adviser's report.
NSERC has no policy on regional distribution of funds, and in fact, the regional results are the sum of many individual funding decisions. But we are aware that small and some medium-sized universities in some regions of Canada aren't succeeding as well as they would like. NSERC senior management takes this very seriously, and last year we visited 16 universities in the Atlantic and prairie provinces to investigate the issue a bit further. We found that there are indeed barriers in some universities to research productivity, which can in turn significantly affect their ability to compete in an excellence-based peer review system. But the universities actually told us not to lower our standards. Rather, they need targeted resources to bring their research facilities and capacity to a high level, to allow them to compete on a level playing field, and a relatively modest program to provide flexible infrastructure support could go a long way to achieving this goal. This advice has been made available to Industry Canada and the federal regional economic development agencies.
In closing, I'd like to say that the peer review system is well respected by the great majority of the academic community. This is evidenced by the huge volunteer effort from thousands of participants, who dedicate about 80,000 hours a year to the process in NSERC alone. This dedication and commitment, as well as NSERC's policies and procedures, ensure that the system maintains its high standards of equity, quality, and accountability.
Thank you.
The Chair: Thank you very much.
We will now turn to the Social Sciences and Humanities Research Council of Canada, Ned Ellis.
[Translation]
Mr. Ned Ellis (Vice-President, Programs, Social Sciences and Humanities Research Council of Canada): Thank you, Madam Chair.
Dear Members, I am pleased to be with you in order to help you in your important study of the principles of the peer review process in the federal granting councils.
I am Vice-President of Programs with the Social Sciences and Humanities Research Council of Canada. The peer review process is obviously part and parcel of my job, as well as that of all my program officers.
During my remarks today, I will focus on three points: first, I will give a brief overview of the peer review system at the SSHRC and I will underline the aspects that have not been raised by my colleagues; second, the positive discrimination measures taken by the SSHRC; third, the evolution of the peer review system.
First of all, in Canada, peer review is already more than 40 years old and it is a process that is recognized all over the world. Given meagre resources, it remains the best system to distribute public funds. However, it is a system that first and foremost ensures a high degree of excellence in subsidized research. It is the most independent, the most transparent and the most objective granting process.
Each year, 4,000 outside reviewers and 300 members of the committee work on a voluntary basis.
[English]
I've seen this work. It's incredible the effort that people in the academic community put in to look at each proposal in detail and come prepared to discuss and to judge them.
• 1545
It's important how the committee members are chosen.
At SSHRCC committee members are first proposed by the
officers who work in my division. They get the names
by looking in SSHRCC's database, which has over 14,000
curricula vitae now, and by going through lists of
names that have been suggested by the universities.
What's really important here are the criteria we
use, because we know exactly what you're interested in,
and they're the same things we're interested in,
quite frankly. When they choose, they look in
particular for regional representation, for male and
female balance, for small institutions, and—something
we're very proud of at SSHRCC—we also look for
working comprehension of the two official languages.
That doesn't mean that everyone on a
committee is perfectly bilingual, but they are able to
read and to understand.
[Translation]
Moreover, each application is sent to two experts in the relevant field of study: one expert is selected by the researcher and the other is selected in the SSHRC data base. These reviewers undertake a detailed analysis of the project and assess its quality and its relevance.
[English]
At SSHRCC we have adopted a number of measures that, although they're certainly not adopted exclusively for this reason, over time have had the effect of improving certain distributions of our grants. For example, we have a program called the SIG program, which is SSHRCC institutional grants, and these go to all institutions to help them build their research base. The money tends to go for smaller projects with, perhaps, newer researchers who are just getting into the field and starting to get their feet wet, so that they can become experienced researchers and make application to the council.
We also have a specific program of aid to small universities. It's not a huge program, but it's really well subscribed. The universities each send in a proposal for an area where they would like to put some focus. So we ask them for a small strategic plan, something that has helped them a great deal in preparing to deal with the chairs program, I'm sure.
In certain programs sometimes we will introduce particular clauses, in the community-university research alliance program, for instance, which was, of course, community-based and had a huge demand. What was nice about that program was that we specified that institutions could only get one community-university research alliance, which meant that it was much more widely spread across the country. If we hadn't had that, frankly, a few institutions would have received quite a few of those.
As to new researchers, it used to be, up until a few years ago, that the new researchers were in the smaller universities. So we have a special scoring system in our standard research grant, which doesn't give them a great break, but gives them something of an extra advantage that they wouldn't have normally in the scoring criteria. Now, of course, new researchers are coming from all institutions, and they're coming in increasing numbers.
Finally, when we do what we call grant information seminars, we focus on the small universities. This is where we put our efforts, quite frankly, because we're looking to help those that need help.
[Translation]
Peer review is not a perfect system. In the case of the SSHRC, 70% of proposals are not subsidized, which means that many people will likely be displeased.
In spite of the attempts made by the SSHRC to ensure regional representation, provincial policies play an important role in creating a favourable climate for their universities and promoting their success in the competitions launched by federal granting councils. However, smaller institutions must take on challenges that larger universities are not faced with.
[English]
From personal experience, the thing I would tend to focus on there is having a critical mass. That is one factor that's really important. You only have to look at the high-tech sector in the west end of Ottawa to realize how important it is to have other institutions or other organizations that are in the same business, so you've got more people to talk to, more people who encourage you to have ideas or with whom you can easily form alliances. Some universities are a little isolated that way, it's difficult for them to do.
• 1550
The absence of large numbers of masters and
doctoral students, particularly in some of the smaller
universities that are not able to offer doctoral
programs, is a problem. It helps their research proposal a
tremendous amount to have that kind of talent at your
side, and it is something the review panels look
for, because they like to see students being trained.
The third thing, quite frankly, is infrastructure. Some universities just do not have the same research structure. The really obvious element is the quality of the library, but the less obvious one is the quality of the research grants office, which plays a really important role in getting people to apply and helping them with their proposals, so that they are of a higher quality. Some have a very poor research grants office, and some really have little or no SSHRCC presence.
I was warned not to talk about money, and I'm not going to ask for money for SSHRCC, but I did want to put one point on the table. When Marc Renaud was here before this committee, he did talk about money. When he came back to the SSHRCC council, there was a fair discussion. The council decided to put more money into basic research grants. The effect of this is that smaller universities are not really that far off. They're excellent, they're just not quite as excellent as some of the larger universities.
We have three categories of recommendations that are made by committees. One is, recommended and funded. The second is, recommended, but unfortunately SSHRCC does not have the money. The third category is, not recommended at all. That second group, recommended but not funded, is where you find a lot of the small universities and a lot of the universities over which there are regional concerns. We decided at that council meeting to beg, borrow, and steal from other programs. We ended up slowing some down, we ended up putting some off for a year, in order to finance more of that middle category. The greatest percentage increases were Saskatchewan, Manitoba, New Brunswick, and Prince Edward Island. It shows that they're not that far below. They're just
[Translation]
one notch above.
[English]
Without evaluation by peers, I would be hard pressed to come up with another system that would guarantee excellence. I'm struck by the fact that we now have at SSHRCC a number of joint initiatives with government departments and voluntary organizations. When I joined three years ago, it was hard for SSHRCC to get these other groups to do business. Now they're beating down the door. One of the biggest selling points we use with them, quite frankly, in the wake of HRDC, is the fact that we do have a neutral, external, high quality system that allows them to pick excellence from right across the country. As a result, we're doing business with almost every government department in Ottawa now. They see us as a good and neutral way of doing this. We're also doing business with organizations like Thérèse F. Casgrain and the Kahanoff Foundation, for exactly the same reason.
As a closing note, I've experienced both systems. I was director of research in Canadian Heritage for five years before I joined SSHRCC. This system of peer review is far superior to the kind of system I had, where essentially, I and a couple of other people were picking who won and who lost, and I certainly did not have the expertise to do that.
[Translation]
Madam Chair, I thank you for giving me the opportunity to make this presentation on peer review process as it is applied within SSHRC. I will be pleased to answer any question. Thank you.
[English]
The Chair: Thank you very much, Mr. Ellis.
Mr. Rajotte, please.
Mr. James Rajotte (Edmonton Southwest, Canadian Alliance): Thank you, Madam Chair. And thank you very much for coming in today. I certainly appreciated the presentations. We can appreciate how difficult it is for you to set up a process for determining between the applications as they come.
I do want to touch upon the large and the smaller universities and the regional distribution. It's been an issue in this committee for some time about smaller universities not getting their fair share. Perhaps you could just define the difference between a large university and a small university. I don't know if you all use the same standard. And do you know what percentage goes to large versus small? That would be the first topic.
• 1555
For the second, in respect of regional funding, it seems
there are different approaches between the
agencies—
The Chair: Excuse me. I have made an error here. I assumed that Dr. Durocher and Mr. Ellis were together. We have one more presentation before we go to questions. I apologize.
Dr. Durocher.
Dr. René Durocher (Executive Director, Canada Research Chairs Program): Thank you very much.
[Translation]
Madam Chair, ladies and gentlemen, I am pleased to meet with you today.
First of all, let me give you a brief overview of the Canada Research Chairs Program. The program was created in the year 2000 and its purpose is to establish 2000 research chairs in Canadian universities by the year 2005. The program has a budget of 900 million dollars.
[English]
The primary objective of the CRC program is to enable Canadian universities, together with their affiliated institutes and hospitals, to achieve the highest level of research excellence and to become world-class research centres. It is a very special program compared to what you've heard from my three colleagues, because it is only one program. They have a lot of programs they can play with, they can compensate, they can do different things with their programs. The CRC program is unique. There is only one program, but it's huge, it's ambitious.
The program is open to Canadian universities, it is institution-based. Nominations for chairs must be submitted by universities, not by individuals, as is the case for grants, for instance. There are two types of chairs. Tier one chairs are worth $200,000, to attract and retain today's research stars, that is, experienced individuals acknowledged by their peers as international leaders in their research field. Tier two chairs, worth $100,000 a year, are to attract and retain future research stars, those acknowledged by their peers as having the potential to lead their research fields in the future. Tier one chairs last seven years and can be renewed, tier two chairs last five years and are renewable once. There will be a total of 2,000 chairs allocated to 61 universities throughout the country, 45% in the science and engineering sector, 35% in the health sector, 20% in social sciences and humanities.
The allocation of research chairs is based on a university's track record in obtaining research grants from the three federal granting agencies. For many people this is the principal benchmark for research excellence, which is the fundamental design principle of the program. Once again, it's a bit special, because you don't invest in 2,000 researchers, paying $900 million, without having one very important objective, that is, research excellence, building world-class research centres in this country. It's really a strategic investment.
But still the people who designed this program were conscious of the small universities. That is why in the program there's provision of 120 chairs for the small universities. These are special chairs for small universities. But they can also obtain regular chairs. Looking at the figures we have now, we can expect that the small universities, those who receive less than 1% of the funds from the granting agencies—that's the definition of small universities—will receive 120 chairs as a special allocation, plus at least 120 chairs as a regular allocation. So they will have 240 or 250 chairs.
• 1600
The CRC program provides unique opportunities for
smaller universities to attract or retain world-class
researchers. Without this program, it would be much
more difficult for these small universities to retain
our attract these researchers. In fact, through the
CRC program, excellence is being promoted
across the country in all universities,
whether small or large.
Another feature of this program is that the Canadian Foundation for Innovation invests with the chair $250 million. As you know, CFI covers 40% of the cost of a project, so it means there will be another $350 million found by the university in their endowment fund or paid by their provincial government. So $900 million is invested for the chairs, but there will also be $600 million invested in equipment and infrastructure. So it's really a big program. Once again, it serves small and big universities.
The review process of this program has the same principles as in the other councils, with some difference because of the nature of the program. We have what we call a college of reviewers. It's a bank of 1,200 experts, the best we could find in various fields of research, who have generously agreed to assist in the deployment of these 2,000 research chairs. Excellence in research, wide experience, and sound judgment are the prime considerations in the selection of individual college members.
Each nomination is reviewed by three experts. If a consensus is reached, there will be a positive recommendation to the steering committee to award the chair. If there is no consensus, it goes to another committee composed of 15 members from all three sectors, who meet face to face two or three times a year in Ottawa and look at all the cases where there is no consensus. They make a recommendation to the steering committee.
The steering committee includes the presidents of the three granting agencies, NSERC, CIHR, and SSHRCC, the president of CFI, and the deputy minister of Industry Canada. There is no appeal system in this, but—
[Translation]
and that is important—if the nomination is rejected, both the university and the nominee can review the assessments that were made and may decide to submit the application again, at which point we will send the file to three new experts in order to have
[English]
a fresh look on this second review.
[Translation]
What is the situation after a year or a year and a half? The program actually started last September. It had been created back in May, but it took some time to organize and the first applications were received in September. So, after 14 months, we have granted 455 chairs. There were 527 submissions and 455 of them have been accepted.
So the rate of success after this first year is pretty high, 86%, and we're glad about that, because the universities knew it was very important to send their best candidates or to recruit the best candidates available in the world. They knew that excellence was the criterion on this program.
One thing we can see after the awarding of these 455 chairs is that 82% of the chairs are going to the retention of our best researchers. That is important. There is very strong competition for these outstanding researchers, and there are some in our Canadian universities.
We have received internal nominations from the universities. We expected that especially at the beginning of the program, it would be mainly internal recruitment. But when the program was established, there was a lot of discussion about the so-called poaching issue. Some were very much afraid that large universities would raid smaller universities to get their best researchers. That has not happened after 500 submissions. Only 6%, 29 nominations approved, involved a transfer from one Canadian university to another, and very often it was from a large university to a small university or between two medium-sized universities. So there was no raiding from large universities.
What is also very interesting is that 10% of the nominations, 47 out of 455, involve recruitment from outside Canada. It's approximately half: 50% expatriates coming back, 50% foreigners. That's very good. We hope that in the future we'll increase the number recruited from abroad.
The last point I want to mention is that the Canada research chairs program is committed to the following reporting requirement. A review of the operation and structure of the program will be conducted during the third year, a comprehensive evaluation in the fifth year. The objective of the third-year review is to identify potential adjustments that will improve the likelihood of obtaining desired outcomes. We've started this evaluation. It is a very good opportunity to be here with you, because we're very sensitive to your suggestions and what you might think about this program. In this evaluation we will look at a large number of issues, and it will probably be completed by September. If there are changes to the program, we will inform the universities, and we'll go on with this.
We cannot cover every issue in a ten-minute presentation, but I'll be pleased to answer your questions.
Thank you.
The Chair: Thank you very much, Dr. Durocher. Again, I apologize.
We're going to go back to Mr. Rajotte. Mr. Rajotte.
Mr. James Rajotte: Thank you, Madam Chair.
I'll just give you an opportunity, then, to address the first question I had, which was whether you have some collective definition of large and small universities and whether you know the percentage that goes to small versus large.
Dr. René Durocher: In the case of the chairs program, as I said, the definition of a small university is one that receives less than 1% of the funds given by the three granting councils. By the way, it's good for universities, because with that kind of definition, it means a good number of universities will receive some of these 120 special chairs.
Mr. James Rajotte: Under this definition, how many large universities are there in Canada?
Dr. René Durocher: There are all kinds of definitions—comprehensive universities, universities with a faculty of medicine, and so on—but by our definition, there are 34 small universities, and the rest are considered large universities. But there are other ways of looking at things.
Mr. Ned Ellis: For instance, at SSHRCC we don't use exactly the same definition. We're all dealing, of course, with our own faculties here, so it matters how large an institution is in your terms. For us a small university has less than 250 faculty in the social sciences and humanities, medium-sized is 250 to 500, and large is 500 and over. Using that definition, there are 12 universities in Canada that are large, 16 that are medium sized, and about 50 that are small.
Mr. James Rajotte: Do you know the proportion of grants given to each level?
Mr. Ned Ellis: Yes. Seventy-four per cent of our money goes to the top 15 institutions. That means that 26% of our money goes to the other 60. That's not, of course, dollars per faculty. The larger institutions have a lot larger faculties as well.
Ms. Elizabeth Boston: I could comment for NSERC. I think what you're hearing is that we don't have a common definition of what a small university is. At NSERC we have no definition of a small university. What we find is that universities may be in different situations. One may be small in respect of numbers, but have universities around it with which they can collaborate quite well, and that creates a critical mass in that area. In other places there may be a university that has quite a large faculty in the natural sciences and engineering, but is quite isolated geographically. We find there are some similarities in situation between those kinds of universities.
So we don't have a broad definition of small, large, and medium, and I don't have any numbers to give you. We can provide a breakdown of NSERC funding by universities, if you require it later on.
Mr. James Rajotte: Thank you.
Mr. Mark Bisby: At CIHR our situation is a little complicated, because we don't just provide funding to universities, we provide funding to hospitals and to hospital-based research institutions. In fact, some of the largest health research institutions in the country are hospitals, like the Hospital for Sick Children in Toronto, for example.
We've used the less than one per cent definition of a small university in respect of its health research capacity. Our definition of small university includes York University, for example, which in undergraduate enrolments is one of the largest in the country, but it's not strong in health research. Our large universities would generally be those that have medical schools, in the same group as the NSERC large universities and the SSHRCC large universities as well.
Mr. James Rajotte: The second issue is related, using the granting councils almost as vehicles for regional development. This must be very difficult to do, because not only must you try to select excellence, you must then try to ensure that regions are developed.
It seems NSERC has one sort of policy and some of the other agencies have other policies. I'm wondering if you could expand on why you choose one policy over another. Mr. Ellis, you had probably the most explicit criteria for trying to engage in regional development through your agency, but then, Ms. Boston, you talked about not having an explicit program, but actually sending out teams to research why these universities in certain regions are not doing as well as others. Perhaps you could explain why they are two differing approaches.
Mr. Ned Ellis: Maybe I can go first.
I didn't want to give the impression that our purpose in doing those things was actual regional distribution, but they do tend in fact to have a positive regional distribution. What we are after in all cases—and it's a mandate, what we're supposed to do—is research excellence. We're always looking for that excellence.
• 1615
That said, we probably,
at one point at least, dealt with
more universities than the other two large councils,
though CIHR is fully engaged across the board now, and
NSERC does the same thing. But our experience with
small universities is longer. And the one thing I've
noticed in doing visits to
small universities is that while research excellence is
what we're about, and it's extremely
important for the country and
moving ahead, it's not necessarily the only reason to
have a university. I've heard people tell
me, particularly students in some of these institutions
that might not do so well in research, that this was
the thing that kept them in the region where they were
born or grew up, the fact that they can
have a tertiary education there. The importance of
that role in regional development, not just through
the economics spinoffs, but in keeping some
smart minds around, I think is also something that is
really important to take into consideration. There's
nothing wrong with being a teaching institution.
The Chair: Ms. Boston.
Ms. Elizabeth Boston: The issue of how funds are distributed regionally is a more recent one at NSERC, and we've just started to look into it in more detail, which was the reason for the fact-finding mission we conducted last year. NSERC is in the business of funding research excellence, and you often find that research excellence goes along with things like graduate student programs and very good research infrastructure in the universities. What we found was that in some areas of the country these things aren't as readily available as in other areas of the country.
It's not really NSERC's mandate to go out and create equalities between all the different universities in the country, because we want to fund the best research where we find it. What we've identified is the need for all universities to be able to compete on a level playing field, if you like, and they need help. We've identified this issue, and what remains to be seen is how we can deal with that, whether we can work with the regional development agencies to really help the universities get up to similar standards in research infrastructure, which will allow them then to compete in national programs such as NSERC.
Mr. James Rajotte: Does that mean that when you talk about a relatively modest program, you talk about working with a regional base?
Ms. Elizabeth Boston: Yes, that's what I mean.
The Chair: Mr. Bisby.
Mr. Mark Bisby: At CIHR we think regional development is very important, because we believe that it's better for the health of Canadians to have research capabilities widely distributed in Canadian communities. When we've been looking at what some of the small universities have been proposing to us as their plans for these development grants that we've just started, I've been impressed by the extent to which those local universities that don't have medical schools are beginning to engage the local health professional community in their plans. That means that health professionals in those smaller communities are also going to begin to be engaged in research, where they haven't been before. If health professionals are engaged in research, they stand a better chance of knowing about the latest treatments, the latest approaches, and so on, so I think it's a very beneficial thing. That's why we've launched this development program, that's why we have our regional partnership program, which has been in existence for over five years now, that's why we've widened the eligibility rules, so that people not associated with universities can apply.
I didn't mention that we also have something called a university delegate program, and there are 35 universities currently involved in that. These university delegates are CIHR's representatives in each of the universities, and they're fully informed about CIHR's policy procedures. They are like branch plants, if you like, of the CIHR that operate in the various universities and can be focal points for the local research community to organize and learn about CIHR.
The Chair: Thank you very much.
Dr. Durocher, do you have anything to add to that?
Dr. René Durocher: Yes. I think the chairs program tries to build capacity for research in all the regions, all the provinces of the country, and all types of universities. For instance, you have a small university like Acadia in Nova Scotia, or the University of Northern British Columbia, or Université du Québec à Rimouski. To receive six or seven chairs means a lot for building research capacity in these small universities, because the top researchers who go there will be very stimulating for all the others, they will build a team around them, and so on.
• 1620
So we're trying to build research capacity everywhere
in the country, in each of
the regions. But of course,
the chairs program cannot solve all the
problems. There are in our
country—and we know it very well—some rich
provinces and some poor provinces.
There are consequences in their university systems and in their
research capacity. For
instance, compare Saskatchewan with Alberta, and
you'll see there's a difference.
But we're trying to help
Saskatchewan. We do, of course, distribute a
lot of chairs in Alberta, because
they deserve them—they've got good researchers and they're able to
attract and retain top
people. But we're helping
Saskatchewan, we're helping Manitoba. It's
more difficult for
them, but once again, the other councils
are also working to help them build their
capacity.
The Chair: Thank you.
Thank you very much, Mr. Rajotte.
Ms. Torsney, please.
Ms. Paddy Torsney (Burlington, Lib.): Thank you.
Dr. Boston, it struck me, as I was listening to your presentation, that it all seems pretty well and great, but at the end of the day, the system supports the people who have been in the system, while young researchers may not be getting the same breaks. At some point the science could be terrific science, but is it really what we are supposed to be addressing? Is it really leading us to the things that we need to be looking at and preparing us for the future, five and ten years down the road? Is there not potential that while all these things are in place and it looks great, the outcomes could just be not what we are expecting as a country?
Dr. Elizabeth Boston: I think you've raised two issues. One is, how do we deal with new people coming into the system? The second is, is the knowledge being generated of real use to the country?
On the new applicants issue, we have a real challenge there. One of the really great things happening at the moment is that we're seeing a huge increase in the number of new people coming to request funding. These are in areas such as information and communications technologies. We see a very large growth in those areas, and I think that's a direct response to some of the needs we're seeing in the country at the moment.
We have always treated new applicants in a slightly different way than we treat the established researchers. We have guidelines in place for our grant selection committees, to ensure that every year they fund at least 50% of the new applicants applying to their particular committee. In that way we guarantee that there's some kind of regeneration in the research community. One of the problems we're facing right now is that the number of people coming into the system is much greater than the number of people who are leaving the system. We're seeing a lot of growth, and we're having difficulty coping with that at the moment.
But we are doing our best, and last year our success rate for new applicants coming in was about 70%. We actually had to move some monay around between our programs in order to achieve this, because we didn't want to have the universities hiring very good new people with great new ideas, and then not be able to get them off to a really good start with their research programs. That's very discouraging, and not good for Canada either.
So we do have ways. In our criteria we put relatively less emphasis on the track record of the person, relatively more emphasis on the ideas and the research program that they're putting forward.
As to whether the knowledge being generated is of real use to the community, NSERC has a number of different programs. The one I've been talking about primarily is the one I'm involved in, the research grants or discovery grants, which fund basic research programs—curiosity-driven, if you like to call it that. What we feel is that by letting the researchers decide on the research they want to conduct and funding the best of it, we're giving Canada a real knowledge base from which to draw the innovations of the future. You need a very wide foundation of knowledge to be able to achieve new ideas that will result in products and processes and things for the new economy.
We have other programs at NSERC, which I'm sure you know about, like the research partnerships program, which aims to bring university researchers and industrial partners together, so that they can develop these ideas and really respond to the needs of the country. So we have a mix of programs, which allows us to really get useful results out, while maintaining this base of research.
Ms. Paddy Torsney: It's pretty amazing that you have 80,000 hours a year devoted to the process. I'm sure the other councils have the same kind of amazing volunteers. This little chart with all the different committees and the different researchers is very impressive.
Is there a forced renewal process in all the councils, so that we don't get the same people reviewing the same applications again?
Ms. Elizabeth Boston: Yes. It's a three-year term, and approximately one-third of the committee turns around every year. Given most grants run for four years, you're guaranteed to get a completely new committee when you come up for renewal. So we try to remove biases in the system that could occur like that.
Mr. Ned Ellis: I think it would be the same for all the councils, and maybe even a little more than one-third, because you get people going on a sabbatical, or perhaps switching institutions, or wanting to make an application themselves, in which case they disqualify themselves, of course. So it's about 40% actually, when you get all the changes in.
Dr. René Durocher: On the conservatism of the system you were mentioning, in the chairs program there's something very wonderful, because by definition, there will be 1,000 tier one chairs for senior people and 1,000 chairs for junior people. It means that we're taking younger people, and sometimes we have submissions from people who have had their PhDs for two, three, four years—they're really young. We're taking a risk, but it's over five years. If they're good, they will be renewed, if they're not, that's all. It's really wonderful to have a program giving 1,000 chairs to younger people. Even talking about the older ones, these top researchers, at 50 or 55 years old they're not has-beens. They're still very innovative, and they have around them 10, 12, 15 people who are post-graduate students, so they keep young even at this age.
Ms. Paddy Torsney: That's right. We certainly wouldn't want to be the committee that slandered seniors.
My last question is specifically to Mr. Bisby. From time to time there are clearly emerging issues in our nation, especially in health care. There was AIDS 20 years ago, and autism seems to be very hot right now. There seem to be different times when suddenly someone is saying, oh my God, this is serious, we're seeing incredible numbers, whether it's foetal alcohol syndrome, or whatever else. How do you account for that? Are the researchers the leading edge, having already applied for this five years before we're even seeing it on the radar, or is there suddenly a push on certain areas? I can think back to my environment committee and the matter of pesticides, for instance, wondering what's going on there. How do you guys deal with that?
Mr. Mark Bisby: It was really for that reason that the 13 institutes of CIHR were created. As I mentioned, each one has an institute advisory board. There are about 15 to 20 people on each one of those boards, about half of them scientists, and obviously, we try to find the best scientists across the whole range of disciplines the institute's interested in. The other half are stakeholders of various kinds, people in government, people from industry, people from the voluntary health associations, like the Alzheimer Society. They try to figure out collectively what the emerging health issues are that they should encourage research in by providing a special funding opportunity. That's what the strategic part of our budget is about.
For example, there are two competitions we have in place at this moment. One is called “Financing Health Care in the Face of Changing Public Expectations”—very relevant—the other one is entitled “Neurodevelopment and Early Life Events”, and that alludes to two of the issues, foetal alcohol syndrome and autism, that you mentioned specifically. That's the way we're trying to respond to these emerging health issues.
We are beginning to get a team together to talk about bioterrorism, Canadian responses, what kind of resources we have in Canada to deal with that emerging threat to Canadians' health.
The Chair: Mr. Bergeron.
Mr. Stéphane Bergeron (Verchères—Les-Patriotes, BQ): Thank you, Madam Chair.
I believe that we all agree that one of the main reasons why we have undertaken this study and you are appearing before us today—and I do thank you for being here and for your presentations—is because we are trying to reconcile two principles that are hardly reconcilable as they are not based on the same realities. On the one hand, a number of colleagues have realized at one point that there were some regional disparities that they had a hard time understanding and, on the other hand, we have in place a granting system that is based on peer review, that is recognized internationally and that has been in place for a number of years.
I believe that it is Ms. Boston who, in her presentation, made reference to democracy. Sir Winston Churchill said that democracy is probably the least objectionable of all political systems. We can probably say as much in the case of the peer review granting system which, I must admit, reflects accurately the level of investments made in research, including by the provinces.
However, as has been shown in your presentations as well as in the questions that have been raised up to now, one of the results of that system is to put at a disadvantage the smaller universities, although you are using quite different definitions of what constitutes, according to you, a small university, especially the newer universities, the universities that are more regional in character.
I would like to ask a number of questions regarding this whole issue, and my first question is for Mr. Durocher. You mentioned a moment ago the case of some universities that are quite dynamic even though they may be rather modest in size, such as Acadia University.
However, if we look at a number of universities that are comparable in terms of size and the various levels of teaching, if you take for example Mount Allison University in Sackville, New Brunswick, Acadia University, in Nova Scotia, and Bishop's University, in Quebec, I observe that the first one has been granted five chairs, the second one, Acadia University, has had two chairs, and Bishop's University, one chair. For universities that are quite comparable in terms of size, in terms of levels of teaching, how can you explain such disparities in the number of chairs?
Dr. René Durocher: I fully understand your question, but I would like to make a small correction. I don't know where your figures are coming from, but Acadia will have seven chairs.
Mr. Stéphane Bergeron: How many? Four?
Dr. René Durocher: Seven, which is comparable to what Mount Allison has had. Bishop's had one chair.
Mr. Stéphane Bergeron: There you go.
Dr. René Durocher: And there are other cases. I can explain the case of Bishop's, for example. Bishop's claims to be a liberal arts college. There is no research tradition in Bishop's. A special chair has been granted to that university, but it has never put a major emphasis on research. It is an excellent small university that provides an excellent education.
There are other universities, such as Acadia, UNBC and the Université du Québec in Chicoutimi, that want to develop fully their research dimension, while still remaining small universities. There are different policies from one university to the other. Researchers from Bishop's do not make many applications for grants, while in the Université du Québec or in Acadia, researchers are encouraged to make applications for grants and to do some research in order to better prepare undergraduate students for graduate studies. And indeed, they are very well prepared. Students have perhaps a better chance of succeeding at the graduate and post-graduate level coming from Acadia than if they come out of Université de Montréal. Even smaller universities want to do some research. So there are distinctions to be made, even between smaller universities.
In fact, the whole university system is relatively complex. There are large urban universities such as York, UQAM and Simon Fraser that have 40 000 or 35 000 students. They are large universities. They have 35, 38, 32 chairs. It may seem outrageous, but it is not so. It is because in these large urban universities, there is no school of medicine, when 35% of the chairs are in medicine. So they don't have any. They have a few of them in health. They don't have any engineering school, when 40% of the chairs are in natural sciences and engineering, a great number of them being in engineering.
• 1635
So, if you compare the humanities and the social sciences in
York and at the UQAM with McGill and Université de Montréal, you
will see that comparisons are not meaningful because they are not
the same type of universities.
You also have large universities such as the University of Manitoba and the University of Saskatchewan, but they are located in poor provinces. They have a medical school and an engineering faculty, but some provinces, be it Alberta, Quebec or Ontario, have invested a lot in research through provincial funding. Others have not been able to do so. It is not their fault; they did not have the resources to do so.
That has created all sort of inequalities, but one thing is sure—I go to all universities; there is only one province which I have not visited yet—we do need some diversity within our university system. More importantly, what we need is that all our universities, from the smallest to the largest, be excellent and give an excellent education to people.
Mr. Stéphane Bergeron: Basically, in a general way, I rather tend to agree with what you are describing. As I was saying earlier, the system that is presently in place reflects the level of investment that was made, including and especially by the provinces. That being said, when you compare cases such as Mount Allison University, Acadia and Bishop's, the answer that you gave appears to me at first blush to be quite plausible, quite acceptable, but it raises another question in my mind. It is somewhat like the chicken and the egg situation.
Are universities interested in investing in research when they already have an infrastructure and the accompanying funding, while another university, whose research infrastructure is rather poor, will be less interested in promoting this kind of work among its faculty? The perverse effect of that situation is that there is a risk that professors who are teaching in that university and who are interested in doing some research may leave and go teach elsewhere, in another university. This exacerbates the problem. It becomes a vicious circle for these universities.
Dr. René Durocher: You are quite right. Mr. Brzustowski, the Chair of NSERC, who visited the Atlantic Provinces, Mr. Renaud, the Chair of SSHRC and myself have all arrived at the same conclusion, namely that there is an aspiration to development and excellence, but that universities do not always have the means to pursue that goal. So the granting councils and the federal and provincial governments must build that research capacity in all universities. However, some of them want to keep their liberal art college tradition, they do not want to focus on research. They represent a small minority. In my view, most universities want to go into research. Professors in smaller universities can be just as good, just as bright as those who teach in larger universities, except that they must teach six hours more a week, they must give more supervision to their students and they don't have any teaching assistant such as they do have in larger universities. They are not being given time to do some research. Obviously, for them, it is difficult to do any research, but the federal government cannot subsidize teaching and undergraduate studies. It is up to the provinces to do so.
I believe that the situation is not desperate because, contrary to the situation in the United States, there is not an enormous gap between the very large universities and some very bad universities. In Canada, all universities are good, but there are obviously some that are better in some areas. In different areas, the other universities are better.
[English]
The Chair: Thank you. Merci, Monsieur Bergeron.
Mr. Bagnell, please.
Mr. Larry Bagnell (Yukon, Lib.): I'm from north of 60 in the Yukon, and there are no universities north of 60. I'd be interested in how you adjust your rates for that half of the country where there are no universities, but that's not my main question.
I used to work at Industry Canada, but I've been away a while, so I'm not sure exactly how you work. But I have no problem, I think the peer system is great for getting excellence, and I appreciate all the efforts you've made regionally and to get young people in it.
I only have one question, and it will be short, because Paddy sort of asked it, but someone may want to add to it. Because you have such a huge proportion of Canada's research money, how is the connection made with the desires of the taxpayers and the research they think should be done? You've each got a huge area where you could do research. Of course, people's needs change, the needs of these taxpayers, whether it's reflected by government departments, by us as their elected representatives, by the government, or by the NGOs, which one of you mentioned.
• 1640
For instance, I
mentioned last night at 1:30 in the House
that we should be doing more
research in social sciences and humanities on the root causes of
terrorism. With science, at one of the
meetings last weekend there was mentioned a non-flammable jet
fuel,
which wasn't a priority before September 11.
So my question concerns how you direct the taxpayers money towards the topics within your area that they think are priorities for research to be done on.
Mr. Mark Bisby: I've already mentioned our institute advisory boards, which are a key way that CIHR wants to be responsive to what Canadians think are important issues. In addition to having—though I don't know any ordinary Canadians—ordinary Canadians on those boards, we also have representatives of the voluntary health agencies, like the Heart and Stroke Foundation. We also have a number of co-funded partnerships with those voluntary agencies, where we decide what the objectives of a particular initiative are in collaboration. I think that's one very good way, because Canadians do vote, in that sense, with their dollars, the money they contribute to the Canadian Cancer Society, the Heart and Stroke Foundation, the Arthritis Society, the Canadian Diabetes Association, and so on.
I think by forming those partnerships, we are beginning to address those issues, but the fundamental thing is that in the decision-making process in CIHR we involve the broadest cross-section we can of the Canadian public.
The Chair: Mr. Ellis.
Mr. Ned Ellis: Thank you very much.
I think there are two things that are important here. As Mark mentioned earlier, the councils tend to have two kinds of programs, one of which is what we call researcher-driven, which is where the researchers make proposals on basically anything they want within disciplines, while the other councils tend to have something that is more directed or strategic, where the councils define the areas. In that context, we do a survey every three or four years. It's not an immediately responsive kind of thing, and I think that's one of the problems really, responsiveness to timely events, like a September 11 kind of thing. We survey government departments, provincial departments, universities, voluntary organizations for their ideas as to what we should be focusing on in the strategic areas. And we follow through on that, the council makes choices on that.
The other thing we do is with our programming structures. So there are community-university research alliances. It was really quite a change for the university community, but the community organization really had to be involved with the research right from the beginning, and the committee would not even look at an application unless they knew that the community organization was really in charge of defining the research questions. That led to some different kinds of research, believe me, and some questions where people would say, well, that might not be the most sophisticated research technique, but that's what the client wanted and that's what the client got.
The other thing we do similarly to all the councils is with this partnership thing and voluntary organizations. Indeed, government departments, with all the warts and bumps that are attendant thereon, are, we hope, responsive to the needs of the people as well, and we do a lot of partnerships with them.
The Chair: Ms. Boston.
Mr. Elizabeth Boston: There are a number of ways at NSERC by which we reflect the priorities of the Canadian people. NSERC works with a number of partners, not only on the social aspects, but also on the economic aspects. Many of our partners are industry or industry-based organizations.
In our partnership programs we have something called the strategic projects program, which does identify priority areas, such as environmental technologies, information technologies, on which we will request applications from the community. Those areas are reviewed every three to four years. But that's not one of the most responsive ways of doing things, and I think that's something we've become more and more aware of in recent years.
• 1645
We are proposing to introduce something
called NSERC innovation platforms, which you may have
heard of. It's not something we've been able to fund
yet, but we're going to try to
partner with similar kinds of organizations that have
identified the need to accelerate research in a
particular area, nanotechnology,
photonics, or some of these areas that are really
important for
the future economy of the country. We hope that
by partnering with these organizations,
we can accelerate the rate of research and
train more people in these areas, because we will
be responding to industry's need and the country's need
to generate more research, and therefore more
innovation,
in that area by establishing these platforms. So this
is something that's new for NSERC, and we're hoping to
launch one in nanotechnology quite soon. There's a
line-up of other areas, such as greenhouse gases,
bioinformatics, quantum computing, that we think are
very important to the future of the country.
The other program, which nobody has mentioned, is the Networks of Centres of Excellence, which is a tri-council plus Industry Canada program. That is a program through which the government does set research priorities. Applications are requested in certain strategic areas, and then research is funded in those. A recent example, which was quite responsive and timely, was one we funded on water quality and safety.
So I think there are a number of mechanisms that we all have.
The Chair: Thank you very much, Mr. Bagnell.
Mr. Strahl, please.
Mr. Chuck Strahl (Fraser Valley, PC/DR): I have a couple of things, again following along similar lines.
I've heard it said that the only thing that's mixed in the reviews of the peer review system is that it does take an awful lot of the political interference out of the system, which is probably pretty good, because none of us knows anything about nanotechnology, so it's just as well somebody else is making the judgment calls on that. On the other hand, as has been mentioned, there are some times when there's a judgment call. The industry minister, for example, says he wants to spend a couple of billion dollars on connecting the universe to the broadband Internet, so there's a political decision, because he thinks that's in the best interests of the country.
Do you experience political interference, or is it just that you're given a mandate and then you're left alone? Do you get influence from on high?
Some hon. members: Oh, oh.
Dr. René Durocher: No, I can say frankly that there's no interference in the process. But of course, to discuss it with you is a political aspect, and it's right in a democracy. You're elected, you're responsible, we're accountable. So that's a kind of political discussion. But inside the process, certainly not. The government was pretty much involved in defining the chairs program. He wanted to target something, he consulted, did all kinds of things, but now the program's in place, there's no interference whatsoever—the program would lose its credibility.
Mr. Chuck Strahl: Yes, I agree with that. And while you're answering that, if in our community we see a need, whether it be the health industry or something else, is there a place for us to intercede?
Dr. René Durocher: Yes, sure.
Mr. Chuck Strahl: I don't want to interfere, because I don't want to force you to make a bad decision, and excellence should be the point. On the other hand, we might want to drop a line to somebody. and we'd like to know who. I have 17 aboriginal communities, for example, in my riding. If I see a growing need, or they do, they probably will get to you, but perhaps I could pass that along as well, just as a point of interest.
Mr. Ned Ellis: That's a tremendous idea, quite frankly.
I'll answer your question in order, and then I'll come to the third part.
The presidents tend to do a lot of consulting around town, and they talk to a lot of members of Parliament of all stripes and a lot of ministers. The general case of a minister or a member of Parliament perhaps talking to Marc Renaud and saying, this is of interest to me—and he always makes sure he asks, because he's very sensitive that way—has an effect. That does get passed on and discussed at council.
• 1650
As for interference on individual cases, I've
only been there for three years, but they all go
through my hands, and I haven't had a sniff of that.
Mr. Chuck Strahl: Thank you. That's good to know. Nowadays nobody wants to get accused—any of us, or the minister—of improperly interfering, but it is nice to know that we can still make our community concerns known to somebody and that's not considered a cardinal sin.
The other question I had was this. I'm not sure of the mandates of private universities compared with public universities. I don't know how many private universities there are, and I don't know if any of them are involved in research, but when you allocate funds, does that have an impact on how you do it? Say it's a private facility, as opposed to a publicly funded university, can anyone apply, or is it only public universities?
Dr. René Durocher: In the chairs program all universities were considered, and the criterion was research.
Mr. Chuck Strahl: Okay. That's the same?
Mr. Mark Bisby: Generally speaking, the issue of public versus private universities isn't a factor in Canada. There may be one starting up in Toronto, but I think that's it. All the rest are public.
Mr. Chuck Strahl: There's Trinity Western out in B.C., but it's not primarily a research facility.
Mr. Mark Bisby: No, that's right.
Our terms and conditions do allow funding of people who are employed outside the university setting, but they have to be associated with the not-for-profit sector. For example, a researcher from a pharmaceutical company couldn't apply to us for funding. We have to make sure there's financial accountability, ethical accountability, and so on before we'll accept an institution as eligible for funding that's not a university.
The Vice-Chair (Mr. Walt Lastewka): Dr. Boston, do you have a comment?
Ms. Elizabeth Boston: It's very similar for NSERC. We have eligibility rules for an institution to receive funds from us. I don't think there are any private universities involved in our system. We've just opened our eligibility criteria to include colleges under certain circumstances as well. The main criterion for receiving NSERC funding is that the institution has to have a research mandate and that they're willing to support their researchers by providing the facilities they need to be successful. There are a number of other criteria that they have to fulfil.
Mr. Chuck Strahl: The last question I had was touched on. Somebody mentioned 10,000 international experts who are brought to bear on some of these cases. That's probably wise, again, because you're trying to get that research excellence. On the other hand, there may be something that is more Canadian-based in interest. In other words, you might say, internationally this may not be a big deal, but in Canada it could be a really big deal. You send it out for perusal or peer review, and somebody says, well, we don't need a lot of research on that, there's plenty of it down here in Boston. We might want to say, yes, that's the trouble, there's way too much in Boston, we need a whole lot more here in Toronto, or someplace.
So how do you make sure your international experts don't say, that's just not of interest to me, and soft-pedal it?
Ms. Elizabeth Boston: In the NSERC system we have two ways of getting input. One is from all these international people. We call those external reviews, because they're generally written reviews that then go to a committee, which makes the final recommendation. The committees that receive all this input and receive the applications are primarily Canadian. There may be one or two international members on those committees. So the input tends to get filtered through a Canadian viewpoint. They may receive recommendations from international people who say, oh well, there's tons of this going on in Amsterdam, or somewhere like that, but generally there's enough knowledge around the table that it can be filtered out, so they can look at it from a strategic Canadian point of view and make the appropriate recommendation.
Mr. Chuck Strahl: I guess that'd be important, as well. Everybody feels the same? Okay, that's good. Thank you.
The Vice-Chair (Mr. Walt Lastewka): Thank you, Mr. Strahl.
Mr. St. Denis.
Mr. Brent St. Denis (Algoma—Manitoulin, Lib.): Thank you, Mr. Chairman. Thank you for being here.
• 1655
No doubt we as a country, you as front-line
practitioners in delivering tax dollars through our
education institutions for the benefit of the country
and pure and applied science, must at times
compare ourselves to other countries. Within, say, the
OECD nations, is there, at the international level, any
periodic comparison made, either by the OECD itself or
by another body, of how our system compares to other
countries? Are we ranked in relation to other
countries by any method? Is there a Maclean's
evaluation of the way we spend taxpayers' dollars?
We're moving in the right direction, I'm
sure, and it gets better every year, no doubt, but is
there a way for there to be an outside objective
comparison of how we spend our dollars and get a bang
for that buck with how other countries do it?
Mr. Mark Bisby: I'll have a go at that one. It's not simple.
There are OECD league tables, yes, which express how countries are doing in respect of the proportion of their gross national product they invest in research and development, the proportion of government funding that contributes to that, and so on. I think Canada is about fifteenth on that table at the moment.
The issue, though, is one of productivity. It's not just investment, as you say, it's bang for the buck. And that isn't really, as far as I'm aware, done consistently or regularly in health research, which is the only area I can talk about. There was a survey done by an English academic about three or four years ago now looking at the bang for the buck, the productivity, research publications per dollar invested. Canadian researchers are number one. That's partly because our grants are so small, so I don't want to push one side of that equation too hard. But in fact, they were very productive researchers in the way they used the public funds invested.
Ms. Elizabeth Boston: Maybe a less quantitative, more qualitative answer is that every four years NSERC conducts a reallocation exercise, where we try to look at the balance between the funding of all the disciplines with NSERC funds. Part of that process is a submission from each discipline saying why they would need more money to do more of the things they want to do, to produce more excellent research. Those submissions are sent out to international reviewers. The last time we did this, the main message we received from the international reviewers, very high quality people, was that Canada was really extremely efficient in how it spends its money, so that the bang for the buck we get is extremely high in quality, and in quantity, I would think, as well. I'm sure there are league tables and all kinds of things for that, but I don't have that information at my fingertips right now.
The Chair: Mr. Ellis.
Mr. Ned Ellis: Briefly, the biggest problem we have in measuring output is deciding what to measure. You can certainly measure numbers of publications, which are valid within the academic world, but for instance, if CIHR develops a new cure, it may be 30 years before you really know the value of that. In the case of SSHRCC, when a discovery is made that results in a change in government policy, it may be some time before anybody sees the value. Nobody will ever measure it, and nobody will ever know how much came from that discovery and how much came from somebody's gut reaction to something they heard on the street. These kinds of things are really difficult to measure.
We don't do much travelling, but a lot of people come to see us, and when they come to see us—I don't know if they're telling us the truth—they universally seem very impressed with the system we have here in Canada. Generally, they're here with a very earnest desire to learn more about how we do it. I think that's a good indicator.
Dr. René Durocher: And each country has its own profile. For instance, in the matter of the chairs, in the U.K. they decided to establish 50 very prestigious chairs, and here we went for 2,000. In joking, I could say it's almost un-Canadian. My God, 2,000! The Brits have 50. I was in the States talking with people, and I realized why they don't have a U.S. chairs program—because they don't need it. They receive a lot of money for indirect costs of research, and they have large endowments. So when they negotiate with the top researchers, they can offer them $150,000 U.S. a year, plus $500,000 or $600,000 for equipment. They don't need a program, but we do need to compete with them. So each country has its profile.
As for policy, from talking with French people, with Belgians, and with many kinds of people, I think they consider Canada very innovative in policies. And as for productivity, if I'm not wrong—I should check—I think that Canadians produce 4% of scientific literature. For a small country like Canada it's really impressive. We're not that good with patents per capita, but there are indicators, and that's a thing we should develop more in our universities. There are people working on this.
The Vice-Chair (Mr. Walt Lastewka): Thank you, Mr. St. Denis.
We'll now go to Mr. Penson.
Mr. Charlie Penson (Peace River, Canadian Alliance): Thank you.
Given that there are pretty precious dollars available for research, what would be the reason that we have to cycle some of that money through regional development agencies, and then back to your organizations? Wouldn't it be better just to have that money flow directly to your organizations, without the bureaucracy it has to go through in the regional development agencies? Wouldn't that serve you better?
Dr. René Durocher: That's a political question.
The Vice-Chair (Mr. Walt Lastewka): Are you talking about a budget?
Mr. Ned Ellis: It's probably easiest for me to take a first swing while my colleagues think of their own answers.
There really isn't a lot of money that SSHRCC deals with that goes through that kind of route. It tends to be more on the equipment and buildings and facilities side, and that's not something we're involved with a lot.
Mr. Charlie Penson: I think you did suggest that money started flowing from other departments.
Mr. Ned Ellis: Yes, most definitely.
Mr. Charlie Penson: It's really the same question, I guess.
Mr. Ned Ellis: Okay.
Mr. Charlie Penson: Rather than have each department go through it, why isn't it just dedicated to your agencies to begin with?
Mr. Ned Ellis: No department, of course, is obliged to deal with us at all. They tend to have their own research efforts and their own research shops. Some of them choose to partner with us as a more effective way of getting their research done, because they like the evenness of the peer review system and the fact that we get in contact with everybody, not just perhaps their small stable of researchers. Certainly, for them there are advantages in our having a machinery that works. For us there are advantages, because we know that this is a priority, and to the extent that those priorities have been worked through the system and come out through that department, we feel we're responding to a real need.
Mr. Charlie Penson: That raises the next question. When you're putting projects out for peer review, you're trying to get the best value in research, it seems to me. But we put in a lot of qualifications. There has to be a regional aspect, it has to be a certain size of university, certain money is dedicated to that. Doesn't that tend to lose some of the value in trying to get good research? Doesn't it negate the peer review process, in that they're looking for the best value for money spent in respect of research?
Mr. Ned Ellis: One small point of precision is that it's really excellence, rather than value for money, and that's an important distinction.
Mr. Charlie Penson: What is that distinction?
Mr. Ned Ellis: It's the best possible proposals that are chosen.
Mr. Charlie Penson: But you said it's not value for money, it's excellence. Explain what that means.
Mr. Ned Ellis: The best possible proposals are chosen within a budget. They put in a proposal and a budget, and the best possible proposals are the ones that are selected.
Mr. Charlie Penson: How do they evaluate that?
Mr. Ned Ellis: They evaluate that according to the track record of the person proposing and the quality of the proposal that's being made.
Mr. Charlie Penson: Is that the same in all cases?
Mr. Ned Ellis: Yes.
As for the other factors, which are important—I think I was probably the one who made most reference to this—we try to make sure there is the best possible chance for everyone to have an equal shot at that excellence. So to the extent that we can within our own means, we try to make sure there are people from all the regions on the committees—and indeed, we do that more rather than less—the same for people on these committees from small universities, the same for French and English, and the same for women and men. So we try to give the best possible chance for excellence to come out from all regions and from all sizes of universities.
Mr. Charlie Penson: But is that realistic?
Mr. Ned Ellis: It's working pretty well.
The Chair: Okay?
Mr. Charlie Penson: Yes. I don't necessarily agree, but I understand what he's saying.
The Chair: Thanks very much, Mr. Penson.
Mr. Volpe, please.
[Translation]
Mr. Joseph Volpe (Eglinton—Lawrence, Lib.): Thank you, Madam Chair.
I felt like being somewhat the Devil's advocate. Obviously, some other members have already begun to play that role.
[English]
but they've been very diplomatic. So I wonder if you'd just allow me a couple of observations, and then maybe you can address them.
Let me start off, without making you feel uncomfortable, by saying that a few years ago, when people were sounding out the reaction of parliamentarians about whether we should give more money to research and to projects that would foster excellence, I was happy to add my name to the list. However, given the nature of my job, I hate to do it without having made some observations along the way.
I'm going to be something of a devil's advocate, because one of the issues that was always raised, and the one we're supposed to be studying, is, of course, the peer review. The reason peer review is such an important component of research, as at least I used to be told, is because scientists, whether they're medical scientists, social scientists, or some other sort, are by nature skeptics, and they don't like each other. So what they're going to do is be watchdogs, so that the public's money is appropriately spent and nobody is going to be able to put forward a project that will escape the scrutiny of the most hostile individual in the world. I'd like your comment on that, because I'm wondering who is going to watch the watchdog.
Second, keeping in mind that we're trying to understand a concept of peer review for the purposes of our study—and I'll be provocative in my language—your response has been very political and danced around the question of value for dollars. All these moneys that have been attributed to your institutions over the course of the last several years and for many more years to come have been an investment by the Canadian people. And the investment, as I hear you say in your responses, has been for excellence, but excellence for its own sake. All the academics, I'm sure, around the world would agree with you, Dr. Durocher, this is a great academic's place. It's where excellence is supposed to thrive, and so we get moneys to try to achieve excellence. But I think Canadians have also a different definition of value. Value equals the output. What's the outcome? What do we get back?
So there are two components to the question. If you actually discover something that's commercially viable, do you get a piece of the action? If not, in heaven's name, why not? Second, can you point to any such advantages?
I guess I'm a bit like an American senator who a couple of years ago—this was when I was on the health committee discussing this sort of thing—viewed with some skepticism a request for additional funds. As you know, of course, the Americans spend a lot more money per capita and as a lump sum than we do on this. And he said, you know, I don't understand why we have to give you more money; we've been giving just the cancer component of the NHI $2 billion a year since the early 1970s, when Nixon declared war on cancer, and the only thing we've discovered so far is that early diagnosis helps, and everything else is probably attributable to adjuvant therapy. Maybe he was being a little too cynical, but you get my drift.
• 1710
I guess the third thing is one that was first
raised by Mr. Strahl. Where is there a
role, if there is one, for the political
process to say, listen we have to direct our studies,
our energies somewhere? We've got all these brilliant
brains working. They're
developing excellence. Where do you
finally get all this adrenalin? Where's the rush going
to go? I hear you saying that you have
advisory committees that will give you a sense of where
the public wants to go. That's a very polite way for a
scientist to tell a politician, it's not the
politician's
or the political system's responsibility to direct
scientific research, i.e., give us the money and bug
off.
That having been said, over to you.
Mr. Mark Bisby: Lovely. I don't know where to start. I won't try to address them all.
The issue of the war on cancer is interesting, because cancer rates are falling now, and early diagnosis is a very important part of it. It's partly due to better therapies, and that's a definite by-product of research. The war on cancer took longer, and it revealed that cancer was a much more complicated disease than people knew before they went into it. But there are some gains being achieved there.
Who watches the watchdogs? I'll try to address that one and leave some of the other, more difficult ones to my colleagues. At CIHR we have an oversight committee that is composed of other scientists and a couple of members of the public, and their job is to look at the peer review system, look how it's working, look at the kinds of recommendations and decisions that are coming out of it. In addition, our council, constituted by Governor in Council appointments, again represents a wide range of Canadians. It's not just a bunch of scientists sitting around the table. Our council has the last word in what gets funded, how much is devoted to each program, and so on. So there are at least two layers within the organization that watch what the peer review system is up to.
The Chair: Ms. Boston.
Ms. Elizabeth Boston: I'll have a stab at all three, but they may be partial answers.
First, like CIHR, NSERC does have a multi-stakeholder council, which is, if you like, our board of directors, that carefully monitors everything we do. We also have checks and balances during the peer review process. For example, we have members of the community who come and watch our committees in action to make sure things are happening fairly and there aren't inequities in the system.
We also regularly evaluate all our funding programs as well, to make sure they are meeting their objectives. In fact, we have a major evaluation of our biggest program, the research grants program, under way at the moment, where we're going out to a wide range of stakeholders, not just the researchers themselves, but a broad range of people, to get their input on whether this program is doing what it really should be doing.
As for value for money, bang for the buck, if you like, are we getting what we want out of our investment? I think that was your question.
Mr. Joseph Volpe: I want to see whether the hockey players ever score a goal or whether they spend all their time at practice.
Ms. Elizabeth Boston: I think there are certainly two things that come out of NSERC funding. One is the knowledge we need in order to draw out new ideas. Because NSERC devotes a large portion of its budget to basic research, a lot of those developments take a long time to evolve. But we do have evidence of small to medium-sized spinoff companies being formed out of universities. We have our partnerships programs that enable university researchers to get together with industry to develop their ideas further, if there is potential.
But the other, and maybe the more important, outcome of NSERC funding is the highly trained people we produce. These are the people who go on to work in universities, in industry, and in government. And they're the people who will have ideas of the future. They will be the people who create new companies and other real innovators. I think it's been identified that in order for Canada to move ahead of other countries in this area, it's the people and their skills that we really need. A huge portion of the NSERC budget goes into supporting masters and graduate students in their advanced training. That is one of the real outcomes, and it's quite a measurable outcome as well.
I think I'll stop there and let others carry on.
The Chair: Mr. Ellis.
Mr. Ned Ellis: It's quite a devil's advocate sort of question.
In general, this commercialization business is something fairly new to SSHRCC. We don't have a lot of experience with it. Marc Renaud is famous for saying that when he's in a room with Tom Brzustowski and Alan Bernstein, they each have something physical to hold up to show what the money has produced, and we don't. So it is a little harder with the commercialization. But we are increasingly seeing potential for that, particularly in the arts and new technology areas, which seem to be growing quite a bit. A lot of research, of course, is being done in those areas. So that's something we're going to have to pay attention to. We haven't up until now.
On the issue of who watches the selectors, we do quite a bit on that front. We have council members, of course, who are other academics who watch. We also have staff, and staff are really neutral in this process. We are not captives of our community. We're the ones who know that five years from now we'll still be there defending that process, so we're pretty strong on it.
I can't speak for the other disciplines, but social scientists and humanists are famous for eating their own young. These committees are really tough. This is not a group of people gladly clapping each other on the back. I've seen some committees fight for 10 minutes over whether a computer really costs $2,000 or whether you couldn't get one for $1,495 if you went to Future Shop. They're really tough on each other, they're not easy at all. Whenever we have Networks of Centres of Excellence, our people seem to me to be tougher on their own kind than pure scientists are on theirs. That could be a reflection of the fact that our average success rate is 30%, not 68%, so the choices are a little bit tougher. But they do a pretty good job of policing themselves, at least in our domain.
We would welcome direction and input from all members of Parliament, and certainly from this group. We're quite happy to have it. It's as simple as a letter to the president of SSHRCC, quite frankly. We would really appreciate that.
The Chair: Thank you.
Dr. Durocher, did you anything?
Dr. René Durocher: I mentioned that the steering committee is composed of the three presidents of the granting agencies, the president of CFI, and the deputy minister of Industry Canada. So I am watched very closely.
The Chair: Okay.
Thank you very much, Mr. Volpe.
Ms. Torsney, you had one last question?
Ms. Paddy Torsney: I'd like to take the issue of political interference from the flip side.
In the 1993 campaign and between 1993 and 1997 we had all kinds of horrifying examples of gaspillage of government money in various research initiatives: how dare we spend money on blueberry studies in Atlantic Canada, research into the artistic benefits of feminism, or whatever it was? I would argue that it's political interference to make people scared to grant money to things that perhaps look unpopular at the time. I wonder how you make sure we are funding things that are interesting and relevant and could lead to good research, even when opposition parties in particular mock the very grants you give and you know you're under incredible scrutiny. How do you still manage to fund things that are important and relevant?
The Chair: Mr. Ellis.
Mr. Ned Ellis: I think it's true of all the councils that we tend to have two streams of programs. The one is more directed, the strategic type of stuff we've been talking about quite a bit here. On the other hand, we have what we call at SSHRCC standard research grants—I think at NSERC they call them just research grants. These are the topics that are chosen by the researchers. So they will make a proposal on what interests them. Then committees choose from those on the basis of merit.
Actually, we're the ones, SSHRCC, who every year have three or four really embarrassing examples that come through. They tend to be publicly embarrassing, but they're actually quite defensible in their own right. The title, or whatever, will throw people off and become a cause célèbre. We're quite proud of it. You do need both.
Ms. Paddy Torsney: You absolutely do.
Mr. Mark Bisby: I think the peer review actually helps us in that process. In other words, it's not Ned's, or my, or somebody else's bright idea that gets funded. It's been through a committee of experts. There are reasons that can be advanced as to why this is being funded. The committees have been able to point those out. So I think that actually helps us.
• 1720
To follow up on something Ned Ellis said, I
wouldn't want to characterize the strategic funding we
do as being the applied, the investigator-initiated
or curiosity-oriented as not being applied.
Two grants we've
given recently, one to do with how you treat ankle
injuries—it doesn't sound very exciting—the other
to do with what kind of pacemaker works better, have
the potential to save the health care system, if people
take up the information contained in these
studies, respectively, $15 million and $50 million a
year in costs. So even areas
where people's curiosity leads them have huge applied
benefits as well.
The Chair: Dr. Boston.
Ms. Elizabeth Boston: We also have the research grants program, which funds basic research. It really is a “from the bottom up” process, but because we're the Natural Sciences and Engineering Research Council, there is a lot of incredibly relevant, but very basic engineering research being done that could be of relatively short-term benefit, balanced against some of the more pure sciences. So we see a nice balance in that program, and I don't think there are really any concerns about the dangers of funding something that may not look very popular.
In fact, some of the feedback we got in the mid-1990s about publicizing grant titles and having people publish those in the newspapers made us also realize that we have to do a better job of explaining to the Canadian public some of the reasons we make these decisions. We ask researchers to write a summary of their research that explains in very basic and easily understood terms the research they're conducting, so that we can explain to people why this is important, even if the title is completely meaningless to them.
Ms. Paddy Torsney: Just for the record, I think if you didn't occasionally fund things that looked controversial, you wouldn't be funding good investigative research in this country.
The Chair: I wanted to add a couple of things before we close.
Dr. Durocher, you made an interesting comment earlier, in response to Mr. Bergeron's question, about how the gap in Canada was not as wide as the gap in the United States. I have to tell you that I agree with you, and that's a good thing. I'm a little worried that we are heading in that direction through the way the CFI is providing funds for research and creating research facilities in Canada. On that point, I've received correspondence from people who actually think it would be a good thing if Canada were to have, as an example, five really renowned research facilities, and too bad for the rest. That's not exactly what the letter says, but it's pretty close. So I'm a bit concerned that there is some movement within Canada—I'm not sure where it's coming from—to direct our research in that way, so that instead of us not having this gap like the United States, we will come to have this gap. I'm not sure what that means in the end for your program or for research.
Dr. René Durocher: At least with the Canada research chairs program, I think every university is benefiting. Even if you receive six chairs, you're benefiting greatly from this program. Does this widen the gap between large and smaller universities? I don't think so. What we want is that all our universities, small and large, become better, that they all increase their capacity in research, that they all have better researchers, top researchers. But there are one million people in Saskatchewan, eleven million people in Ontario, so you won't have equality there. It's difficult.
There's a gap between provinces, between institutions. We're not trying to widen this gap, but it's there. Without the chairs, it exists, and it will exist after the chairs. So I hope the chairs are not aggravating the situation, but the program helps everybody, every institution. Small institutions who receive 15, like Windsor, for instance, just because I have it in my mind—
The Chair: It's not enough.
Mr. Joseph Volpe: Too much.
Dr. René Durocher: It's important, because one tier one chair is $1.4 million. To receive $200,000 a year would take an endowment of $4 million. So for a small university to get 10 or 15 chairs is like giving an endowment of $x million, saying, okay, you can take $200,000 a year for each of the chairs. It's enormous. But of course, some have more chairs than others—that's life.
The Chair: I understand that.
Mr. Ellis.
Mr. Ned Ellis: Very quickly, the job is so huge, the challenge is so large that as far as we're concerned, everybody has to put their shoulder to the wheel. I personally believe that the variety is extremely important and that we have to continue to have variety. Just as in every other place where we have variety, it leads to creation, and creation is what this is all about. So it's really important that there be a wide range of these institutions.
Are they getting worse? I don't think so. My own personal sentiment is that the disparity is, as René said, maybe not as bad as some people feel, and certainly not as bad as in the States. I don't think it's going in a worse direction.
The Chair: When it comes to the peer review process, I understand that people volunteer their time and they spend a lot of hours on this, but as much as they are probably tired from it, there's also the application fatigue that goes with writing all the applications, sometimes being disappointed.
You can correct me if I'm wrong on this, but do the peer review panels make the actual final decision when it comes to the granting councils? What I hear with regard to the CFI is that after the peer review, the CFI board sits down and makes the final decision. Then you end up with this potential conflict again. You say you can't sit on the peer review panel if you've got an application in there, but somehow you seem to be able to sit on this board, and that's okay. I have difficulty with that.
Mr. Ned Ellis: Everyone probably operates slightly differently. At SSHRCC the process depends on the size of the awards that are being made, and below a certain amount—at SSHRCC that's $500,000 per project, so it's a very large amount—the committee is technically making a recommendation to council, and the council delegates to the president, Marc Renaud in our case, the ability to accept or reject the committee recommendations. In practice, he does not reject committee recommendations or change them, they're accepted. The power is there, but it's not exercised.
The Chair: But when you're not accepting all of them, when you're making recommendations, and you're choosing at the board level because there are so many good applications, it has the potential to look as if there's a conflict.
Mr. Ned Ellis: At SSHRCC, at any rate, that conflict doesn't exist, because the review committee will make recommendations. They know what the budget is for that competition, and they'll make their recommendation in this way.
The Chair: Is that how NSERC and CIHR work?
Mr. Mark Bisby: Yes. Our board members just know what ratings applications get, they do not know what rating is attached to a particular application. So they're completely blind about that, they just make aggregate decisions, rather than decisions on individual applications.
Dr. René Durocher: And the chairs program is very different, because we judge the merit of one individual. If he is refused, the university still keeps the chair and may propose somebody else. But it's not a matter of money, it's just the merit of the nominees.
The Chair: There is one last comment I have—the bells are ringing, so we have to go and vote.
We didn't have a chance to get into this fully, but I have heard several times over the past few years that there is almost a hierarchy amongst different levels of scientists or different types of scientists, and that engineers often feel they're way down in the pecking order of how things are approved. I'm just going to leave that with you, and if you have any comments, you can get back to me.
I will also tell you that coincidentally, I serve on a task force, and we were out visiting one of the veterinary colleges the week before September 11. They were talking about how they've been turned down through the granting councils on a number of occasions, even though they fit within the mandate, I think of NSERC. Yet they have the facilities to look at disease and biological warfare—this was the discussion the week before September 11. I just raise that because there seem to be groups out there that fall through the cracks of the three granting councils, and I don't know how we better address that. If it doesn't happen through peer review, I'm not really sure how they can be captured, because if there's no one sitting on the panel, for example, with an animal science background, they feel as if they've been discarded.
Dr. Elizabeth Boston: I have a comment on that. We've been in a fairly extensive discussion with the people from the veterinary colleges and it's an ongoing thing. So we're aware of those problems and we're trying to address them.
The Chair: Okay.
I want to thank you all. This has been a very interesting discussion. There are more questions, but I apologize, we do have to go and vote. We look forward to meeting with you again in the future. Thanks very much.