Skip to main content
EVIDENCE

[Recorded by Electronic Apparatus]

Friday, March 14, 1997

.0916

[Translation]

The Chair (Ms Sheila Finestone (Mount Royal, Lib.)): Good morning everyone. Welcome to everyone. My name is Sheila Finestone and I'm Chair of the Standing Committee on Human Rights and the Status of Disabled Persons of the House of Commons and the member of Parliament for Mount-Royal.

[English]

This is the last of our six consultations across Canada. We've been in Vancouver, Calgary and Toronto, and we were in Fredericton yesterday. We are here today for the end of the formal interchange on the implications for federal and provincial jurisdiction with respect to privacy and the privacy rights of individuals.

In the examination of right to privacy in this high-tech world, I must say it's wonderful to be back in this beautiful city of Montreal.

[Translation]

I wish to thank all of you for being with us today. Throughout our round table discussions in Ottawa, in the fall, in September, October and November, we received people from all walks of life. We heard people who had expertise in various sectors of this society. Given everything we've heard and read about new information technologies, the committee then decided to look at their impact on the private lives of individuals, the protection of privacy and human rights.

It's interesting because here in Quebec, the issue of smart cards is being looked into right now. That enters into the discussions we heard on subjects that are of a great personal interest to us. This is important and I think this is a good time to examine this very closely indeed.

[English]

We've adopted a modified structure for the standing committee of the House. Normally we would have witnesses around a table and we would hear individual presentations. We felt that since this is such an important subject we wanted direct input in a very informal way on the issues that touch each and every one of us in a very personal way.

[Translation]

The procedure of our committee is somewhat different. It's important to get a feel for what various people are thinking and understand the values that we seek to protect as individuals, national pan-canadian values on issues that affect our personal lives, our soul, our behaviour and our values.

The invasive and changing forms of today's technology lead to ask the following question. Who is watching me? Who knows what about me? To what extent do they need to know? Where's the balance between social and economic needs such as crime and fraud prevention, health care and services, and business practices and our right to protect our private lives?

.0920

Where do we strike the balance? This examination is very timely, given the news we've heard in recent weeks: the cloning of Dolly, the Scottish sheep, the children created accidentally in Belgium, the Quebec National Assembly's examination of smart cards and the alleged sale of personal information by public servants.

We've learned that public servants were selling medical, tax and other information on the black market for amounts varying from $25 to $120. Have there been fines? What is the procedure? Once such information is revealed, how can it be recovered?

These are the questions we have and it's interesting, because there were press clippings, not only in La Presse, in Montreal, but also in the Globe and Mail. In Vancouver, there was a whole series of questions about the setting up of information clinics, a system that will save hospitals $20 million. But there was no mention of a protocol for the protection of privacy.

[English]

There's a foreign-owned system called Phamis that's going in through B.C. Tel and is very effective. It's going to save $20 billion. That's very significant in the health service sector for this hospital, but what happens to your private life and the protocol for protection and privacy?

Again, this is an issue of the privacy commissioner in Quebec, who will be joining us shortly. What do you do with these people who have, in a sense, invaded the privacy of individuals and sold this information? It was very interesting in Vancouver. The first cartoon we saw was photo radar, a video.

[Translation]

There was reference to a means to protect ourselves from video cameras. I don't know if they knew we were coming, but that was the news of the day.

[English]

The other thing that was very fascinating, I think, and that all of us would be interested in, was in The Gazette. It was about this history teacher who finds out that he can trace his family back 9,000 years to the Cheddar man in the museum in London. DNA linked this family's history.

[Translation]

That means that it's important for us to find out what we need to know in our evaluation of DNA. If we can be found 9000 years later, do we have the right to go into it more closely? Do we have to examine a certain level of protection more closely? I don't know, but we'll see what you think about these issues. I believe they're important.

[English]

The right to privacy, as we all know, does not stem from one source. It's drawn from international law, constitutional law, federal and provincial legislation, judge-made law, professional codes, ethics and guidelines. It's called a ``patchwork of privacy''.

At the international level, several important human rights documents contain guarantees to the rights of privacy. I call them the ``Magna Carta of humankind''. One is the Universal Declaration of Human Rights in 1948, which was co-drafted by a Canadian. I'm very pleased to acknowledge that John Humphrey lived in Montreal, in my riding. Andy Scott says he was born in New Brunswick, so in our bilingual and multicultural world, we share him. The other is the International Covenant on Civil and Political Rights of 1966, to which Canada is a signatory.

[Translation]

In Canada, there is currently no overall mechanism for the protection of privacy. Quebec is the only place in North America where private sector practices regarding personal information are appropriately regulated. In Europe, for example, the principle of equitable information of the European Union and OECD countries applies to all personal information, regardless of their medium or the form in which they are accessible, whether they are gathered, held, used or distributed by other persons. In Europe, everyone has a right to no invasion of their private and family lives, their homes and their mail.

.0925

There are no similar rights in Canada, although section 7 and 8 of the Canadian Charter of Rights and Freedoms, which list the legal guarantees regarding life, liberty and security of the person, as well as protection against search and seizure, have been interpreted by the courts as applying to privacy. The one exception is Quebec.

[English]

It's important to note, however, that the federal departments of Justice and Industry are currently working on a framework of legislation with respect to data protection, and it is expected by 2000. This legislation is expected to include the Canadian Standards Association's code of ethical practices.

A consultation paper will be released shortly and I hope you will be interested in receiving that paper, reacting to the content, and looking at it with respect to the question of human rights and the right to privacy.

The concept that privacy is a comprehensive, pervasive human right held around the world has, I believe, been accepted. It's a broad and ambitious right. It's a universal concept, and yet it's not an inalienable right.

[Translation]

Some experts define it as the right to dispose of a space of one's own, to have private communications, to not be watched and to be respected in one's bodily integrity. For the average citizen, it's a question of power, the power that each individual exercises over private information that regards him or her. It's also the right to remain anonymous.

It's a very fundamental right, and the question that arises is the following: what is the value of privacy in our high technology society? There is no doubt that new technologies present great advantages for us all. Yes, there are things that are important and are better because of it, but to what extent is the price of these advantages our privacy? Is the price too high? Are compromises inevitable?

[English]

So where do we draw the line? Privacy is a precious resource. Once lost, whether intentionally or inadvertently or in unconsidered consequences, it can never be recaptured.

[Translation]

As members of the Standing Committee on Human Rights and the Status of Disabled Persons, we approach this squarely from the viewpoint of human rights to measure the negative or positive effects of new technology on our right to privacy.

[English]

Canadians have never approved of peeping Toms or unauthorized wire-tapping, and our criminal law reflects this. Does this same disapproval extend, for example, to hidden video cameras in the workplace, to DNA data banks or to citizen identity cards?

We've asked you to come here for the reason we've been holding this series of informal round tables: to test the pulse of Canadians, to take a look at and to listen to what the values and views of Canadians are so that we can, in an enlightened way, suggest the kind of direction federal, provincial and territorial ministers might look at with respect to these issues.

[Translation]

The committee is aware that the issue of privacy and new technology is a very broad one. That is why we have decided to concentrate our investigation on these three types of intervention and intrusion. Ms Valerie Steeves will present the points on which our attention will be focused today. We hope to use these case studies to raise public awareness of what is at stake here.

.0930

The proceedings of this committee are televised. We are on CPAC and we invite the Canadian public to watch these programs which will begin on Sunday. The round tables will also be broadcast. Anyone who watches is invited to table their views with our clerk. We will try to present to the House of Commons a portrait of the vision Canadians have of their values in this regard. When is enough? How far should the state and the private sector be allowed to go on this issue? To what extent should we be protected?

[English]

We're not going to resolve the issues, but we are certainly going to look forward to what you have to say.

Valerie, I would ask you

[Translation]

to direct us. But before we undertake this, I would like to introduce Mr. Wayne Cole, our Clerk;Mr. Roger Préfontaine, our Deputy Clerk; Mr. Bill Young, our researcher, who is an analyst at the Library of Parliament; and Jean-Yves Durocher, our media consultant.

The members of the committee, who are members of Parliament in the House of Commons, will introduce themselves.

Mr. Andy Scott (Fredericton - York - Sunbury, Lib.): My name is Andy Scott and I'm the Member of Parliament for Fredericton - York - Sunbury in New Brunswick, and vice-chair of the committee.

[English]

Mr. Sarkis Assadourian (Don Valley North, Lib.): I'm Sarkis Assadourian, member of Parliament from Don Valley North, in Toronto.

Ms Jean Augustine (Etobicoke - Lakeshore, Lib.): I'm Jean Augustine, member of Parliament for Etobicoke - Lakeshore.

[Translation]

Mr. John Godfrey (Don Valley West, Lib.): My name is John Godfrey and I'm the member of Parliament for Don Valley West in Toronto, the city where John Humphrey lived and worked for a long time.

The Chair: Now there's a man who really shared his life, didn't he?

Mr. Maurice Bernier (Mégantic - Compton - Stanstead, BQ): My name is Maurice Bernier and I'm the Member of Parliament for Mégantic - Compton - Stanstead and vice-chair of the committee. When I hear Ms Finestone tell the story of the guy who found a relative from 9000 years ago, I say to myself that for historians, it's interesting to think that we might be discovered and recognized in 9000 years. However, as politicians, we want to be discovered in the next 9 weeks that will lead us to the election.

Having said that, I wish you all a very good day.

The Chair: Thank you, Maurice. You're always in a good mood and you make interesting comments.

Valerie, please.

[English]

Ms Valerie Steeves (Committee Facilitator): Thank you, Mrs. Finestone.

[Translation]

The Chair: I'm sorry. Valerie Steeves is a law professor at the University of Ottawa. She heads up a project on technology and human rights at the University of Ottawa.

Ms Steeves: In order to put our discussions in a social and personal context, the committee is presenting three case studies regarding video surveillance, genetic testing and smart cards.

As you know, the point of these stories is to illustrate both the advantages and drawbacks of these new technologies. We hope that by discussing the impact that they have on the lives of individuals, we will have a better idea of what respect for privacy means to Canadians and the way that we, as a society, can strike a balance between the advantages provided by new technologies and maintaining our fundamental social values, notably the importance of privacy.

[English]

Participants here today represent a broad cross-section of Canadian society. Across the country we've had representatives from advocacy groups, banks and insurance companies, general business associations, crown corporations and disability organizations. We've had educators, government workers, genetic researchers, health workers, human rights groups, multicultural organizations, labour unions, police officers, lawyers, media, technology firms, telecommunications companies, cable companies, and you.

.0935

[Translation]

In order to better examine the perspectives open to us, we will begin by discussing case studies in small groups. Each group will be facilitated by experts in privacy rights and will include at least one committee member.

Once the case studies are examined, we will come back into a plenary session to discuss the issues that were raised. In order to start off the plenary discussion, we will ask members of the committee to summarize the main points raised by their respective groups. We will then give the group facilitators an opportunity to add their own comments and concerns, and then we will open the debate to the entire assembly.

[English]

We're looking forward to an open and free-flowing exchange of views between you, the participants, the experts, and the committee members about the meaning of privacy in the technological age.

It's my privilege this morning to introduce the four people who will be facilitating the small group discussions.

[Translation]

Marie-Claude Prémont is an assistant professor at the Faculty of Law at McGill University, a lawyer and chemical engineer. As part of an international co-operation network between the European Commission and Canada, she was a consultant on the use of smart cards in the health care system. She works with the Centre de recherche informatique et du droit in Namur, Belgium, and is also a consultant for the Province of Quebec regarding government data banks.

Pierrot Péladeau is currently a guest researcher at the Institut de recherche clinique in Montreal. In that capacity, he has collaborated on research projects on the ethical, legal and social problems raised by the use of genetic information. He is also the Vice-President, Research and Development, at Progesta Communications Inc., a firm specialized in the management of personal information, which has helped more than 500 organizations in implementing personal information protection programs. He is also the editor-in-chief of Privacy Files, a professional journal specializing in all the issues raised by the use of personal information.

[English]

Lewis Eisen is a lawyer and computer consultant who also trains lawyers to use the Internet in both Canada and the United States. He is the author of The Canadian Lawyers Internet Guide and the president of the Canadian Society for the Advancement of Legal Technology.

[Translation]

Marie Vallée is an analyst on issues on policy, regulation, telecommunications, protection of personal information and the information highway. She has worked for the Fédération nationale des associations de consommateurs du Québec for over six years. She is very active on issues of personal information protection. Among other things, she participated in the consultation that led to the passage of Bill C-68 in Quebec. She's a member of the CSA technical committee that developed a prototype code of ethics for the protection of personal information. She represents consumers before the CRTC on telecommunications issues.

[English]

You'll notice that your name tags are colour-coded. That's one of the ways we're going to be dividing into groups today. If you have a blue name tag, you'll be meeting with Marie Vallée and Lewis Eisen. If you have a yellow or green name tag, you'll be meeting with Pierrôt Péladeau and Marie-Claude Prémont.

When you get into your small groups in a moment, your facilitator will start by asking which of the three case studies you'd like to start with. Our time together for these discussions is short, so please remember that the case studies are really just to provide you with a starting point. Feel free to spend as long a time or as short a time as you wish on any of them and to draw linkages between the three, mainly to voice your concerns about how these new technologies impact on your sense of privacy.

.0940

We'll be reconvening for the town hall shortly after 11 a.m. Just before we break into our small groups, Mrs. Finestone will be suspending the formal meeting. When she does, if we get into our small groups quickly, it will maximize our time together for our discussions.

Thank you very much.

The Chair: Thank you, Valerie.

Under normal circumstances each of you would be presenting a brief or exchanging your point of view, so we would have individual presentations. I hope this new format, which we're experimenting with, will be of interest to you. It certainly will be fruitful for us.

I thank you very much. The meeting is suspended to the call of the chair.

.0941

.1114

The Chair: I see a quorum. The meeting shall resume.

This has been a very instructive and informative session. I found it very dynamic to have an exchange amongst informed people from different sectors rather than hearing all the individual presentations.

[Translation]

For my part, I have learned a great deal and I would ask the rapporteurs to tell us about the substance of the discussions.

.1115

I'd like to tell you that we have here an abridged version of the final report on the evaluation of the Quebec experimental project on microchip medicare cards, as well as a document regarding the pilot project for health care cards conducted in the Rimouski area. Throughout Canada, be it in Vancouver, Calgary, Fredericton or Toronto, questions are being raised about this smart card and the experiment conducted in Rimouski. I would invite those who wish to obtain a copy of these documents to contact our clerk, Mr. Wayne Cole.

[English]

For any of you who wish to have the report, I have it in English and in French. The assessment of the smart card for Quebec patients, the manner in which it has been assessed, and the evaluation that has been done with respect to access to information will be of great interest to many of you watching on the web site. Please note that you can obtain it through the clerk of the committee.

As well, outside the room following the meeting there will be a demonstration of the access cards. Those of us who saw the CANPASS card and the INSPASS card in Vancouver will be able to see, in this demonstration, the access card and the information you can retrieve and who can retrieve it.

At this point I'll call on Maurice.

[Translation]

Mr. Maurice Bernier: It's my pleasure to attempt to summarize very succinctly what was said in our workshop. Very interesting things came out of it since we had with us people who are very interested and well informed about this whole issue of privacy and new technologies.

Of course, I will not repeat what every person said, but I will try to underscore certain points. Since we will have a plenary session in a few moments and that all participants are still with us, they will also have an opportunity to respond if I misinterpreted anything or forgot certain points.

Although we did not conduct an in-depth examination of the two proposed case studies concerning genetic testing and the smart card, we did discuss the whole issue of information that concerns us as individuals, which is really the basis for the discussion when you make the link with privacy. As Ms Finestone pointed at the beginning of our meeting, essentially, the point is to find out who knows what about us, how that information is used and how we can control it. Therefore, our discussion centred on those issues.

The greatest danger, that all participants identified as being something to avoid at all costs, is a concentration of all information about us as individuals in a single place, with a single data base holder, be it the government or the private sector, or that there be a simple technology, a sort of super card that could be used for any reason. We should be in a position to assure an individual who does not want his or her file to circulate that there is no danger that can happen and that the file will not be disseminated any where on the planet through the Internet.

.1120

That was the first issue on which there was consensus in our workshop. We also stated that the government should be a model in this area, because if we can't trust what our governments do, we have a serious problem on our hands.

The Rimouski experiment in the area of health care, which everyone in this room is probably familiar with and which used the smart card, took place over a period of a few years and produced very conclusive results. We discussed it and tried to find the basic element of this experiment in the use of new technologies. We decided that one had to know why there was a need to introduce new technology and what the objectives are before looking at how this technology should work. Before we consider the tool as such, we should ask ourselves what we plan to do with it.

The basic principle in the Rimouski experiment, which we would like to see everywhere, is the need to maintain the trust that must exist between consumers or individuals and the government or whatever institution it may be, be it in the private or the public sector. How can we maintain that bond of trust? We set out a certain number of principles, particularly and above all transparency. The individual must understand what it's all about exactly, what he or she is getting into and what will be done with this information.

The second basic principle is informed consent. We've heard about that all week and we've come back to that point very specifically. We've noted one very interesting aspect of the Rimouski experiment, namely that at all times, individuals should be able to say that they do not want certain pieces of information in their file or sent to a specific person. In other words, the individual should have power over the information that circulates.

We also discussed security mechanisms. Before using a given technology we should be aware of the means at our disposal to ensure that information is kept safely, otherwise, we should find such means.

We also stressed another basic principle on which we all agreed: information concerning us as individuals belongs to us. The information held by our insurance company does not belong to it, no more than any information about us held by the Department of Revenue. It belongs first and foremost to us, and once that is established you can make all the subtle distinctions that you wish.

We also agree on the need for legislation or a legal framework to establish those basic principles. We would like to have a legal document on which we can really rely. When any form of new technology is introduced or implemented, particularly in the private sector, we consider that an assessment must be carried out. The Rimouski experience is a good example which shows that can be done and is being done in the public sector. We would like the private sector to do likewise.

In conclusion, I would just like to say that the point on which everyone particularly focus is the absolute need to sensitize Canadians about the emergence and impact of new technologies, and to ensure that they are continually well-informed. Sensitization and information can be considered the key to successfully introducing any new technology.

.1125

I must apologize to those people whose statements I have mistranslated. You will be able to rectify the situation later. Thank you very much.

The Chair: You have been quite accurate, Maurice. The reason why we ask members to serve as rapporteur is to ensure that you agree on what we report and that your ideas are accurately reflected in our reports. If we overlooked a key point, please bring it to our attention during our discussions.

[English]

Andy Scott, you monitored that table. Is there anything you want to add?

Mr. Andy Scott: No, I'm looking forward to the experts and the interventions by the rest of the town hall. Thank you.

The Chair: Thank you. John Godfrey, please.

[Translation]

Mr. John Godfrey: This is the first time in my life that I have been part of a blue group. We discussed two issues, video surveillance and smart cards. The blue group was a group of philosophically minded people obsessed with definitions. They wanted to clearly define the problem and even the concept of privacy. My report will be given in the language of the participants; if a participant spoke English, I will report his observations in English, and vice-versa. This will be a typically Montreal mixture.

Five major themes emerged: definitions, social values, principles, a debate on good or bad technology, and lastly suggested solutions. We will begin with definitions.

[English]

Of all of the groups I've participated in, I think this one was the most determined at the beginning of the conversation to make some distinctions that I think are quite important for our conversations.

The first distinction or definition was between privacy and confidentiality, with privacy being defined as ``Who knows what about us?'' and confidentiality being more ``What use shall be made of information that is known about us?''. A second useful observation or definition was that privacy is a complex term because it is not a kind of free-standing right but is often associated with other more established rights - for example, it's sort of an associated or preconditional right.

The right to free assembly can be chilled or damaged by excessive knowledge about you, say through video surveillance. If you know there are going to be television or RCMP cameras picking you out as an individual, depriving you of your anonymity, that might reduce your inclination to assemble, or indeed your inclination towards free speech. So that's one of the sources of its complexity.

A third source of complexity is the distinction - and I must say, I think I heard it more fully developed in this group than elsewhere - between ``privacy'' and ``anonymity'', with anonymity being defined as putting yourself in a certain situation and not being identified.

In other words, if you wish to go to a store, presumably not a convenience store or dépanneur, and buy by cash, you do not leave an electronic trace of the transaction. But of course anonymity has this curious distinction of being almost an urban right; it's only available in communities that are large enough for you to get lost in or crowds that are large enough for you to get lost in.

There are two other aspects to anonymity that the group brought out. There is a reverse problem of anonymity, that is to say the anonymity of those who control the data banks - and we don't know who they are - or who control the cameras - and we don't know where they are or who they are. So that's one of the bad sides of anonymity.

.1130

Another bad side is, of course, that if nobody knows who they are, that can be in a sense a source of social breakdown. Anonymity and privacy are not the same thing, and they're kind of determined by societal conditions. I think that's something that we as a committee will have to keep in mind.

With regard to social values, I think the first thing we said here was that in a sense the public is ignorant of what all these technologies are about and of their potential and doesn't care, and that's a problem that will have to be resolved in some fashion or other. This state of blissful ignorance is only true until something comes along to shake people's confidence, such as the sale of data by civil servants in Quebec. At that point people start to go on the talk shows and to show some concern.

Another issue, which I really raised in another fashion previously, is the whole question of the collectivist values being in a sense undermined by our loss of privacy. The example, which was a rather interesting one, was where we could count on a video camera doing something for us - that is to say, there's an accident but the video camera has seen it so I can walk away. In other words, there's some kind of reliance on Big Brother or the anonymous person at the end of the video to do the job that you as a citizen ought to have done, which is an interesting observation that I don't think we've heard before.

Moving on to principles,

[Translation]

the fundamental question is what problem we are trying to resolve through new technologies and whether it can be resolved by other means. I believe that this is, if I may use the expression, the burden of proof required here.

Another fundamental question concerns the expectations of ordinary people with respect to privacy. Is it necessary to resolve these issues before introducing new technology and, if so, how that can be done? Technologies are being developed at frightening speed, whereas I would suggest that the law is moving forward at a more stately place.

[English]

A third principle, which I think we've heard a lot about in our previous outings, is the whole question of what is the primary purpose for which these technologies are introduced and how do we prevent them being used for secondary purposes - the whole notion of function creep. For example, one might agree that having cameras in the auto tunnels of Montreal is a good thing for accidents, floods, hijackings, for all the various and exciting things that happen to your tunnel, but how do we prevent the misuse of information about your licence plates and comings and goings?

Moving on to the next theme of technologies, which really got more developed, I think, during our discussion of smart cards, the point was made that right now we're in a very strange situation with the smart card technology and data collection. We're at the worst possible moment, because they're huge systems that are accessible to people who we don't know about. But in a sense there is help along the way if we choose it, which is also technological help, through public key encryption so that your smart card doesn't necessarily have to store data, although it can. It can simply store the key, which allows you and no one else to get at the data. Of course the smart card has this dual possibility, and we have to decide which way we want it to go. Basically, technologies in general or smart cards in particular are tools that can be used either appropriately or inappropriately.

Sometimes I call this the National Rifle Association's point of view - that is, it's not guns that kill people, it's just people who kill people, but still there is something in that point about technologies in themselves being neutral.

The point again is that right now we're at this strange point where things such as medical records are being in a sense poorly kept. The security on them is very bad. A terminal goes on in the morning and because it's so boring to have to boot it up every time, you just leave it on, and anybody who passes by that particular terminal can punch up anybody else's information. You don't really know who is going to take care of it.

.1135

Again, the help to guard against abuses can be on its way, both through encryption and through a better system of tracking who has been accessing your information. But that raises other issues, such as whether or not there should be an override in certain cases in which the public good is at stake. For instance, should a trauma physician be able to override your right to privacy if you come in unconscious on a stretcher, as long as you can find out that he has been at your files - and that, of course, is if you have the good luck of coming to?

Finally, in terms of solutions, again, solutions are being forced by the problems. For example, one problem, which has to do with fundamental human rights, is in welfare cases, where we are keeping all kinds of information...people's lack of informed consent, swapping the information around. Even if you're a powerful person, what legal recourse would you have against that? Should that legal recourse be high up in the legal structure? Should it be a constitutional right as opposed a legislative right that can be countermanded, overrun, contradicted by another law? How can you prevent the kind of situation you have in Quebec, where you have a charter for privacy rights on the one hand, but it can be overruled by a simple desire to save money on the other hand? Why didn't the charter stop the fraudulent sale of data by civil servants? The question is what the penalties will be for them, if any.

Another challenge to regulation is the international situation, the data havens - the Cayman Islands of data, if you like. Whereas you might be able to regulate something within your own geographic sphere, how can you control the Internet? The example given was that of the international 411 service. You can track anybody in the world if that person is in a data bank or a phone book. If there is a desire to keep this material offshore - it is offshore already - how do you regulate it? There is, of course, the additional problem of no authentication of whether the data is true or not.

Obviously, one crucial action for both the general public and all the other players is the whole notion of education. I think this is a theme we have heard right across the country, as well as in Mr. Bernier's presentation.

The second notion was that of support for privacy-enabling technologies. That support would take several forms, including policies to encourage such things as encryption; tax incentives; and penalties for people who do not go along with this.

A third avenue is policies or codes that are enforced by trades and professional associations. There was a certain amount of skepticism in our group, though, because those exist already and don't seem to have prevented abuses.

A fourth option - and I think all of these options are thought to be connected, not to the exclusion of any other - is laws that are clear, effective and enforceable.

I think this last option is a good note to end on: government services that adapt to changes in technology, that reflect social needs, and that incarnate the very values we want to preserve in a free and democratic society.

Merci.

The Chair: Thank you very much. We're having a discussion about guns, because you used the analogy of whether or not guns are really neutral. Is this technology that we are discussing neutral technology? Can it ever be used for a good purpose? Before we make any further decisions on neutrality, perhaps the question that ought to be asked here is whether or not that is a good way. Is technology the best way? I guess that's the question. What is neutral, and how do you ensure that neutrality?

Sarkis and Jean, you were both at that table. Would you care to add anything?

Ms Jean Augustine: I think he did a super job.

.1140

Mr. Sarkis Assadourian: With the help of the computer, it was fantastic.

The Chair: Thank you very much.

Before I then go to the floor, I have to tell you that Mr. Fortin is here. As you probably all know, he has been in charge of the whole Rimouski experiment. I'm sure he'll be able to indicate, certainly to the audience watching, to those who will be reading the record and to all of us here what he's put into place in the pilot project that responds to a number of the issues you raised. You might then wonder whether that's really enough, or you might think that's good enough or that's great. It may be very interesting to see your response given the list of questions you raised.

[Translation]

I would like us to begin with the comments of the experts. I would like them to express to us their concerns, interests, view points and the observations they heard in their discussions around the table.

Ms Marie-Claude Prémont (Faculty of Law, McGill University): A number of very important subjects were raised during the two workshops. I would like to focus on one which was perhaps less visible in the reports given. It concerns the right to privacy and the protection of personal information.

It might seem that we are participating today in a committee considering individual rights, but in my view this is not just an individual right. The right to privacy also implies the protection of certain collective rights or certain social entitlements. This is particularly obvious when you are talking about a subject of a kind our group focused on, that is medical services. Our medical services structure currently has an insurance system which works very differently from a private insurance and is based on a different philosophy. Some people have referred to it as the socialization of health care costs, which is at the very heart of our health care system.

For example, as regards the protection of personal information in the area of health care, which is in fact very sensitive information, the impact of implementing technologies such a smart card or the networking of information will be quite different because it will integrate a system based not on individual but rather on social considerations.

When we talk about personal information, we tend to think that it means information about us to which we do not want anyone else to have access. But in fact this is not the case. In the area of health care, the important information is that which is not held, the information which the physician has and must provide to us. Someone other than us has information that we want to obtain in order to maintain our good health. If we had a private health system, where our contribution to the health care plan was in proportion to the risk we represent, we might perhaps be hesitant about consulting a doctor. Therefore, even the creation of health-related information is jeopardized. As you can appreciate, the problem of protecting privacy is closely linked to the particular social context and to that fundamental issue of collective rights, which means in this particular case the right to participate in a group insurance plan.

The simple message I wanted to get across here is that, regardless of what I suggest, you should not consider that the right to privacy is only an individual right. It is also a collective right. It involves social entitlements and collective rights referred to in many pieces of legislation.

.1145

The Chair: Ms Prémont, if, as we are told, the drug insurance plan is being privatized, what should be done as regards both prevention and treatment with respect to drugs which are an integral part of society's health care system?

Ms Prémont: It must certainly be realized that all of this has an impact. If we take the example of Quebec's drug insurance plan, which is a joint plan since part is covered by the social insurance system and a part is privatized, we see that this will have an impact on the way you will exercise your rights as a person ensured under the plan. There is also an impact with respect to the management of information.

This is not a unique situation. We have experienced this kind of thing in other areas, for example automobile insurance. Therefore, it is not just health care which is involved. Why, in our societies, are we moved from a private insurance system for automobile to a public one? There are reasons for that. There is the issue of social entitlement behind all this. There are consequences concerning the management of personal information. Will you or will you not report that you have had an accident? Will you hesitate to do so because the insurance on damage to property may have an impact on your premiums? However, as regards personal injury, you know that you are entitled to compensation and that such a claim will not increase the premiums you have to pay to the Société d'assurance automobile du Québec.

We must realize that, in the area of personal information and right to privacy, the very creation and circulation of such information is closely linked. So what should be done? There is no one answer, but it must be recognized that this a relevant question and the issue of protecting personal information is closely linked to the particular social context involved.

The Chair: Thank you.

Ms Marie Vallée.

Ms Marie Vallée (Fédération nationale des associations de consommateurs du Québec): I would like to thank Mr. Godfrey. I think he has done a very good job. However, I must point out to you that there were very many lawyers in my group, and that is no doubt why the definitions were studied so carefully.

I will try to convey to you the feelings of consumers with respect to new technological developments and possible dangers regarding the protection of personal information. A few surveys have been conducted in Canada and elsewhere. We also conducted one. We wanted to know whether people were aware of the situation and had any experience in this regard.

We were not surprised to discover that people didn't know that so much data was collected about them, and also they not really had any bad experiences. However, as people do not know what is happening, they are not aware of the problems involved. I think therefore that the point raised about the need to increase the level of knowledge and do a lot of educational work is a very important one.

Furthermore, people are increasingly accustomed to having instant access to a wide range of services, provided by the government or the private sector, to what we might refer to as convenience. This has become a kind of trade-off. In order to have rapid access to money or services, we have allowed many technologies to invade our lives without really being able to assess the long term consequences. We now realize that such an invasion has perhaps gone too far.

What can we do to regain control or at least retain some control? This will have to be looked at carefully. A theme which we hear more and more about is the concept of a "tool box" that is the technological tools, the tools with a code of conduct, the legislative tools we will have to use to resolve the various points involved in the problem.

I would like to raise another point to which Mr. Bernier referred, namely the degree of trust we can have in our governments.

.1150

I live in Quebec and for a long time I boasted about living in a jurisdiction where personal information was very well protected, both as regards the public and the private sector. But I became very disillusioned last Spring when I saw the Quebec Government was introducing a series of acts allowing data to be crossed over between the various governmental data banks.

I thought that in Quebec we had two charters of rights, one federal and the other provincial. In the provincial charter, protection of privacy is explicitly stated, and yet we have seen our government, on the ground that it is trying to catch thieves and save money, completely or almost ignoring the Quebec charter. So, can we trust our governments to introduce means of protection if, when it suits them, or when they think they can perhaps save money, they tell us that the charter of rights does not apply in a particular case for a particular reason?

That is the main question I am asking now. I think that a piece of legislation with general principles would certainly be a useful instrument, without worrying about the fast or slow development of a particular technology. But, I will repeat once again, if our governments do not respect the charters, what is the point of such legislation?

[English]

The Chair: Thank you very much.

The question of cross-information access is certainly one we've been looking at for a long time and we've been listening to and hearing about across this country. I'm glad you raised it again.

[Translation]

Pierrot Péladeau, please.

Mr. Pierrot Péladeau (Progesta Communications Inc.): Thank you madam Chair.

I will deal with three points. First, I would like to talk about the relationship between human rights and information technologies. Second, I will talk about the role of the technologies themselves. Third, I will try to resolve or respond to the question raised a few minutes ago.

Given that we are here before the Committee on Human Rights, I think we have to be clear about human rights. We are not dealing just with the concept of the right of privacy or non respect of privacy when we are talking about the use of personal information or information technologies. That is what comes immediately to mind. We immediately think of collecting, storing and communicating information, and focus therefore on that.

I explored the Thesaurus and found some 300 terms related to human rights, and I was able to make a connection with about 150 notions associated with fundamental human rights. In addition to the collection, storage and possible communication or non-communication of information, it is also the fact that such information is used to make decisions regarding the individual or collective life of these people. Such information is collected and used to make decisions, with or without our participation.

These decisions concern the right to health care, education, work, social security, the right to vote, etc. Some 150 such areas were identified. To come back to what Marie-Claude Prémont said, I therefore believe that the focus should be placed on those areas where there are really problems.

Also, and this may appear to be a little provocative, I am writing a book which is in fact entitled Pour en finir avec la vie privée (Finishing with privacy). I think that we have to get away from this and look at where the real problems are, because that's what we do spontaneously.

There are other issues. I have already mentioned the approximately 150 issues concerning rights, but there are also issues concerning social values, economic issues, which are obviously important, as well as administrative issues which also have to be looked at.

I would just like to give you the example of the health issues mentioned in our workshop, I myself work in this area and, as part of a far broader program, I have had interviews with 20 people in charge of plans to network health-related information. These people circulated the information, and I asked them what were the social, legal or ethical problems they encountered. They answered immediately and spontaneously with words such as "confidentiality, privacy" and such like. You don't have to go any further.

.1155

At the present time, we know what are the technological and legal solutions and they can be implemented. Generally speaking, they are identified and implemented.

When in my interviews I tried to continue this discussion and focus on the real problems encountered, the people concerned told me about the enormous organizational and professional problems involved. There were many problems between patients and doctors, for example, or professional problems between members of the public, and it was found that 15% of the projects were in danger because of social, legal or ethical problems which were not issues related to privacy. Many of these projects failed. In the case of one recent one, it was stated that the $100,000 invested would be money down the drain because it didn't work.

Therefore, we have to look at all the issues involved because the problem is an overall one. As Marie-Claude Prémont pointed out earlier, the notion of privacy leads to a series of questions which are perhaps more fundamental than we thought and which must be looked at. All these points are included in the report on human rights and information technologies.

Second, I would like to talk about the role of technology. It must be understood that information technology means only collecting information and managing files. It concerns the management of relationships between individuals and organizations, relationships between the individuals themselves and the relationships between the organizations themselves. It is through the rules governing the encryption of information used in smart cards or automatic tellers that technology will make it possible to identify whether something is permitted or not. Therefore, certain rules will be included in the hardware and software, and that is far more effective than a piece of legislation.

You can always adopt legislation or rules which can just gather dust on a shelf, but when you put rules in an automatic teller or smart card, it is very difficult to ignore them, and that is what makes this technology both great and dangerous. In fact, once this costly infrastructure is installed, not much can be changed. However, that is no longer really the case because progress is being made very quickly and it will be easier and easier to change the rules.

Therefore, there is an important point which I consider fundamentally to be a democratic issue rather than a matter of individual rights, in so far as the rules inserted in a smart card, an automatic teller machine or elsewhere on the information highway, are rules which will govern our relationships. That is what we call legislation.

In a democratic society such as ours, laws must be the subject of a public debate and be adopted with full knowledge of what is involved. People must be able to make decisions.

My friends, the members of the Committee, I may have a piece of bad news for you. I think that in the past, you were the people who took the legislative decisions whereas now such decisions are taken by technicians or engineers. The situation may have to be corrected.

This leads me to my third point, that is who makes the decisions and how they are made. There has been a lot of talk about the role or organizations such as the Privacy Commission and also the Canadian Human Rights Commission, there was also talk about bringing together groups of experts to discuss problems.

Personally, I think that there is a fundamental problem. I am an expert, a technocrat and I believe that I have a role to play, as do all experts. The particular commissions also have a role, but as the issues are complex they should not be considered just as technical issues to be looked at by the technician, lawyer or ethics specialist.

I often see in discussion that everything is reduced to our particular speciality. But this is a global problem, a complex issue involving a series of interests and problems. I think we have to come back to the idea of a jury, that is average members of the public to whom it should be proven that a projet makes sense.

I think that our elected representatives could play that role. I have taken part in a number of parliamentary committees in recent weeks, both in Quebec City and Ottawa, and I realized that our elected representatives were ready. I saw in fact that they were able to ask the right questions more easily and more quickly than certain experts. Therefore, certain public projects could be placed in the hands of our elected representatives.

.1200

This brings me to the question raised by Marie Vallée. I have also seen those same elected representatives adopting legislation which is completely unacceptable. I think that there is certainly a problem of educating our elected representatives, but there is also the issue of formalizing this type of responsibility. As I said earlier, I think that there is a role for our elected representatives but also one for other kinds of jury. It could be a kind of public hearing or a committee of wise people made up of members of the public such as taxpayers, consumers or patients, who would make a judgement on the range of issues involved.

I really believe that if we do not do that the danger is that we may abdicate the legislative aspects of information technology to engineers and experts. That would be a sad day for democracy. That's all I want to say for the moment.

[English]

The Chair: I have the feeling, Pierrot, that John Humphrey would love to have heard what you had to say, and seeing as how he belongs to at least three of us around the table, we have opted for this open consultation process. I think that's fundamentally what you were alluding to.

It was the view of this committee that we had to take an unusual step. As you know, this is not a normal procedure for a standing committee of the House, but rather than having each of you individually come and testify and bring us dossiers - and I thank you very much for the memoirs you have been presenting to l'Assemblée nationale - for us it was an exchange between experts who had differing points of view and differing interests that would enable us to be informed legislators.

Perhaps the day may come, Monsieur Péladeau, when we might find this newer format even in the development of legislation, because we're at that point of having change in legislative procedures in Ottawa, and bills do go before committees immediately after first reading. They go out in white papers.

You will be receiving the white paper on the joint initiatives of the Minister of Justice and the Minister of Industry, along with CSA -

[Translation]

How do you say that in French?

An honourable member: L'ACNOR.

Mr. Péladeau: The Canadian Standards Association, better know as the CSA.

The Chair: I see, but it is the CSA for me.

[English]

You know, therefore, that the Canadian standards act will be included, but of course we will want your input. So there has been an evolution in process and a recognition that, with all due respect to bureaucrats in this room, at many levels, you do need the input of

[Translation]

The average member of the public so that we can go down to the grassroots. It's not because there's more common sense there, but rather because it helps in terms of common sense.

[English]

Mr. Eisen.

Mr. Lewis Eisen (President, Canadian Society for the Advancement of Legal Technology): Thank you, Madam Chair. I had about 20 points written down. The first speaker stole my first 5, and Pierrot has very adequately covered the last 15. I've nothing to add.

The Chair: Does that mean we have had a consensus from you? Are you in an agreement, or are there any points you wish to discuss?

Mr. Eisen: I have nothing left to add. Let me chip in during the group discussion, if I may.

The Chair: Very good. Thank you very much.

[Translation]

You have the floor, ladies and gentlemen. Do you want to take the mike?

[English]

Would you please introduce yourself for the Minutes of Proceedings and Evidence.

Mr. Michel Kabay (Director of Education, National Computer Security Association): I am the director of education for the United States National Computer Security Association. I live and work in Canada and I'm a Canadian citizen, but I do work for an American organization.

I've been sketching out some brief thoughts on issues this committee has not addressed, at least in the session we've heard this morning. I have a warning to the committee that there are areas outside governmental and organizational attacks on privacy that we ought to be thinking about. Conventional discussions typically deal with government and organizational policy, and sometimes with illegal acts or acts that are against those policies, but there are other areas. We have attacks on privacy that come from criminal individuals and criminal organizations. Examples such as industrial espionage - and even external international espionage - must be part of the consideration of this committee.

.1205

It is the opinion of most information security specialists, and I speak as one, that the state of computer and network security in today's environment is abominable. Generally we have absent or poorly developed or ineffective and unenforced security policies that make it difficult to implement the best wishes for protection of personal privacy. In the absence of effective security policies, privacy is one of the victims.

I would argue that the Government of Canada ought to be looking at a national information policy where we would have high-level support to improve the conditions of information security and thereby aid the endeavour to protect personal privacy.

I would like to see high-level support for information security as a strategic requirement in all levels of social enterprise, including government organizations, non-profit organizations, education and the rest.

It seems to me that we ought to have widespread support, both from government and through government incentives to private foundations for improved research, especially in the human factors affecting proper security, and therefore - I repeat - protection against the widespread abuse of existing information systems by authorized personnel, people who are authorized to get access to data but then break the rules that are supposed to protect us.

It would make sense to increase the use of civil law, and I appeal to the legal profession to work out better approaches to punishing organizations that fail to exercise what is generally called their ``fiduciary responsibility'' to protect private data and other kinds of data as well, in order to see us moving towards a general acceptance that we must not continue to be irresponsible in how we handle other people's information.

Finally, I would like to see government support for extensive educational programs. The NCSA has been among the first security associations to carry out educational programs in elementary schools, middle schools and high schools in Montreal. I have personally been responsible for changing the curricula for all the computer courses in one of the school boards so that they all include consideration of ethical behaviour when using computers and networks.

We are breeding children who have never been given any guidance in what is appropriate and what is not appropriate in dealing with other people in cyberspace. It's not surprising that for many children, criminal hackers are viewed as the Robin Hoods of today. They don't understand the issues. They do not see other human beings as being on the other end of the computer connection.

In conclusion, Madam Chair, I propose that the government look very seriously at forming a policy group to look at full government support for improving the security of information in our country.

Thank you.

The Chair: Thank you very much. I'm going to call on my colleague, Andy, but I want to add one point.

With respect to your presentation on education programs, I can tell you that I welcome the observations you've made. We had so many spheres of concern with respect to privacy presented to us at the round tables in October and November, amongst which were the issues of pornography, hate, bias, anti-Semitism, racism. We determined that would be left to another time and to the justice and criminal code issues, but I'd like Andy to respond on the other areas you touched on.

Mr. Andy Scott: Thank you very much, Madam Chair.

.1210

It's important to note a couple of things, and I would add to what Madam Chair said about welcoming the intervention and the warning and alerting us to a broader range of issues we might be considering.

It's important to mention that this undertaking is not being done in isolation. The justice committee and the industry committee are engaged in the preparation of a white paper. People who are involved in that process in fact have been part of this exercise this week across the country.

Our interest in this was born of the fact that we wanted to make sure there was a balance brought so it wasn't simply a technical exercise. Human rights and values should be brought to bear on the inquiry.

I think it's safe to say that as we had our hearings earlier to set the parameters of the discussions, the people who presented intuitively believed that Canadians would find some of the intrusions upsetting, perhaps startling, but there wasn't an outcry because there was a certain level of ignorance about what was going on, such that we felt the need to explore this very much at a values level.

So if it appears that our exercise is somewhat value-driven, that's quite deliberate. We recognize that there are all kinds of technical issues that need to be attended to at the same time and welcome your reminding us of that.

The Chair: Thank you very much.

Would someone else care to approach the microphone?

[Translation]

Mr. Melançon please.

Mr. Marcel Melançon (Collège de Chicoutimi): I am a teacher and researcher in bioethics at the Collège de Chicoutimi and the Université du Québec à Chicoutimi. For anyone who doesn't know where Chicoutimi is, I would just say that it is one of the capitals of our kingdom, which was washed last summer by the flood.

There are three points of particular importance to me which I would like to focus on briefly. First, legislators should provide special protection for a specific kind of information, that is genetic information regarding individuals, members of the public and families at risk. Genetic information is different from ordinary medical information in that it effects other family members.

To put it clearly and briefly, I might consent to the tracing of my DNA or to giving DNA, but if I do so, the other members of my family are indirectly involved without realizing or agreeing to such a step.

There is another point which I would like to raise following the comments of Ms Prémont and Ms Vallée which, in certain respects, were complementary. With a view to ensuring privacy, we should maintain our social entitlements. In the case of just Quebec, I'm referring here to the Commission d'accès à l'information. We shouldn't be working to have the Commission dismantled or rendered powerless. In other words, with privacy, we have developed a mechanism for defending ourselves which should be preserved and strengthened.

Third, the workshop focused on the example the government should give and the model it should follow. Just because it was elected by the people, a government may not do what certain private companies may, that is cross referencing files, etc. Ms Vallée referred to this earlier. Therefore, the government must set an example concerning privacy even when, as is the case at present, it is haunted by the need to deal with the deficit.

I also believe that a government should develop information and education programs on the means it uses to ensure privacy. There have been a lot of campaigns concerning the referendum and other issues, as well as publicity campaigns. Perhaps a campaign should also be organized to explain to people that the signing of a form does not necessarily constitute informed consent.

.1215

In conclusion, I think that the only way for a government to protect the people who elected it is to make sure that when in office it does not cut itself off from the grass roots. The only way of protecting society is still to ensure that each and everyone of its members is protected.

Thank you, Madam Chair.

[English]

Mr. Sarkis Assadourian: May I ask a question. You referred to DNA. DNA is a new science, basically. Well, it's not new science - it's two or three decades old - but everybody is talking about DNA now. We even use DNA for prosecution in court in cases of murder or whatever the case may be.

A couple of weeks ago we were told about cloning. Correct me if I'm wrong, but by cloning you can get DNA identical to the other clone or whatever you want to mix or match. What kind of impact will that have on DNA science or the study of DNA genetics?

The Chair: The question was for you.

[Translation]

Did you understand the question?

Mr. Melançon: No I didn't understand.

The Chair: The question deals basically with the cloning of Dolly, because we know that from now on, the same DNA can be found in two different bodies. Will that have an impact on research?

[English]

The question is, will it have an impact on the research because you have two identical DNAs, in mother and daughter or something?

Mr. Sarkis Assadourian: No, I want to know the impact on the identification.

[Translation]

Mr. Melançon: I think that a geneticist, Mr. Louis Dallaire, in this case, would be better able to answer that question.

I myself believe that the cloning of human beings should not be allowed. Identical twins are a natural phenomenon. In Ontario there was the case of the Dionne quintuplets, who were in fact five clones. However, can clones or twins be recreated in a laboratory? I don't think so. I believe that, in the case of human beings, we should close the door immediately to the introduction of such a process. In fact, Bill C-47, which is at second reading, strictly forbids that.

Now, for the rest of the question which is specifically scientific, I would refer to Dr. Dallaire if he would be so kind as to respond.

The Chair: Doctor, please proceed.

Dr. Louis Dallaire (Department of Pediatrics, Hôpital Sainte-Justine): I would ask you to repeat the question.

[English]

The Chair: Do you want to state your question again please, Sarkis?

Mr. Sarkis Assadourian: DNA has been used recently in science to prove or disprove a case or an individual's character. Now, with cloning, there's a chance or possibility that they could clone you, they could clone me, they could clone any one of us here. There could be two, three, or five of us being charged for the same crime or having the same identity wherever we go. What impact will that have on DNA as an instrument to positively identify a human being?

Dr. Dallaire: I don't understand what you're asking. What impact do we have on DNA? What do you mean by that?

[Translation]

Mr. Maurice Bernier: I think that the gentleman means that DNA research can be very positive, for example if it is used in legal proceedings to identify criminals. However, if it is used for cloning or any other discovery, would the effect be to prohibit or stop...

[English]

Mr. Sarkis Assadourian: May I make a point, Madam Chair?

I'm told - and again, correct me if I'm wrong - that DNA could be matched with a15 million-to-one ratio. I may be wrong. I was following the Simpson trial, by the way. That's where this DNA thing really got explosive. If you could have another identical Simpson somewhere else, it could be argued that this is not proper identification for DNA, because he was cloned somewhere else in the world.

.1220

[Translation]

Dr. Dallaire: Agreed. I understand your question. The chances of having two identical individuals are one in one billion. If you clone an individual, the DNA will be the same, of course. You're right, to a point.

Now we're very far away from cloning human beings. We're still working on animals. As far as we're concerned, there's no way we're going to be cloning humans. I imagine that legislation, in any case, will prohibit us from trying to clone human beings.

The Chair: Bill C-47 provides for denying the right to do research in that area.

Dr. Dallaire: Absolutely. Some research could be carried on like introducing human genes into an animal which could then be cloned. We could then extract protein of human origin from the animal that could then be used for the treatment of hereditary diseases. That is being looked at and is conceivable. It's being done with bacteria, to a point. Actually, that's how insulin is produced.

The Chair: Ms Lippman.

[English]

Ms Abby Lippman (Individual Presentation): Thank you, Madam Chairman. I can add some further clarification.

I work at McGill University, but I'm really here to speak as - as someone in our group referred to me - a simple citizen today. I'll answer your technical question later.

I've learned a lot sitting in the group today. The fact that I've learned a lot and have been thinking about privacy issues really makes me very conscious of the fact that education and public discussion is necessary.

I would like to propose that there be some kind of structure set in place to adapt some models that are already being used elsewhere. In the United States right now there's a man named Rick Sclove who is trying to hold public hearings about technology where simple citizens get together and are given a problem that they know nothing about, and they have access to experts to tell them what they need to know. My notion of regulation is becoming more and more this idea of citizen courts, call them what you will, where experts are resource people and not experts or technocrats who are making the decisions, because they're too complicated. I think people are much more easily able to see through the issues. The model has been adapted and is now being widely used in the Scandinavian countries for very complicated scientific issues.

I think this is a good area, because it has so many ramifications to bringing in people to learn. I get concerned as a citizen when I hear about education because it usually means that if I just learn what someone wants me to learn, I'll do what they want me to do and I'll understand it and I'll be happy with the technology. So I think that the education has to come from the ground up rather than from the top down.

The other thing I have to say as a citizen is that what I've learned was in terms of the uses. Someone pulled out their hotel card, and then somebody commented on all the information that could be in the little bar strip on it. As I sat there, I realized that I had never thought of that. Now I really don't want to use one of those things when someone can know where I was today and whether or not I was really here when I wasn't in my office.

The last comment I want to make addresses, Madam Chair, the point you made at the beginning. I was delighted to hear you raise the issue as to whether or not technology is neutral. I come from the school that says that no technology is neutral. A well-known Canadian researcher in the Maritimes said that ``technology comes in with a valence'', which means a charge on it. The gun is a good example. People say it could be used for good or for ill. If I'm in the room having an argument with somebody and I know a gun is in a cabinet, that argument has definitely changed whether or not that gun is ever touched. The mere presence of the gun has changed the territory. The mere availablility of DNA in a data bank changes how we accept peoples' testimony in a court of law. The mere availability of the computer affects how we interact with people. I think there is nothing neutral, which is why I want citizens, simpler ones than me, talking about these things.

Thank you for letting me come today.

The Chair: Thank you very much. You certainly support Mr. Péladeau's point of view very significantly.

Yes, sir, please.

[Translation]

Mr. David Masse (Chait Amyot, Lawyers): Do you have time to listen to another intervention?

The Chair: We do have the time and we do have the need for your intervention.

Mr. Masse: I'm David Masse. I'm a Montreal lawyer whose specialty is trade law and I'm in private practice. I'm also the president of the Association québécoise pour le développement de l'information juridique (Quebec Association to the Development of Computers and the Law).

.1225

I only have a few words to add. The new technology is extremely promising if well used. Of course, there are major risks for privacy and we're all very conscious of them especially in the case of reprehensible content on open networks like Internet and so forth.

The point I would like to make is that the technology of data communication on open networks is very powerful and very efficient economically speaking and could especially allow our public authorities to rationalize its information flow in a major way. There is perhaps not a single area of government that would not significantly benefit from the implementation and use of open network data communications technology.

Let's take the legal area, which I know very well. At this point, all across the country, our courts are working far too slowly and inefficiently. We're getting to a point where the only people whose interests are probably represented within the administration of justice are either the very rich, or the very big corporations, or the very poor. We're dealing with a middle class who is absolutely not being served by the legal system at this point.

This is happening because, at least in part, the legal process is too slow, way too slow, and that forces the lawyer, who is a very well paid professional, to prepare the file several times. If the preparation of a file costs $2,000 every time and if a trial spread over seven years and if this requires reopening the file each time, then it's clear the costs are multiplied by seven.

One of the possible solutions would be to dematerialize the judicial process so that several of the processes could happen at the same time and that, once the file is dematerialized, it could then go ahead in the hands of several professionals simultaneously. This would multiply the efficiency of the administration of justice by a factor of ten. It's not the only solution, but it's one solution to be considered.

The only way to establish a data communications infrastructure on a major scale, in other words statewide, province wide or countrywide, is to establish data communication over an open network. The problem with open networks is of course that of...

The Chair: I'm sorry. Maybe I've sort of lost your train of thought, but I have a problem with your last comments. Yes, justice delayed is justice denied. There's no question about that. But how do you figure on communicating the data to the concerned parties? I think I missed the substance of your intervention.

Mr. Masse: It's because distinction must be made between "open networks" and "open data". An open network is a means of communications something like the telephone networks we all know so well. It's an open and efficient network. Forget the question of open. It's a means, a communications technology that has nothing to do with the data being carried.

If an open network is set up without any care being given to the safety of data, then the data will be as open as the network. The technology already exists and was mentioned - data encryption - to solve that problem, not just to solve it but to eliminate a risk that already exists for data contained in very important data banks.

I know that people quite naturally and quite spontaneously have concerns about privacy. I'm aware that these concerns could turn into a major obstacle to setting up open networks. So if people don't fully understand the stakes in the matter of open communication networks, then we may risk having to deal with objections that are mainly based on privacy considerations, however ill-founded they may be. These objections could prevent us from taking advantage of promising technology for all the bad reasons.

.1230

Basically, then, my conclusion is that we should encourage our lawmakers to consult the experts in the field - and there are many who have come before this committee today - so that whatever decisions are made are the right ones. We should benefit from what open networks promise while, of course, avoiding having to deal with major problems in the area of privacy. Madam Chair, those are my thoughts on the matter.

The Chair: I would like to thank you and I must admit that I will very carefully read the transcript of the debate. I will be giving due consideration to your presentation because I still have questions in that respect. We can get in touch at some point.

Mr. Masse: Madam Chair, I had the honour of giving a conference in San Francisco at the end of January on the matter of protecting data on an open network. For people who'd like to read it, I can give you the URL right away; it's http://chait-amyot.ca/docs/pki.html. When you read the minutes of the committee, you'll be able to consult a document of some 50 pages on the website that will provide an answer to most of your questions.

The Chair: We thank you indeed.

[English]

I see two hands, sir.

[Translation]

Mr...

A voice: Paul-André Comeau.

The Chair: No, it's not Mr. Comeau.

[English]

I know Mr. Comeau is here. I've seen him.

Mr. Sunny Handa (Individual Presentation): Madam Chair, I work at the Faculty of Law at McGill University, where I teach courses in computers and the law, and in copyright information technology.

I want to begin my address by saying that I agree with David Masse, who spoke about the new technologies we've heard a lot about today. I think this committee needs to further study and understand what some of these technologies are all about. The public key infrastructure that David talked about is a very important new technology that may provide solutions to many of the problems being raised, and it may forestall a need for legislation. I've written about the subject as well, and there are very few people in Canada currently conversant with this subject; there are more in the States. That's my first point.

The reason I wanted to speak today was that we've talked about the scope of a privacy right. We've talked about it in our group meetings and we heard about it from the experts, and we've also talked about solutions to the problems encountered with privacy. We've spoken about voluntary codes, about the technological support for the private sector in developing solutions to the problem. We've talked about education.

My point is that we've heard this all before. We've seen it in the Information Highway Advisory Council report and we've seen it in many other government reports. But none of them really addressed the scope or the nature of a privacy right.

We heard from Madame Prémont that this must be a balanced right, that it has to balance the individual and collective interests. I'm not sure that's necessarily or overly helpful. Everything in Canada is a balanced right. There are no absolutes. We know that our charter of rights is balanced by our section 1. We all know that free speech can balanced, and we've especially seen it in the context of Quebec.

So what is the scope of this privacy right that we keep talking about? This is a committee on human rights. Is it a human right, and what does it mean to be a human right? We've talked about the International Declaration of Human Rights. What does that mean in terms of the applicability of this in a court of law, for example? Is it optional, is it a voluntary right that may be subject to free riding, for example?

.1235

I don't think that what the public necessarily wants is a legislative right that can be overridden in the public interest - we've seen the Quebec charter overridden - should it become a federal charter right, and we all know the difficulties that would be encountered in trying to pass that as a charter right.

Can someone on the committee comment on what they've heard so far across Canada in terms of how high this right should be? Where should the bar be set for a privacy right?

Thank you.

The Chair: First of all, I want to thank you very much. Valerie Steeves is going to respond to one part of your question.

I can tell you that in general, and I won't say it's unanimous across the country, there is a recognition that it's a fair balance between individual human rights - privacy being an individual human right - and the right of the collective to protect themselves in many ways, and to develop new means of looking after health issues, and all kinds of research developments. So there is that balance, precarious as it may be in some instances, between the individual and the collective right.

We've heard, almost unanimously across the land, that in the view of the many, privacy is the first of the human rights. Then there are the balances that have to play. It is with very defined parameters that you would abridge any sense of the individual's right to control and have access to and to determine where his rights shall be placed and who shall have access to them.

You wanted to add something about that piece of research being done.

Ms Steeves: Yes. As a point of information, as I believe both Mr. Scott and Mrs. Finestone indicated earlier, the work of this parliamentary committee is in many ways complementary to other federal initiatives in the area.

One of the most positive of those, I think, is the federal government's pilot PKI project for internal matters within the federal government. The hope is that through developing this complex bank with appropriate safeguards and identification practices, they will be able to develop a model for PKI that they would be able to use in the private sector as well.

One of the strengths I've found, going across the country and watching these consultations and the work of this committee, is precisely that it does look at the issue as a human right. Industry Canada is looking at private sector involvement and practices, which is valuable, and Justice is looking at hate crimes and the application of those types of offensive-content rules to this new kind of communication technology.

The perspective of human rights allows us to see the balance between the competing social values. In order to really deal with this technology in an effective way, I think that's an essential part of the package.

I would certainly agree with Mrs. Finestone that the public has indicated there is a certain primacy to this type of inquiry. Although it's not an absolute right, in the sense that there is a balancing act, there's a real concern that we had better start doing this balancing act now, while we're proceeding with these other initiatives.

The Chair: Thank you. Andy.

Mr. Andy Scott: Thank you, Madam Chair.

The Chair: Excuse me, Andy. I'm going to ask each one of you, in response to that question - because I think it's a very fundamental question that was put to us - what, in your view, were the main things you heard, and how would you wish to answer that question?

Mr. Andy Scott: Well, I'd like to answer that question too, but I wanted to speak to another point that was made.

The Chair: Okay.

Mr. Andy Scott: To some extent I'm responding to the challenge Ms Lippman made to find even simpler citizens. Speaking on behalf of all of those, one of the things we've heard about all week has been around the question of education and the fact that no one knows what's going on. The citizenry is therefore in a weak position to look after its own interests, because they really don't know what those interests are. I've bought into that all week, listening to that, and hearing people speak of education, and realizing there is a need for people to arm themselves in their own defence.

Perhaps what's necessary is to turn this on its head and recognize - and I think it has been mentioned, but I have not been alert enough to recognize it until this moment - that in fact what we need to do is educate the people who do know about what's going on about what the wishes of the population are.

.1240

In reality, I think we do have a very strong sense that people feel they are being violated in some way. I don't know if they know specifically or technically how that's happening. They're not coming forward to say that with the intensity that they might because they're intimidated. They're worried about making fools of themselves, so I'm speaking from that perspective. The reality is that the public needs to feel the confidence to educate the people who do know what's going on so that their values are brought to bear on this.

Because ultimately, this isn't a technical question. Ultimately, this is a question of fundamental values. It may even be less technical than the questions of human rights. I'm sure most Canadians can't speak with a great deal of confidence even about what human rights are. It's more intuitive than that.

I believe that our obligation as legislators is to somehow reach into the collective wisdom of the country, the citizenry, to find out what it is that people believe their laws should reflect. This is one area where there's an urgency in that exercise because it's taking off on its own. It's taking off without the guards and the checks and balances that people believe are necessary.

It really isn't so much incumbent upon the citizenry to understand all this technical stuff, because they're not going to. It's incumbent upon all the people who know all this technical stuff to pay particular attention to the intuitive sense that Canadians will try to express about what it is they want known about themselves and what they want the country to be capable of doing.

I think that's what we have to get to. It's going to be very difficult, I accept that, and I'm not certain it's even achievable, but it certainly has to be the goal. I do believe that is where this enquiry was born. It was born of simpler citizens. I welcome the suggestion that we need to make sure to remember that. I'm particularly grateful for the fact that my head has been turned around with respect to who needs to be educated.

[Translation]

The Chair: You'd like to say something?

Ms Prémont: I'd like to answer Sunny's comment. We're from the same faculty and Sunny is completing his doctorate.

The Chair: I'd like to witness that exchange. It looks very interesting.

Ms Prémont: You'll see that we encourage divergent of views. Sunny is completing his doctoral thesis at the faculty and I think he may even have completed it already.

If I understood Sunny correctly, he's suggesting we shouldn't waste our time and we should just skip over the balance of diverging interests stage because the arguments concerning fundamental rights do take that into consideration and therefore we should have no further concern with that matter.

I quite agree with the person preceding me who said that it's quite unacceptable to think that we should implement a given technology without knowing what its impact on society might be.

But we should also realize that this is a recursive loop. It's not just technology that has an impact on society. We have to examine how that society is structured, and the very structure of that society will have an influence on how technology is used and the impact it will have.

Sunny seemed to be saying that we don't have to be concerned with the divergence between individual rights versus collective rights. I fundamentally disagree on that, because the balance he's referring to often focuses on Charter arguments.

We know that, in principle, the Charter protects individual rights. However, if we analyze the historical effect of the Charter, we may notice some day that we missed the boat in certain respects but when it comes to protecting whatever gains we have made as a community and which may not be totally strictly protected by the Charter. In other terms, the Charter is not all encompassing and isn't the only protection we have. We really must have a fundamental debate on other conflicting social values.

[English]

The Chair: I think a judge-made law that indicates the shared jurisdiction....

[Translation]

Pierrot.

.1245

Mr. Péladeau: Moving right along with the aspect of the picture our elected representatives have drawn up of their cross-country trip and previous consultations, I'm quite happy to see that these elected representatives see themselves as lawmakers who will have to

[English]

tap into the common wisdom of the population.

[Translation]

That's what has to be done. In a few weeks, you won't be lawmakers listening to people anymore, you will have become people selling your respective visions of what Canadian society must be or whatever solutions you'll be suggesting for our problems. During the election campaign, which is also a campaign to sell visions, you'll be confronted with this question within your own parties and programs because you have been made aware. With that in mind, I hope you will all be reelected so that we can pursue this debate later on.

When we talk about society's accomplishments, that is one of them, a small one but is not negligible, especially that it would be nice to mention in the context of the campaign the interest your respective parties show in matters of privacy or the implications of technology but it would also be good to mention, when you talk about specific solutions to social problems, that those solutions should not be a threat or pose a problem to technological matters.

So I submit that you should go forth and spread the good word or confidence within your respective teams. You will be confronted very, very concretely. There are all kinds of things being discussed today concerning the solutions to the problems we're facing among which are the matter of protecting personal information and privacy as well as the question of new technologies which will be at the heart of the solutions that will be proposed. I think you'll soon be facing the question of how to balance all this out.

The Chair: I think you're right. The matters of industry and justice are coming and the big corporations are very interested.

Mr. Maurice Bernier: Trying to summarize in a few minutes everything that went on this week is a feat in itself. I often say that in the areas of human rights just as in the area of handicapped persons, as our committee is responsible for both, if there's anything we should be leery of, it's to take such or such a situation for granted. In other words, nothing is ever definitely settled in that area. Maybe that's good news for the lawyers or whoever may be involved in the debate, but chances are this will be going on for a long time yet.

When we talk about new technologies, the population in general shows a sort of fatalism. All kinds of new technologies are coming on the scene and the thought is that it's part of our daily lives. More often than not, our instant reaction is to let all this be imposed upon us without question, without wondering exactly what the goal of this new technology might be. There are always very good reasons for suggesting new technology to us whether it's economic efficiency or speeding up the flow of information or the fact that we'll know everything, that we'll have all the information on everybody and so forth.

Another thing that's taken for granted is the fact that when you have legislation or a monitoring organization it's as though, for many, the problem was almost if not already totally solved. That's dangerous. Just think about the example given by Ms Vallée. Mr. Comeau is here today and perhaps he could repeat it for us.

.1250

Even though we may have legislation and organizations, if we don't remain vigilant, it's always tempting for a government or a corporation to use new technology for all kinds of reasons.

I'm ending this brief and very enriching trip with the impression that, as Mr. Péladeau was saying before, we should continue the debate and find the tools and convince ourselves of how necessary it is to inform people and make them aware in order that they be more and more vigilant.

The Chair: Thank you very much, Mr. Bernier.

John and then Mr. Fortin and Mr. Comeau, I hope. We'll also have a few words from Mr. John Godfrey. Mr. Comeau, I hope you'll have a little time to spare for us.

[English]

Mr. John Godfrey: I was almost tempted to use the flip chart because I found the alternative was a wonderful collection of mixed metaphors, but I think I'll just go with the wonderful collection.

As I try to pull together what we've learned over the week, not just at this meeting, the first metaphor that comes to mind is of two moving targets. One is the rate of technological evolution of the things that we're studying, which is very fast. The other is the rate of evolution of both the awareness and the changing social values of our society, which is moving at a different speed.

Our challenge is not necessarily to somehow yoke the two speeds together, but to reduce the differential. That's where we act as some kind of change agents - you can see the metaphors going wild here - and that's the challenge. There are two different speeds and yet we have to do something if we're going to preserve a democracy with a sense of control and participation.

Because this is a very dynamic model with two different speeds of variables, if you like, part of our difficulty is that we're using still photography - that is, we capture in a still photograph the current situation of charters and other laws that are passed in other jurisdictions and we capture in a still photograph abuses that are current or.... This is the worst of all times in the sense of the complexity of data not being matched by its protection. But if we get hung up on trying to create a regime for the future that is based on the still photographs, we would ignore the dynamism of the two things we're worrying about, social values and technology.

The solution would seem to be a type of dynamic field theory, if I can put it this way, which recognizes that there is a whole set of interlocking mechanisms that have in some sense to be crowned by a charter of general principles that will apply to technological evolution, however it goes, and that will give some sense of eternal fundamental values so that we at least can get some sense of being ahead of the curve instead of always behind it.

Whether it's likely to be a constitutional principle or what place it has in the hierarchy of law is to be debated amongst us, but below that will come all of the other things that have to fall into place, like associational codes and, of course, provincial legislation and international legislation.

When you lay it out, it has to interconnect, make sense and be dynamic, in order to keep getting ahead of that technological curve. It has to have some kind of feedback mechanism that allows us to monitor constantly what it is that the public is concerned about and wants and understands the nature of the trade-offs they're willing to accept....

That's where I'm going to stop.

The Chair: Thank you very much, John.

I'd very much like to hear from Mr. Fortin and from Mr. Comeau.

I think what you've underscored is that although the goal posts may be getting wider apart much faster than we can catch up to in terms of the technological sense, there are still fundamental ethical and moral issues that reflect Canadian values. This is what we're really trying to get at.

Mr. Fortin, please, and then Mr. Comeau.

[Translation]

Dr. Jean-Paul Fortin (Régie d'assurance-maladie du Québec): My name is Jean-Paul Fortin, I'm a medical doctor specialized in public health and I'm a professor at the Département de médecine sociale et préventive at Laval University, in Quebec City. I work for the Direction régionale de santé publique de Québec. I'm also responsible for the research team that developed, implemented and evaluated the health card project in Rimouski.

.1255

First of all, I'd like to take this opportunity to point out how essential I find the matter of privacy. It's not essential only to avoid certain things from happening, but also to allow other things to happen.

I'll try to give you a little demonstration in the line of logic you used for our work here this morning. So I'll start off with a case history to emphasize a certain number of fundamental principles that guided our process and led us to a solution that seems to be acceptable by and acceptable to all users, whether in the field, be they health professionals or the population at large, or at the level of essential responsible organizations, such as associations and professional federations of doctors, pharmacies, the Commission d'accès à l'information as well as all people involved in the process.

I'd like you to see how, based on such a concern, things can be built.

The health card project was built for the purpose of improving the quality of health services and their users' health. The flow of information was only a means to an end which was completely different.

It is extremely important to look at things objectively and, in particular, to emphasize that this could not be used for monitoring purposes. The objective was to target the 95 or the 96% of the people who do not defraud the health system.

I have no problem with monitoring our systems, but we shouldn't forget the 95 or 96% of our population that are not defrauding the system. We were looking for an approach that would improve the population's health while, at the same time, decreasing the health care costs of the system.

We started from the premise that better practice can help to decrease health costs. We thought we could decrease costs through better practice and the condition to attain that objective was to have full knowledge of what I will call the nature of the beast.

What do I mean by the nature of the beast? A health system is going to work well if there's a relation of trust. When you see your doctor, you have to have a relation of trust with that doctor. You're the one who is going to decide what you'll tell him. Otherwise, the system won't work and you won't necessarily be enjoying optimal care.

There also has to be trust in the system between health professionals and also between health professionals and the State.

One of the major concerns of health professionals in the matter of this health card was as to whether the card would be used to monitor professional practices. As soon as the answer was found to be in the negative, the participation of both doctors and the population in general was unexpectedly high and this participation led to rather impressive results. The doctors and pharmacists confirm that this serves to decrease drug interaction.

What does that mean for us? It means that we're touching on a point that causes from 16 to 18% of hospitalizations in Quebec. Maybe we could save money somewhere if we had a strategy leading to an improvement in the quality of services. For that, this trust has to exist.

As our work progressed, we verified that. Each time we afforded them comfort, we were playing the game. That's why, at the end of the day, we managed to find solutions that were both acceptable and useful.

That's why I'm saying that protection can be an asset to help us attain a goal leading to interesting conclusions.

How did we do it? First of all, we thought it was important. The population and the professionals confirmed that it was important.

Before we got the project underway, we'd be looked at with a little smile that said: "Just look at what purists those people are. Let's not go overboard with confidentiality". Five years ago, that's how it was in North America.

Today, in North America and in Europe, those who weren't concerned with that question, in our field, now have to backtrack and start their homework all over again. That reality does exist somewhere. So, that's what we believed in.

.1300

Secondly, we figured that the strategy to attain the goal wasn't only technological. The strategy has to get people to show trust. So people have to be informed; they have to be able to talk; there must be others to listen to them and there has to be dialogue between those building the project and those who are going to have to live with it.

So there was consultation. There was unbelievable transparency throughout the whole process. In the field, we gathered together ordinary people, health professionals as well as the professional organizations and the Commission d'accès à l'information at the central agency level. I should point out that at the outset our goal was to convince the access to information commission. We figured: "The day we manage to convince the Commission d'accès à l'information, that's when we'll have a trump card to convince the population that the project is acceptable".

Five years ago, people sort of smiled at us. They told us not to go to the Commission d'accès à l'information because we'd find more problems there than solutions but we figured that it was important for us. I can tell you today that, looking back at it, it was one of our more brilliant ideas. It's an asset that has helped us come up with solutions acceptable to the population.

So we were transparent. We spoke to people and we listened to them. We designed structures and established basic principles and procedures and, based on that, we managed to show some progress. The basic principle is the consent of the population. To our minds, the essential point for the individual is the ability to say no, the ability to refuse that information be made available.

Let's get back to your situation. If you wake up with a sexually transmitted disease tomorrow morning, you're going to want to choose the doctor you will want to share this information with. That has to continue. Otherwise, you start getting into a process where things are being hidden and this is even a danger where health care is concerned.

We absolutely have to go that way. Consent, with the concept of the patient volunteering and controlling the information is an extremely important element. To get a consensus, you have to take on the burden of proof. We have to manage to convince the population of that. The population must be well informed.

Our own citizens asked us if this would be used by employers and insurance companies. If that had been the case, they would not have participated. We had to build in this element. At the outset, we were even being asked whether the health system would be using this to audit our practice or the health care given. We answered that it would only be used for the management of operations.

In time, people went along and believed in it and we found acceptable solutions. Today, our purpose is to suggest a double purpose card, a microchip health insurance card that would start being used in January 1998. With that card, it would be possible to see whether a given individual is eligible for service. There's also the possibility of writing clinical information onto it.

However, it must be recognized that both purposes enjoy their own characteristics and must enjoy their own rules. What does that mean in practice? You have the obligation to present your health insurance card to identify yourself, but you'll always have the right to give or withhold consent to clinical information being written onto it. The concept of consent and voluntary participation will come into play. The purpose must be very well defined any time the information is looked at and it must be possible for the patient to control this information.

Within the scope of that logic, coming in with a multi-service card tomorrow morning is unacceptable to us. I'm still trying to convince my mother that if she has health information on her card and a police officer stops her because she's driving too fast, then the police officer won't be able to read the information on that card. I still have not managed to convince her. So if we want those cards to be used for specific purposes, especially in the area of health where information is particularly sensitive, then in no way can it be accepted that the card could be used for any other purpose except the one it was intended for.

The Chair: Mr. Fortin, I think you have answered many of the concerns that were expressed during our meetings in other Canadian cities.

.1305

The cross-filtering of data on a single identity card, despite all the protection in the world, was a concern for everyone. On the back of any Visa, American Express, or any other card, it says that the data can be cross-referenced and information lists can be drawn up using that information. People don't want all that information winding up in the same place and being so easily accessible anymore.

Your comments today have answered the concerns expressed in that respect. We had a discussion on the time, energy and money required to find the cheats. A medical card like yours, properly supported, ensuring the anonymity that's needed a requiring consent is very important.

For the benefit of our colleagues and the people in the audience, I would like to point out that Mr. Fortin brought along the whole "gimmick" and you'll be able to use it and see how it could work.

Are there any further questions?

[English]

Mr. Sarkis Assadourian: I have one quick question.

You mentioned that a health card or any card should be used for a specific purpose only. How do you guarantee that? How do you guarantee that it's going to be used for that purpose only?

[Translation]

Dr. Fortin: The strategy which, in my opinion, might be used as a base for the possibility of guaranteeing... First of all, I'm not an expert in all of the aspects, but I know that the debate must be public, open and transparent and that people will have to know exactly what can be done and what cannot. I think that's an important element.

Rules and principles must be established and people or groups must be put in charge of some aspects of the process. So there are conditions to it. The tools that already exist must also be used for the purpose they were created for. Answers have to be available at the time questions are raised. That way, you will progressively create a consensus and people will tell us how far we can go with the tools available.

[English]

Mr. Sarkis Assadourian: The other point was to ask for a summary of what we learned this week. I think all three points we discussed - video, genetic testing, and the smart card - all come together in one definition, if it's possible, or for one use. They are necessary evils in our society. It depends on how evil you want to be. You need the smart card to live. You need the video to live. One you do more than the other. Video can be in your house, but you can use the smart card or genetics for your health. But the question is, how are you going to control it and who are you going to trust?

When we were in Alberta, Madam Chair, you remember that everything was based on industry. That's because typically three groups will use this information or these cases. First, there's the government, of course, then industry, and then individuals. In the three cases, you have to have proper controls and make sure they are intended to be used for the purpose they are gathered for.

That's the key question. We all need to use technology. We use it more today than we did last year. We're going to use it more next year than we did this year. But basically, it's an evil and we have to control it. We all need the same evil, but we have to control it.

That's the summary. I put it very briefly.

The Chair: I think I'm going to call on you again.

[Translation]

Dr. Fortin: Mr. Comeau could certainly tell you more than I could.

[English]

The Chair: If you're finished with your questions for Mr. Fortin, I'm going to ask Mr. Comeau to come to the microphone.

[Translation]

Dr. Fortin: I'd just like to add one little point. The end uses can be divided into specific areas. By that I mean that even in the area of health, use will vary according to level. For the relation between the patient and a professional, the name is important; at the planning level, names are not needed.

.1310

So according to the level and the use to be made of the information, rules can apply: you have access to a name or not, or to an identification number or not. As for the banks, they must make very specific use of that.

Presently, technology, especially in the case of the smart card, does allow for the establishment of anonymous data banks. The data is in the bank, but not the names. You can't make a link between the person and the information. Technology is slowly coming up with solutions, but it's important to realize that the technology has to remain in a far broader context in a process where people will be able to progress. The day we arrive at a committee consensus on a solution, we'll have made a lot of progress.

The Chair: Thank you very much.

Mr. Comeau, please.

[English]

Jean, I want to ask whether you want to make your observations on what you noted before Mr. Comeau speaks or after.

Ms Jean Augustine: I think I can make my comment right now and be brief.

I saw the demonstration out there. I was really impressed by the fact; over and over, what we heard was a concern not so much about the primary use as the secondary use, which is, what can be done after I've given this and how I can retrieve it? Do I own it? Who owns it once I've given it?

I think it was important for a number of the people we heard in terms of the public. They have a right to say, no, they don't want to give this, and a right to be able to have some control over the information once it's given. So there was a lot of discussion about the power positions that sometimes seem to be involved in the collection of the information.

We also heard a good deal about the state versus economic and financial aspects, and private enterprises, etc. There was a difference between the comfort level of individuals giving information to the state for the general public good versus other kinds of information collection.

We heard a good deal about voluntary codes and presently loose legislation attached to various ministries in terms of Justice, the Criminal Code, etc. There was some concern in terms of the mixture of information about the U.S. litigation processes, etc. Over and over I got the message that people were looking for not necessarily legislation but some strong indications, guidelines, or policy direction, or some way in which we can control who owns what and who does what with the information collected. The onus should be on those who would make secondary use of the information, and there's the individual's right to the ownership of information that belongs to the individual.

The Chair: Thank you very much, Ms Augustine.

[Translation]

Mr. Comeau, I think you've shown up at the right time. For most, it was comforting to know that the power of control, access and so forth was in their own hands. We saw a card made by the new company, CANPASS, which had that kind of inscription. You not only have the right to decide to use it yourself, but also to grant the system access or not. Your fingerprint or palm print alone won't be enough to give access to data.

.1315

Is it possible that all this could find its way into the hands of each and everyone of our fellow Canadians? As Mr. Fortin has already explained, it's very important, but I don't have control yet and I don't know if it would be a good idea to have control. We are in the process of measuring and evaluating everything that's been expressed. Some will want to have a jury made up of ordinary people. I'd like to have your opinion on that as commissioner for access to information. We've already had the pleasure of hearing your comments previously.

Mr. Paul-André Comeau (President, Commission d'accès à l'information du Québec): Thank you, Madam Chair. I'd like to apologize. I came early this morning, but I had to leave. I shall not answer the question raised by Mr. Bernier as the Sûreté du Québec has an investigation underway concerning the events that were in the news a few days ago.

The Chair: Everywhere in Canada, Mr. Comeau.

Mr. Comeau: There you are. However, I would like to elaborate a little on the debate and get back to what I consider to be GOHS, your good old horse sense or the basics in English. Two and a half years ago, I was at a seminar at the political science department of the University of Montreal where your colleague, Stéphane Dion, was also present. I shocked quite a number of academics when I said that we'd have to stop looking at technology for what it is in itself to look at it in conjunction with man.

My comment was based on a fact that seemed self evident to me. At the time, Internet was described as the place par excellence for conviviality, the most perfect means of access to culture, information and so forth while Internet was actually becoming a vast commercial gimmick. At that point, we were so busy buying into a certain number of clichés and myths surrounding the Internet that those major problems of security, identity vetting and confidentiality which today are everyone's concern were simply being forgotten.

I think that we have to go back to basics. Fifteen days ago I read Aristotle on the Internet. When you think of it, Aristotle did his writings a few thousand years ago. We have his writings and we still use them. Does technology mean that man has changed and that human nature will change? I think that we have to be very realistic and ask ourselves these questions before we determine the fundamental ramifications of technology.

We have to come back to something. You know, a few thousand years ago, a very simple commandment was invented: "Thou shalt not kill". We still have prisons and police forces in the world that deal with this issue. Humane nature does not change. I do not see any major changes since humanity first began. Of course, there have been some refinements, there was the introduction of law, Charters, etc., but the fundamentals of human action, or human behaviour, remain the same. We must come back to this and not allow ourselves to get caught up in this worldwind where technology is dictating how we should think. I am a little bit worried by this tendency to always think in terms of changing. Our methods do change of course, but the reasons and objectives underlying human activity do not vary.

I would like to make a very simple comparison. Today, messages and data are transmitted through a binary system, wireless or otherwise, by satellite or by some other means. What difference is there between this and the fact that at one time, in Africa, our ancestors, when they began walking upright, used the tam-tam to send the same messages? I think that we have to come back to earth about this as well and really examine the issues in terms of man's requirements and motivations and not in terms of the technology itself. Technology is a means.

This is all that I wanted to say. Unfortunately, this is all that I have to offer for reasons that you may guess. Thank you.

The Chair: Mr. Comeau, my analysis - and this does not mean that I am right - of what I have heard since last September is that, despite the new technologies, despite all the limitations or the validity of one thing or another, there are some human values that form the very foundation of a society such as ours. These human values vary from one society to the other in the world.

.1320

Here, in Canada, we feel solidarity with our fellow citizens and we all have a mutual responsibility for the welfare of our citizens. The same things bind us together. I'm not talking from the political perspective. I'm referring to the way that we feel and the way that we conduct ourselves.

With all of the problems that we are facing, should we not go beyond this and come up with a privacy charter, a charter which would indicate that, as human beings, we have certain rights to freedom and to self control and that we have the right to be autonomous? We have, if you like, the right to be present.

We are talking about self control. This is why I began with this question. Are there any others who would like to ask Mr. Comeau some questions?

Mr. Maurice Bernier: We could ask Mr. Comeau so many questions or get involved in so many exchanges. However, I would simply like to tell them that earlier, I wasn't asking him a question, but making a general comment. I'm in complete agreement with what he has just said. I share this point of view completely. The only thing that we must avoid is to think that one day the situation will be resolved. Human beings being what they are... That is all.

The Chair: Mr. Comeau, I kept you for the end because we have a tremendous respect for the role that you play and for your vision of society. We already stated this in October or November, and nothing has changed since then. When Mr. Fortin tells us that you were consulted about the smart card, we feel safe. Thank you.

Mr. Comeau: Thank you, Ms Finestone. I would also like to thank Mr. Fortin, because we worked very well together.

The Chair: Does somebody else have anything else they have to say?

I would like to thank you from the bottom of my heart for sharing information and for the time that you have given to us. I can assure you that the committee will take this very seriously, in the interest of our fellow citizens.

[English]

Its mild reluctance to come in here notwithstanding, I must also thank the business community for its enabling help and thoughtful direction. I think the business community has understood that its members are also individuals in society.

As part of human rights, individual rights are a serious consideration in all our undertakings. I think it is quite wonderful that we have ended up here, in my city, talking about human rights and privacy. I hope we will find some kind of answer that will be a charter or guideline for the future, because I don't think this is the end of the conversation. I think this is just the beginning, and the input of everyone right across this country has been enormously helpful.

Let me remind you once again that you can join us on the web site. You can also send us a brief or your views, which will be received with the greatest of pleasure. At the beginning of April, we will start drafting a first view of the thoughts and expressions of Canadians from across this country, and that report will, I hope, be tabled by the end of that month.

With that, this session is closed.

Return to Committee Home Page

;