Skip to main content
EVIDENCE

[Recorded by Electronic Apparatus]

Thursday, March 6, 1997

.0919

[English]

The Chair: Good morning, ladies and gentlemen.

[Translation]

Good morning, everyone.

[English]

I see a quorum and I therefore call this meeting to order.

[Translation]

My name is Sheila Finestone and I am the Chair of the House of Commons' Standing Committee on Human Rights and the Status of Disabled Persons.

.0920

[English]

I want to welcome you all to the first of six public consultations that the Standing Committee on Human Rights and the Status of Persons with Disabilities will be holding across Canada. We'll be examining the state of our right to privacy in today's high-tech world.

It's quite fascinating that we are meeting today, as this past week has indicated that there is an ongoing and growing concern about an area that very few of us thought the public was sensitive to. I would bring to your attention, for when you go home, today's issue of The Globe and Mail, which talks about the potential for sale on the black market of confidential information on private citizens. There is a potential for abuse by people in the field.

Secondly, you will see ``All-purpose ID might be in the cards'', which was on the front page of The Gazette yesterday and continued with a full article on the back page, with all the credit cards. It gives us some idea of the extent of the concerns and speaks of the potential for one universal ID card and the implications for that.

[Translation]

The newspaper Le Devoir carried two articles recently dealing with Quebec's plans for a citizen's card and for a central registry on Quebec residents.

[English]

I would suggest that all of you here today are sharing in citizens' vigilance, concern, and heightened awareness about the rights to privacy, the implications of new technology, and just where we are going in this very modern and new society that's evolving right in front of our eyes.

Our committee decided that, given all we've heard from the round tables we held starting in September and all we've read about new information technologies and their practices, we want to know the implications for our personal lives, our privacy, and human rights.

When you look at the invasive evolving forms of new technology today, you can't help but ask, who's watching me, who knows what about me, and how much do they really need to know? Where is the balance between the competing social and economic interests, such as crime prevention, fraud, health care, and business interests, and what is our right to protect our individual privacy? Is there a need for an ethical framework and an obligation to ensure informed consent in the use of all these new technologies?

I suggest to you the right to privacy does not stem from any one source. It is drawn from international law, constitutional laws, federal and provincial legislation, judge-made law, professional codes of ethics, and guidelines. The result is often what is referred to, here in Canada anyway, as a patchwork of privacy protection in this country.

At the international level, several important human rights documents contain guarantees of the right to privacy. Look at the Universal Declaration of Human Rights of 1948. A Canadian, Mr. John Humphrey, was very much involved in the writing of that declaration. And we should remember that Canada had a very important place in the development of the Magna Carta of humankind and in the International Covenant on Civil and Political Rights of 1966, to which Canada is a signatory.

There is no comprehensive privacy protection currently in Canada. Quebec is the only place in North America where there is comprehensive regulation of private sector personal data practices.

In Europe, for example, the fair information principle of the European Union and of the OECD countries applies to all personal information, whatever the nature of its medium and whatever the form in which it is accessible, collected, held, used, and distributed by other persons. In Europe everyone has the right to respect for his or her private and family life, home, and correspondence.

There is no such express right in Canada, even though sections 7 and 8 of the Canadian Charter of Rights, which apply to search and seizure and the right to life, liberty, and security of person, have been interpreted through the courts to apply to privacy. As I said, the exception is Quebec.

.0925

The concept that privacy is the most comprehensive of all personal rights is, I think, held around the world as a broad and ambitious right, a universal concept. The question is, should it be an inalienable right? Privacy is a core human value that goes to the very heart of preserving human dignity and autonomy. I think most of us would agree that the right to privacy is something that is of paramount importance in our lives.

[Translation]

Some experts define it as the right to have one's own space, to conduct private communications, not to be monitored, and to have one's physical integrity respected. For the average citizen, this is a question of power; the power that each person has over his or her own personal information. It is also the right to remain anonymous.

[English]

The question then becomes, what is privacy worth in today's high-tech society? There's no doubt that emerging technologies offer valuable advantages, efficiencies, and conveniences to all of us and can save money in government circles, but do the benefits offered by new technologies come with a privacy price tag? Is this price too high?

Is this trade-off inevitable? Where and how do we draw the line? Privacy is a precious resource. Once lost, whether intentionally or inadvertently, it can never be recaptured.

[Translation]

As members of the Standing Committee on Human Rights and the Status of Disabled Persons, we are resolute in taking the human rights approach to measure the negative and positive impacts of new technologies on our right to privacy.

[English]

So this human rights approach is fundamental to the exercise we are undertaking across Canada.

I suggest to you that Canadians have never approved of peeping Toms or unauthorized wire-tapping, and our criminal laws reflect this. Does the same disapproval extend, for example, to hidden video cameras in the workplace, to DNA data banks, or citizen identity cards?

In order to exchange ideas with Canadians on these issues, this committee is holding a series of broad-based town hall meetings, starting today in Ottawa. We will continue next week in Vancouver on Monday, Calgary on Tuesday, Toronto on Wednesday, Fredericton on Thursday, and Montreal on Friday. We have a very diverse group, which will be present at each one of these sessions.

To focus discussions at these town hall meetings, we have decided to look at three basic types of privacy-intrusive activities by using fictitious but reality-based case studies involving specific technologies: video monitoring, genetic testing, and smart cards. We hope to use these case studies to raise public awareness about the issues, risks, and benefits of advancing technologies and to initiate an open and frank debate about the promise and the peril to our privacy, as a human right, in this era of modern, evolving technologies.

In conclusion, while we do not expect to definitely resolve all of the issues raised by these scenarios, we do hope that with your input and that of Canadians from all sectors of society across this vast country to come away with some concrete recommendations for action in these areas.

To this end, we look forward to hearing your ideas, your concerns and proposals.

[Translation]

I would like to thank you all for being here today. Our proceedings will be guided by our consultation coordinator, Valerie Steeves, a human rights professor at the University of Ottawa.

[English]

Actually, you're in charge of a special section at the University of Ottawa.

Before I turn this meeting over to Valerie, I would like the members of this committee to present themselves.

.0930

I will start with the vice-chairmen of this committee, please.

Mr. Scott (Fredericton - York - Sunbury): Andy Scott, member of Parliament for Fredericton - York - Sunbury.

[Translation]

Mr. Bernier (Mégantic - Compton - Standstead): My name is Maurice Bernier, and I am the member for Mégantic - Compton - Standstead. I would like to welcome all the witnesses who came here today, in spite of the awful weather. Thank you for being here.

[English]

Mrs. Hayes (Port Moody - Coquitlam): Sharon Hayes, member of Parliament for Port Moody - Coquitlam.

Mr. Godfrey (Don Valley West): John Godfrey, member of Parliament for Don Valley West in Toronto.

Mr. MacLellan (Cape Breton - The Sydneys): Russell MacLellan, member of Parliament for Cape Breton - The Sydneys.

Ms Augustine (Etobicoke - Lakeshore): Jean Augustine, member of Parliament for Etobicoke - Lakeshore.

Mr. Assadourian (Don Valley North): Sarkis Assadourian, member of Parliament for Don Valley North.

The Chair: Thank you very much.

Valerie, would you like to take over, please?

Ms Valerie Steeves (Consultant to the Committee): Thank you, Mrs. Finestone.

I'd like to take this opportunity to welcome you to the parliamentary committee's public consultation on privacy rights and new technologies.

[Translation]

Welcome to the public hearings on privacy and new technologies. We would like to thank you for your participation. Your contribution to our work is extremely valuable.

[English]

Your participation today is particularly important because we're going to begin to explore the concept of privacy as a human right. We're all aware that new technologies are quickly changing our daily lives, but too often our discussions about technology fail to address the end result of this process. It's the committee's hope that this consultation will encourage an ongoing dialogue between Canadians about what those changes mean to us as a society.

We're hoping to explore two things here today. The first is: what does privacy mean to Canadians and how can we protect it in the face of new technologies? The second is: just how far is too far when it comes to trading off our privacy for the benefit of these new technologies?

In order to provide our discussions with some personal or social context, the committee is presenting you with three case studies. They deal with video surveillance, smart card technology, and genetic testing.

As you know, these case studies are stories that attempt to illustrate both the benefits and the detriments of these new technologies. We hope that through discussing the impact of these technologies on the personal lives of the individuals in the case studies, we'll begin to understand what privacy means to Canadians and how we as a society can seek to balance the benefits of these technologies with our underlying social values, including our commitment to privacy.

The consultation process itself consists of two parts. First, we'll be dividing into small groups to discuss the case studies. Each small group discussion will be facilitated by an expert in the field of privacy rights. Each small group will also include at least one member of the committee, who will be participating in the discussion with you.

Once we've had the opportunity to explore the case studies in our small groups, we'll reconvene the general meeting and have a town hall discussion about the issues they raise.

To start the town hall, we'll be asking the committee members to report back and summarize the major concerns that were discussed in their small groups. We'll then give the small group facilitators a chance to raise any comments or concerns of their own. Then we'll open the discussion to the floor.

We're looking forward to an open and free-flowing discussion between you, the participants, and the committee members about the meaning of privacy in a technological age.

I'm privileged to announce the five people who will be facilitating the small group discussions today.

I'll start with Laurence Kearley, who is sitting at the end of the table. Larry graduated from Osgoode Hall Law School in 1974 and he was called to the bar in 1975. Throughout his career, he has both taught and practised various forms of administrative law. In 1996 he obtained his master's degree in international law from the University of Ottawa, specializing in the protection of privacy and in technology issues. Larry has written numerous articles and papers on privacy issues and he's currently co-writing a book on privacy and technology.

Kate White has an extensive specialized background in the area of risk assessment, risk communication, and the use of the Internet in electronic diplomacy.

.0935

Through her values-based company, Black and White Communications Inc., her leading-edge initiatives in the field of communications encompass things such as electronic democracy, women in development issues, bringing the marginalized to the Internet, and human rights.

Kate is also the president of the Centre for Open Media Studies International, an electronic communications think tank associated with Carleton University.

Kate publishes in both academic and popular journals and speaks on issues of human rights, women in development, electronic democracy, risk, and risk perception.

Next is Pierrôt Péladeau.

[Translation]

At present, Pierrôt is research fellow at Montreal's Clinical Research Institute. In that capacity, he participates in research projects on the ethical, legal and social problems associated with the use of genetic information and with the computerization of health information on Canadians. He is also Vice-President for research and development at Progesta Communications Inc., a company specializing in the management of personal information. To date, the company has helped more than 500 organizations implement personal data protection programs. Pierrôt is also Chief Editor of Privacy Files, a professional journal that focuses on all the issues raised by the use of personal information.

[English]

Next is Geoffrey Gurd. He has a PhD in communications from the Université de Montréal. He currently teaches in the department of communications at the University of Ottawa. Prior to that, he taught at a university in Pittsburgh for four years before returning to Canada.

His research interests include questioning identity in the technological age, the rhetoric of cyberspace culture, new management theories, and discourses about the future of work and technology.

Maybe you haven't noticed, but try to notice now that your name tags have been colour-coded. Each participant has been pre-divided in a very random way into a small discussion group.

If you have a yellow name tag, then you'll be participating in a discussion with Kate White. If you have a blue name tag, then your discussion will be led by Larry Kearley. If you have a green name tag, then you'll be meeting with Pierrôt Péladeau. If you have a beige name tag, you'll be meeting with Geoffrey Gurd. If you have a pink name tag, you'll be meeting with me.

When you do get into your small groups, your facilitator will start by asking the group which of the three case studies you would like to start with. Our time together is very short, and I'd really like to remind you that the case studies are really to provide you with a starting point for your discussions. Feel free to spend as much or as little time as you want on each of the case studies. Also feel free to draw linkages between the three different fact situations and to voice your concerns about how the new technologies impact on your sense of personal privacy.

Does anyone have any questions before we start?

We will be reconvening for the town hall at 11 a.m. sharp, so the sooner we get into our small groups, the more time we will have to dedicate to our discussions.

I should be available throughout the morning if you do have questions.

The Chair: Ladies and gentlemen, I just want to advise you that this is an official meeting of the standing committee. The meeting will be suspended and then reconvened at 11 a.m. sharp for the second half of the town hall session.

.0939

.1104

The Chair: I see we have a quorum, so I will reconvene the meeting of the standing committee.

I believe you know that these meetings are being televised across the country. Just in case you didn't know, I want you to know. I want informed consent. Not the small groups, because

[Translation]

according to the Standing Orders of the House of Commons, we cannot bring people together into small groups like this without applying very formal procedures.

.1105

[English]

So the small groups will not be televised per se, although there will have been a picture of the fact that we have moved into small groups and are now back in a plenary session.

The second phase of the town hall will commence. I would like to ask the animators if, at the end of the session, they could stay for about 20 minutes. We'd like to do an evaluation before we start our cross-country tour. Thank you very much.

Madame.

Ms Steeves: Thank you, Mrs. Finestone.

To start off the town hall I'd like to ask the MPs who participated in small group discussions to give us a brief summary of the two or three major points raised in their discussions.

I'll start with Mrs. Hayes.

[Translation]

Mr. Bernier: How much time do we have, Madam Chair?

The Chair: About three minutes, or more if necessary.

[English]

Mrs. Hayes: Thank you, Madam Chair.

I enjoyed the discussion in our group. We did have time to look specifically at two of the case studies. We first went to the smart card scenario and then moved to the genetic testing case study.

I took reams of notes, actually, page after page, because it seemed a lot of ideas had to be brought together.

Generally speaking, I think we went beyond the case studies in the sense of the linkages made...[Technical Difficulty - Editor]

One concept that recurred was the issue of trust - who you trust in the handling of information, whether industry or government. Even the volunteer sector was suggested as being part of the number of groups that would be interested in information on individuals. Where was that level of trust, and who should have that?

.1110

In terms of the limits of control of information, as information is passed from one point to another, whether it be genetics or smart card, at what level are controls possible? What levels are safe, and what levels go beyond the privacy issues? That was a question that was brought forward.

I question also the different rules, perhaps, of these different sectors. Do the same rules apply to government as to private sector as to volunteer? Certainly, then, how practical is enforcement within each of these sectors?

There was a paradigm mentioned that I should put forward - I thought it was quite interesting - on the balance between risk management and the potentiality. ``Potentiality'' is actually information that in itself is not dangerous, it's just information that's there. But depending on who has it and what level it goes to, the risk management of those different levels is something government has to look at.

Finally, on whether legislation is necessary, I think there was a feeling that legislation should be looked at. There was also a strong feeling expressed that this legislation, if it were to come, should be very specific, not broad-based. I think there was some discussion and disagreement there. So that's something that needs to be pinned down.

Certainly, that's an issue that has to be decided, whether it should be very specific - again, in terms of genetics - to, for instance, cloning or whatever, or it should be more broad-based on the whole issue. I don't know if there was agreement, but certainly that's an issue that has to be discussed.

Thank you.

The Chair: Thank you very much.

Ms Steeves: Mr. Scott, perhaps you could give us a brief summary of the discussion in your group.

Mr. Scott: I'm sure the cameras will find me, which seems somehow topical, doesn't it.

Mr. Godfrey: As long as they don't see through you.

Voices: Oh, oh!

Mr. Scott: And he's on my side.

Our group's sentiments might be summarized this way: it's not too early to be discussing this, and perhaps it's too late.

We touched on the scenarios, but only touched on them. In reality, all participated with personal experience and perhaps some second-hand experience in terms of the impact of these technologies. In fact, I think it's safe to say our group was unanimously fearful that we've gone too far and we need quickly to respond.

Some of the highlights of the fear had to do with the fact that very often the most vulnerable communities, those least able to resist this invasion of privacy, probably get hit first. Perhaps they are dependent on the state and therefore can't resist the information collection. If they do resist it, they no longer qualify for some benefit or another.

In terms of the fact that we've put so much emphasis on informed consent, who are we leaving behind in the context that a quarter of the population doesn't read? What is the psychological impact of the quantification of our identities?

Basically, there was no resistance to the view that there's an urgent need to attend to the human rights values that are not being attended to currently in a momentum that is headed in the other direction, chasing technology. We need to act urgently. We need to get the public more educated and probably realize more of what invasion of privacy exists.

.1115

We would recommend that perhaps a better balance might be struck in the context of the discussion itself. We were without the other side. Some of us struggled to advocate on behalf of the devil, but we would have preferred that he'd been there.

Thank you.

Ms Steeves: Thank you, Mr. Scott.

Mr. Bernier.

[Translation]

Mr. Bernier: The discussions in our workshop were very lively indeed, as they probably were in the other workshops as well. A great many ideas arose from the discussions. If I were to summarize the main idea into one sentence, I would say that the new technologies will have to allow for and adapt to our right to privacy, and not the other way around.

As the first speaker said, what we are trying to avoid is having to make a choice between privacy and new technology. Everyone in the workshop agreed on that. It's not a question of making concessions. The right to privacy must be respected, but we must also be open to new technology. That is the main idea that arose from our discussions.

Whenever new technologies are developed or applied, we have to see why they are being used and what their purpose is. What are we trying to do with them? Why are they there? If the answers are clear at the outset, we can then determine which technologies we must use, and how far we can go with them.

And as the participants of other workshops pointed out, we have realized that it is a question of balance among a number of factors: the need to protect privacy, for greater or less security, for more information, for more rapid or complete access to information, and - very popular at all levels of government nowadays - for reducing the cost of services provided. Ontario's penitentiary system was given as an example; prison wardens are being replaced by closed-circuit surveillance cameras. Costs have gone down, but many other problems have arisen. So I would say that we must always try to find a balance between privacy and other needs.

We could perhaps come back to several other points later, but here is the last observation that came out of our workshop. Governments, both federal and provincial - let's not forget jurisdictional issues, which apply in this area too - must act quickly and decisively to control the application of new technologies through more statutes and regulations. The entire workshop agreed that more legislation was necessary.

The Chair: Thank you.

[English]

Ms Steeves: Thank you very much.

Mr. Godfrey.

Mr. Godfrey: I have the tough one because of course I have to wait and justify myself in front of Valerie Steeves, who is our facilitator.

We touched on all three topics. Perhaps we were a little heavy on the closed-circuit television off the top, but there were in fact some specific themes and some general themes. What is obviously specific to closed-circuit television - and not to any of the others - is the sense of physical space, the idea that issues like place matter enormously in a way that they obviously don't for the other two examples. There was a sense of what the reasonable expectations for the workplace are versus being outside on a street; versus being in an apartment building, where you might want security in a garage; versus your own home. Those are issues that I think are unique to television cameras, and they have to be addressed.

What is common is the notion of expectation; that is to say, whether it's going to the hospital or whether it's for fraud, what would a person reasonably expect the primary purpose of this activity to be? In other words, if it's something you didn't anticipate - such as it being sold to a commercial outfit - that is obviously a major concern.

.1120

A third theme that is common to our three cases is the notion of commercialization, and it seems to be the place where things are going too far. Particularly in the case of closed-circuit television, with the individual's suicide bid being broadcast on television, there was no public good to it. It was strictly for commercial purposes.

Commercialization is something that comes through in all the cases, and it leads to more profound questions, such as who owns and controls the technology? It's obviously easier, in some sense, to control public technology than to control private technology.

All of this led in the first case to general agreement - and this is true of all the cases - for clear principles and ground rules.

Fifth, the citizen should have some reasonable expectation of privacy and there should be a burden of proof on the violator of the privacy to show there was a reasonable public or common good. The citizen should have some recourse when people do go too far. It should be absolutely there and should make people nervous, just as you might say wire-tapping legislation should at least make people nervous, about doing things they ought not to.

This session ended with an observation by our facilitator and others that there are some analogies we can use that might be helpful. Examples are the whole notion of unreasonable search and seizure and the fact that tort law may be helpful and worth pursuing as an example of a way of defending ourselves.

Under the genetic heading, the thing that came through first of all was the whole notion of finding things out by accident, which is also common of closed-circuit television. I'm speaking here of finding out by accident that someone has a disease. Again, there are analogies, which we have already, with blood testing - finding out by accident that somebody may have HIV and then what to do with that information.

The group felt fairly strongly that one should never find things out by accident. There should always be a clear sense of what you're looking for and what the reasonable expectation of the person should be.

As well, there was this notion of the right to know and the right not to know, which is specific to the medical situation. Specific to the case of DNA testing is the fact that right now it's a relatively small group of people who by accident find themselves potentially having their rights violated. Presumably this group will grow bigger as the technology becomes more commonplace, but they may need some special protection right now, because by accident their privacy is vulnerable.

A general theme that came from all of them was this overriding theme of the public good versus the whole sense of private harm.

Finally, on the smart cards, common themes again are commercialization and a great concern about the erosion of privacy through things such as the record of spending patterns. One of the answers seems to be that the perfect defence to an invasion by technology is a technological defence, such as encryption of smart cards, but it was strongly pointed out in our group that if left to their own devices, commercial enterprises such as Mondex and so on have no interest in preserving anonymity. So there has to be some kind of guarantee or protection against the commercial invasion of privacy, because it will not happen automatically if left to commercial interests.

The whole issue of primary versus secondary purposes for which information is collected, the whole issue of informed consent for data linking so you know that might happen to you, and the whole issue of reverse onus and burden of proof came back.

A fundamental distinction between data matching and the other two is that there is no possibility of an accident that might benefit the person or society. That is, it's not like spotting an accident on closed-circuit television or inadvertently finding a gene that may alert you to some problem you have. Data matching doesn't have that quality to it. It's quite specific. It's just plain trolling. When you try to put things together, you're just looking for trouble.

.1125

Finally, to pull it together, our four-hour driver from Kingston, David Lyon, summarized it this way. The big common theme for all three technologies is an obsession with risk management, a drive to try to reduce risk, whether it's in the field of health, security, or financial fraud. It's trying to make society more predictable, but the problem is that it's based on lots of false assumptions about the likelihood of somebody coming down with a disease just because their genes would indicate they have a predisposition, for example. It ignores not only the human factor but human rights in the most profound sense. The onus should always be on the protection of the individual in society.

Ms Steeves: Thank you very much, Mr. Godfrey.

I'll now turn it over to Ms Augustine.

Ms Augustine: Thank you.

Our facilitator was Geoffrey Gurd and our group had a lively discussion. We started with the topic of genetic testing and did touch on all of the various case studies and the various topics of the case studies presented.

We began with the statement, which carried through our entire discussion, that the technology is benign. By the end of the session, I think we agreed that maybe it was more neutral than benign. I'm not too sure whether we did agree whether it was benign or neutral or what those two words exactly and precisely mean in the area of how the technology is used, how the technology is utilized, who uses the knowledge, and what future use is made of that knowledge.

We went on to say that it is the knowledge that presents a dilemma for us. Who owns the information about us? I think some of us in the group were surprised to know that we are not owners of ourselves or of our body parts, etc.

We went on to talk about what is in the law, what legislation we have at present that does protect what protocols there are. For example, Bill C-104, which is recent legislation, has a protocol that calls for the destruction of whatever has been collected on an individual if the person being investigated is not the individual who has committed whatever the offence is. We talked about that for a while in the discussion around the DNA and the collection of fluid and hair and all the pieces and how that operates.

We did move to individual versus collective. We said it was important that aggregate information be collected in order to draw a conclusion. Someone alluded to David Foot's Boom, Bust and Echo; if information was not collected then, the aggregate information acquired there to enable us to make some projections and predictions may not be possible.

We went on to discuss the cost of surveillance versus public safety. I think a couple of the other presenters did mention that in some way. One of the individuals in our group was very familiar with the issue of real time monitoring, such as the control and access to buildings, etc. We talked about the social and ethical values and rules there should be or could be for manufacturers of the system versus the managers of the system. In our view, the manufacturer of the system or those who build those systems are not operating with the same set of ethical rules or codes as the managers of the system. That's where the question of control comes in.

.1130

We spent a good deal of time on the whole issue of control - who controls, under what circumstances, and where does information get stored? We mentioned in passing that in Manitoba, for example, there is a central registry for cancer patients. What's the trade-off between the research value of this and the concern for general societal uses?

We went back to the issue of ownership. It kept going back and forth throughout the discussion. One individual said that when they define ownership, they view it from the perspective that if they give, they therefore are able to take back. If they can give and take back, it means they own the data about themselves.

That brought us into the business of perishable and non-perishable; the decisions made on verification of systems; accurate and current information; whether extrapolations are made from information that maybe is not current or accurate; and the upsides and downsides of all of this for abuses and malpractices, etc.

We went back to the issue of informed consent and said that if you did sign something that says you have consent to do this, is it giving carte blanche, and how do we ensure that this informed consent is in some way regulated?

Finally, we ended with the role or the pull of industry and industry policies in voluntary codes, self-regulatory, and felt that self-regulatory/voluntary is preferable. At the same time, it's important that government has a role in that regard and therefore has regulatory minimum standards.

We spent some time on education of the public. We feel it is public education and consumer knowledge that will give us the kind of public that would respond to invasion, or to the new technology, and its infringement on a whole series of issues. So we agreed that we should educate the public, that the public has rights. They need to know to protect themselves.

Finally, we went on to speak about the need to formulate a legislative framework to govern relationships between the individual and commercial interests. In Canada, with the exception of Quebec, it was said that privacy laws are about 10 years behind Scandinavian countries.

Ms Steeves: Thank you very much.

Mr. MacLellan, I know you were participating in Mr. Scott's discussion group. Do you have any comments you would like to add?

Mr. MacLellan: No, but when we have our meeting afterwards I'll make some comments. I think Mr. Scott handled it very well.

Ms Steeves: Mr. Assadourian, would you care to add any comments?

Mr. Assadourian: Thank you very much.

I think my colleague, Sharon Hayes, did a good job. I would also like to thank Kate. She did a fantastic job.

I have one overall concern on this issue of privacy and smart cards or genetic testing. A few months ago we passed in the House of Commons a bill for DNA. It took us half an hour, I think, to pass the bill. I think it's the fastest we ever passed a bill, if I recall correctly. I think we should take time to reflect on what we're doing now.

I'm not saying we should not pass a bill to control or to influence the smart card or the genetic or video procedures we do here in the country, but we have to study it. My concern is that this field is changing so fast, just like computers, that what was correct six months ago may not be correct today.

That's the overall concern I have. I'm saying this because we have to pass a bill, but we have to make sure we do our homework before we pass it. It's very sensitive. I think it's a very important issue.

The final point I want to emphasize, as it was reported, is that it's a matter of trust. Who do we trust - the government, the industry or the citizen? That has to be decided, I think, at the end of the day.

Thank you.

Ms Steeves: Thank you very much.

We're going to take a few moments to see if the facilitators have any brief comments they'd like to add before we open the discussion to the floor.

Perhaps, Kate, I can start with you.

.1135

Ms Kate White (President, Black & White Communications Inc.): First of all, I think our comments have been somewhat addressed already, so I'll try to move through any additional comments really quickly.

Much to my surprise, we had a fair amount of consensus in our group even though we had some wide-ranging opinions. We recognized the social cooperative schemes at play in any of these issues. That meant recognizing that employment was affected and industry was infected - affected, rather; a little Freudian slip there - but our notions of capitalism and democracy were affected as well in these issues we were speaking about today, and these were of rare importance.

It was mentioned by my colleague just now, and also earlier, that the soul of the issue does seem to be one of trust. Who do we trust to make these decisions? Ultimately, the last part of what we dealt with was how to handle them. We talked about legislation, about enforcement and regulation, of putting resources to find out when our human rights in these issues were breached, and who had a responsibility there.

In the final analysis, there was a sense that legislation was imperative - now. Time is absolutely of the essence, and we should act. I would also say there was a sense that it shouldn't be some grand, sweeping legislation but rather focused, specific legislation in cooperation with other social activities. Education was certainly one we touched on. This went hand in hand with legislation.

I think the other thing we recognized off the top was that there are enormous benefits to technology, and a potential for great good. It was from this that we also recognized lots of individual and collective fears around this monolithic sense of understanding technology and how it could be used.

This notion of potentiality has come out a lot, and I think it's important. One of the things we looked at is the notion that if I was tested as having a predisposition for breast cancer, maybe the state wouldn't have helped me get my way through university and I wouldn't be here now. I don't know whether I have a predisposition, but it's an important point to consider.

I also thought of Albert Einstein, who did his very best work at 26. Maybe we shouldn't have schooled him, either.

So there's a sense of fearfulness. I think by articulating this fearfulness we then move forward in understanding how both the state and other civil society organizations have a role in this.

Thank you.

Ms Steeves: Thanks a lot, Kate.

Larry, do you have any comments you'd like to add?

Mr. Laurence Kearley (Senior Legal Adviser, Documentation, Information and Research, Immigration and Refugee Board): Mr. Scott covered very well everything we discussed. I'd like to emphasize a few points, though, mostly because they're near and dear to my heart.

We weren't very sure whether we were representative, in our group, of the population at large, because we agreed on just about everything. We'd like to think that everybody out there would agree with us, but we're not so sure about that.

I found it to be a very emotional experience, in a sense. During the course of the hour and a half we ranged from despair to cynicism to optimism about the situation as it is now. I'm not sure where we ended up. I think we were cautiously optimistic that maybe it really wasn't too late, but I'm not sure we're convinced of that.

One of the issues that seemed to really take a fair amount of our time was consent, informed or otherwise, and whether we really have the ability to consent to the collection of the data that goes into these data banks. If you want the things society is offering, do you just have to consent, and should we think more about that?

In terms of public education and awareness, we were sure, of course, that if everyone could hear the horror stories we were swapping, everyone would agree with what we were saying. I think we do believe that. If the average person among the general public - relatives, colleagues - became more aware... Lord knows my family and friends are sick of hearing me preach, but I'm sure they change their opinions when they do find out what's going on.

The question we started off with and maybe ended up with, too - and this is similar to the cloning issue everyone is racing to deal with now, whether we should clone sheep, monkeys, humans - is that just because we can do something, does that mean we should do it?

.1140

We need a public discussion, we said, of the values at stake. Do we, as a society, want to use an available technology or do we prefer to give up some of the economies and efficiencies for the human value and right of privacy? That's something that really gets discussed.

A parliamentary committee and Parliament is a perfect place to discuss the competing values. We felt fairly strongly about that. Thank you.

Ms Steeves: Thank you, Murray.

Pierrôt.

[Translation]

Mr. Pierrôt Péladeau (Progesta Communications Inc.): Essentially, I have nothing to add to the points already raised. I will therefore not extend the meeting needlessly.

There will perhaps be other forums where I can express my personal views, but to my mind, I'm here as a resource person. I will just raise one important point that has not been mentioned. Participation - at least in our workshop - was both lively and well thought out, everyone showed great respect for the opinions of others, and this should be highlighted. That was all that I wanted to add, madam Chair.

The Chair: Thank you.

[English]

Ms Steeves: Thank you, Pierrôt.

Geoffrey, do you have any comments to add?

Mr. Geoffrey Gurd (Department of Communications, University of Ottawa): Yes. First of all, I want you all to notice that Val selected facilitators with beards.

Ms White: Excuse me.

Some hon. members: Oh, oh!

Mr. Gurd: The males.

Ms Steeves: I would like to make it clear that she does not have a genetic predisposition for a beard.

Some hon. members: Oh, oh!

The Chair: It's a different hormonal structure.

Mr. Gurd: Gene summarized our meeting in quite a bit of detail. I found a couple of things to be intriguing about it, besides the fact that we didn't have enough time.

First, we realized that the regulation of crime and criminal elements in society is fairly elaborate and detailed, but when we come to the commercial part of our lives, there's not as much there.

I had a sense that we wanted to look at that a little bit more. We should at least bounce it out a bit more, because one of the undertones of our discussion - I hear this from the other reports as well - is that both technology and commercial imperatives are driving these questions and making them very problematic for us.

Otherwise, I want to stop there and hear more from the floor, shortly.

Ms Steeves: David Lyon, unfortunately, was delayed from joining us because of the snowstorm.

David is the head of the sociology department at Queen's University. He has researched and written extensively on the social impact of information and communications technologies.

David, do you have any comments you'd like to add before we move to the floor?

Professor David Lyon (Department of Sociology, Queen's University): I was late, and I apologize for that.

Certain things did strike me about the case studies both in the group discussion and the discussion that has been happening here this morning.

It seems to me that it's important to move away from the sense that this is a kind of residual personal or individual matter. We talk about privacy, and it immediately seems to be just an individual thing.

I think we're rather talking about a question that is a social one. We're asking questions about what kind of society we want, rather than just residual individual privacy.

One of my comments was already quoted to the effect that what we're looking at is a situation in which there is a huge drive toward risk management. Risk management has become the crucial feature in so many of these situations, and certainly in all three of the case studies we were looking at.

This involves the identification, and all the paraphernalia thereof, of attempts to predict, and therefore to pre-empt or prevent, certain kinds of things from happening. This is usually, of course, with very plausible reasons given in specific cases.

What that leads to is this desire for surveillance, this obsession with garnering as much personal data as possible within machines that are geared to discrimination and creating categories of persons. Whether those are the commercial ones or the genetic disposition ones or the propensity to fraudulent activity, whatever they are, we're creating categories of persons that don't necessarily represent the actual persons described within those data categories.

Against that huge power, in all these three examples, we have talk about regulation, law, the extension of privacy law, and so on.

.1145

It seems to me that there is a difficulty here, because privacy law, as we have it in Canada, very much puts the onus on the data subject to protect herself or himself. That seems problematic to me if the question we're talking about is a social issue. What kind of a society do we want?

That's why occasions like today's are very important indeed. We need rather urgent, concerted action and attempts to raise awareness about what is going on.

Informed consent is meaningless if most people don't know how their data is being gathered. It's irrelevant to the question. We're talking about a much bigger kind of issue, it seems to me. Therefore, an informed ethical and policy debate with a view to trying to have the protection of persons built into the systems that we establish, it seems to me, is exceedingly apropos.

Ms Steeves: Thank you very much.

Ladies and gentlemen, we'll ask for your comments and concerns to be added to this discussion. I would ask you, for the purposes of assisting the person who is going to be preparing the transcript, to state your name before you make your comments.

Dr. John Williams (Director, Ethics, Canadian Medical Association): My name is John Williams.

I was particularly interested in the genetic testing issue. It raised one question about the whole issue of privacy that I think cuts across all three issues. There are different problems with regard to privacy in at least three areas: gathering information, accessing information, and using information.

Genetic testing raises particular problems in the latter two.

With access to information, it could be argued that people other than the individual concerned have a right to that information because it's also information about themselves. These are questions that have to be explored.

With regard to the use of information, there are real problems here, as illustrated in the case study about potential discrimination.

So I think when the committee is studying privacy, it has to decide how far its extent will reach. Will it concentrate on simply the privacy of information concerning the individual in the gathering and perhaps the access to it, or will it go downstream a little bit to look into the ways in which discrimination must be prevented? This is particularly with regard to genetics, because obviously, this provides a great potential for discrimination against individuals.

Does this mean that, in addition to some sort of privacy legislation or framework, the current legislation on human rights, in forbidding certain types of discrimination, needs to be tightened up? For example, there may be a specific provision within even the charter such that discrimination on the basis of genetic status should be included among the other categories of reasons for not discriminating.

Ms Steeves: Thank you very much.

Mr. Ken Rubin (Individual Presentation): I'm Ken Rubin.

I have perhaps a concern, challenge, or question for the committee, Madam Chairman.

I heard reference to the fact that Parliament spent half an hour on the DNA bill. I see we're spending a little more time today on discussing some of the issues in more of an abstract form. Then you're going across the country.

But I'd like to know this. Say the committee, before they write a report, is going to investigate and call as witnesses those who are currently planning things in secret, such as a national ID card in Human Resources or with the provinces. Are you going to call the head of the Communications Security Establishment agency and the national defence persons responsible for the spy satellites? Are you going to call the banks, who have certain modern technology - I think the committee should hear about this - to invade our privacy? Are you going to revisit the DNA bill now that Parliament seems to have had some second thoughts, at least on the electronic monitoring aspects?

.1150

I think it's a serious question. It's one thing to hear discussion.

My first presentation on privacy in front of a parliamentary committee was in 1978. Yes, the technology has changed, but I would hope that the committee would call the appropriate witnesses, grill them, and come to some understanding, because that's where the public education has to take place.

Thank you.

The Chair: Thank you very much, Mr. Rubin. We all know you on this committee for your extensive writings. They are very thoughtful, I might say. Certainly they were an incentive to make sure you were included here to help us examine some really interesting, troubling, and challenging issues - there are no clear conclusions yet - before us.

I can suggest to you that I can't say exactly where we will go when we finish this report. This report will hopefully be tabled during the spring. We will see what findings we come up with, and certainly your observations will be very key.

In fact, this committee, following this meeting, will be doing an evaluation of what we heard and of any changes we'd like to see in the process. We'll also take into account some of the material issues that were brought to our attention. When we come back, we will certainly take into consideration your observations. We'll decide whether we will act in that regard at that time.

I think we should hear the next intervener and certainly allow the panel to respond to the observations made by Mr. Williams, who was in the group I was monitoring.

Mr. Peter Brandon (President, Sysnovators Limited): Thank you, Madam Chair. My name is Peter Brandon.

I'd like to echo David Lyon's comment. I made the same point in the discussions in our committee that I think there are two concepts that are important perhaps to the way the committee may wish to frame the whole issue of privacy and the interrelationship between the privacy of the individual and of society in general.

One concept is viewing information that we generally associate with privacy as potentiality. Think about it as information about oneself. It could be obtained through genetic testing or through a sensor, for example, that exists at an intersection. Say your automobile has a chip in it. As you go through the intersection, the fact that you're going through the intersection is recorded.

There are all of these types of information. Say you've been taped in a public or private place. That is a piece of information, in whatever medium or format, that represents a potentiality.

Take genetic code. The results of a genetic test may indicate that you have the potential for breast cancer or an increased potential for heart disease.

By passing through the intersection, this may indicate that you have the potential of cheating on your wife, because you should not be there at that particular time.

It's the potentiality. It's not realized until somebody actually takes that piece of information and does something with it. So we have the concept of potentiality. It may be helpful to frame the whole concept of privacy as being something to do with our data sphere, our method data, our information about ourselves, and to view that as a potentiality.

As David Lyon very aptly suggested, our society, at all levels, whether we're talking about government, the volunteer sector, or merchants' business, is obsessed with risk management. Risk management is a fundamental fixture of society today.

.1155

Really, we could frame the whole issue of privacy in terms of the relationship between the potentiality that information about ourselves represents and risk management in society. If I'm an insurance company, I will manage my risk and I will reduce my risk if I know more about my policy holders. Therefore, I have the potential of reducing my risk and I perform better risk management or more successful risk management if I have more about the potentiality of the people around us.

Society presumably is safer if one collects information, via video cameras placed in public places, about potentialities of someone committing an offence such as breaking into a car.

I think it's important that the debates and the discussions and the thinking of this committee be framed around the fundamental forces, the fundamental dynamics, between privacy as potentiality and risk management in society. We're talking about the whole range of societal effects, primary and secondary, that arise out of that relationship between potentiality and risk management.

I think it's very important that we not get too hung up on technology and that we not make too many assumptions about smart cards, for example. Think about it. When you cross the border using a smart card and information about your crossing is recorded, that is fundamentally no different from filling out a card at the border and the card being taken and punched into a computer and turned into a record in a database. Fundamentally there's no difference; the only difference is the efficiency with which that particular act is accomplished.

The technology really provides an efficiency-enhancing thing as opposed to changing the paradigm. It doesn't change anything. It simply makes the act of turning what you do into a database record faster, more efficient, cheaper, and therefore easier to do for a government agency or for the private individual or the private organization. So I think it's important that we not get hung up on technology.

The other thing is that there is a fundamental assumption that if we create one smart card, all the information will be held there. That's not the case. We may put in place things like escrow agencies that hold various pieces of the information about ourselves, with the smart card merely containing an index to that particular information and a way to obtain permission to get that information.

The bottom line is that I think we shouldn't get hung up on technology. We should not regulate technology, but really we should talk about risk management versus potentiality. Thank you.

Ms Steeves: Thank you very much, Mr. Brandon.

Would anyone on the committee like to respond to any of the comments?

Mr. Gurd: Yes. I think we should be worried about technology even though it's a moving target and even though under things like copyright law we can't seem to nail it down.

There's a book called Connections by two U.S. authors who have looked a lot at electronic mail-in organizations. The key part of their book is that they look at primary and secondary effects of technologies. The primary effect of technologies is that we expect machines to make things happen faster, more efficiently, at less cost, etc. Those are things we intend to reap benefits from within a short period of time.

What we can't predict and what we can't plan for is what are called the secondary effects of technologies, the secondary effects of things like the use of the Internet and the development of virtual communities and what the virtual communities do or don't do with themselves. Technology does change paradigms, but we don't see it right away. Sometimes these things happen over a longer period of time and we can be asleep while it's happening. I think we need to think about technology in terms of primary intentional effects and also secondary, indirect, long-term effects, because I think those happen as well. They're just harder to identify so we tend not to talk about them.

.1200

Ms Kathleen Connors (President, National Federation of Nurses' Unions): My name is Kathleen Connors and I'm here on behalf of the National Federation of Nurses' Unions.

I'm glad there are some women coming to the mikes, because it is International Women's Week.

As I began looking at the issues that were going to be examined as part of this consultation process, to me the question became who is controlling and has access to and benefits from the information we're gathering in an increasing manner? I'm really pleased that we've moved beyond the individual aspect of the issue to the collective, because I come from a collective. I work always promoting the collective, and I think it is incumbent...

If I can throw out another challenge to the committee and to the Government of Canada, it's that Canadians elect politicians to work in the collective interests of us in the constituency. So I think it's very important not only that you examine the issues that have been percolating in this country and across the world for many years, but we also count on and call upon you to act on it. Many European countries already have legislation putting in place protection for the collective, and we really would hope that this is the sort of move we have here.

The issue of trust was raised, and while there are differing amounts of trust of what government can and should do or what business can and should do or what other groups can and should do, we still live in a democracy. We still hope there is integrity of the democratic process, and we would hope that the best interests of the collective are taken into consideration.

As a nurse, as someone who advocates for those who can't for themselves, I'm counting on you to advocate for us.

Much of the information I provide about myself I want protected. In this area of risk management, I'm proud to say that we've come from a very public health care system. It's a system of social safety nets that has an increasing number of holes, but because that public protection has been there around this sort of issue, that protection has to be there as well. So I throw that challenge to you.

As a collective we don't want a report gathering dust. We want some legislation with some teeth and funding for the enforcement to protect the collective. Thank you.

Ms Steeves: Thank you for your comments.

Mr. Assadourian, would you like to respond to that?

Mr. Assadourian: No, I was going to make a comment overall when they finish.

Ms Linda Rheaume (Director of Special Projects, Civil Liberties Association, National Capital Region): My name is Linda Rheaume. I'm with the Civil Liberties Association. My president couldn't dig himself out, and the reason women are here is because they take the bus.

I appreciate the philosophical discussions we've had, but I am a very practical kind of person. I have here the ears of federal MPs, and I'm going to direct myself to those issues that I think could be rectified very quickly, hopefully by the people who are sitting here today.

First of all, the length of time for turnaround on inquiries for information about oneself is far too long. It seems to me that Mr. Phillips does not have the proper clout to deal with those civil servants who are malingering, or whatever. I believe three years to bring forward material is far too long, and I believe it would be a revenue generator to fine those civil servants who do not meet the deadlines.

I was told by a gentleman today that the RCMP has a 30-day turnaround, never mind whether it's overseas or whatever. It's quite possible that they have other things more important to do and that we might want to revisit that 30-day deadline for the RCMP.

I thank the chair for setting up this kind of forum so that I could hear my colleagues on these issues.

.1205

When government contracts out, as they are doing, the contracts do not necessarily contain the restriction that the federal legislation must be adhered to with regard to the information. Alternatively, if a person, say, five years later, wants the information that's been gathered by that firm on behalf of the government, they may find it's not accessible. I believe that should be written into the contract. Given the cost factors and so on, maybe you couldn't say in perpetuity, but there should be something.

I have two other things that have to do with individuals.

It's my understanding that if you're a federal servant and you have a voice box machine, your employer can dip into your machine and listen to what appointments you might have with your doctor, your mistress, or whomever. Unless you're off on leave or sick, I don't believe your personal voice box should be accessed by your employer.

I also want to bring to your attention the fact that Sun Life Assurance Company has the contract for disability insurance for federal civil servants. They also have the contract for the medical plans. Therefore they know whether a person is taking their medication or not. I believe there should be, as there was before, two different firms, one that has the disability contract and the other one that handles the personal information about your medication.

Thank you.

Ms Steeves: Thank you very much for those comments.

The Chair: It sounds to me as though you'd like to build a fire wall for that information. Is that what you're saying, with Sun Life?

Ms Rheaume: Yes.

The Chair: Thank you very much.

Ms Steeves: Mr. Assadourian.

Mr. Assadourian: I notice we have a very impressive list of witnesses. So far we've heard from witnesses that we provide information to, but we haven't heard witnesses from the RCMP, for example, and others who use this information, such as the Canadian Association of Chiefs of Police, the privacy commissioner, and the justice department. If there's a chance we could hear them before we conclude the day, I would appreciate that.

Ms Steeves: Actually there was a delegate from the RCMP around. There he is.

That's a challenge to stand up and get in line at the mike, I guess.

The Chair: Actually he was a very active participant in the round table.

Mr. Assadourian: Can we get him...?

The Chair: Mr. Assadourian, that's an interesting and important observation, for which I thank you. We will review the invitation list, because these sectors of our own structure were definitely to be with us, and if they're not going to be here, they will be found in Toronto or Montreal. If we're still short, we will call them before the committee when we get back.

Mr. Assadourian: Thank you.

Ms Steeves: Ms Moll.

Ms Marita Moll (Head of Research and Technology, Canadian Teachers' Federation): My name is Marita Moll. I'm here with the Canadian Teachers' Federation.

I live in Ottawa and I've been in this building several times before. This morning when I came in, I went through three levels of security. I was shocked and appalled, almost to the point of tears, to think this is what has happened to my society, this is what has happened to my community. I was very pleased to hear David Lyon saying that these privacy, surveillance, and monitoring issues are issues of society and how we deal with the different tensions within our society.

The issues of privacy and technology are moving very quickly, ladies and gentlemen. We've been talking about this for two or three years at the parliamentary level. All of these things are questions of resources.

The allocation of resources is a question of priorities. I would suggest to you that this is a very top priority and that resources be allocated to deal with the question to put together whatever legislation Canadians, as citizens, decide they want. That is not being done right now.

I have a specific question. Two years ago now, the Information Highway Advisory Council suggested that certain legislation be put in place. Has anything been done about that? Can anyone tell me where we are on those particular issues?

The Chair: Nancy or Susan, would you tell us as much as you can share, please?

Ms Susan Alter (Committee Researcher): I think we have somebody in the audience. I don't want to put Stephanie Perrin on the spot, but she could probably bring you up to date very aptly on that, because she is with Industry and I think she's tracking what's been happening in that milieu.

The Chair: Stephanie, would you join us at the table, please?

If anyone's been involved in the whole question of privacy and the private sector, they will know Stephanie has been the driving force behind Industry Canada and the justice department taking a very important look at the Canadian Standards Association's model code and the issue of business and privacy.

Stephanie, please.

.1210

Ms Stephanie Perrin (Special Policy Adviser, Long-Range Planning and Analysis, Department of Industry): Thank you.

As you may know, a commitment was made in the response to the Information Highway Advisory Council's report to government on what was required to protect citizens. There was a recommendation from that private sector council that privacy legislation be tabled and that it be based on the Canadian national standard developed, through the Canadian Standards Association, for privacy - that is, the CSA model code.

The response to that on May 23 of last year was that indeed Justice and Industry ministers would return to Parliament with a plan to protect privacy. At the moment, we are anticipating releasing a consultation paper discussing that. I can't give you a date at the moment, unfortunately. We are in discussions with our provincial colleagues in preparation for an information highway ministers' conference, in which privacy is one of the key elements on the agenda. We are basically working through what the provincial response to this will be. As you may know, legislation for privacy in Canada is inherently a split jurisdiction.

So that's what's happening, in a nutshell. I'd be happy to answer questions.

The Chair: In light of the urgency that was outlined in the question placed before you, I think you could say that it is within the next year. I believe the minister did indicate that by 1998 something would be done. Is that accurate?

Ms Perrin: Minister Rock made the commitment in September at the privacy commissioners conference that it would be on the books by 2000. We all know how long that takes, so I think you can work that back. We are hoping to get that consultation paper out soon so that we will get going in earnest the kind of debate that has been called for around the table about what legislation ought to look like.

Ms Moll: As a quick supplemental, Madam Chair, I'm wondering why we haven't had access to that. I would like to have known before the session started that there was some information that perhaps would have made a difference to our discussions.

The Chair: I think I tried to frame the discussion, as did the papers that went out to you, that our approach is that of human rights and the implication of privacy on human rights. Industry and Justice are looking at commercial rights.

Ms Steeves: David Lyon.

Mr. Lyon: I'm slightly out of sync here. I was responding to the potentialities and collectivities comments.

The question of potentiality is so important because those potentialities, sometimes referred to as data shadows or data images, although they may not be related directly to actual situations or may not relate directly to who we think or know we are, they have very real consequences in the way management, coordination, and control occurs in very many areas. That's why the potentialities are so important even though they may not reflect accurately what we think actual situations are.

The way they relate to the collectivities is that we believe we live in a participatory and democratic society, where mutual trust is assured, because we deal with each other as people who have disclosed things to each other within those relationships of trust, disclosed certain things appropriate for situations. That's why it's quite different from a residual question of privacy. It's a social question.

So the potentialities and collectivities comments, it seems to me, go very well together.

Ms Steeves: Thank you very much.

Mr. Long.

Mr. Murray Long (Individual Presentation): I'm a self-employed privacy adviser. I also came in on the bus this morning. I'm not sure what that says about me, or how my bank or insurance company may choose to use that information.

The Chair: It's not just a matter of genes or hormones?

Mr. Long: I've been videotaped, too, so it may get passed back to my bank. Who knows?

I want to comment on two things. One is the whole question of genetic testing, which, from my point of view, is not just a privacy issue but also a very fundamental ethical issue.

.1215

The fact that this is predictive medical research, as I think David Lyon mentioned, doesn't necessarily state that someone's going to do something, or going to have something. It simply states you have a precursor or some kind of indicator that something might happen to you. This information has to be governed with a set of ethical guidelines that ensures it can't be used for frivolous purposes.

We all know that in the marketplace, insurance companies are very risk-averse, as are many other economic institutions. The fact that this information can be gathered, indicating that someone stands outside the norm, not that they're going to have a problem, only that they might have a problem... Those kinds of theories about who's going to have a problem, or those risk factors, end up being used against people.

The major problem there is that an institution is going to make decisions based on their own best interests. It's very hard for the individual to challenge that and to win. It's very important that the use of genetic research be governed by a set of strong, ethical guidelines that prevent frivolous uses. I think some of the economic spin-off uses end up being rather frivolous, and deserve a very broad social debate.

The second point I want to mention is that we do need to have framework legislation. It's absolutely imperative. It's not a panacea, but along with public education, it'll go a long way.

The other point is that this framework legislation can't simply be imposed by the federal government. So much of the business activity in this country is not governed by federal jurisdiction but resides under the purview of the provinces. It's therefore very important that the provinces work together, in concert with the federal government, to establish harmonized framework legislation that will apply across all economic boundaries and govern every aspect of our economic and personal lives.

A good example of that is the Internet. You never know where that data is travelling. You could be dealing with a data centre in Dayton, Ohio, or L.A., or it could be based in Moncton. We need, therefore, at least inside Canada, some overall framework legislation that ensures we don't have data havens and ensures we have some common standard so that every citizen knows their personal information is governed by a common set of guidelines.

Thank you.

Ms Steeves: Thank you very much.

Mr. Evert Hoogers (National Union Representative, Canadian Union of Postal Workers): Thank you.

I'm a worker from a very automated and monitored environment, the post office. As a result of that, I have personal knowledge of how intrusive electronic monitoring and CCTV technology can be. I've had that knowledge for many years. I've spoken about it for many years and I've worked within my union on it for many years. I also have a little bit of knowledge about how, if workers have a difficulty and they band together, they are able to do something about it. For example, my union was able to get rid of closed-circuit television as an investigative device in the post office 10 years ago.

I'm kind of fortunate, because I'm a worker who has a strong union that's prepared to fight on these issues. I'm fortunate also because I work for a federally regulated agency. The Privacy Act does, to some extent, cover us. But what concerns me is that there are millions of workers out there who every day are suffering intrusions into their personal privacy that with each developing technology becomes potentially greater.

What I'm about to say really tells me a little bit about why control of technology is so important. Workers from these agencies way back in 1982 told the Labour Canada task force on microtechnology and employment that closed monitoring of workers was a gross intrusion of privacy and ought to be prohibited. This is now 1997. In 15 years, absolutely nothing has been done.

Government agencies recognized there was a problem then. Everybody here today recognizes there's a problem. We're talking about it, having very interesting discussions indeed. I'm glad to hear there are some developments. Perhaps by 2000 something may be in place around other areas.

.1220

I appeal to this committee to recognize that you as a government committee not only must make it clear where you stand on the issue of invasion of privacy in the workplace but you also must insist that legislation be developed very quickly to ensure that this does not continue. Lots can be done. It's not a question of problems being so complex and so difficult and so philosophically difficult to deal with that we can't do anything.

So there is much that can be done. I urge this committee to ensure that it happens.

Ms Steeves: Thank you very much for those comments.

We have approximately eight minutes left in the consultation process. Mr. Péladeau has some comments to make, as does Professor Crelinsten. I'll then ask if there are any more comments from the floor or from the committee members.

[Translation]

Mr. Péladeau: I had asked to speak towards the end of the meeting, so that I could give everyone a chance to express their opinions. I am here to speak as an expert. I would like to raise certain points after the discussion.

This committee focuses on human rights. The use of personal information is not associated only with a single human right - the right to privacy. At the University of Ottawa, I had a chance to browse through the book, entitled Human Rights Thesaurus. I was looking at all the possible uses of personal information, and came across 150 human rights concepts that could be affected. I'm talking about fundamental areas such as the right to health, the right to education, the right to vote, and the right to move freely about.

What I am giving you here is a very personal view. Privacy is perhaps our window onto these issues, but it is a fairly narrow window. We have to look at all the human rights issues raised by the use of personal information, including data collection and surveillance. That brings us back to what we were discussing before, the balance between individual rights and social, or community, rights.

And there is another issue at stake here. I study projects in the field: for example, I am working on one concerning the computerization of health data, and I am examining some 40 computerization projects. These projects are aimed at computerizing and networking all health data, so that it is available from all parts of the world. When we talk about the privacy issue, the first thing that springs to people's mind is protecting privacy and personal information. But when we go a bit deeper, we see that this is not a privacy issue, it is a social issue. It's an issue of values. We see that 15% of projects have failed, essentially because of issues involving powers, conflicting values or ethical conflicts. This is much broader.

In conclusion, I would say that there is a very great deal at stake and that this implies very broad questions of values and powers. We must approach this with a clear eye to the future. Today's society is committed to technological development and we have to ask where we want to go in the future, what kind of society we want to build and how technology can serve us.

To maintain this perspective, certain specialized agencies have an important role to play, including the Privacy Commissioner and the Information Highway Advisory Council. In the final analysis, beyond all this technocratical expertise - I'm a technocrat and an expert - one of the best approaches would be, in my opinion, a forum such as a parliamentary forum, where citizens have easier access to this expertise, where they can express themselves more easily and where broad discussions can be held on all the issues involved.

Those are the comments I wanted to make. Thank you.

[English]

Ms Steeves: Thank you very much.

Professor Crelinsten.

Professor Ronald Crelinsten (Faculty of Criminology, University of Ottawa): Thank you. I want to pick up on the last comment as well, and to commend the committee. The concept of risk management has been raised, but you are looking at risk to human rights. I think this is a central issue. This is one of the major risks.

A quick point about Europe and counter-terrorism. A large database in Wiesbaden, Germany, was developed to develop data on potential terrorists. It now is being used to track refugees.

.1225

The concept of function creep, in the third case study I think, is a crucial one and it highlights the big difficulty in policy making. We're talking about an election this year. You have to make policy very slowly, usually incrementally, and this goes for function creep as well. These things play out over a long time.

Someone said technology doesn't matter, but it does change, and I'm glad Geoffrey Gurd said that they do change, but you don't know for a long time. Paradigms take a long time to change. How do you develop policy when you don't know what's happening down the road?

One particular case was dated 2004, but I read it and I say it's happened already. Your scenarios have happened already; they are not the future. That's why we are all telling you it's urgent to do something now.

In the range of policy instruments you have available, you have to deal with the private sector because we live in an atmosphere of privatization, especially in criminal justice. We've seen that. How do you get the private sector to self-regulate? Do we need sanctions for misuse of information? Do we create fire walls, as the chair mentioned? All of these have to be done together.

I wish you the best of luck. It's very hard to plan the future when you're working in a short timeframe. Thank you.

The Chair: There are two more interveners.

Ms Marnie McCall (Director, Policy Research, Consumers' Association of Canada): My name is Marnie McCall. I'm the director of policy research for the Consumers' Association of Canada.

I was glad to hear Mr. Rubin and Mr. Hoogers both up here. In Mr. Rubin's case, I think he said the first time he was here was 1978; Mr. Hoogers talked about a recommendation from 1982. The Consumers' Association first made a recommendation to the federal government in 1973. This is a long time - it is a very long time. Very little has happened.

The Privacy Act was a step forward. As we've heard in all of the groups, I think, the privacy commissioner only has advisory power, not enforcement and sanctioning power. I think people feel we've had enough. We've had 12 or 13 years now of the Privacy Commissioner. I think we can say now that the advisory power is not sufficient and that his power should be increased.

The Consumers' Association has been calling for adoption of the Canadian Standards Association model code as the minimum standard, either sectorally or in framework legislation. We support framework legislation. To be very practical about it, I think our legislation is going to be driven by international trade with the EU. We are going to have to comply with the EC directive, so we might as well get on with it.

I guess what I really want to stand here and say is that it's very important; it's crucial to consumers. I'm with the cautiously optimistic view that it's not too late, but I think if we wait very much longer things will be completely beyond our ability to regain them.

On a personal note, I would like to know how I get my driver's licence number out of the computer downstairs, because I also have a problem with the way security is done in this very building. I think that's something you folks ought to be aware of. You probably don't have to go through the system because you work here.

In case you're not aware, you come in downstairs to do parliamentary business. You go through a metal detector like the airlines. You put your briefcase, your purse, your coat, etc., through the X-ray machine. Then you go to the desk and you have to show ID and it has to be punched in to the computer. I certainly don't object to showing my driver's licence and saying that I approximately resemble the picture on here and I can go upstairs to this committee. I do not like the fact that my driver's licence lives in the House of Commons security computer.

I come back to the point of the woman who said if I can give my information and I can get it back, then I feel I own my information. I've given my information here; I don't think I can get it back. I really don't think it's germane to a safety issue how many times I come here. You can see my face on my driver's licence. You can see that I look somewhat like that and my name is on a list. I think that's sufficient.

.1230

But that's not the main point here today. We do need privacy framework legislation and we need it soon. Let's not be pushed into it by the EC; let's think it out and do it ourselves.

The Chair: I want to thank you for that information for my members of Parliament. We will be doing a site visit of the House of Commons later on today, after Question Period. It will be of interest to note your observations. We'll take a look at what you've told us with respect to the computerization of your driver's licence and any other issues that might come to our attention at 3:30 p.m. Thank you very much.

We have one last intervener and then there are a couple of comments, because we have obligations to close this session.

[Translation]

Mr. André Thouin (Departmental Privacy and Access to Information Coordinator, Royal Canadian Mounted Police): My name is André Thouin and I work for the RCMP.

An interest was expressed in the involvement of law enforcement agencies. Ms Rheaume mentioned a 30-day time limit. Let me point out that this period is provided for by law and applies to everyone. It causes serious problems for institutions of the magnitude of the RCMP because it is difficult to answer the large number of requests received.

The discussions held this morning regarding the installation of video cameras in public places captured my attention. I wonder if that's really the problem we're trying to solve. We talked about a social problem. We have many concerns regarding security. If cameras are installed everywhere, it is because the public feels insecure. So the issues that have been raised involve more or less, at least for a law enforcement agency, the tools with which it can work. From a police standpoint, the key is to have the tools we need to work in a context that will allow us to protect the public interest by enforcing the law and fighting crime.

The solution lies in proper regulation and proper control of the tools. Another very important point is consistency: police information must be used for police purposes and must be controlled and regulated. We will thus avoid grey areas, abuses and mishaps.

The Chair: Thank you very much for your presentation. You will certainly have support in terms of security. But we're wondering if security has increased. That's one question that arose after the Jeep burst into our corridors.

What do you do with the videos you have filmed? Do you keep them? Do you have the right to use them? How do you dispose of this information and how do you classify it? How can we find our way through all that?

[English]

If I go too fast are you going to send me a ticket?

[Translation]

Mr. Thouin: That's what I'm wondering also. We're talking about control and regulation of their use. That use has to be compatible and the information gathered for specific purposes must be used to better protect people and eliminate such situations.

The Chair: Your work is done under some accountability framework isn't it?

Mr. Thouin: That's right.

The Chair: Thank you. We should be relieved to know that such a framework exists. We will be examining this issue this afternoon.

[English]

Ms Steeves: Stephanie, did you have any comments to add?

Ms Perrin: I was just going to comment that there seems to be considerable interest in possible legislation. If we have permission to get the list of people here, we could send you the consultation paper when it appears. Otherwise you could slip me your cards, but it would be easier if I could just send it out to everyone on the list.

.1235

As was indicated earlier, there are definitely things that can be done, but it is not a trivial matter to figure out how to craft this legislation and put in place the policies of implementation that will ensure things actually get done. So we need all the help we can get.

The Chair: Perhaps we might use negative optioning here, which I have some discomfort with, but notwithstanding, each of you has heard the invitation from Industry Canada as put forward by the director general. I believe I'm right. Oh, well, I've just made you ADM.

Some hon. members: Oh, oh!

The Chair: Only those of us who are on the inside understand how silly all this conversation is anyway.

In the negative option, if you don't want to receive the consultation paper or the white paper, please advise the clerk. Otherwise you will be put on the list. Informed consent is being requested. Thank you.

I believe a number of members wish to make a few comments, but first our panel ought to be asked if they have some final comments.

Did you want the members to make their comments first?

Ms Steeves: I believe Mrs. Hayes has some comments and Mr. Bernier. No?

Mrs. Hayes.

Mrs. Hayes: My mind boggles with what I've heard. Certainly some of the concepts have been very challenging.

As legislators, we have to keep in mind that there's consistency in the right of privacy and how it applies not only to the three sectors we've suggested but particularly to how government deals with Canadian citizens.

When we talk about the collective rights and the potentiality, what I would like the committee to think about, if we move to privacy legislation, is whether or not that is consistent with our treatment, say in criminal legislation, of things such as the public knowledge of potential harm or criminal activity and what our responsibility is in criminal legislation.

It was mentioned that in Europe there are privacy rights for family life, home, and correspondence. That takes us into government's legislative rights within the family. That would be property rights, discipline of children, and parent versus child rights.

Previously we talked about employment equity and so on, and within that we have commercial viability and commercial rights to business plans, compared to government legislation and regulation of that.

All of these things are privacy, and it all deals with how government interacts with different parts of society. So these are also issues.

The Chair: Thank you very much. That's a big, long discussion that this committee will get into when we get to writing our report.

Russell, did you want to say something?

Mr. MacLellan: Not right now.

The Chair: Andy? Okay.

I would just like to remind everyone here, as I introduce our clerk to you, that anyone who has any further information, questions, or observations they'd like to share can send them to the clerk and they will become part of the committee's record and part of the considerations we will take into account when we start to draft our report.

I do you want you to meet the clerk of the committee. He's Mr. Wayne Cole.

You've already met our coordinator, Valerie.

I do this all the time. I know them all so well, I forget their names.

Nancy Holmes, Susan Alter, and Bill Young handle research, policy, and all kinds of things. They're from the Library of Parliament. I can tell you they're very key to the undertaking we have before us.

I want to take this particular moment to thank every one of you here. Those of you who have participated have come a long distance in a snowstorm. We're not supposed to have snowstorms now. I do appreciate the input. On behalf of my colleagues, I can tell you we have found a broadening of the perspective and the scope of the undertaking.

.1240

Please don't expect miracles to happen tomorrow. That would be most unwise, unfounded, and unrealistic. We have been given some very strong direction and some very good food for thought that will be taken into account as we move across the country and hear from the regional differences in our land. That will be quite interesting, because the mix is quite a good mix.

So we'll go along with listening to risk management, potentiality, human rights, and primary and secondary effects, which is a summation of what I essentially heard.

John, do you want to comment? Did I do that all right? Thank you very much.

The meeting is suspended to the call of the chair. Thank you.

We will now meet in camera, please.

[Proceedings continue in camera]

Return to Committee Home Page

;