[Recorded by Electronic Apparatus]
Wednesday, March 12, 1997
[English]
The Chair (Ms Sheila Finestone (Mount Royal, Lib.)): Welcome.
[Translation]
I want to welcome you all to Toronto. We are holding today our fourth hearing to determine the impact of new technologies on the right to privacy. We are examining which aspects of the new technologies are enhancing our privacy and which ones are diminishing it.
[English]
I'm glad to see Ann Cavoukian here, because she participated in one of the four round tables we held in September, October, and November with respect to what's happening with privacy rights in the light of new technology. Is the human rights dimension taken into account?
Following these round tables, so many different issues were raised that the committee determined that we had better narrow this sphere down and look at the implications for this new technology. What's happening in our personal lives for our privacy and human rights?
When you look at the invasive, evolving forms of new technology today, you can't help but ask - and this is what we heard - who's watching us? Who's watching me? Where is the balance? How much do they really have to know about me?
There must be a balance between competing social and economic interests - such as crime and fraud prevention, less costly health care, and business marketing practices - and our right to protect our individual privacy. These are the positive things that can come with new technology. Is there a need for an ethical framework and is there an obligation to ensure informed consent? And just what does informed consent mean in the use of all these new technologies?
Most of you who are interested in this field and have been watching what's going on are quite familiar with the amount of recent interest in the whole field. I guess maybe the cloning of Dolly and the supposed four-year-old twins in Belgium, which proved to be an unfounded story, have heightened interest, but there are other issues that have bothered us a great deal and that the world is looking at.
It's been of interest to me and to the committee that the newspapers, both in French and in English, have been covering this marché noir au Québec, where you have had this sale of private information that is supposedly secured information at a cost of $25 to $120. Why is this taking place? Who wants this information? And why do they have access to this information? Where are the fire walls? What's protecting people?
I already mentioned the live baby and the visions of the future. These are all in La Presse, Le Devoir, The Gazette, The Globe and Mail, and The Toronto Star, if you've been following this.
Then we arrived out west and found that B.C. Tel has a new program called Phamis Technology that's going to work to install large clinical information systems at the Vancouver Hospital. It's foreign-owned. It will go into effect. It outlines a very interesting $20 million-a-year saving for the Vancouver Hospital, putting all kinds of data and information together on one sheet, but there is absolutely not one word about a protocol to protect privacy of the individual patient in regard to the medication they're receiving and the kinds of tests that have been done.
The focus is they can drive costs down and bid on a per-patient or per-bed basis. So they have great new technology that they can sell around the country or around the world, but what happened to you and I and our privacy rights in this focus? Maybe it's there, but it certainly doesn't seem to highlight it.
The right to privacy, as you probably know, does not stem from any one source. It's drawn from international law; constitutional law; federal and provincial jurisdictions, because it is shared jurisdiction; judge-made law; and professional codes, ethics, and guidelines. The result is often referred to in Canada as a patchwork of privacy protection in this country.
At the international level, as a historic background, several very important human rights documents contain guarantees to the rights of privacy. For example, there's what I call the Magna Carta of humankind, the Universal Declaration of Human Rights of 1948, which by the way was co-drafted by Eleanor Roosevelt and a Canadian, John Humphrey, who passed away last year. As well, there's the International Covenant on Civil and Political Rights of 1966. Canada is a signatory to these major conventions and covenants in which privacy is defined.
There is no comprehensive privacy protection currently in Canada. There are codes of practice and codes of ethics, but there is no comprehensive legislation. Quebec is the only place in North America where there is comprehensive legislation and regulations on personal data practices with respect to the private sector.
In Europe, for example, the fair information principle of the European Union and the OECD countries applies to all personal information, whatever the nature of its medium, whatever the form in which it is accessible, and whether it's collected, held, used, or distributed by other persons. That's an important factor.
Here in Canada, it's important to note that the Minister of Justice and the Minister of Industry have taken the Canadian Standards Association's code of ethical practice and are in the process of drafting a very comprehensive, data-based bill for business practices. It will impact on the private sector. This is still not in the House; it has been promised for the year 2000.
In Europe everyone has the right to respect for his or her private life, family life, home, and correspondence. There is no such expressed right in Canada, even though sections 7 and 8 of the Canadian Charter of Rights, which apply to search and seizure and the right to life, liberty, and the security of person, have been interpreted through the courts to apply to privacy. And there are some privacy rights under the Criminal Code. Is that right?
A voice: There are some protections for interception of -
The Chair: That's right, interception of self-owned communication or communications.
The concept of privacy - and we've heard this everywhere - is the most comprehensive of all human rights. That's the attitude held throughout the world. It's a broad and ambitious right, a universal concept, but not an inalienable right. It's a core human value that goes to the very heart of preserving human dignity and autonomy.
Most of us would agree that the right to privacy is of paramount importance to each and every one of us as an individual.
[Translation]
Some experts define the right to privacy as the right to have a space of one's own, to have private communications, to not be under surveillance and to have one's integrity and one's body respected. For the ordinary citizen, it has more to do with power, the power to control the private information which concerns us. It is also linked to the right to anonymity.
[English]
So the question now becomes, what is privacy worth in today's high-tech society? There's no doubt that emerging technologies would suggest that these new technologies offer very special advantages - and they're valuable advantages - and efficiencies and conveniences to all of us. But do the benefits offered by new technologies come with a privacy price tag? Is this price too high? Is this a trade-off that's inevitable? Where and how do we draw the line?
This is what we're going to be asking you to define for us and to outline for us, for privacy is that precious resource that once lost, whether intentionally or inadvertently, can never be replaced or recaptured.
[Translation]
As members of the Standing Committee on Human Rights and the Status of Persons with Disabilities, we are resolutely approaching the subject from a human rights perspective in order to assess the positive and the negative effects of new technologies on our right to privacy.
[English]
I know Canadians have never really approved of peeping Toms or unauthorized wiretapping, and as I said before, our criminal laws reflect this. Does this same disapproval extend, for example, to hidden video cameras in the workplace, or to DNA data banks, or to citizen identity cards?
In order to exchange ideas and views and to raise this issue among Canadians - because I don't think many Canadians realize how much their life is impacted upon - we're holding this series of broad-based town hall meetings. It started in Ottawa. We were in Vancouver two days ago, in Calgary yesterday, and here today. We're in Fredericton tomorrow and in Montreal on Friday. Then we hope to have a report, which we will table in the House of Commons before the ring of the bell.
To focus discussion in these town hall meetings, we made a decision that we would look at three specific areas. We've given you an outline of three case scenarios. They are fictitious, but they are reality-based.
For the audience out there watching on CPAC, you can access the web site. You can read the scenarios, with our guests today, who agreed to join with us and are experts, and you can follow along so that you too can start to think about this issue and perhaps let us know through our web site what you think. Or you can be in touch with our clerk to bring to his attention, and through him to all of us, your points of view so we can write and raise these issues more effectively.
The three areas we are going to talk about are video monitoring, genetic testing, and smart cards. What are the risks and what are the benefits of advancing technologies? We want to initiate this open and frank debate about the promise and the perils to our privacy and human rights in this area.
We're not going to resolve all the questions; we don't even have all the questions. We're looking to you to enlighten us with your perspective, and for that I'm going to turn the hearings over to the coordinator of these hearings, Valerie Steeves, who is a professor of law at the University of Ottawa Human Rights Centre, and she's leading their technology project.
However, before passing the chair to Valerie, I would ask the committee members to present themselves, please. I think today we'll start with the opposition.
[Translation]
Mr. Bernier, the floor is yours.
Mr. Maurice Bernier (Mégantic - Compton - Stanstead, BQ): My name is Maurice Bernier and I am member for Mégantic - Compton - Stanstead and vice-chair of the committee.
The Chairman: Thank you.
[English]
Unfortunately, our Reform Party candidate is not here.
Mr. Scott.
Mr. Andy Scott (Fredericton - York - Sunbury, Lib.): My name is Andy Scott, the member for Fredericton, New Brunswick, and vice-chair of the committee.
Mr. Sarkis Assadourian (Don Valley North, Lib.): I am Sarkis Assadourian, the member of Parliament for Don Valley North, just north of here, in the great city of Toronto.
The Chair: No commercials.
Mr. Sarkis Assadourian: Mayor Lastman asked me to do that.
Ms Jean Augustine (Etobicoke - Lakeshore, Lib.): I am Jean Augustine, the member of Parliament for Etobicoke - Lakeshore, and a member of this committee.
The Chair: And who gave us a real tour as we came in off the plane.
Mr. John Godfrey (Don Valley West, Lib.): I am John Godfrey, the member of Parliament for Don Valley West.
The Chair: Bill Young is from the Library of Parliament research bureau. He's the adviser, counsellor, etc., to our committee. Nancy Holmes shares with Bill the responsibility for guidance from a research perspective. And Wayne Cole is, of course, our clerk.
I'm going to leave it to our facilitator, Valerie, to introduce our special guests.
Ms Valerie Steeves (Committee Facilitator): Thank you, Mrs. Finestone.
As Mrs. Finestone indicated, and as you know, we have provided you with three case studies precisely to provide some kind of personal or social context for our discussions here this morning. These case studies are stories that attempt to illustrate both the benefits and the detriments of these new technologies. It's our hope that through discussing the impact of these new technologies on the lives of the people in the studies, we'll begin to understand two things: first, what privacy means to Canadians; and second, how we as a society can best seek to balance the benefits that attract us to these new technologies against our underlying social values, and our commitment to privacy in particular.
As participants here, you represent a broad cross-section of Canadian society. Among you today are representatives of advocacy groups, banks and insurance companies, general business associations, disability organizations, educators, government workers, genetic researchers, health workers, human rights groups, multicultural organizations, police officers, lawyers, media, technology firms, telecommunications companies, and cable companies.
To best explore the very diverse perspectives that you bring to the table in this dialogue, we're going to begin the consultation process by dividing into small groups to discuss these case studies. Each of these small group discussions will be facilitated by an expert in the field of privacy rights. In each group, there will also be at least one committee member participating in the dialogue.
Once we've had the opportunity to explore the case studies in our small groups, we'll reconvene the general meeting and have a town hall discussion about the issues the groups raise. To start the town hall, I will ask the committee members who participated in the discussions to summarize the main points that were raised by you in the small groups. We'll then give the group facilitators an opportunity to add any comments and concerns of their own, and then we'll open the discussion to the floor. We're looking forward to a very open and free-flowing exchange of views between the participants, the experts, and the committee members about the meaning of privacy in a technological age.
It is my privilege, as Mrs. Finestone indicated, to introduce the four people who will be facilitating the small groups this morning. The first is Ann Cavoukian, the assistant commissioner with the Privacy Office of the Information and Privacy Commissioner of Ontario, where she's responsible for the protection of privacy and for ensuring the government organizations comply with the requirements of the Freedom of Information and Protection of Privacy Act. This involves activities ranging from resolving complaints and issuing investigation reports to research exploring threats to privacy codes by information technology, genetic testing, and video surveillance - which is convenient for us. She's sort of bang-on with regard to the major focus at this conference.
Ann joined the commission in 1987 during its start-up phase. Prior to that, she headed the research services branch of the Ministry of the Attorney General, where she was responsible for conducting empirical research on the administration of both civil and criminal law. Ann speaks extensively on the importance of privacy protection, and she's recently written a book on privacy called Who Knows: Safeguarding Your Privacy in a Network World. It's co-authored with Don Tapscott.
Next I'd like to introduce Rita Reynolds. Rita is a privacy advocate who has corporate responsibility for implementing access and privacy legislation for the municipality of Metropolitan Toronto. She's held that position since 1990. Not only does Rita have an extensive knowledge of privacy principles, she also has considerable expertise in both designing and implementing policy systems that ensure these principles are upheld in practice. We're very grateful that she's here to contribute her expertise this morning.
The next facilitator is Frank White. Frank is the former director of the Ontario government's freedom of information and privacy branch. Frank administered the consultations on and development and implementation of the Ontario provincial and municipal freedom of information and protection of personal information laws. In January 1997 he established a consulting business that provides advice on information access and privacy in both the public and private sectors. Mr. White is a graduate of the University of Maryland and York University.
Next is Liz Hoffman. Liz is the ombudsperson at Ryerson Polytechnic University. She's currently a member of the federal government's Information Highway Advisory Council. She's also a director on the Canadian Network for the Advancement of Research, Industry and Education, also known as CANARIE. She's a member of the national advisory committee for the Community Access Program, and for the national advisory committee for the Office of Learning Technologies. Liz is also the chair of the board of the Coalition for Public Information, which is a public interest coalition concerned about a number of issues related to the information highway, including privacy.
The way in which we've divided you into small groups has been somewhat random, but we've tried to create groups that will bring a diverse number of perspectives to the table. You'll notice that your name tags are colour coded. The blue group will be meeting with Ann Cavoukian; the red group will be meeting with Frank White; the yellow group will be meeting with Liz Hoffman; and the green group will be meeting with Rita Reynolds.
You'll get into your small groups in a moment. We've asked your facilitators to start by asking you which of the case studies you'd like to start your discussion with. The one thing we've noticed is that our time is very short in these consultations, so please feel free to spend as long or as short a time as you wish with any of the three studies. Feel free to draw linkages or to jump back and forth. Basically, voice your concerns about how these new technologies impact your personal views and sense of privacy.
We will be reconvening for the town hall shortly after 11 o'clock. Just before you get up, I'm going to ask Mrs. Finestone to suspend the meeting - in other words, drop the gavel. Once she does, because time is so short, I'd encourage you and ask you all to get into the small groups as soon as possible, in order that we can hear from you and take your views into account on these issues.
Thank you.
The Chair: Thank you very much, Valerie.
Grab your coffee, go sit down, and we'll see you later.
We're suspended.
The Chair: Ladies and gentlemen, first of all, I can tell you that the intensity of the discussions was such that I don't think people realized how strong the feelings are with regard to privacy rights.
I'll turn this back to Valerie Steeves, and we will proceed with the second half of the meeting.
Ms Steeves: As I said, we're going to start the town hall by asking the MPs to report back on the major points that were raised in their small groups. We'll start with Ms Augustine.
Ms Jean Augustine: Thank you, Valerie and Madam Chair.
My group, led by Ann Cavoukian, was really an interesting mix of individuals, who brought to the table different expertise, from the media to legal people to FOI individuals and organizations. So our discussion was an extremely interesting one.
We started with the OECD code, which lays out privacy principles and some remedies in that respect.
Our first discussion was around the issue of video monitoring. I will go quickly through this simply because we repeated in some areas some of the basic issues - for example, who controls what's done; who has knowledge; is there posting of notices and signs that an individual is being monitored; what kind of advance notice is given; and if advance notice is given and if the individual through the posting and the signs does know that he or she is being videoed and it's done in plain view, then there is reasonable expectation that one's privacy is not infringed upon.
We went on to talk about devices, such as telescopes and lenses. One individual's input was that all these devices are really an extension of the human eye, and therefore that was not considered to be an invasion of privacy, although I'm not too sure that was the general sentiment of the entire group.
We went on to discuss surveillance by the state as opposed to businesses that would want to survey or monitor our actions. We did agree that surveillance by the state has to be justified and that to some extent it was no business of the state to monitor certain activities of individuals. The concern again was around the collection of the information, and we went around collection, storage, access, and destruction of the material collected.
We tried to differentiate between the purpose of the collection for crime prevention as opposed to the safety aspect and at the same time differentiate between the primary and secondary use of the information collected, because we felt that really the problem in the case study, especially, dealt more with the usage that was made of the material collected.
We talked about the deterrent factor, the protection of property versus the protection of people. One individual was very firm about the fact that cost should not be part of the discussion, such as private business would use video monitoring in order to protect their businesses because this was more cost-effective than other means. We felt that this was not part of the discussion.
We went on to refer to the photo radar and the protocols that are involved in that, who does it, what access, and the sale and other things that could be done with photo radar. We went on to say that this is a technology that has x number of guidelines attached to it. We emphasize the technologies here, but we must balance that technology with societal values.
We then went on to the issue of smart cards and the improper use of information for secondary purposes. We spent a good deal of time on the issue of linkages and we went on to talk about the fact that medical information mixed in with other kinds of information could be problematic.
The discussion around wrong information and mistakes and how to go about correcting and ensuring that the individual has right of access, the right to the information, the right to correct the information, the right to say no without withdrawal of the service, the questions being asked not being tied to the response the individual gives, the right of the individual and the right to give that information.
We spent a good deal of time around the issue of consent, the issue of negative consent, positive actions, tracing, difficult-to-track individuals, to give them a negative or positive option. Then we spent some time on the issue of massive sensitization of privacy issues, ensuring that the public does know about the public education and training as an essential part of the general information that the public should have in order to make some decisions.
Genetic testing - And I know this is long, but I have a few more minutes. We started out by talking about the tactics of the insurance companies to use material in such a way that one is disadvantaged. We had a fairly long discussion about genetic information and testing done on children that could be changed at some different point; the formation and the safeguards, etc., there are; the fact that once genetic testing is done, such information tells about whole families and therefore the need to safeguard that. We discussed the issue that we can create entire classes of people as the result of genetic information that's given.
We went on to talk about ownership, who owns the information - the information is yours, the file is the medic's or the doctor's - the option to destroy. The sample is the property of the facility, the individual is the owner of the information, and if the information is not identifiable then that information could be used.
We ended with recommendations, what we would like to do, and those recommendations go from what we do about violations of the principles, what kinds of sanctions we should have, to provinces legislating and giving the right to sue for invasion, better education and training, government publications with guidelines widely distributed, trading in information that violates our privacy - that there should be some laws around that - the principle of privacy to deal with high tech.
One individual said that maybe if we started with principles, then we could talk about remedies to harm and breach of trust for more serious offences. We ended with the Bank Act and some of that information and violations and some restrictions that should be within general legislative processes.
This ended our group session. The time was too short. There was so much more we could have done in any one of the specific areas. Again, I think I can thank the individuals who were with us for their input. They were extremely knowledgeable people. We shared cards and I'm sure we'll continue to talk on the subject.
Ms Steeves: Thank you very much, Ms Augustine.
Mrs. Finestone, could you provide a summary of your group discussion, please?
The Chair: Thank you very much.
We had a very exciting and dynamic group with Rita Reynolds. There certainly was no lack of intervention, and I think a lot of what you have heard from Jean you will find refocused in the nature of our discussion.
The issue really collapsed into the fact that with respect to videos, smart cards and genetic testing, there were some commonalities to be found in all instances, and that we needed some clear and overarching guidelines and, in the end, legislation. I'll come to that.
The question was asked about whether there was specific injury where DNA tests were involved and what the overall implications were where there might be gross violation of that information and there was no recourse. Yet there's the whole issue of no choice, where you have to inform, particularly with respect to insurance companies. The focus was on the potentiality, then the reality, and then the potential for intrusion, because there was no umbrella legislation. You are in a situation of reverse onus and that's not acceptable in a democratic society.
The discussion centred around medical information versus genetic or psychiatric information: is there a difference in that particular sphere? The sense was that you need very special protection, which has to be kept very separate, under separate types of clear, identifiable protocols, because this is very private, very personal, very instrusive information.
In regard to the question of clear information and informed consent, the observation made by some at the table was that a much better term would be ``meaningful consent'', because what does informed consent mean? And what does meaningful consent mean? It is broader and refers to the potential for collective use and to the factors of dissemination, the whole question of distribution and the vital importance of anonymity.
As for the question of when to test the predictive capacity - but there was no provision or remedy possible - where should the firewall be and how high should it be? If there is curative therapy potential versus the inevitability of illness and death - Where the predictive capacity is inherent in the text but not always realized, there is a potential impact of the environment that has to be taken into account.
It becomes a difficult decision as to how much you tell, and particularly when to tell, but certainly the ``who'' to tell is the person who is involved, and no one else. I think there was a strong feeling that there had to be walls around who to tell, and that would be the person, the individual. Then the social responsibility and ethical responsibility remain with the individual. I will come back to that because some differences were discussed.
In this whole area, you have to realize that genetic information is different because it has significant familial impact. You have to know there is control over that information and you have to know the serious consequences of the implication for insurance. As one gentleman pointed out, for anything that is printed on a database you should make sure you know that such a thing as control doesn't exist; it eventually comes into the public milieu.
There needs to be protection from improper use.
Informed consent means meaningful consent.
Cross-linking as a commodity is not acceptable.
The individual's right to know and to correct is inherent in whatever we undertake to do.
In the broad context, it is really an offence to gather information. As an example, police need warrants or some form of official sanction for what they're going to collect. And we should have the same kind of protection for what we collect in the private sector, because you have inherent protections under section 1 of the charter when you have an infringed right. So you really should have some kind of control.
Non-government issues should be subject to proof and approval when they want to do it, not just because they want to do it for commercial interests, business interests or other cross-referenced information.
Even the item on today's front page of The Globe and Mail under the headline of ``parking tickets trip up the suspects'' involved data matching. Do they have those rights? I think that there's a differentiation between where the rights are in the community's interest versus in the individual's interest.
So in the broad context, regarding the right to know and protection, a protocol must be followed. In practice, how should we police it? The use of the electronic tag, we discussed briefly. The reason you'd want the electronic tag in order to police it is so that you can go back to where the violation took place. There should be substantive sanctions so that other industries or individuals will not follow the same process.
There's the principle of the fairness doctrine and a penalty if it is not used ethically. We have to remember that under human rights legislation we have to render justice on time, but we have a serious case of justice delayed. So you need definite and strong overall and overarching legislation, because presently what we have is toothless. Therefore, you either have to have an effective human rights commission with proper sanctions and proper rules and regulations or you need strong legislation with guidelines and with penalties that are strong and enforceable.
It was a very interesting and very strongly held view that there should be no forced disclosure leading to personal disclosure and that the insurance companies had to have some controls around them, because they said privacy is controlled by those who want to violate it. Therefore, you need physical and electronic control and clear records, and government has a responsibility to protect the individual. It's not to be industry-driven. The major issues were strong legislation that was enforceable with penalties and no forced disclosure.
Public education is a vital aspect, and within that public education the individual's personal right to control who gets access. Third-party rights should be clearly defined.
On the question of protocols versus sectoral legislation or harmonizing both, there was a discussion around what were government responsibilities, what were private sector responsibilities, and the feeling was that in the government sphere it was not just federal legislation but also provincial and territorial and that in the private sector they absolutely had to be covered.
On the question of ethical issues, risk management, risk assessment and predictability, there was a discussion about the real father who came out in the Huntington test and who should tell the real father so that his children would not have to live under the shadow of being the inheritor of the Huntington tag. It was an interesting discussion. In the end, they felt that it was up to the father to decide but that there was another paradigm - i.e., you could go and test the children if the father didn't want to own up, in a sense.
The question of the disabled and discrimination came up, and it was felt that they should be covered and that the legislation should be broader than just privacy.
There is a problem with no standard information act as well, especially with privatization. I forget what that was about.
On videos, we discussed fairness principles, not prejudiced rights. On the whole question of surveillance, it was felt that surveillance is okay in certain circumstances but its use was what was in question, because cross-referencing data should be very clearly identified. It is for different purposes. Search warrants and wiretaps are the normal way, and that's still in place. The issue boiled down to the difference between individual rights versus community rights, and issues that are justifiable in a democratic society under section 1.
Last, but not least, I guess, was the general sense that there should be a philosophic principle that is values-based and that the legislation needs to be strong and not subject to technological change, that once you put the values in place on that philosophic principle, you can really have laws that will be overarching and protective.
Right now we have toothlessness. You need enforcement. Agencies need personnel to be able to enforce. The public good must be taken into consideration. The only route is new legislation. Control should be in the hands of the individual, and access is inherent there. You need meaningful consent, which is implied in the informed consent, how it's used, who uses it and how you maintain personal control, and it covers videos, smart cards, and genetic information. I think that's the overview, but certainly what came out was that you have to have proper penalties and proper recourse to act as a deterrent.
It was interesting, again, that they came back to the person - the holders of the breach are those who object - and why government has the responsibility to protect, based on the economic data, which they really want.
That's it.
Ms Steeves: Thank you, Mrs. Finestone. Mr. Godfrey.
Mr. John Godfrey: Our group, the red group - I like the colour - concentrated mostly on genetics, and then went on to smart cards. Our discussion focused on the following themes: societal changes; technological changes; process issues; fundamental principles; issues specific to DNA and genetics; and finally, suggestions concerning legislation and recommendations to the committee.
Starting with the societal changes, I think it was recognized by our group that society has a very ambiguous view of these technological changes. Many people approve of these advanced technologies if they help combat crime or fraud, or produce a better health card. We voluntarily make a contract or recognize that we give up certain of our rights when we go to the airport and agree to be searched, or recognize the validity of a RIDE program, or apply for a credit card, or confess our stories on Geraldo, but there is within that a common societal sense that there are some things that are supposed to be absolutely inviolate, things like medical records.
On the technological side, it was also recognized that whether it was in the medical field or in the field of commerce, there were certain advantages to technological advances - for instance, being able to know through DNA testing that there is a predisposing gene for cystic fibrosis can actually help one take proper measures - and that one should never neglect those positive aspects.
Similarly with smart cards, the fact is that we are now carrying 10 cards that can be read by the common human eye if somebody steals our wallet, and if all of those cards can be summarized on one that is encrypted and only accessible by us through biometrics, that actually means that technology can give us a greater control from the bad guys. So that was not to be lost.
What we felt was in common with all of these things was that all the cases deal with process issues. That is, what was the original intention of the activity? For example, in the case of the genetics, the intention was to be admitted to hospital, not to take part in a research program. That was what you thought you were doing. So we have to be very careful about what the rules are when we participate with these technologies. That's what really distinguishes all of these cases.
What are the four principles that come out of this? The first of these principles is the right to know. It's an overriding principle for all of the cases: what do they know about us; how do I correct that information if it's wrong; how do I challenge that information?
A second principle that arises as well is the principle of reasonable expectation. It comes out of the whole notion of process as well. If I'm being admitted to hospital, why would I expect if I'm lying on a stretcher they're going to be doing research for some other totally different purpose? But those reasonable expectations need protection by the rules. They need protection against ``function creep'' where things kind of move along in ways you didn't anticipate.
The third principle relates to the first two: the notion of informed consent, without which nothing can happen. In the specific cases of the smart card relating to the woman who's applying for employment assistance, or unemployment assistance, or whatever they call it, the difficulty with that specific example was that it really wasn't informed consent. She had no option if she wanted to get her money, as indeed people who apply for social assistance have no option but to allow that information to be shared. That's not informed consent, and yet that's a fundamental principle.
Finally - and all of these principles are interrelated - is the notion of control, not only the right to know, the right of informed consent, but the right to control that information, to see it almost as a property right, that the information belongs to us, not to somebody else.
Those are the principles. Then I guess the next issue is whether there is something specific to the DNA case, to the genetics case. I think our group felt that there was. It was partly because it's so intrusive. It is so predictive or potentially predictive, or so subject to misinterpretation or misapplication that perhaps it needs a separate consideration. It doesn't just fall under normal categories of privacy in technology issues.
As a subset of that - and I think I picked this up as well - perhaps there is something specific about the way in which insurance companies need to be regulated with regard to what they do with their information. Given the fact they're already dealing with it in a very untechnological way, such as asking for family histories, perhaps we need to have a very specific regime for insurance companies as opposed to banks, for example - although who can tell the difference these days. Sorry about that, David.
Then the suggestion concerning legislation is that we as a committee might begin - and I think this is very useful - to survey the effects of existing legislation, not only legislation that purports to deal with privacy but legislation that inadvertently deals with privacy. There are many situations in which the federal government is involved in things: the Canadian labour act right now, which inadvertently, in a sense, gives unions the right to find out information about other individuals who are not union members, against their will.
We discovered in the House of Commons, by the way, when people were coming into our meeting, that they had to surrender their driver's licences to be swiped to find out whether they were security risks. There are all sorts of things we're doing, but we should be looking first at the effects of legislation.
Secondly, we should recognize that it is not by legislation alone that we're going to solve our problems. It will be a blend of solutions, involving some self-regulation, some codes of professional conduct, but somehow or other we have to produce out of this a kind of unified field theory that allows all of us to increase and enhance our privacy.
Ms Steeves: Thank you, Mr. Godfrey.
[Translation]
Mr. Bernier.
Mr. Maurice Bernier: Valerie game me only five or six minutes to summarize the discussion in my group. It's always a challenge to speak after such a good speaker as John.
[English]
Ms Steeves: Don't be intimidated; you're very good.
[Translation]
Mr. Maurice Bernier: The discussion in my group was skillfully facilitated by Ms Hoffman. An issue which was not mentioned in the other groups was raised several times in ours. Although diametrically opposed views were expressed, it was always done with a lot of ease and with the utmost respect for others.
One of the participants in our group was constantly questioning the assumption that there are problems in this country concerning privacy. He wondered if we were not making up problems or asking ourselves the wrong questions. His remarks always brought back the participants to the basic issues. This was very productive and very interesting. We discussed more specifically two of the three case studies which we were given, the smart card and genetic tests.
I'll start by talking about the smart card. Everyone obviously recognizes the advantages of this new technology from the point of view of efficiency and cost. If such cards become readily accessible to most people, they can better control the information concerning them that governments and private companies hold. For example, if we had access to a smart card, we could better control the information on it and correct it if needed. It is easier to make sure the information concerning us is right or to correct it if needed if it is on a smart card rather than in a file at a bank or in a department. The problems which were mentioned concerning the smart card in our group are the same ones that were mentioned by the other groups.
I would like to draw your attention to one issue. We not only wondered if it would be better that all the information pertaining to us be put on one card, but there was a consensus among the participants of our group to recommend to avoid that all the information concerning us be put on one single card, the information that the government has on us as citizens as well as the information that is in the hands of the private sector. We strongly recommend that there be more than one card for citizens. Whatever technological controls can be implemented, it's not in our interest that a card containing all the information that concerns us fall in the hands of somebody else, even if we're not a politician. I can't say this was a formal recommendation of the group, but I think that was the intent.
We also mentioned the fact that we mustn't confuse the issue of privacy with the issues of security, efficiency or cost since these are totally different issues. It is obvious that if we oppose, on the one hand, the new technology and the need for more security to privacy, on the other hand, it is privacy which will be sacrificed in the decision process. We must maintain as a principle the protection of privacy while obviously keeping an open mind toward new technologies.
We all agreed that genetic testing had some advantages although we also identified some problems concerning the storage of this information. Who has access to this information? And as other groups mentioned, what is being done with it?
We could ask ourselves the same questions from other perspectives. We asked one of our participant, who is a lawyer, if there is a remedy in law if we feel our rights have been violated through the use made of genetic tests. As it was mentioned at the beginning, we all agreed that lacking a comprehensive law, it was almost impossible to have our rights acknowledged. To defend these rights would be very difficult and very costly; the proper remedies wouldn't be accessible and in the end, our privacy wouldn't be protected. Furthermore, lacking a comprehensive law, and of course any case law, it would be hard to assess the monetary value of the impact of an intrusion in the private lives of people.
I will concluded, as my colleague did, by saying that people agree that whatever the legal framework, if there is no public education, if people don't know what is happening or what the impact of new technologies is, we will never get ahead. In other words, one does not go without the other. Education and information are critical.
Thank you very much. I apologize to the participants if I didn't convey their thoughts properly. They will be able to correct me in a few minutes.
The Chairman: Don't worry.
Ms Steeves: Thank you, Mr. Bernier.
[English]
Mr. Scott, do you have any comments you'd like to add before we turn to the experts?
Mr. Andy Scott: No, that's okay, thank you.
Ms Steeves: We'll just take a few minutes to hear from our experts and then we'll open the discussion to the floor.
Ann, would you like to start, please?
Ms Ann Cavoukian (Assistant Commissioner, Privacy, Office of the Information and Privacy Commissioner, Ontario): Thank you.
I thought I would touch upon a couple of the issues that arose at a meta-level that ran through all three scenarios. They were essentially three issues: the secondary use of information versus the primary, principal use; the issue of data linkage; and the capturing of information in personally identifiable form. I'm going to touch upon each of these.
We ended on the note that really the issue of technology is itself a bit of a red herring. There are basic, fundamental principles that need to be addressed in terms of protecting personal information and protecting privacy, and this ran through all the scenarios. That's where we started to draw back to the issues that arose.
On secondary use, very simply the information is collected for a primary purpose for which the individual, the data subject, should be advised at the time of collection. Any other uses subsequent to that that are unintended, that can't be expected, are called secondary uses. If you think of the examples we have, in all three of them, a lot of the problems arose out of secondary uses of the information that were unintended at the time of the data collection and that the data subject, the individual to whom the information relates, could not anticipate and could not have had any knowledge or awareness of. That's problem number one.
The second problem, data linkages, comes right from the first problem. There was a general acceptance that government in some instances does have to have some data linkages. It makes sense for law enforcement and other purposes; it is an appropriate concept. The problem arises when you get into the public versus the private sector. The public-private shift creates a great deal of problems.
But even before getting into that, there's the whole notion of what are acceptable data linkages. Well, if you advise people you're going to be doing this, you tell them. This is very important. It hinges on the difference between informed consent and the concept of notice. I think it was mentioned that when someone applies for welfare benefits, you tell them all the usage you're going to make of all the information collected on them. Are you really giving him or her a choice? Is it informed consent? Arguably not. However, this is the model that exists throughout, across public sector and private sector people.
If you go to a bank to apply for a mortgage, you're going to have to give a lot of information; otherwise they're going to deny you the service. So the whole notion of denial of service, yes, hinges on you giving the institution the information it wants. Maybe it's more the concept of notice, advising people what you're going to do with the information and what information you're going to collect, versus informed consent. It's a very difficult question.
The third issue was the capturing of personally identifiable information. If you didn't capture information in personally identifiable form, that would eliminate a number of the problems. When you think of the genetic testing example, if the individual's DNA information had been used for research, a very valid purpose, but had been used for research in a non-identifiable way, there would have been no problem. It was the fact that the information was linked in a personally identifiable form to the individual by his or her name that created the problem. That takes us to anonymizing data wherever possible. Anonymity and the anonymization of data through various technologies, privacy-enhancing technologies, leads to the elimination of a lot of these problems.
I'll end with the one thing that I think hasn't been touched on, that was sort of an anomaly, and that was the concept of royalties for your personal information, the question of whether you should be given some modest remuneration for the use of your personal information, such as your name and address.
Telemarketers use this information. It's the raw material that fuels a multi-billion-dollar industry. Shouldn't you, the individual whose information is used, get some modest form of remuneration, perhaps through a world payment scheme? It is very doable. You know, micropayments can be done through the Internet and other systems. Shouldn't we contemplate that? That was one of the intriguing issues we came up with.
The issue of control of privacy revolves around the issue of control and the ability to choose what information you choose to impart and the uses made of it. We ended with a call for broad-based legislation with strong sanctions and methods of enforcement associated with it, the final note being educate, train, and heighten awareness on privacy issues in every possible forum.
Thank you.
Ms Steeves: Thank you very much.
Rita.
Ms Rita Reynolds (Manager of Corporate Access and Privacy, Municipality of Metropolitan Toronto): Thank you.
Our group focused our discussion mainly on genetic testing, but we felt the group's views were referable to all the case studies. Similar to the previous comments, we felt very strongly that the issue of meaningful consent, as distinguished from informed consent, was important. We felt that the person should be very much involved in the decision-making process about any testing they may undergo, any uses the information should be put to and the controls on the disclosure of that information, not just at the time the information is collected but that they should have rights over time to that information, to have access to it if they wanted to have access, but at the same time to be able to make decisions about whether or not they would view the information.
Specifically, anonymity was an issue, the importance of being able to remain anonymous in a free and democratic society. The testing could have occurred - genetic testing, video surveillance could have occurred in - No, not video surveillance. The person has a right to remain anonymous in most circumstances. It didn't mean that in all circumstances video surveillance is inappropriate, but that there should be specific controls. Again and again our group came back to the issue of personal control over their own information, over the collection of the information, over the restrictions that needed to be placed on both the government and the private sector in the information they have authority to collect.
There was a concern very strongly expressed about the fact that existing privacy legislation - federally, provincially, municipally - has no teeth, and that rather than trying to go back and build greater strength into these laws, what is needed is overarching umbrella legislation that would give very clear protections to individuals over the collection of genetic information, things like video testing, biometric technologies, and that as much control as possible should remain in the hands of the individual.
There was a very strong consensus about the urgency of this new legislation, that it should be brought in as quickly as possible and that it should contain real sanctions, real penalties against governments and private sector institutions. There particularly seemed to be a very strong concern about insurance companies because of their ability to affect your future, your economic viability, and that the individual has the right to expect that government would exert its authority to provide protection to individuals. It should not be left to the private sector or to government institutions to voluntarily, in their own different ways, express these values as they see appropriate. There needs to be - and I think Mr. Gordon expressed it very well - a unified field across Canada. This is what's really required.
Ms Steeves: Thank you very much, Rita.
Frank.
Mr. Frank White (Individual Presentation): I'd like to make a personal observation first. If you take a look at all the provincial laws, our national privacy law, the OECD guidelines, the Canadian Standards Association privacy code, the European Union directive, you will see that we're not talking about principles that are particularly different.
I think the solution to a lot of this is there already. It has been investigated and legislated and standards have been developed. If you look at those, there's a very common thread, so I don't think we have to make any giant leap in terms of what the principles are.
I start out with that because I don't think this issue should be discussed and debated ad infinitum. I think the solution's out there and we just have to take that solution to the next step and make it work in different settings. I think we actually need some kind of a national standard that gives individuals, be it in the private or public sector, what those standards are so they can work toward them.
Right now, we have very much a patchwork of privacy laws, privacy expectations in place. I think what we need, as Rita said, is that overarching standard and then within that standard I think we have to play it in terms of how our privacy should be protected.
I say that in terms of some situations that may be very appropriate for a statutory regime. It seems, particularly in the group I was with, that DNA testing was very much an issue and very much a concern in terms of individuals and their privacy protection, as opposed to perhaps your address in the telephone book and already public information. Maybe there should be some standards on how that could be used, but there shouldn't be anything that's legislated.
I think we'd have to bring this patchwork together under some kind of umbrella and then try to determine how best we can provide privacy protection in a variety of ways.
Underlining that, I think what the group I was with felt was critically important was that there has to be some meaningful enforcement of whatever those rules are. That can be anything from either a self-audit, or offence provisions in legislation, but it has to be something meaningful and probably something that has some type of outside oversight if it is going to be meaningful.
The other item that underpins all of this is public awareness and training. There are just so many ideas of what the privacy expectations are or are not that I think it really is incumbent for some educational money to be loosened from different levels of government and the private sector to focus on education for privacy.
Lastly, and I thought it was a really excellent suggestion from our group, someone asked: why don't you put all of this in plain language, too?
Thank you.
Ms Steeves: Thank you, Frank.
Liz.
Ms Liz Hoffman (Ombudsperson, University of Toronto): My comments are going to be a little different, in that I would like to address process rather than content.
As chair of the board of Canada's Coalition for Public Information and a member of the Information Highway Working Group, I would like to commend the federal government, especially members of the Standing Committee on Human Rights and the Status of Persons with Disabilities, for consulting with the public on this very important issue.
The opinions on this issue of privacy are incredibly diverse, depending on where one is coming from regarding the issue and depending on how we are affected by the issue.
I would like to encourage all of you who are watching this program to become involved in the discussion related to your privacy and privacy issues in this country, issues such as who owns what information and informed consent.
Once again, I take away with me today the valuable information that is gained when the public is consulted, along with how important it is for us - if our goal is to successfully head into such a rapidly changing future - to create forums for ourselves, to talk, to listen, to share our concerns and opinions, and to work together in attempting to resolve the areas of concern.
Thank you, Madam Chair, for the opportunity your committee has provided to all of us today.
The Chair: Thank you.
Do you think it will have some impact, if I may be so forthright as to ask?
You're looking at legislation. You're on that committee. Some of the things you've heard today indicate that industry and its interests seem to have primacy over our personal information and are rather demanding about us revealing that information. Are you bringing those points of view to bear as that information is being formulated into legislation?
Ms Hoffman: Madam Chair, it's one of the reasons why receiving invitations to forums such as you're holding today are very beneficial: the shared information then continues to be taken into other forums.
Certainly the feeling of the federal government's Information Highway Advisory Council, which was in its report provided to the public last year, was that there needs to be more protection provided to the public than that provided by the Canadian Standards Association code, which we all commend and speak of very positively.
The weakness in that code is the issue of compliance and sufficient areas of sanctions not being provided for private sector areas that do not comply to that code. We feel a higher level of protection needs to be provided to the public. What is being assessed at this point is the possibility of framework legislation that would cover that weakness.
The Chair: I think that gives us a sense of comfort, because certainly in every exchange that we've been involved in, certain industry fields have been targeted as impinging on our personal right to certain protections. I won't name the industries right now because you've already heard that, but I think you ought to take that very seriously into consideration. For example, the way we feel that DNA is far more personal perhaps and has less potential - you can't recover that stuff once it's out there. I think we also feel very strongly that there are certain industries that need some particular attention as well, sectoral as well as overarching.
Ms Steeves: Thank you very much, Liz.
Now I'd like to open up the dialogue to you, the participants. For the purposes of the record, we ask you to please state your name when you do come forward to the mike.
Mr. Jim Drennan (Chief Executive Officer, Ontario Provincial Police Association): Thank you very much, Madam Chair and members of the committee.
My name is Jim Drennan and I'm with the Ontario Provincial Police Association in Barrie, Ontario.
I want to couch my comments in terms of the framework that's going on today in British Columbia. I'm going to focus more on the criminal aspect - which you can well expect from our position - of Clifford Olson's ``faint hope clause'' hearing that is going on in British Columbia right now.
The Chair: And you know it took up all the media's time -
Mr. Drennan: I want to address the issue of DNA testing. And I will do so not only on our association's behalf but I think on behalf of police officers and justice officials and certainly on behalf of victims of crime across this country.
I hope you will take this plea with you or address this question in the legislature when you return to Ottawa: that is, it is our belief, and it has been stated by the Canadian Police Association just recently, that the problem with the dragging out of the legislation on DNA testing and DNA data banks and the delay in the introduction of legislation has been more due to financial issues than to issues relating to data collection or the privacy of data collection. We think that is a serious problem for the citizens of this country. Committees have been held across Canada, and Canadians have stated outright that they support the idea of a DNA data bank.
I want to address this issue. I see Mr. Andy Scott has left the room, but it was raised in our group that we should be really concerned about the expansion of the lack of control of the data once it appears in a DNA data bank. That's unfortunate. There's a misconception among the members of the committee or the public if that's what they think happens.
We already have legislation in this country under the Identification of Criminals Act that allows for fingerprinting and photographing of offenders even before they're convicted. What we are simply asking for on behalf of victims and members of the justice community is that you take back a message about the barriers that are already part of the existing legislation on the Identification of Criminals Act and apply them to the DNA data bank legislation to get it introduced on behalf of the people I've spoken about.
I hope you'll take that message back. I certainly thank you on our behalf, the police community, for allowing us to speak today. It is a very important issue. Thank you.
Mr. John Godfrey: Just a question. I think I know the answer to this, but it would be useful to remind us. When you take that information before someone is found guilty of anything, if they're not found guilty of anything, do you destroy that information? Presumably you do the same for DNA?
Mr. Drennan: Yes.
The Chair: Thank you. I think that is a very important piece of information. That request will be honoured, and we'll get back to you. Thank you.
[Translation]
Mr. David Nicholson (Early Intervention Officer, Ontario Nurses' Association): My name is David Nicholson and I practice labour law from the union side in Toronto.
I would like to add a comment concerning all types of monitoring in the work place, including video monitoring. Our group didn't really have the time to study this issue, but I could summarize what was said about it by stating that an employee has at least the right to be told that he will be under constant monitoring. I believe employees' rights in this field should be a lot greater than they are now. An employee has the right not to be watched and monitored constantly at work, eight hours a day.
[English]
I really think there should be a recognition that even when you're at work, there is still a privacy factor at play. At the very late end of the 20th century, when we're at work, there is a recognition that we are not completely 100% under the control of our employers. There are times when one really has the right to scratch, blink or do a lot of other things that are not the business of anybody, including the employer.
I would hate to think - et franchement cela me donne des frissons dans le dos - that even I as an employee of a union would or could be watched every minute of the day.
I think that a 1984 kind of concept should be spoken to and enshrined, if not in a federal employment standards type of legislation, then as a basic human right in the human rights legislation as well.
Thanks.
[Translation]
The Chairman: I sincerely thank you for you comments about video monitoring in the work place. We will surely study this question more thoroughly. It is not the first time it has been raised. We have to find a balance between the right of a company to greater efficiency and productivity on the one hand and, as your story showed, the employee's honesty on the other hand. When the young woman who was leaving her job five or six times a day to go have a cigarette was not honest enough to admit it, that was costing money to her employer. Are decisions taken based on video monitoring only or is it one element among others of the decision process? Did the company really tell its employees they would be monitored? Did these employees understand it? I believe we have to examine more thoroughly the case law in this field.
Does somebody want to make a comment?
[English]
Mr. John Godfrey: I have a clarifying question, which is simply this.
[Translation]
The Chairman: In French please.
Mr. John Godfrey: Is collective bargaining the only way for unionized employees to defend themselves against video monitoring or can they do it through other means?
Mr. Nicholson: There a more non-unionized employees than unionized ones. What happens if during a job interview, a candidate is told that the company's policy is to monitor it's employees constantly, or eight hours a day? If a candidate objects to this, he won't be hired. A law is necessary.
[English]
The Chair: It's a condition of hire? Is that what you just said?
Mr. Nicholson: I can see that happening. If what we're saying is it's just a matter of being informed, what happens then is you go for a job interview and they say ``Look, here's our policy at this company. We're going to have a camera trained on you eight hours a day. If you don't like it, you don't have to work here.''
The Chair: I thought that was only people in the casinos.
Mr. Nicholson: Speaking on that, it makes some sense, in certain areas and if certain needs arise, to have some amount of video surveillance. But for no good reason at all, for every location at the workplace, and for eight hours a day? No.
Mr. John Godfrey: I know the postal workers, for example, through their collective bargaining, were able to get some of the video surveillance out of the workplace. That was just because there was negotiation and a contract.
Are there other ways you're aware of that the labour movement has been using to defend itself? For example, can they go to the courts? Or is there a gap there that we should be looking at filling?
[Translation]
Mr. Nicholson: Yes, I believe there is a serious flaw in this regard.
The Chairman: Mr. Bernier wants to add something and we will end the meeting afterwards.
Mr. Maurice Bernier: The issue which Mr. Nicholson is raising comes back to what we were saying earlier and to what I mentioned in my presentation. When we oppose the right of an individual to his privacy, on the one hand, to economic costs and the right of employers to make a profit, on the other hand, it is obvious that it is the right of the individual that will be sacrificed.
In response to what Mr. Godfrey was saying, it is true that the problem can be solve in a piecemeal fashion through collective bargaining, but I think the concerns which were raised today underline the importance of a comprehensive legal framework. In other words, video monitoring must be the last option to be considered, and it is to those who want to use it to prove that is it.
[English]
The Chair: Thank you very much for bringing that to our attention.
[Translation]
Mr. Nicholson: I would like to add a comment.
[English]
The Chair: Would you please approach the mike, the next person who wants to speak?
[Translation]
Mr. Nicholson: For example, an Ontario act forbids employer from using lie detectors.
[English]
The Chair: Thank you very much for the information.
Do we have time for two or three more interventions?
A voice: We have 20 minutes left.
The Chair: Oh, we have 20 minutes; we have all kinds of time.
Ms Patty Bregman (Executive Director, Advocacy Resource Centre for the Handicapped): Hi. My name is Patty Bregman. I'm a lawyer at ARCH, which is a legal clinic that deals solely with disability rights issues in Ontario.
This is an issue of great concern to people with disabilities, because frequently, in order to receive benefits, information is requested by employers for the purpose of accommodation, by insurance companies, and by government agencies.
One of my suggestions comes out of something that I think all of the groups raised, which is a concern about not only data protection but data collection. It might be helpful if in discussions we start talking about data collection and protection in order to make sure people understand there are two parts to this puzzle.
Our biggest concern is that there is no control now on what kinds of information people can request. We in our office get calls almost daily from people saying ``They requested this information, and if I don't provide it, I won't get the benefit''. Not only do employers want to know that a person needs an accommodation; they say they need to know every drug the person has ever taken. So it's critical for this committee to deal with the issue of data collection.
Provincially, for example in Ontario, there are restrictions on what can be collected and there is oversight of what can be collected. We really need to make sure that's part of the solution.
Tagging onto what Ann said about primary and secondary uses, we need to make sure we narrowly define ``primary use'', because the provincial government in Ontario, for example, is talking about integrating the health care system, so information will go to every health care provider in the province about everybody. That, to me, is not an acceptable primary use. That's what they've suggested, and thanks to our very strong Privacy Commissioner in Ontario, they are currently drafting legislation.
Although I should tell you, I worked for the Krever commission dealing with the confidentiality of health records in 1978, and I'm still waiting to actually see something come out of that.
My second point has to do with loss of benefits. Once you've assumed the information can be collected, it's very important that nobody lose a benefit or entitlement without an opportunity to respond. We see this as a due process issue. You should not be cut off welfare and you should not lose drug benefits unless you are first notified that this is going to happen, given copies of the information, and given an opportunity to respond, correct, and argue your point. Otherwise, again, people will not feel comfortable giving information.
The third thing is literacy and how we communicate notice. Many people with disabilities may not be able to read. Certainly people with visual impairments can't read a written notice. We have to make sure that whatever kind of interaction is done in terms of collecting information is done in a way people can understand. We don't have a right to sign language interpreters in every setting. We need to make sure that, again, you take this into account.
Finally - this particularly relates to technology, and it just needs to be on the table - international issues are critical. Our information can be stored out of the jurisdiction or out of the country, and there are very few ways to protect people when that happens. First of all, usually you don't know it, and secondly, I'll give an example from Krever, where we were dealing with a private company that collected information without consent of people. The head office was based in Atlanta. We could not compel that head office to come here and be accountable. So what we need to do is make sure that whoever in Canada controls that information is accountable and won't be able to shift it off and say, ``It wasn't our problem. We contracted with a company. The company is based in the U.S.'' I know Ann Cavoukian has done a lot of work on this.
Those are my main -
The Chair: Tell me, did you happen to look at the latest legislation of the OECD and the European Community? If you have, could you tell us, or could you let me know at least, if you believe it would cover the issue of discrimination you've talked about and access to information of secondary -
Ms Bregman: To be honest, I haven't looked at it in enough detail to really comment at this point. Ann may be able to give you some advice on that.
The Chair: I was just going to say that.
Ann, have you?
Ms Cavoukian: Are you speaking of the directive on data protection?
The Chair: Yes. Ann, do you have some response to the observations that were just brought to us?
Ms Cavoukian: The draft directive on data protection is what's being referred to here. It's a directive that was developed by the European Union, the EU, for member countries. It's been in the process for about five years, and it was approved last year. It's a very broad-based requirement that will require EU countries to harmonize their privacy laws within three years, by 1998, so that they all have a common standard of data protection, and that standard of data protection is very high. If we had that here, I would be delighted.
The Chair: We have accessed that information from Atlanta; it was needed by the Krever commission.
Ms Cavoukian: If we had that here and the United States also had it -
The Chair: Yes, reciprocal.
Ms Cavoukian: - then they would both be bound, and we would have access to it and there would be reciprocal requirements. But unless everybody's covered by it or follows it, the United States is under no obligation to follow that directive.
The Chair: So put the information in Paris.
Ms Cavoukian: Yes, it's a much better place to put it, in the hands of an EU country.
I also want to comment on the first point that Patty raised, which is an excellent point. People regularly confuse the notion of privacy and data protection with the notion of confidentiality, and it's a very important distinction. Privacy and confidentiality are not one and the same; in fact, they're quite distinct.
Privacy relates to a broad scope of protection, starting with the collection of personal information and placing restrictions on what should be collected. Because, as Patty says, once you get the information collected, then it's far more difficult to prohibit the uses of that information.
Confidentiality is only one means of protecting privacy, and it applies when you actually have the information. After you've collected it, then confidentiality means safeguarding it from unauthorized access and use. But it becomes far more difficult to ensure confidentiality, especially in this day of electronic records, data linkages, and network communications. So the first protection is don't collect it. That's what privacy is about.
Ms Steeves: Thank you very much.
Would you like to respond to that?
Mr. White: I'd like to make a quick comment about the European Union directives. That will require that if data is transferred out of one of the European Union countries, the country it is transferred to would have to have equivalent data protection.
That's fine if it's transferred to a Canadian government institution, let's say. If it's transferred to let's say an insurance company in New York, then you contractually arrange with that insurance company to provide an equivalent level of data protection in New York.
The problem is with auditing that, and how do you audit a lot of the time? It's almost impossible to audit in another country. So again, it's not necessarily the contractual provisions but how you follow up and audit and make sure those requirements are being followed.
Ms Steeves: Thank you.
The Chair: I'd like to say that was one of the issues we looked at when the German subway contract was given to the United States, and they used special protections that had to be accorded. So thank you for reminding us of that.
Ms Catherine A. Johnston (President, Advanced Card Technology): I'm Catherine Johnston from the Advanced Card Technology Association. On behalf of our members, we're truly grateful for the opportunity to participate, and I look forward to going back to the board and telling them that very meaningful discussions took place today.
We represent those pesky smart optical and capacitive cards that we spent so much time talking about this morning. I was very impressed that our contact here had a CANPASS with him on an optical card.
I'd very briefly like to talk about the technologies that none of us in this room could possibly dream about but technologies about which our children will likely sit in a room 20 years from now having these same types of discussions.
It is not totally a technology issue. Technology does facilitate the movement of data, and we always have to be cognizant of that. But the reality is that the principles of privacy never change; it's public perception that changes.
I would imagine, looking around the room, that most of us grew up in houses where the front door was never locked when we came home from school, because someone was there. In these days, I lock my car inside my garage, which has a garage door opener attached to the house, which has a monitoring system on it, because I perceive the threats as being very different from those my parents faced when I was a child.
So what we really need to come to grips with is a process whereby we can reassess the threats and use technology, not to facilitate the bad guys but to keep them out of our privacy.
On behalf of technology providers today and in the future, they need to know the rules. They need to know them, and they have to build them into their applications and technologies. They need to know that the public won't accept them if they don't do this. They need to know that there are penalties by law if they fail to do it adequately. If we can do that, then we are going to put our children in a much stronger position.
Thank you very much.
The Chair: That's a very fine summation. I must remember why I have that garage door the way it is. Frankly, I still don't remember to lock the door all the time.
Professor Calvin Gotlieb (Department of Computer Science, University of Toronto): I'm Calvin Gotlieb, professor emeritus in the department of computer science of the University of Toronto. I'm a member of the Information Highway Working Group and the Canadian Information Processing Society.
What I have to say has been discussed in several other forms, but I wouldn't presume to say that it has been agreed to. First of all, several of us at several places are pleased to see that the CSA is being taken as a starting point for legislation. We think there will be an improvement there. We agree with what was said earlier that it has to be augmented by better compliance.
Once we have legislation, it's important that the legislation be implemented and effective. We have good cause to worry about that. Very often, in most legislation, privacy is connected with freedom of information.
Look at Bill C-43, the government's freedom of information act. Take that side of it and listen to what is happening in the Somalia case or in the Krever commission about making free access to information. Then you realize that legislation is only the first step; implementation and effectiveness are important.
I happen to be a contrarian with regard to privacy. Some of you may know that I have written a paper called ``Privacy: A concept whose time has come and gone''. I think it's a very complicated thing. It's a perception, and it's always changing. It's a very hard problem.
I think we ought to focus on the problem we can handle, which is confidentiality, which, as Ms Cavoukian said, is in fact only part of the privacy problem.
There are at least two techniques that we can bring to bear in this, one very old and one very new. One is contracts. In fact, through contracts, we can get the royalties she wants. That has been explored. Contracts work. They've been here for a long time. Incidentally, to make them work, you need courts, so we've got problems there.
The other technique is encryption, which has been mentioned. I have a mathematical background, and I put a lot of faith in encryption, more than I do in courts or legislation.
Thank you.
Ms Steeves: Thank you very much. We have time for perhaps one or two short comments. Ann would like to step in and make a short comment.
Ms Cavoukian: I have to say that one area Calvin and I agree on completely is that of encryption. We'll be looking toward technologies of encryption increasingly in a growing network of an electronic world to protect the information that our laws will not be able to protect for jurisdictional reasons for a variety of reasons. The laws are necessary, there's no question, as they they impart a certain message, but as for encryption and a variety of means by which to encrypt, we're in the pioneering days. You'll have point and click encryption hopefully by 2000, maybe sooner. That will go a long way to protecting privacy and confidentiality.
Ms Steeves: Liz, would you like to respond to that?
Ms Hoffman: I think other issue that Calvin brings up in this situation is the variety of views related to privacy and what people across the country feel is important to remain private or confidential and what people don't care about people having access to. I think that's caused an added tension or added diversity in this issue as we try to establish whether it's legislation or protection. How do you apply the legislation when in fact the public has such varied opinions on what should be private and what should not be private?
This is another reason why I think it is key that we are doing this with opportunities for the public to come forward. It is so that we have that diversity of opinion, versus a few people in a back room trying to decide or assess what the reality is for the public.
Ms Steeves: Thank you.
Ms Mairi S. MacDonald (Individual Presentation): Good afternoon. I'm Mairi MacDonald. I'm a lawyer in private practice in Toronto. I'm also the vice-chair of the media and communications law section of the Canadian Bar Association.
Even though I don't have the official sanction of my section to say this, I think I'm on fairly safe ground - this is picking up on what Ms Hoffman has been saying and also answering the question that Ms Augustine asked our group this morning - in saying what legislators coming out of this section can do.
One of the things legislators can do and that you can do as a committee - I would strongly recommend and ask you to do this - is to ensure that the need for speed in passage of any legislation dealing with this wide range of rights and issues does not overwhelm the need for public education and consultation about the actual legislation that gets put forward.
It has been extremely interesting this morning to discuss case studies and the sorts of issues that might be raised by the case studies you put in front of us. I'm really glad you've done that, but at the end of the day, you will get more specific and focused comments if people have had an opportunity to review the proposed legislation and think about how that affects their own rights, the rights of their clients, the people they represent or the people they know, and then give you input on something focused and specific.
So I'm just asking that when the government goes through the process of tabling legislation on these matters, you recommend there be a long and serious public consultation process.
Thank you.
The Chair: I'd like to just answer that with a thought. I know how useful the information has been that has come from the bar associations. I've done copyright law with the help of your bar association and others.
But I also know that there is a very important role that you have to play on a paper that will be coming out. There will be a white paper on the changes that are required for private sector protection.
I'd like you to look at it in terms of privacy and individual human rights, not just for your clients or the industries you often represent. Let's look at it in terms of the individual in the public and make sure that privacy issues and human rights are part and parcel of the white paper.
Could you promise me to do that if I promise to do what you asked?
Ms MacDonald: I can only promise for my committee, which is media and communications law, but yes, I could certainly make that promise. I can also make sure that your message gets through to the Canadian Bar Association headquarters, which will collect and make coherent recommendations and responses to this that will come from a whole wide range of sections of legal practice. So yes, I promise.
The Chair: Thank you very much.
Ms Steeves: Thanks, Mairi.
We have a very short period of time left. There are two people left at the mike. Perhaps if you have some brief comments -
Mr. Tim Fletcher (Hamilton - Wentworth Regional Police): I'll keep it brief. My name's Tim Fletcher. I'm with the Hamilton - Wentworth Regional Police.
I guess the banks and insurance companies and what not have taken quite a hit today. It's important to remember that these are not supposed to be non-profit organizations, and they have a responsibility to their shareholders and the corporation themselves to protect them. Therefore, they feel bound to seek an awful lot of this information because they have a fiduciary duty to do so.
The legislation we're all talking about I think has to be addressed to remove this duty from them so they don't feel duty-bound to collect it. They can say to their shareholders that they're not allowed to collect this information regardless of the benefit it would bring to them. That would then protect them from lawsuits or accusations of neglect of duty.
It would go even to the point at which we could say to these corporations - this is whatever they may be, even information resellers - that not only are you prohibited from collecting this information, but you are not even allowed to ask the person for voluntary consent. That's if we want to protect it that much.
If you submit a form for a loan that contains blank spaces on it, the bank manager looks at you and says that maybe you're not going to get this loan because you're not fully disclosing, even though they know the legislation says they're not really allowed to ask this. If you leave a blank form on it, they're going to look at you. So don't even allow them to put it on the form in the first place. It therefore doesn't become subjective. That's a thing we have to look at for basic collection points. Don't even allow them to collect it if it should not play a role in the final decision.
The Chair: I think I remember your very enlightened interventions. I think you were specifically referring to genetic information, were you not?
Mr. Fletcher: No, it's in any case where information should not be collected. Again, we come back to the overarching principle: whether it's video surveillance or genetics or smart cards, the same principles apply in all cases. So if we don't want to collect it, make it so the corporation can protect itself to its shareholders, for instance. They're not even allowed to ask, so they're absolved from that duty.
Ms Steeves: Thank you for your comments.
We have time for one very short comment.
Mr. Elliott Goldstein (Individual Presentation): From a very short person.
Madam Chair and members of this committee, my name is Elliott Goldstein. I'm a lawyer. I have written a book on visual evidence, which deals with video surveillance in the workplace and in the home.
Many of my clients are companies in the security industry who sell equipment, who install it, whose employees actually watch the monitors the cameras are connected to. The feedback I get from my clients is that they're very confused about the myriad of laws that apply to them. Different provinces have different types of legislation.
If the committee's going to recommend any legislation in the future, whether it's changes to the Criminal Code or even amendments to the charter or to any type of provincial or federal legislation, we ask that they deal with two issues.
The first issue is any restriction with respect to the collection of the information - that is, where the surveillance can be conducted, under what circumstances. This deals with the private sector, of course.
The second issue is what use, if any, can be put to the videotapes after they are collected if they are not going to be used for court purposes? If they're turned over to the police, they become the property of the court and the judges regulate the admissibility of the evidence and its dissemination to the media. But if not, they want to make sure the videotapes can be used for educational and training purposes and they want to see what standards and guidelines there are with respect to the subsequent use of the video surveillance information - how long do they have to keep it, can they erase it, etc.
Thank you very much.
Ms Steeves: I'd like to thank you all for your participation today. It's been a really enlightening time and the dialogue's been fantastic. Thank you.
Mrs. Finestone.
The Chair: Thank you.
Mr. Fletcher, please know that I've turned over the documents, so they will be part of the record. We will see that they're circulated to the committee.
Thank you, everybody. I hope we are able to meet the task you've put before us and the very broad range of information you've shared. See you in Fredericton.
The meeting is adjourned.