Ontario IPC issues guidance on police use of facial recognition and mug shots

European Parliament passes landmark AI Act on March 13

UK AI regulation bill receives second reading

AI Notetakers – the risks and benefits

UN adopts AI resolution which focuses on safety

Ontario school boards sue makers of Facebook, Instagram, Snapchat and TikTok

Tennessee Elvis Act, replication of voices” by AI

Australian government proposes to implement AI changes

Podcast -Ontario IPC discusses facial recognition

Draft American Privacy Act introduced

“Unlocking Health Care: How to Free the Flow of Life-Saving Health Data in Canada”: An appeal from Canada’s Public Policy Forum

“Unlocking Health Care: How to Free the Flow of Life-Saving Health Data in Canada”: An appeal from Canada’s Public Policy Forum

The Canadian Public Policy Forum’s (PPF) recent report entitled, “Unlocking Health Care: How to Free the Flow of Life-Saving Health Data in Canada” has received a lot of media attention since its release last week. It included some important recommendations relating to the need to ensure personal health information is accessible digitally to all patients and health care providers in a timely, and privacy and security protective manner.

For context, it’s the third report in a three-part study that sought to address what it described as “the shortcomings in Canada’s precious health-care systems.” The first report dealt with accessibility to health services. The second report focused on the delivery of primary care.

Our office is a proponent of secure, privacy protective health information systems that enable secure and appropriate retention, access, use and disclosures of personal health information. We appreciate that there is great value in interoperable digital health records for all residents of Canada. Moreover, we know that privacy is not a barrier to innovation.

Long before the PPF wrote about the value of privacy-protective digital health innovations, Canadian privacy authorities passed a resolution entitled, “Securing Public Trust in Digital Healthcare” calling for concerted effort, leadership and resolve in implementing a modern, secure and interoperable digital health communications infrastructure.

The PPF paper is noteworthy because it explained how better health outcomes for Canadians can be achieved with “a high functioning, data- and digital rich system.” The paper uses a series of case studies illustrating how technology can support timely and efficient health care delivery.

The paper also provided examples of how antiquated communications systems can frustrate timely and appropriate health care services. Not surprisingly, the PPF included a number of examples relating to the use of fax machines to communicate. In one example, a physician working in a family clinic and in a number of hospitals in their province, explained how test results intended for them are regularly sent by fax to the wrong clinic or office.

The PPF paper noted that one consequence of these types of incidents or privacy breaches is the psychological impact on patients and physicians who are frustrated spending time chasing down faxes. It stated:

Canada’s continued reliance on phone calls with no return number, paper letters and fax machines impede critical referrals and prescriptions, potentially lifesaving acts of care. Our seeming inability to move beyond outmoded forms of communication delays vital treatments and extracts a psychological toll on patients and the people caring for them, who often must chase down a misdirected or overlooked fax. We cannot state it strongly enough: lives depend on this information.

The paper reported that in one study, e-referral systems shaved 21.4 days off wait times for Canadian orthopaedic surgeons compared with paper-based referrals, such as faxing.

Our office has had its own experience that illustrates how a paper-based system poses a serious risk to privacy. Since 2018, our office has opened approximately 84 files and issued 18 investigation reports involving misdirected faxes. Many of the reports involved multiple misdirected faxes.

For example, Investigation Report 045-2021, et al, involved 23 misdirected health records originating from four different trustees and Investigation Report 164-2023, et al, involved 86 misdirected health records. As there is no requirement to report breaches to our office, we would have no idea how many privacy breaches have resulted from misdirected faxes in Saskatchewan.

In Ontario, the former Information and Privacy Commissioner noted in his 2021 Annual Report, that his office received 4,848 breach reports related to misdirected faxes.

The PPF paper concluded with twelve recommendations as to how governments and partners can work towards making all health records accessible digitally by 2028. While the recommendations are presented as key to achieving these goals, I was particularly pleased to see that among the recommendations is a recommendation that Canada prioritize national safeguards for the collection, analysis, sharing and use of health data.

According to the report, this included ensuring that privacy and security of health data must be preserved in a way that maximizes the benefits for individuals and for the community at large.

The report also recommended a commitment to being paperless, interoperable and with seamless user access by 2028, starting with eliminating transmission of medical information by fax machines in 2024.

Finally, it recommended that e-consultations, e-referrals and e-prescriptions between all clinical service providers should be made available through fully interoperable digital health platforms.

We commend the PPF for its work and encourage you to read the report.

For any questions, contact intake@oipc.sk.ca

Cyber Security Threats – How can you Prepare and What to do After

Cyber security threats are becoming an ever-growing issue as technology and digital information continues to grow and evolve. These types of incidents are a malicious means to steal or destroy data or disrupt computer systems and could result in a breach of personal or personal health information if they do occur. Some common security threats include malware, phishing, and ransomware.

What steps can an organization take to reduce the risk of a cyber security incident and any potential breaches that may come from it? The following are some things to consider:

  • Keep your software and systems updated regularly.
  • Use strong passwords and change them frequently to limit the risk.
  • Use security software and a firewall to protect your network and data.
  • Use multi-factor authentication for your accounts.
  • Back up your data regularly.
  • Train yourself and your staff on basic cyber security principles and how to spot suspicious activity.
  • If you use an outside information technology provider or information management services provider (IMSP), be sure to have agreements in place for regular monitoring of security threats and updating of any security software.
  • Develop and follow cyber security policies and procedures.
  • Have a cyber incident management plan in place so that managing the attack can begin immediately and staff will know their role.

A cyber security incident has occurred – now what?

Implement your cyber security incident management plan which may include things like the following:

  • Identify potential evidence, preserve it, and ensure nothing is lost or damaged.
  • Isolate your network from the Internet and activate your incident response plan.
  • Take note of who was present in your organization before, during, and after the incident.
  • Appoint a point of contact for law enforcement officers to speak to directly and gather information about the incident.
  • Document the report number provided to you by law enforcement.
  • Anticipate law enforcement may need access to your equipment to analyze the technological components of the cyber incident. The police will work with you to collect evidence while minimizing the impacts to your business and recovery efforts.
  • Provide logs, employee statements, emails, and other similar items as potential evidence.
  • Produce a list of key contacts within your organization for law enforcement.
  • Communicate the incident to staff, business associates, clients, and partners.
  • Review your cyber security policies and ensure your staff receive training.
  • Consider purchasing anti-malware and anti-virus software for your network and devices.
  • Enhance your data security with protective measures (e.g., firewalls, virtual private networks, encryption).
  • Prepare your organization for the possibility of testifying in court.

Government of Canada. (November 2021). Have you been a victim of cybercrime?

https://www.cyber.gc.ca/en/guidance/have-you-been-victim-cybercrime

Our office has issued some investigation reports involving this topic:

Investigation Report 009-2020, 053-2020, 224-2020

Investigation Report 398-2019, 399-2019, 417-2019, 005-2020, 019-2020, 021-2020

Investigation Report 370-2022

Investigation Report 098-2021

Some resources available for information on these types of incidents:

Chatbots and Security

Ransomware

Ransomware – What Everyone Should Know

Security and Phishing Presentation

S2 – Episode 7: Unmasking digital threats: How to guard against cyber crime

 

Raising Awareness of the Facts about Fax

The ongoing use of traditional fax machines to send personal information and personal health information by government institutions and trustees continues to raise privacy concerns. My office and Canada’s other privacy commissioners and ombudspersons called for a concerted effort to phase out the use of traditional fax machines in a September 2022 resolution which can be found here. We understand that developing this plan will require broad consultations and additional resources. However, we continue to urge organizations to address this problem on an urgent basis. Public trust and confidence in organizations’ ability to protect Saskatchewan residents’ personal information and personal health information hangs in the balance.

In the meantime, we continue to receive complaints and reported breaches of misdirected faxes that are caused in part by human error. Staff may enter a number in the fax machine incorrectly, fail to comply with policies that require the use of pre-programmed fax numbers or rely on fax numbers found through unverified sources, such as Google. These errors are often caused by inattention, or lack of awareness or training on applicable policies. The office issued an investigation report in November 2022 involving two Saskatchewan Health Authority employees who entered an incorrect fax number in the fax machine. They sent one of the faxes to a Town instead of a public health office. They sent the other fax to the Parole Board of Canada’s office instead of a physician.

Trustees should be aware that the shift from traditional fax machines to digital fax solutions is not sufficient, by itself, to reduce privacy risks. This was shown in Investigation Report 164-2023, et al, which involved 12 different trustees and numerous misdirected faxes. In most cases, the trustees used digital faxing systems. The breaches occurred when staff sent faxes intended for one physician to a different physician with the same last name. In some cases, the faxes were misdirected because the employee involved did not receive clear direction on the recipient. In other cases, the fax was misdirected because of errors in the physician directory or because the employee chose the wrong physician from a drop-down list in the directory.

In September 2020, my office issued guidance on the safeguards to prevent misdirected faxes titled, Faxing PI and PHI. While plans are being developed to discontinue the use of traditional fax machines, every effort must be made to ensure that appropriate safeguards are in place to prevent faxes from going astray. We encourage all organizations to revisit this guidance.

To help ensure that staff are aware of their need to comply with existing policy and to exercise caution when faxing, we have developed a poster that you can download and place in key areas.

Remember that a policy is not enough! Creating a privacy sensitive culture requires that organizations raise levels of awareness of privacy risks and provide appropriate training.

For any questions, contact intake@oipc.sk.ca

Privacy Matters

Advocate’s Report on Independent Schools

The Saskatchewan Advocate for Children and Youth issued her investigation report regarding independent schools in December 2023. She made 36 recommendations, a number of which relate to the access and privacy world. The Freedom of Information and Protection of Privacy Act (FOIP) and The Local Authority Freedom of Information and Protection of Privacy Act (LA FOIP) deal with the collection, use, and disclosure of personal information, and the protection of that information by government institutions or local authorities. With that in mind, I note the following recommendations in the Advocate’s report:

External Accountability and Participation Rights of Young People

Recommendation 4: The Government of Saskatchewan amend The Registered Independent Schools Regulations to recognize the right and entitlement of all pupils of sufficient maturity to immediate access to all procedures established by the board of a registered independent school for the purposes of investigation and mediation of any differences or conflicts with the independent school.

Recommendation 5: The Government of Saskatchewan amend section 35 of The Registered Independent Schools Regulations to recognize the right and entitlement of all pupils of sufficient maturity attending, or having previously attended, registered independent schools to independently access their own records.

Recommendation 6: The Government of Saskatchewan amend section 35 of The Registered Independent Schools Regulations to protect the right of all pupils under 18 years of age from disclosures of information that would constitute an unreasonable invasion of the pupil’s privacy.

Recommendation 7: The Ministry of Education amend the Registered Independent Schools Policy and Procedures Manual to reflect changes made to The Registered Independent Schools Regulations related to access to records and protection of privacy, as recommended in this report.

Recommendation 8: The Government of Saskatchewan amend section 148 of The Education Act, 1995 to recognize the right and entitlement of all pupils of sufficient maturity to immediate access to procedures established by the board of education or the Conseil scolaire fransaskois for the purposes of investigation and mediation of any differences or conflicts with the school.

Data on Learning Output

Recommendation 25: The Ministry of Education review its processes of collection, entry, storage and tracking of data from registered independent schools on learning outputs, make improvements to ensure the accuracy of data in Ministry records and develop policy and procedures on these processes.

It is clear independent schools serve the needs of parents and their children. The more transparent an independent school is, the more parents and students will know what is happening in their school. At the same time, the better protected student information is, the more comfortable parents and students will be regarding their personal information. Finally, the easier it is for parents or students to access their information, the more comfortable they will be that the independent school is collecting the proper information and making sure that information is accurate.

Regular school boards in Saskatchewan have these obligations as they are subject to the rules of collection, use, disclosure and protection outlined in LA FOIP.

I have written the Minister of Education requesting that his government make independent schools local authorities under LA FOIP. I am hopeful that the Minister would give serious consideration to doing so.

 

Is De-identified Information Personal Information?

Now and then, our office receives requests for review where a public body (government institution, local authority or health trustee) denied access pursuant to subsection 29(1) of The Freedom of Information and Protection of Privacy Act (FOIP), or subsection 28(1) of The Local Authority Freedom of Information and Protection of Privacy Act (LA FOIP) or subsection 27(1) of The Health Information Protection Act (HIPA). Therefore, I thought it may be helpful to explore if de-identified information is personal information.

To qualify as personal information, the information must: 1) be about an identifiable individual; and 2) be personal in nature. Information is about an “identifiable individual” if the individual can be identified from the information (e.g., their name is provided) or if the information, when combined with information otherwise available, could reasonably allow the individual to be identified. To be “personal in nature” requires that the information reveal something personal about the identifiable individual.

One of the most effective ways to protect the privacy of individuals is through strong de-identification. Using proper de-identification techniques and re-identification risk management procedures, remains one of the strongest and most important tools in protecting privacy.

“De-identification” is the general term for the process of removing personal information from a record or data set.

“De-identified information” is information that cannot be used to identify an individual, either directly or indirectly. Information is de-identified if it does not identify an individual, and it is not reasonably foreseeable in the circumstances that the information could be used, either alone or with other information, to identify an individual.

Subsection 2(1)(d) of HIPA defines “de-identified personal health information” as personal health information from which any information that may reasonably be expected to identify an individual has been removed. This is important as subsection 3(2)(a) of HIPA provides that HIPA does not apply to “statistical information or de-identified personal health information.”

The goal is to reduce the risk of re-identification of information once it has been de-identified. The following table shows decreasing probability of re-identification of information:

State Description
1. Identifiable data The data have directly identifying variables or sufficient quasi-identifiers that can be used to identify the individual.
2. Potentially de-identified data Manipulations have been performed on the identifying variables but attempts to disguise the quasi-identifiers may be insufficient. The data may not be fully deidentified, partially exposed, and may represent a re-identification risk.
3. De-identified data An objective assessment of re-identification risk has been done and it is concluded that all directly identifying variables have been adequately manipulated and quasi-identifiers adequately disguised to ensure an acceptable level of re-identification risk.
4. Aggregate data These are summary data such as tables or counts, where there are no identifying variables or quasi-identifiers.

For further explanation regarding de-identified information, please refer to our resources available on our website: IPC Guide to FOIP – Chapter 6 and IPC Guide to LA FOIP – Chapter 6.

Public bodies may find the following recent review reports issued by our office helpful on this topic:

  • IPC Review Report 060-2023 – in this Review Report at paragraph [19], the Commissioner found that the “claim numbers” assigned to individuals by Saskatchewan Government Insurance (SGI) were personal information pursuant to subsection 24(1)(d) of FOIP. However, once the “claim numbers” which were assigned to particular individuals were redacted, any personal health information attached to those numbers, such as reason for doctor appointments, became de-identified information and were releasable.
  • IPC Review Report 063-2023 – in this matter, the Ministry of Health denied access to a spreadsheet of 18 columns pursuant to subsections 29(1) of FOIP and 27(1) of HIPA. However, the Commissioner found that once a few columns of personal information were redacted pursuant to subsection 29(1) of FOIP, the remaining data in the spreadsheet became sufficiently de-identified, and was releasable.

Hopefully, the above will assist you in successfully de-identifying personal information or personal health information. For any questions, please contact our office at intake@oipc.sk.ca.

 

How has the Pandemic’s Seismic Shift to Increased Remote Work Affected Privacy?

A recent poll by a global market research and public opinion specialist – IPSOS, revealed that approximately 36% of people who worked from home during the pandemic anticipate a return to their offices on a regular basis in the near future. This means that more than half of paid employees expect a work-from-home environment to be their new normal at least in the short term. For many organizations/institutions, the transitions toward work-from-home structures means employers have had to create additional privacy policies/procedures or amend existing ones. As more work-from-home job options for employees are created, elevated levels of concerns arise regarding safeguards for both client/customer privacy and employee privacy including proper use/storage of client personal information, wider scope of employee monitoring, safety of work devices and safety of internal work communications.

From purchase order patterns to credit card information to home addresses, clients’ personal information is collected, used and disseminated by organizations/institutions regularly. Employee records such as payroll information, attendance reports, formal and informal personnel files, resumes, records of web-browsing, discipline notes, sick notes and evaluations also circulate within workspaces as needed.

To protect clients’/customers’ personal information, organizations should require employees to use strong remote access controls such as multi-factor authentication, a Virtual Private Network (VPN) with end-to-end encryption and a secured WiFi. In addition, staff should obtain permission before installing any non-approved software/apps. Though not recommended, if employees are required to use their personal devices for work, employers should remind them of safe privacy practices. Regarding home workspaces, the Information and Privacy Commissioner of Ontario advises employers to remind their staff to set up private workspaces where private information, conversations and meetings can be properly safeguarded. For more tips, see our blog, Working from home.

The Office of the Privacy Commissioner of Canada instructs organizations to consider the privacy rights of its employees when monitoring their work at home. This is important because, if not implemented properly, digital surveillance of employees can lead to heightened stress levels, reduced autonomy and creativity and mental health effects. Regarding safeguarding employee personal information in a work-from-home context, the Information and Privacy Commissioner of Ontario encourages organizations to remind staff that office and security policies also apply in a work-from-home context and staff need to report any security breach incidents immediately.

Furthermore, a longer-term work from home strategy should be created by organizations and include monitoring and evaluating effectiveness of access and privacy in a remote context, an update of corporate files and secure records keeping. As more employers shift toward increased remote work arrangements and use of monitoring technologies in this digital world, the privacy authorities have called on governments to develop or strengthen laws to protect privacy. They have also asked employers to be more transparent and accountable in their workplace monitoring practices.

 

Access, Privacy, Children and Joint Legal Custodians (updated)

Commissioner Kruzeniski’s blog Who Signs for a Child? (updated) described the rules under The Freedom of Information and Protection of Privacy Act (FOIP), The Local Authority Freedom of Information and Protection of Privacy Act (LA FOIP), and The Health Information Protection Act (HIPA) applicable to legal custodians.

The Commissioner explained that subsections 59(d) of FOIP, 49(d) of LA FOIP and 56 of HIPA give a legal custodian the right to sign on behalf of their child. He added that depending on the terms of any applicable court order, agreement, one or both parents could sign for the child.

Since publishing that blog, our office conducted a privacy investigation involving two parents who had signed an Interspousal Agreement which included a provision that they would have joint custody of the children of the marriage. The parents disagreed about whether one of their child’s information should be disclosed to a stepparent.

In Investigation Report 083-2022, the Commissioner found that where two legal custodians, with equal rights and responsibilities under an Interspousal Agreement, disagreed, the wishes of one legal custodian could not prevail over the wishes of the other.

This raised the question “How does the head of a local authority or institution, or trustee manage access to information requests or consents to collection, use or disclosure involving children where their joint legal custodians disagree?”

There is no requirement for a head or trustee to canvass the views of every legal custodian to satisfy themselves that the custodian making a request or signing on behalf of a child is doing so with the agreement of the other. However, where a head or trustee is aware that one of the joint legal custodians does not agree with a request or consent provided by the other, they should not rely on the direction of one legal custodian, only.

When determining whether legal custodians have equal rights and responsibilities, heads and trustees will need to consider subsection 3(1) of The Children’s Law Act, 2020 (CLA) which provides:

3(1) Unless otherwise ordered by the court and subject to subsection (2) and any agreement pursuant to subsection (3), the parents of a child are joint legal decision makers for the child, with equal powers and responsibilities.

If there is a court order or agreement between the parties, the legal rights and responsibilities of the parents will be determined by the applicable order or agreement.

The rights of joint custodial parents were considered in a recent review of an access decision. In the Commissioner’s Review Report 175-2022, a joint custodial parent (Applicant) sought access to their 17-year-old child’s personal health information from eHealth Saskatchewan (eHealth). There was a court order in place which stated that the Applicant and the child’s father had joint custody of their child. Based on a review of the court order, the Commissioner found that the Applicant was the child’s legal custodian. However, unlike the circumstances in Investigation Report 083-2022, in this case, there was no evidence before the Commissioner or eHealth that the child’s father objected to the disclosure of the child’s personal health information. Therefore, the Commissioner found that there was no obligation on eHealth to canvass the views of the child’s father in making its decision on whether to grant the Applicant access to the requested records.

However, it should be noted that the Commissioner found that the child was a mature minor and was capable of exercising their own rights and powers. Therefore, he found that the Applicant did not have the ability to exercise their child’s right of access pursuant to subsection 56(d) of HIPA.

 

 

 

Principles for Responsible, Trustworthy and Privacy-Protective Generative AI Technologies

Artificial intelligence is transforming the business world including hiring, auditing, accounting and forecasting processes (Khoury, Richard, “Artificial Intelligence in Canadian Industry”). It is expected to improve health care and change the way it is delivered, such as by increasing diagnostic accuracy, improving treatment planning and forecasting outcomes of care (CMPA, “The Emergence of AI in Healthcare”). Governments are exploring the use of AI to improve teaching, learning and support innovative education systems. In the education sphere, AI is reported to have the capability of creating personalized learning experiences and optimized curricula.

To address the exponential growth in the use and development of generative AI solutions and tools, Canada’s federal, provincial and territorial privacy regulators have released joint guidance for responsible, trustworthy and privacy-protective generative AI technologies.

The guidance describes “generative AI” as follows:

a subset of machine learning in which systems are trained on massive information sets – often including personal information – to generate content such as text, computer cord, images, video, or audio in response to a user prompt. The content is probabilistic, and may vary even in response to multiple uses of the same or similar prompts.

The guidance is intended to help organizations developing, providing or using generative AI apply nine key Canadian privacy principles.

It reminds the reader that they may have further obligations, restrictions or responsibilities under other laws, regulations or policies. It includes an important note regarding the need for extra caution to identify and prevent risks to vulnerable groups, including children and groups that have historically experienced discrimination or bias.

Highlights from the nine principles are:

  1. Legal authority and consent – identify and document the legal authority for the collection, use, disclosure and deletion of personal information during the training, development, use or decommissioning of a generative AI system. Where consent is the legal authority, ensure it is valid.
  2. Appropriate purposes – in many Canadian jurisdictions, this means that personal information should be collected, used or disclosed for purposes that a reasonable person would consider appropriate in the circumstances.
  3. Necessity and proportionality – consider whether the use of a generative AI system is necessary and proportionate particularly where it may have a significant impact on individuals or groups of individuals. Use anonymized, synthetic or de-identified data rather than personal information where the latter is not required to fulfill the identified appropriate purposes.
  4. Openness – be open and transparent about the collection, use and disclosure of personal information and the potential risks to privacy. Organizations using generative AI systems should advise affected parties how the system will be used to make a decision or take an action, and about the potential outcomes and safeguards in place.
  5. Accountability – be accountable for compliance with privacy legislation and principles and make AI tools explainable.
  6. Individual access – develop procedures that enable the right of access to personal information collected about them during use of the system and personal information contained in the AI model to be meaningfully exercised.
  7. Limiting collection, use and disclosure – limit the collection, use and disclosure to what is needed to fulfill the explicitly specified, appropriate, identified purpose. Use anonymized or de-identified data where possible.
  8. Accuracy – ensure personal information used to train generative AI models and entered into a generative AI prompt is as accurate, complete and up-to-date as necessary for the purposes.
  9. Safeguards – put in place safeguards to protect personal information and mitigate potential privacy risks.

For further information, please see this news release and Commissioner Philippe Dufresne remarks to the Privacy and Generative AI Symposium on December 7, 2023.

If you or your organization is considering developing or using generative AI systems, you may wish to contact our office for general feedback. For more information about our consultation process, please see this Consultation Request Form.

 

 

Canadian privacy regulators launch principles for the responsible development and use of generative AI

Federal, provincial and territorial privacy authorities have launched a set of principles to advance the responsible, trustworthy and privacy-protective development and use of generative artificial intelligence (AI) technologies in Canada.

The authorities introduced the principles during an international symposium on privacy and generative AI that was hosted in Ottawa by the Office of the Privacy Commissioner of Canada.

While AI presents potential benefits across many domains and in everyday life, the regulators note that there are also risks and potential harms to privacy, data protection, and other fundamental human rights if these technologies are not properly developed and regulated.

Organizations have a responsibility to ensure that products and services that are using AI comply with existing domestic and international privacy legislation and regulation.

The joint document lays out how key privacy principles apply when developing, providing, or using generative AI models, tools, products and services. These include:

  • Establishing legal authority for collecting and using personal information, and when relying on consent ensuring that it is valid and meaningful;
  • Being open and transparent about the way information is used and the privacy risks involved;
  • Making AI tools explainable to users;
  • Developing safeguards for the protection of privacy rights; and
  • Limiting the sharing of personal, sensitive or confidential information.

Developers are also urged to take into consideration the unique impact that these tools can have on vulnerable groups, including children.

The document provides examples of best practices, including implementing “privacy by design” into the development of the tools, and the labelling content created by generative AI.

Related Content:

Joint Guidance: Principles for responsible, trustworthy and privacy-protective generative AI technologies

For more information:

Julie Ursu, Manager of Communication
Telephone: 306-798-2260
Email: jursu@oipc.sk.ca

Privacy Savvy Children and Youth

Various studies on how Canadian children and youth use technology have concluded similar findings. One study found that about 90% of young people aged nine to 11 have at least one social media account. The same study found that about 80% of young people aged nine to 17 have their own smartphone, with many having received their first phone by age 11. Other studies have found that young people spend up to two hours or more online every day.

As children and youth become more tech savvy, though, are they also becoming more privacy savvy? Social media or other internet activity may appear free, but participating almost always comes at a cost to personal privacy.

Any online presence comes with its own set of privacy concerns or risks, regardless of age. For children and youth, however, the risks can be greater. Besides the fact that they may access harmful or inappropriate content, children and youth may also be at risk to their privacy and safety. They may share more online than they intend to or should. They may also use apps that reveal their location, which can lead anyone to knowing exactly where they are. This can make them easy targets for predators or others who mean them harm.

There is also the fact that once you put something online, it is very difficult – sometimes impossible – to remove it or to take it back. This can lead to reputational harm and, if the information you put out is used against you, to heightened feelings of anxiety and depression.

Being privacy savvy means having the practical knowledge needed to make good decisions or judgements about your online privacy. Online privacy means protecting your personal information and knowing what trail of personal information you leave behind. Personal information is anything directly related to your personal life, such as your name, date of birth, home address, telephone number, list of contacts, where you go to school, etc.

Parents can start helping their kids become privacy savvy online by teaching them the fundamentals of internet privacy and what happens to their personal information when they go online. Many online resources for this exist, including the following:

Don’t let your kids just be tech savvy – to keep them and their personal information safe, also teach them to be privacy savvy.