Forget Me Not: Business Challenges Over Rights to Erasure & Threats to AI

Forget Me Not: Business Challenges Over Rights to Erasure & Threats to AI


hello and welcome to today’s webcast
brought to you by compliance week IBM and CGOC. I’m Kyle Brass with Compliance Week and I’ll be your hosts today’s webcast is Forget Me Not: Business Challenges Over Rights to Erasure and Threats to AI. Before we hear
from our presenters let me review the agenda we’re scheduled to go for one
hour after the presentation we will have a question and answer session your
questions will be kept confidential and anonymous so please don’t be shy
you can ask your questions at any time using the ask a question function on the
left hand side of your screen and I’ll pose them to our guests at the end of
your presentation after the Q&A I’ll wrap up the webcast this webcast will
also offer CPE credit for all attendees please disable your popup blockers in
order to access the exam to obtain your CPE credit wait until the webcast is
completely over and I sign off at that point the final examination will be
presented automatically in a separate window if you have trouble viewing the
CPE test or receiving the CPE certificate please send an email to info
at compliance week comm I’ll repeat these instructions at the end of the
webcast so if you missed anything stay tuned a few other administrative details
at any time during the presentation listeners can download the slides from
the drop-down menu on the left hand side of your screen there you will also find
the feedback form for the webcast we welcome your thoughts as we are always
looking to improve your experience if you wish to increase the slide size you
can also hit the view slide full-screen button at the top right of your screen
and lastly a Help button is located in the upper right hand corner of your
screen for assistive I’d like to welcome today’s speakers my
pleasure to introduce Rachel marmar counsel at Davis Wright Tremaine
Rachel focuses for practice on the assessment of legal obligations and
risks throughout the information lifecycle including retention
requirements privacy cyber security defensible disposal and use of
electronic data and litigations and investigations we also have Andrew
Saxton senior director at FTI Consulting andrew is in the privacy and security
practice of fti he specializes in the development and management of global
data privacy risk and compliance programs across the financial services
TMT healthcare and life sciences sectors and we also have Colleen Alemany senior
manager of business information and security office at IBM Pauline is a
seasoned product manager who works directly with product teams driving
security and privacy practices across 300 products of different deployment
models she said the bulk of 2017 into 2018
focused on GDPR our readiness of IBM products it’s great to have you all with
us and with that I will turn it over to Rachel to get the presentation started great thank you all for joining us today
really great to be here everybody see the slides okay all right having
computer issues all right cool all right so we’re here today to talk
about the right to deletion which in some ways is a proxy for generally
talking about some of the new concepts that we’re seeing in privacy law today I
get to speak with the to the theoretical piece and then we’re going to launch
into some of the more practical implementation implications of what
we’re seeing so I wanted to start with setting the groundwork for what we’re
seeing in this dramatic shift in privacy law particularly in the US but also
globally you probably familiar with the GD P R that went into effect last year
the California consumer Privacy Act which was passed in June of last year
and goes into effect in January 2020 and these have stirred up a lot of
discussions particularly with US policymakers regarding what privacy laws
should we have and what principle should underpin those privacy laws and I don’t
know how closely anyone here follows what’s going on in Washington and the
various hearings at the Congress if you do
wow you’re probably the dorkier than I am but we are seeing some coalescence
around a number of basic privacy principles so I put up here as examples
Google’s privacy principles that they released last fall and the National
Telecommunications and Information Administration principle again this was
put out last I believe October and sent out for notice and comment these are
just two examples but really most of the privacy principles we’re seeing come
along these same lines so we’re seeing people request that organizations
provide transparency usually in the form of privacy notices provide individuals
with some level of control I think there’s a lot of
they’d over whether it should be opt-in or opt-out or where individuals can
meaningfully have control but there’s that’s usually an underpinning in some
way of the privacy laws we’re seeing suggestions around limitations and use
and data minimization concerns about security of the information and the
development of individual rights specifically if you look at all of the
state consumer privacy bills that have been introduced last year we’ll get into
some specifics on the trends there everyone has a right to access a right
to deletion in some cases and sometimes it’s worded clunky but a right to
portability and often a right to expect to correct data about the person as well
as a right to opt out and may that dramatic changes in US law they’re not
as dramatic in Europe the GDPR has bigger fines and therefore is causing
people to pay a lot more attention than they did in the past these concepts
particularly around access and the right to be forgotten have existed in Europe
in some times in the US this is the first time that we’re seeing a lot of
these concepts and seeing an attempt to implement them on a broad scale we’re seeing generally that the
regulatory picture across the world is growing in complexity beyond the GDPr are
you have a number of countries looking at new data protection laws as well as
data localization so we are seeing we saw India going to need to see the
protection law last year Brazil China and Russia are both taking steps to try
to keep their data within the country we’re seeing a lot of places become more
conscious of data breach notification we called out Canada here but actually even
though all US states already have data breach notification believe there’s
something like 12 of them that have introduced new such laws or amendments
to such laws just in the first quarter of 2019 well people are taking very
seriously the obligation of companies to let people know when there’s been
unauthorized access to their data in on the US side we saw the California
Consumer Privacy Act (CCPA) passed last June and that has really set off a firestorm
in the u.s. the subs there are depending on what day you look at them about 12
other states that have introduced copycat CPAs laws there’s also the state
of Washington which I carve out separately in part because if Washington
Privacy Act is much further along in the legislative process than any of these
other states and also because it’s not a CCPA copycat the law is actually
modeled after the GDPR and has a much different structure than the CCPA in
other states that are looking at comprehensive consumer privacy
regulation in large part are just taking the CCP a sometimes they’re cleaning it
up so the same rights aren’t repeated three times so the ordering is a little
bit more clear but it’s the same concept in structure there
also been extensive discussions within Congress and the federal government
agencies regarding the prospect of a federal law I think there was a lot of
optimism last fall after the elections that of course you know privacy law was
going to be inevitable because not only do consumer privacy advocates think this
is potentially a good thing but tech industry now wants this because they
don’t want the situation where 50 different states each pass completely
different consumer privacy laws however and reasonable Minds can disagree here
but I tend to be fairly pessimistic on the past on the prospect of federal law
there’s been a lot of discussions and probably half a dozen hearings over the
last month or so different draft bills being thrown out from both sides of the
aisle there is no agreement whatsoever on anything and it’s important to
remember that we have a divided Congress we have a president who at best nobody
knows what he thinks about data privacy or whether he’s willing to expend
political capital to get something done and there’s a real fundamental divide in
Congress about the issue of preemption where a lot of senators don’t want to
end congressmen don’t want to vote for a law if it’s going to be weak to offer to
provide preempts the states if it’s going to be weaker than the CCPA
which leaves people in a place of saying well do we want anything at all or do we
want the CCPA which is a very burdensome law so it’s an exciting time to be a
privacy lawyer there’s a lot going on and a lot to keep track of and a lot of
this has been real shifts in the conversation in the last year whereas
previously the idea of a federal law was fairly remote and the idea of widespread
consumer rights was unheard of the combination of GDP are coming into play
as well as various events in US data security or data prize
last year has really changed that landscape and caused the dialogue to
ship so I want in particular about the
principles that are underpinning the new discussions particularly the consumer
bites and the right to deletion it’s helpful I think in evaluating both what
new proposals are really asking and how organizations should look at consumer
privacy to understand the rationale so if you look at their right to deletion
it actually comes the idea of allowing individuals to protect themselves
against governance government surveillance there’s an oxford professor
who’s done some extensive research as to what the history of this rate was and
basically it came about because the Nazis were keeping a lot of records on
particular ethnic groups and that they were then using to commit the ethnic
cleansing activities back in the 1940s even after the Nazis were defeated
similarly the Stasi in East East Germany were keeping records on people and that
was dangerous because people didn’t know what was in them and couldn’t protect
themselves against government action based on those records so as Europe
learned from some of the mistakes of the mid 20th century in came the right to
erasure the right to erasure provides the data subject with the ability to
reach out to an entity that holds data about them and demand that the entity
erase that data and under the GDP are an entity does not have to respond in all
situations rather there are particular circumstances and if the request fits
into one of those circumstances then the entity is required to delete it I’m not
going to read all of these but basically if it was a consent based if it can
sense based processing and the subject has withdrawn with what the consent they
can say get rid of my information the other situation
comes up frequently is if there’s an objection to the processing where the
processing is based on legitimate interest and there no overriding
legitimate grounds for the processing then in that situation the entity is
going to have to delete the data the exertion stances which an entity doesn’t
have to delete data are still very broad and I think this is one of the most
misunderstood aspects of the right to deletion today but it does not afford
individuals complete control and it does not mean that anybody can cause people
to have to delete any information about themselves we saw the right tip deletion concept
that’s in the CCPA get copied over to inside this in the gdpr get copied over
to the CCPA last summer again this right is limited ccp a flipped the structure
on its head rather than say you have to delete only in these situations a CPA
says you have you have to delete unless an exception applies a couple things to
note about CCPA the right to deletion only requires the lesion of data
collected directly from the consumer this has never made any sense MP from
the privacy policy perspective it perhaps some of my panelists have
shopped here but if it’s about a person if it’s an observation about a person if
it’s information that would purchase about a person from a data broker or
credit bureau I don’t see why that would be treated any differently nonetheless
a– i tend to believe that the right to erasure is not useful in protecting
privacy so I probably should point that out there are nine different exceptions
when a company can get out of deleting a California residence information I’m not
going to go through all of them again like GDP are these are very broad and
some of them I think that you can drive a truck through one question that I get
all the time is but what if I need this information because I need to keep
accounting records for my taxes or I have to keep HR records for a certain
number of years you can do that under California law very clearly says that
you can keep information to comply with legal obligations there’s also an
exception that you can keep data for solely internal uses that are reasonably
aligned with the expectation that the consumer has based on the consumers
relationship with the business while there are some limits there I think it
does help to remember that often the consumers expectations are set by the
company in combinations in the privacy policy and the general nature
of the way the company does business with the consumer so upfront disclosures
can save a company from having to do a lot of deletion download so the righteous Alicia has been around
in Europe for some time we are still seeing court battles to go over the
scope and I think this is an interesting battle because when you go back to well
this started with keeping the government from being able to use information about
people in an adverse way you can see when we talk about current court battles
just how far we’ve come from that initial intent so Google has gotten into
some very high-profile litigation in the EU over the right to be forgotten
in 2014 the European Court of Justice which is the High Court that interprets
gdpr and other new laws ruled that a Spanish
man did have a fundamental right to have Google removed two newspaper notices
from the search results for that man’s name the the situation there is that the
man had experienced some bankruptcies many years ago and the Google’s results
were finding the bankruptcy notices in a local newspaper so Google was basically
caught you know indexing the information from that newspaper and showing links to
it if you search for this person the court said that the newspaper didn’t
have to get rid of those links because it had an illegitimate interest as
journalists in publishing information and keeping that information up but that
Google did not have a legitimate interest in linking to information that
was inadequate or irrelevant or no longer relevant or excessive in relation
to the purposes for which it was processed Google continuing to push the
right to be forgotten fight the 2014 case sort was about only whether Google
had to delete the search results from the search engines in the incue there’s
a new case with a similar fact pattern this time it’s a old criminal conviction
where Google deleted the information from its EU search engine but it
you ran the search in the US it would still show up and we’re still waiting
for a decision on that that’s a little bit about the underlying
theory behind the right to be delecia right to deletion it’s a really
complicated concept for organizations to apply because most systems are not set
up for a line level data or to do data retention after the request as a
consumer so I’m going to turn it over to Andrew to start to talk about some of
these more practical implications of these new laws thanks Rachel yeah and so
as Rachel indicated it’s quite a bonus and often confusing set of requirements
specific to the right to be forgotten and just how to operationalize that
within any large global organization becomes an extremely difficult endeavor
to step into and understand just where the data sits how to actually execute an
appropriate deletion how do we balance you know appropriate retention
requirements against these deletion requests and then in the instance of our
court component of the discussion here when it comes to actual valuable uses of
our data for instance out of artificial applied artificial intelligence use
cases for instance how are we properly balancing those value use cases against
whether or not we actually do or should delete the data and it’s these sorts of
questions are really tying organizations and knots but ultimately when we think
about GPR and CCPA it’s very much increasing demand for organizational
resources to to action and answer these sorts of questions and as Rachel
mentioned it’s very much sending shockwaves throughout the US state and
federal legislature but it’s ultimately very much changing the way we think
about our data and so just as I had mentioned when it comes to these
discussions of value simply borne out of the complications of these requirements
is often premature and in very many many times
a premature deletion of data right and so these
premature deletions of personal data while they pose certain regulatory
retention risks they also pose actual risks to broader innovation and even
implications to broader AI functionality within organizations and some of their
enterprise or consumer facing systems and so what we find in in our experience
nowadays is that many times where there is a an immediate fast act step to go
and respond to these deletion requests the output is a black hole of personal
data that would otherwise be useful and used to appropriately inform our
artificial intelligence systems on how to operate and specifically what I’m
talking about is within the concept of AI and then just to kind of define what
that is so AI is really the study and development of systems capable of
intelligent behavior and the way to kind of think about AI is either through an
approach or an application machine learning for instance is probably one of
the more common types of approaches of actually executing AI type functionality
and so in general machine learning involves adaptive mechanisms that enable
computers to learn from experience learn by example and learn by analogy learning
capabilities can improve the performance of an intelligent machine over time and
machine learning mechanisms form the basis for adaptive systems and so the
point is is that even though AI is thrown out of the thrown around a bit
machine learning is a type of approach of AI and it actually establishes the
ability for systems to learn and improve and optimize themselves without really
any sort of direct human involvement or explicit programming on humans part and
so with that machine learning when we take into account some of the really
relevant use cases around machine learning it requires upfront time and
real in many instances personal data capital is the way I kind of think about
it to properly train and actually execute tasks appropriately
without that explicit instruction and so just to kind of give you a few examples
and understanding that there are actually specific exemptions for
actually responding to deletion requests in the case of fraud or in the case of
certain other instances by way of actually moving forward and
committing these deletions before really thinking through and effecting proper
programmatic control over these deletions it actually impacts much now
and so specifically minimization and fraud and effect sanction texting
capabilities obviously primarily within the financial services sector many of
our banks and you know payment processing organizations throughout the
world have a number of machine learning capable systems that are performing
money laundering detection fraud detection and even effect sanction
designated individual detection and because you could imagine the volume of
transactions that exist in many of our you know global financial institutions
it becomes an impractical sort of exercise to have even you know a room
full of hundreds and hundreds of people to review transactions that are
potentially fraudulent and so many institutions now are deploying
well-equipped machine learning capable systems to over time learn fraudulent
behavior detect fraudulent behavior and begin to train itself on what the new
and the new sorts of typology of criminals may be and it takes again
machine learning capabilities to grow over time and optimize and change as a
result and so if we were to go in and begin conducting without again having
appropriate programmatic controls in place begin conducting large-scale
deletions of personal data to recognize the name address and and you know
geolocation of known fraudsters or to recognize personal data attributes
specific to specially designated individual on that on that OFAC sanction
list we’re corrupting our pool of data that
would otherwise be useful to inform our fraud engines in our money-laundering
detection engines and we would potentially be impacting our bottom line
as a financial institution and so by putting appropriate programmatic
controls in place to really just stop and
think and balance the risks of deletion versus the benefits of deletion in terms
of broader CCPA and even GDP our regulatory compliance it pays to have
those program controls in place another example is born out of health
care so in many instances now there’s many organizations that are deploying
applied artificial intelligence applied machine learning capabilities to
exercise back office medical coding and billing capabilities and again if we
were to go and commit large-scale deletions of personal data assets we are
potentially spoiling the pool of data that would serve as relevant to inform
our machine learning capabilities and and more accurately code and Bill our
insurance providers and healthcare providers based on a set of attributes
and so again having the appropriate controls in place to understand what is
being deleted before we go and delete it and understand the downstream
implications of that solution is is essential are just a few use cases and examples
and I’ve been talking about this idea of programmatic risk management and
programmatic controls to put in place and these graphics here are really just
there to establish sorts of questions so how within this enterprise risk
management model that we see on the left side of the screen where we’re talking
in terms of third line second line and first line of defense how does our
machine learning and artificial intelligence capability whether its
consumer facing or internal enterprise type system how do we properly
rationalize and understand and balance our our emerging technology risk against
our broader enterprise risk management understanding and framework and so for
instance within the third line of defense how are we informing our
internal audit function to properly oversee and monitor any sort of specific
risk that may be pervasive within our AI or machine learning type tool sets and
capabilities and how are we including within that monitoring and oversight
discussion a discussion around improper deletion and perhaps excessive deletion
to the point that we are breaking the tools capabilities and functionality
similarly within second line of defense so if we’re developing or building out a
privacy program or a broader broad anti money laundering risk management program
or any other sort of second line of defense risk management program that
would exist how is it that we’re properly advising our product developers
our engineers our technologists that are residing at that first line how is it
that we’re properly guiding those first line actions and ensuring that if a
deletion request were received that on the front line know how to understand
and interact with that request and know when it is appropriate or inappropriate
perhaps to make a deletion and it’s that second line of defense that needs to be
there to establish the standards and establish the appropriate procedures and
understanding as to what is valid and what is not valid and then when it comes
to the first line defense how is it that we’re properly informing and educating
our broader workforce to interact with these sorts of unique tools and more
importantly understand the broader data privacy risk implications that exist and
quite frankly and specific to deletion letting them know that simply by way of
hearing about these you know scary rules like gdpr and CCPA it doesn’t
necessarily mean that personal data is a toxic asset in all instances there are
many instances in which personal data will provide value but that value needs
to be balanced against appropriate protection and appropriate understanding
of what works and what doesn’t in terms of the controls and and execution that
need to take place and then on the right side of the screen you see just an
example of the sort of program that would potentially exist to manage
privacy risk generally and of those pillars you can imagine different sub
pillars that would potentially need to think about and include certain
programmatic controls that relate specifically to our internal systems
many of which would potentially utilize machine learning and AI tech
capabilities so it’s a broader emerging risk
management approach that we need to seek to apply and so with that I’ll pass it
over to point thank you Andrew
I’ll just move forward so right – eraser and how that that all
plays in extending upon what Andrew had outlined how do we in today’s fast-paced
tech innovations that are coming out left right and center how do we handle
or manage the right to be the right to eraser so in this slide here we’re
mapping out a typical use case whereby all of us as consumers and the consumers
out there having a strong desire to have everything at their fingertips
everything done for them as programmatically as possible we’ve seen
that through some of the innovations with the app settle track what we do for
for exercise and how many steps we put in every day as a very simple example
and so one of the cases or many of the cases observed over the course of the
preparing for GDP our readiness was how to handle the the type of consumer
demand alongside or in parallel of what the producers are putting out there to
to actually allow the consumers to ingest all this technology so building
on AI from the perspective of the collection of personal information in
order to actually be able to produce those outcomes or those those benefits
and what was seen to date was it typically is underpinned by the need to
track behavior of a consumer in order to do the to detect the pattern of that
particular consumer in a one-to-one model or perhaps to provide a larger
base suggestion in a in a one from many consumer model and so the it begs the
question of are we looking at personal and
Meishan in a single model or are we looking at personal information in an
aggregated model and then how do we handle the right to be forgotten in that
scenario so what producers should consider well
let’s take a step back before we go into you know is it it is consent required I
think from a producers standpoint they need to consider things like in the
absence of a very explicit detailed detailed controls in the GDP our
regulation they need to look at at their whole area that they’re producing
holistically and and provide the proper guidance so that it’s consistent and
then further look at how to then handle consent so when we’re talking about
consent the first piece is to look at the essential requirement to the service
so is the artificial intelligence being being surfaced or being suggested or the
behavior modeling is that essential to the service so if the service is one where the
end-user would reasonably expect that the service involves tracking then there
is a strong argument that the tracking is there and necessary for the service
so then if that is the case you’re looking at the practice of you know or
the reliance I’m sorry of contractual necessity so consent would not be
required in that case but to further elaborate on that and to further enforce
it or allow that transparency you would actually make that clear in any form
that of the service so whether that be in the description of the service or the
material that’s provided at the time of consumption in the case and building upon what
Andrew had said earlier in the case of you know wood with the lack of consent
then force us in a right to erase it sure and and I think at that point we
are looking at you know whether the service will actually be deemed not
valuable if we went ahead and exercised against that eraser eraser request
if consent was not required are we obligated to actually delete that is the
question that remains on the table so understanding the consequences as Andrew
pointed out is going to be key to whether you actually go and take action
on a right to erase it sure when you’re looking at that information being part
of your behavioral tracking our ability to provide a learning machine learning
or artificial intelligence back to the consumer and lastly a defined discipline is
required in order to be able to map out where you fall in this space with
regards to the collection of the person personal information to provide this
level of consumption and it starts early on understanding the fundamental purpose
of that collection and has the rule of minimization taken into consideration
are you just inhaling all the data you can you know holding it in your back
pocket should you need it for the you know ability to build the pattern or are
you concise and understand exactly what information you need to build up the
patterns in order to provide the intelligence you’re looking for so that
would be very key early on secondly obviously you would need to understand
if the service can run without this intelligence is it an additional feature
it’s a nice-to-have or is it really hinged on the need or the ability to
provide the intelligence and then you’d go further on further on in
understanding you know how and if you were to exercise on erase’ sure what
would be the ultimate impact to the to the service if you went ahead and acted
upon it and obviously at the end of the day whether you’ve taken into
consideration the ability to in a one-to-many model the aggregation be
anonymized to some form must it must continue to have the personal
information in order to provide the intelligence or is that an early upfront
requirement and then later you would you would act on the deletion in advance of
any specific requests delete and there’d be no way to map it back to that
individual so a defined discipline with all the
scoping early on would allow you to position yourself to be able to have the
proper legal basis or contractual basis to actually produce the artificial
intelligence and not have to act on consent or the requirement to erase so that takes us to the last slide here
on how should an organization respond and add it at a very high level and I’ll
cue Rachel in it’s really fundamentally about getting all the protocols and
practices in place documenting your your retention and and executing against what
you say the retention schedule would look like sure and then completely thanks God go
ahead thank you yeah in terms of is the practically advice my best advice for
everybody is that the best defense to a right to a reserved request is a good
offense and as we’ve talked about you know the idea that a consumer can hold
an organization’s feet to the fire and force them to go into their technical
systems and define retention periods as fit for that consumer as opposed as fit
for that business is not rooted in history and I’m not entirely sure what
privacy benefit it provides to the consumer given that the organization
already had the data and was able to do you know what it wanted to do in the
first place and that when it’s a private organization in the u.s. at least at
present you know there isn’t a huge threat of government action on the basis
of the information so first of all you know what I recommend that consumers
with how the organization’s do is spend some time taking a good look at your
records retention schedule and what your busier records retention periods are and
this doesn’t just mean like sending your retention schedules to your outside
counsel and saying hey there’s five years sound good to you
although hey if anybody wants that type of review available it records retention
like is really it’s about the implementation it’s not about you know
what’s written on the paper you have to start with what’s written on the paper
but if you say you’re going to delete something after five years and you don’t
you aren’t actually on a regular basis going out into your organization working
with people on identifying what information falls into that five year
bucket figure getting that information from what falls into the four year
bucket or what falls into the actually we don’t need this bucket and setting up
a combination of policies and technology systems so that this can be done on a
regular basis then having a retention schedule isn’t
going to help you it is when you go out into the organization it really becomes
a much more complicated task of working with business teams to get people to
understand their responsibilities when it comes to record retention and get
them to articulate reasonable periods when they might need the information so
how does having a records retention schedule protect you against requests
for deletion protection a couple ways number one most organizations that I
really can’t think of anyone I’ve worked with that wouldn’t fall into this
category aren’t doing a great job at getting rid of data right now because of
the nature of how technology has developed over the last 20 years and the
fact that we’re creating so much and it ends up in places we don’t even know
where it is and so getting rid of the information is going to mean that when
somebody comes and says tell me what you have gives me access delete my
information you’re not going to have it there and have to respond in the first
place which is a bonus and would significantly
reduce your operational burden it also assists from a data security perspective
because if you don’t have the data you don’t have to worry about how you’re
going to protect it finally if you have a good records retention schedule and
you fought through your business justifications it becomes very easy when
you get a request to see whether that request fits into one of the exceptions
under the law so if you’re processing in the EU is based on legitimate interest
then you know if there is a business justification that merits keeping this
data around and I think from a regulatory perspective or you’ve done a
really robust records retention exercise and as an organization you can
demonstrate that you truly understand the place of where data is used and
you’ve been thoughtful about how you set your retention periods
I think that creates some Persei legitimacy to what they are
similarly in California if you have thought through you know this is what we
needs it for and why we need it I think it becomes much easier to create that
argument that this is an internal use and it’s reasonable to the consumer
because we’re a business that does X they knew we our businesses do that does
X and here you go here’s the piece of paper where we’ve outlined all the data
that we need to do X after you do that clean up exercise then you really need
to dig into what the protocols are going to be when these requests come in the
time to think about whether and how to delete the information is not January
2nd of next year when you receive the request it’s now and the reason there’s
a couple reasons for that one is that your intake pathways are going to take
time to to develop so particularly if you are a large organization everything
may have a form on your website requests aren’t always going to come in through
their website they’re going to come in through a customer service phone line
they’re going to come in through email they’re going to come in through other
activities where people see people that work for the company face to face and
the worst thing that can happen is that you get a rogue customer service
representative who says oh yeah I’m just going to try to fix that for this person
and goes and does something that is harmful to the business or undermines an
argument that you’re trying to make in another consumer case that you need this
data or that you can’t do this for XYZ reasons the other reasons why you need
to establish your protocols up front is that data deletion in a system where you
have complex data governance and tables and schema and those are all really big
words I know as a lawyer to use on the stop now it’s not about one particular
consumer and one particular field and when I sit with clients to talk about
the CCPA that’s often the first thing they tell
me if I delete this user ID this person’s name here then that means
that 17 other functions fail to work and some of them maybe they can be adjusted
and some of them they don’t need to work but unless you’ve had that conversation
up front you don’t know how deleting something from a database is going to
have an impact on your organization you have 45 days to respond the consumer
requests under CCPA you only have 30 under gdpr
well that sounds like a lot I can tell you in practice that that time frame
goes really quickly for most of my clients it takes often a good week for a
request to get to legal in the first place
it may take additional time to confirm the person’s identity which under at
least the CCPA clock is running on your organization
not the consumer while that confirmation of identity process is working and from
there if you haven’t already started to look into the data it’s going to take
you a lot of time to figure out what you can and can’t do it’s also really
important that requests be responded to consistently because if you are in put
in a position where you have to defend a denial of a deletion request to a
regulator the last thing that you want is the regulator to say giving all of
your other deletion requests and responses oh wait here’s three other
situations where you did do this you need to make sure that you’ve thought
through this is what we’re willing to do this is what we’re not willing to do
this is where we think we have the strongest legal argument for not doing
it and make sure the same people are using that guide to make those decisions
every time and it’s not easy it will take you from now until January of next
year even if it’s if you are a small organization but given that these rights
have been common become commonplace in discussions of consumer privacy there
this is really important Andrea went after that too yeah this is
Andrew and I think first of all I agree with everything you’re saying especially
when it comes to having an appropriate offense right and really understanding
who in fact is the initial touch point into the organization receiving
inventory and triaging these requests because it’s often those individuals
that can either accelerate the process best or hamper the process the most and
having an appropriate education for really anyone who would potentially be
subject to receive any sorts of requests on how best to handle them is really
your first best step to handling just deletion request broadly and then I
think importantly when it comes to the right deletion at least on those CCPA
businesses really have to instruct its downstream service providers to delete
the data as well and at least with respect to GDP our data controllers also
need to take reasonable steps to inform other data controllers concerning that
that they know to be processing that data and so when it comes to having you
know whether it be 30 or 45 days having a full accounting of all those third
parties that you would potentially need to get in touch with in order to affect
the appropriate deletion as controllers and as individuals that as organizations
that maintain certain decision-making rights over that data the onus is on you
to just as quickly effect that downstream deletion as you are affecting
you know your internal system deletions as well in I think in the case of when
we talk about internal enterprise systems utilizing AI we often see many
of our clients nowadays are basically outsourcing these these AI capabilities
to you know well-known vendors and that are developing tool sets that are
reliant and incumbent upon you know AI and machine learning type capabilities
to execute whatever it is they’re trying to execute and if a deletion request
were to come internally it’s now on I’m you as the organization to have
essentially a service provider action plan on how
you need it how you need to reach out to those individuals and again how you need
to actually execute that deletion and then receive confirmation coming back
from them you all right thank you guys and I think
from there we can we can have time for a couple questions here I know your next
slide was a little bit more based toward questions so I’m just going to ask a
couple that have flowed in here and then we’ll make sure we just go through the
rest of the slides very quickly so the first question I’ll ask what if the
personal information is publicly available such as email addresses posted
on resumes or on job boards or on a LinkedIn profile I’m happy to speak to about one under
the gdpr that’s all still personal data even if
it comes from a government resource if it is linked or reasonably linkable to a
person that’s personal data under the California consumer Privacy Act it is
personal information if it comes from a government source and is being and used
in the way intended by that source so it is still personal information if it
comes off a random internet site but you get out of the definition if it’s a
government source there’s among the other draft CCPA laws
that I’ve seen the definitions vary a little bit but for the most part if it’s
coming from LinkedIn or some other random place on the internet it’s most
likely going to still be personal information thanks Rachel now I’ll get to one more
part I’m sorry I yelled at Andrew yeah just really quick and then to the extent
that you as an organization are either inventory and creating essentially your
own homegrown database on the back end you know just to add to Rachel’s point
to the extent that you need to adequately safeguard and protect that
data and adequately inform those individuals that is essentially the
effect of going out and scraping you know publicly available data for you
know whatever purpose you may be using it for thanks Andrew I’ll get to one more
question that we have here how can upfront disclosures help
businesses from required deletion this is right so I’m happy to take that
sent a guy made the statement that inspired that under California law we’re
a part of the evaluation process as to whether you have to disclose is whether
you’re within the reasonable expectations of the consumer upfront
disclosures can help set what that expectation is so if you have a privacy
policy that says dear consumer you know we provide we sell you shoes and we only
use your information to sell you shoes it’s harder to say well I needed the
consumer should have known I was going to use information about them to
evaluate what shoes to produce next spring if you define to the consumer
what you do as sell shoes and provide you know highest quality most desirable
shoes to the consumer and explain upfront that you use their data to help
with that analysis then you’ve set the expectation in a different way it’s a
really academic argument because let’s be honest we all know people aren’t
reading the privacy policies but the FTC for some reason thinks that consumers do
and being able to make that argument might help with regulators thank you and on that note I know you
guys have a couple more slides in the queue here so I just want to turn it
back to you guys if you want to go through those who look quick before we
wrap up here sure not sure if it’s supposed to take
this one but I think we just put some of the existing laws into the appendix in
order to make this a little less try so the current federal landscape is
basically HIPAA is a big part of that and that’s one of the few places where
individuals have rights to access although not rights to deletion
previously the u.s. landscape we just put in a little summary of what has
changed between a few years ago and now I think we covered this pretty heavily
already and there’s some information about CG OC somewhere here too okay I’m not thank you guys so much and
on that note I’m going to wrap up here just once again if you would like to
copy of the slides that were just presented you can download it from the
drop down menu on the bottom left hand side of your screen thank you so much
for a very informative session again today our speakers were Rachel marmar
Andrew Schack stood and Pauline salomina I’d like to give a special
thanks to IBM and CG OC for making this webcast possible this webcast has been
recorded and will be available later today to compliance week subscribers on
our website under the webcast tab which also contains a library of additional
CPE webcast if you would like to learn more about subscribing please contact us
at info at compliance we comm once again to obtain your CPE credit for this
presentation please disable your pop-up blockers in order to access the exam the
webcast will close automatically and the final examination will be presented in a
separate window if you have trouble viewing the CPE test or receiving the
CPE certificate please send an email to info at compliance week calm this
concludes our webcast thank you again for joining us and have a great day

You May Also Like

About the Author: Oren Garnes

Leave a Reply

Your email address will not be published. Required fields are marked *