Richmond Journal of Law and Technology

The first exclusively online law review.

Is Legal Action the Best Way to Curtail Cyberbullying? Instagram Just Offered Another Option

Print

By: Etahjayne Harris,

Instagram, one of the most popular social networks worldwide, has over 500 million monthly active users as of September 2016.[1] The photo-sharing app enables users to take and upload pictures and videos and share them publicly or privately on the app, as well as on a variety of other social networks, such as Facebook and Twitter. For teens, “Instagram is much more than a medium to share photos on—it’s an extension of their identities.” [2] Instagram’s co-founder Kevin Systrom has said that the app was created to, “make it easy for people to share their lives in a beautiful way.”[3] While the offered use and purpose behind Instagram is of a positive nature, the app has unfortunately been used as a tool for cyberbullying. Teens may be more susceptible to cyberbullying due to the significant role social media plays in their lives.

The National Conference of State Legislatures defines cyberbullying as, “the willful and repeated use of cell phones, computers, and other electronic communication devices to threaten others.”[4] A recent study done by the Cyberbullying Research Center (CRC) has shown that over 25% of students in middle school and high school have been cyberbullied in their lifetime.[5] While cyberbullying can take place on any variety of social media platforms such as Facebook, Twitter, or Snapchat, it has been argued that cyberbullying on Instagram is especially bad, “because it’s a very public platform that people use to post photos of themselves—inviting everyone and anyone to judge their appearances in the comment sections.”[6] For teens whose social media presence is closely tied to their self-identity, the effects of cyberbullying on that identity is particularly worrisome.

You may wonder amid these distressing statistics, whether Instagram has taken any measures to mitigate its cyberbullying problem and whether there are legal consequences for cyberbullying. On September 12, 2016, Instagram implemented an update that allows its users, “ to block ‘inappropriate comments’ on their posts and set filters for specific words.”[7] This update gives users more control over what comments get posted on their pictures beyond simply having the ability to delete unwanted comments or block specific users. In a statement released on September 12, 2016, Instagram co-founder Kevin Systrom said that the app is working towards, “keeping Instagram a safe place for self-expression.” This update gives users the chance to push back against cyberbullies and inappropriate comments generally. While this update is great and gives users more control, what are the legal remedies for the victims of cyberbullying?

Nearly every state has enacted some form of a student cyberbullying statute.[8] To be considered cyberbullying, information technology must be used to, “deliberately threaten, harass, or intimidate another person.”[9] Under many state statutes, state schools may be required to specifically address and correct behavior that may be considered cyberbullying through their policies.[10] The swift spread of social networking apps like Instagram and Facebook has been met with an increase in cyberbullying litigation both in the federal and state courts.[11]A frequent issue in applying these student cyberbullying laws is determining whether a school is responsible for protecting students from off-campus online harassment.[12] There is no clear answer as of today.

In spite of these student cyberbullying laws, finding a legal remedy for cyberbullying is complicated by the fact that cyberbullying encompasses such a wide range of behavior. A claim of cyberbullying is not necessarily viable simply because the victim was offended by what someone else commented on their Instagram post, for example; a claim is generally viable if the alleged conduct violated a criminal statute, violated a state student cyberbullying law, or constituted a traditional civil tort. [13] The issue of finding legal relief for cyberbullying is further complicated by the fact that a cyberbully can post or comment anonymously on social media apps like Instagram, making it difficult to trace the origin of the harassing comments. So while there are legal remedies in place for teen victims of cyberbullying, those remedies may be difficult to obtain. For now it appears that the most feasible way to combat cyberbullying on Instagram is to stop it in its tracks by using the new comment filter option.

 

[1] See Instagram, Number of monthly active Instagram users from January 2013 to June 2016 (in millions), Statista, http://www.statista.com/statistics/253577/number-of-monthly-active-instagram-users/ (last visited Sep. 13, 2016).

[2] Nina Godlewki, If you have over 25 photos on Instagram, you’re no longer cool, TechInsider (May 26, 2016) http://www.techinsider.io/teens-curate-their-instagram-accounts-2016-5.

[3] Michael Noer, A Conversation with Instagram’s Co-Founder Kevin Systrom, Forbes (Apr. 9, 2012), http://www.forbes.com/sites/michaelnoer/2012/04/09/a-conversation-with-instagrams-co-founder-kevin-systrom/#6c6ae93d5d64.

[4] Cyberbulliying, National Conference of State Legislatures (Dec. 14, 2010), http://www.ncsl.org/research/education/cyberbullying.aspx (last visited 9/13/2016, 5:15pm).

[5] Sameer Hinduja and Justin Patchin, Cyberbullying Victimization (Feb. 2015), http://cyberbullying.org/2015-data.

[6] Elise Moreau, What is a Troll, and What is Internet Trolling, About Tech (Feb. 25, 2016), http://webtrends.about.com/od/Internet-Culture/a/What-Is-Internet-Trolling.htm.

[7] Brett Molina, Instagram Update Lets Users Filter Comments, USA Today (Sept. 12, 2016), http://www.usatoday.com/story/tech/news/2016/09/12/instagram-update-lets-users-filter-comments/90260384/.

[8] See Gary D. Nissenbaum and Laura J. Magedoff, Potential Legal Approaches to a Cyberbullying Case, The Young Lawyer Vol.17, No. 9 (Aug. 2013), http://www.americanbar.org/publications/young_lawyer/2012-13/july_august_2013_vol_17_no_9/potential_legal_approaches_to_a_cyberbullying_case.html.

[9] Id.

[10] See id.

[11] See id.

[12] See id.

[13] See id.

Image Source:

http://cdn2.hubspot.net/hub/176530/file-18530001-jpg/images/cyberbullying.jpg

Apple’s Latest Lawsuit Arises From the iPhone Defect “Touch Disease”

person-woman-hand-space-1080x675

By: Will MacIlwaine,

On September 7, Apple held its annual fall event. The event featured the introduction of the Apple Watch 2, as well as the iPhone 7, which is the first iPhone not to include a headphone jack.

While the fall event might suggest that things are continuously heading in the right direction for the innovative company, a defect associated with the iPhone 6 and 6 Plus devices suggests otherwise. In the United States District Court for the Northern District of California, three iPhone users have filed a class-action lawsuit against Apple for the defect that has been coined “Touch Disease.”[1]

Inside the affected iPhone models are chips that allow a user’s finger and the screen to interact.[2] For some users, these chips are not correctly secured to the logic board of the phone, and fail as a result of the consumer’s normal use of the device.[3] The plaintiffs in this case claim that Apple concealed this defect, which “causes the touchscreens on the iPhones to become unresponsive and fail for their essential purpose as smartphones.”[4] The defect also causes a gray flickering bar to appear at the top of the device’s screen.[5]

All three plaintiffs, Todd Cleary, Jun Bai, and Thomas Davidson, have experienced the “Touch Disease” issue.[6] All three individuals were told that they could pay an additional fee, over $300 dollars, for a replacement phone.[7]

The plaintiffs note that the previous iPhone 5S design included a metal shield to protect the device’s logic board, allowing the phone to better accommodate reasonable use by the consumer.[8] Additionally, the iPhone 5c design used an “underfill” mechanism to reinforce the chips at issue and protect them from normal wear and tear.[9] According to the plaintiffs, the iPhone 6 and 6 Plus models carry neither of these protective features.[10]

The plaintiffs claim that Apple has done nothing to remedy the defect, even though it has had knowledge of the issue for some time.[11] Apple gained this knowledge, according to the plaintiffs, through the records of customer complaints, repair records, and various customer claims.[12] The plaintiffs state that Apple’s lack of action is a result of “unfair, deceptive and/or fraudulent business practices.”[13] They even go as far as to argue that, had they not relied on Apple’s representations regarding the quality of the product, they would have paid less for the iPhones, or would not have purchased them at all.[14]

In their argument plaintiffs first posit that Apple engaged in unfair and deceptive acts in violation of the California Consumers Legal Remedies Act (“CLRA”) by knowingly and intentionally concealing from the customers the fact that the iPhones were faulty.[15] In a general sense, the plaintiffs argue that Apple misrepresented the product, representing that the phones had certain characteristics, uses, or benefits that they did not have.[16] They also argue that Apple misrepresented the quality of the product, and advertised the phones with the intent not to sell them as advertised.[17] Since Apple is in a better position to know the true state of the touchscreen defect, the plaintiffs contend that Apple owed a duty to the customer to disclose this issue.[18]

Further, the plaintiffs argue that since they were deceived in purchasing the phones, which they would not have purchased if they had knowledge about the defects, that Apple was fraudulent in its actions.[19] Among other claims for relief include those based on negligent misrepresentation, unjust enrichment, breach of implied warranty, and claims based on violations of several warranty acts.[20]

The “Touch Disease” only became an issue about six months ago.[21] Further, many users are just now starting to experience this problem on the iPhone 6 and 6 Plus, two years after the release of these models.[22] It’s difficult to believe that Apple knew that this was going to be an issue when it originally advertised and eventually released the phones in September of 2014.[23] Apple, while a powerhouse in the technology industry, cannot predict the future, and it’s certainly plausible that the company had no idea that the affected phones would have this kind of issue more than a year and a half after release. If that’s the case, there does not seem to be any “unfair” or “deceptive” action taken by Apple.[24]

On the other hand, ignoring the issue and continuing to produce and sell these iPhone models after becoming aware of the issue could be seen as deceiving and as misrepresenting the quality of the product, if in fact Apple had reasonable knowledge that the issue was affecting a majority of the models in the iPhone line it was continuing to produce and sell.

In reality though, Apple’s general warranty on its iPhone line is a one-year limited warranty.[25] After that, it is the consumer’s responsibility to pay for necessary repairs. All three plaintiffs in this case had been in possession of their phones for at least one and a half years before beginning to experience this issue.[26]

It would seem to be a bit more complicated if a user experiencing this issue was still within the one-year warranty period, because Apple’s warranties do not include coverage for “defects caused by normal wear and tear or otherwise due to the normal aging of the Apple Product.”[27] Then, the main issue would seem to be whether the defect was really a result of normal wear and tear, compared to a design defect as the plaintiffs claim.[28] If the plaintiffs were still within the warranty period, the claims might have more merit, but since the plaintiffs and many of the individuals having this issue only started to experience it around the two-year mark of owning the phones,[29] this does not seem to be the main subject of potential litigation. In this case, it does not seem likely that the court will side with the customers bringing these claims.

Important to reiterate is the fact that this is a class action. The plaintiffs seek to represent a nationwide class of iPhone 6 and iPhone 6 Plus users. That being said, this could be an issue for Apple if anything actually comes from these claims. Be on the lookout for a response from Apple in the coming weeks.

 

[1] See Don Reisinger, Apple Is Being Sued Over the iPhone ‘Touch Disease’, Fortune, (Aug. 30, 2016), http://fortune.com/2016/08/30/apple-touch-disease-lawsuit/ (last visited Sep 13, 2016).

[2] See id.

[3] See Class Action Complaint ¶ 27, Davidson v. Apple, Inc., No. 5:16-cv-4942, (N.D. Cal. filed Aug. 27, 2016).

[4] See id. ¶ 1.

[5] See id. ¶ 21.

[6] See id. ¶¶ 8-10.

[7] See id.

[8] See Class Action Complaint, supra note 3, ¶ 28.

[9] See id.

[10] See id. ¶ 30.

[11] See id. ¶ 2.

[12] See id. ¶ 35.

[13] See Class Action Complaint, supra note 3, ¶ 4.

[14] See id. ¶ 38.

[15] See id. ¶ 57; Cal. Civ. Code § 1770 (a)(2),(5),(7),(9) (2016).

[16] See Class Action Complaint, supra note 3, ¶ 57; Cal. Civ. Code § 1770 (a)(5) (2016).

[17] See Class Action Complaint, supra note 3, ¶ 57; Cal. Civ. Code § 1770 (a)(9) (2016).

[18] See Class Action Complaint, supra note 3, ¶ 60.

[19] See id. ¶ 82.

[20] See id. ¶ 15-19.

[21] John Matarese, Apple iPhone ‘Disease’ Making Touch Screens Useless, WCPO Cincinnati (Sept. 12, 2016), http://www.wcpo.com/money/consumer/dont-waste-your-money/apple-iphone-disease-making-touch-screens-useless.

[22] See id.

[23] See id.

[24] See Class Action Complaint, supra note 3, ¶ 4.

[25] See Your Hardware Warranty, http://www.apple.com/legal/warranty/products/ios-warranty-document-us.html#2013328_201623 (last visited Sept. 14, 2016).

[26] See Class Action Complaint, supra note 3, ¶¶ 8-10.

[27] See Your Hardware Warranty, supra note 25.

[28] See Class Action Complaint, supra note 3, ¶ 30.

[29] See Matarese, supra note 21.

Photo Source:

http://siliconangle.com/files/2016/08/person-woman-hand-space-1080×675.jpg.

Identity Misappropriation on Dating Apps: Did You Right Swipe the Right Person?

Tinder dating app photo

By: Tatum Williams,

It is easy to confirm a Tinder profile’s authenticity when the person is someone you know. When this happens, and it often does considering the app boasts an estimated 50 million users[1], beneath the initial awkwardness of swiping across a familiar face, there is assurance. The assurance comes from knowing that the person beyond that profile is who they say they are.

With the slogan, “It’s like real life but better,” Tinder’s innovative design allows users to connect with nearby people using location-based software and basic Facebook information, such as mutual interests and friends.[2] Tinder and apps similar to it, such as Hinge, require this social authentication primarily to put users’ minds at ease over the likelihood of encountering a fake profile.[3] But such is not always the case. Of those who utilize dating apps and online dating, roughly 54% felt that some users seriously misrepresented themselves in their user profile.[4] Evidence of that is the modern phenomenon of “catfishing.” Catfishing, which is defined as the intentional deception of another through the use of a fake profile, typically done in hopes of achieving a romantic connection,[5] has exploited the rise of social media and dating apps. It is now even easier for someone to fabricate a dating profile.[6] Needless to say, the prevalence of dating apps has facilitated deceptive behavior given the availability of anonymous communications.[7]

More often than not, misrepresentations of this nature are harmless. While some people may exaggerate or lie about their preferred television shows or movies, others execute more strategic lies, often pertaining to their age, weight, height, personality traits, interests, monetary status, career aspirations, and even past relationships.[8] Most people are familiar with these kinds of misrepresentations and would agree that these lies, though wrong, are not worthy of serious legal ramifications.[9] Or are they?

This begs the question whether the legal considerations and ramifications associated with dating apps have evolved at the same burgeoning rate as the apps themselves. Unfortunately, the answer is no; dating app users have a low likelihood of success in holding a dating app liable for any harm that the user experiences from his or her interactions with other app users.[10] Most dating apps have combated these potential claims by disclaiming all warranties and representations with regards to other users in their terms of use agreements.[11] And the immunity does not stop there: The Communications Decency Act protects apps from liability based on content posted by users, such as the aforementioned catfishing scams or other misrepresentations.[12] Under the Communications Decency Act, “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”[13]

Tinder is relatively new—it was only created in 2012[14]—so not many statistics have been conducted on the number of fake profiles currently in existence. With that said, the rapid pace at which the user base is experiencing growth preserves the potential for misrepresentations and catfishing. Despite the protections dating apps and other forms of online dating services receive, several states have made lying on these platforms a criminal offense.[15] This would seem to suggest that the trend of remaining unpunished for misrepresentation or catfishing is diminishing.[16] But until cyberspace governance evolves to the level that dating apps have, it appears that users’ best defense is to swipe wisely.

[1] See Alexis Kleinman, The Typical Tinder User Spends 77 Minutes Tinding Every Day, Huffington Post (Oct. 31, 2014), http://www.huffingtonpost.com/2014/10/31/77-minutes-tinder_n_6082468.html.

[2] See Jessica L. James, Mobile Dating in the Digital Age: Computer-Mediated Communication and Relationship Building on Tinder (May 2015) (unpublished Master of Arts thesis, Texas State University) (on file with author).

[3] See Lindsay Hildebrant, Media and Self Representative Perceptions: Deception in Online Dating (May 2015) (unpublished undergraduate Honors thesis, Pace University) (on file with Pforzheimer Honors College, Pace University).

[4] See James, supra note 2.

[5] See Krystal D’Costa, Catfishing: The Truth About Deception Online, Scientific American Blog (April 25, 2014), http://blogs.scientificamerican.com/anthropology-in-practice/catfishing-the-truth-about-deception-online/#.

[6] See Keith Wagstaff, Hook, Line and Tinder: Scammers Love Dating Apps, NBC News (April 11, 2014, 5:36 PM), http://www.nbcnews.com/tech/security/hook-line-tinder-scammers-love-dating-apps-n77256.

[7] See Geelan Fahimy, Liable for Your Lies: Misrepresentation Law as a Mechanism for Regulating Behavior on Social Networking Sites, 39 Pepp. L. Rev. 2 (2012).

[8] See id.

[9] See id.

[10] See Greg Mitchell, Digital Dating: Legal Matters to Consider Before Swiping Right, Missouri Lawyers Help Blog (Feb. 12, 2016), http://missourilawyershelp.org/digital-dating-legal-matters-to-consider-before-swiping-right/

[11] See id.

[12] Doe v. MySpace, Inc., 528 F.3d 413, 416 (5th Cir. 2008).

[13] See Fahimy, supra note 7.

[14] See James, supra note 2.

[15] See Fahimy, supra note 7.

[16] See id.

Photo Source:

http://www.hercampus.com/sites/default/files/2015/11/02/tinder.jpg

A New Class of Worker for the Sharing Economy

Digital Direction for the Analog Attorney – Data Protection, E-Discovery, and the Ethics of Technological Competence

BMS Publication Version PDFpdf_icon

Cite as: Stacey Blaustein et al., Digital Direction for the Analog AttorneyData Protection, E-Discovery, and the Ethics of Technological Competence in Today’s World of Tomorrow, 22 Rich. J.L. & Tech. 10 (2016), http://jolt.richmond.edu/v22i4/article10.pdf.

 Stacey Blaustein,* Melinda L. McLellan,** and James A. Sherer***

 

I.  Introduction

 [1]       Over the past twenty years, the near-constant use of sophisticated technological tools has become an essential and indispensable aspect of the practice of law. The time and cost efficiencies generated by these resources are obvious, and have been for years.[1] And because clients expect their counsel to take full advantage,[2] savvy attorneys understand that they must keep up with ever-evolving legal technologies to stay competitive in a crowded marketplace.[3]

[2]       With increased globalization and exponential growth in the creation, collection, use, and retention of electronic data, the challenges to all lawyers—especially those who may not have tech backgrounds or a natural aptitude for the mechanics of these innovations—are multiplying with breathtaking speed.[4] Nevertheless, many attorneys are either blissfully unaware of the power and potential danger associated with the tools they now find themselves using on a daily basis, or they are willfully avoiding a confrontation with reality. For lawyers, technological know-how is no longer a “nice to have” bonus; it now poses an ethical obligation. Where competent client representation demands a minimum level of tech proficiency, however, many lawyers come up short with respect to this fundamental component of their professional responsibilities.[5]

[3]       What types of privacy and data security threats do various technologies pose to attorneys, their firms, their clients, and the legal profession in general? What rules and regulations govern how attorneys may make use of technology in their practice, and how might clients seek to impose restrictions around such use when it comes to their corporate data? Must attorneys gain mastery over the intricate mechanics of the technological resources they employ, or is basic knowledge sufficient? How can we weigh the potential risks and rewards of cutting-edge, emerging digital products and electronic resources about which clients—and indeed, even the lawyers themselves—may understand very little? These are just a few of the questions that arise when we consider the issue of technological competence in the legal profession and corresponding ethical requirements.

[4]       To begin to answer these questions, we look to the applicable Model Rules issued by the American Bar Association (“ABA”), various state-level professional ethics rules that incorporate the Model Rules, associated ethics opinions and guidance issued by the states, state and federal court decisions, and guidelines issued by sector-specific agencies and organizations.[6] Our focus in this investigation concerning lawyerly “technological competence” will be on privacy and data security risks and safeguards, e-Discovery-related challenges, and the potential perils of various uses of social media in the legal sphere.

 II.  The Threat Landscape: Law Firms as Prime Targets

[5]       In recent years, the volume and severity of attacks on electronically-stored data, and the information systems and networks that house that data, have increased exponentially. The modern-day “threat environment” is “highly sophisticated,” and “massive data breaches are occurring with alarming frequency.”[7] For attorneys, such perils implicate multiple ethical and professional responsibilities with respect to how they handle data, including the duty to protect the confidentiality of client information and the obligation to provide “competent” representation.

[6]       Unfortunately, law firms can provide a proverbial back door for hackers seeking access to a company’s data, as attorneys often are custodians of a veritable “treasure trove” of valuable client information “that is extremely attractive to criminals, foreign governments, adversaries and intelligence entities.”[8] Some hackers even focus their efforts primarily on law firms, especially those firms collecting vast amounts of data from corporate clients in the course of E-Discovery or corporate due diligence.[9] Corporate secrets, business strategies, and intellectual property all may be found in a law firm’s collection of its clients’ data.[10] In some cases, the interceptors may be looking for competitive information relevant to merger negotiations, or trying to suss out evidence of as-yet unannounced deals for insider trading purposes.[11]

[7]       A 2015 report estimated that 80% of the biggest 100 law firms have experienced some sort of data security incident.[12] And as is the case with so many companies that suffer a breach, law firms that have been hacked may not know about it for a considerable period of time. Moreover, unlike other industry sectors subject to various reporting requirements, law firms generally do not have a statutory obligation to publicly report cybercrimes that do not involve personally identifiable information.[13] Lack of obligations notwithstanding, a recent report indicated that “[t]he legal industry reported more “cyber threats” threats in January [2016] than nearly any other sector,” topped only by the retail industry and financial services.[14]

[8]       Although these reported “threats” might not necessarily result in data compromises, the fact that the legal industry frequently is among the most targeted for data theft should concern attorneys.[15] Anecdotal evidence of actual and attempted interference with law firms’ data security systems abounds as well. In 2014, a report indicated that communications between lawyers from the law firm of Mayer Brown and officials with the Indonesian government were intercepted by an Australian intelligence agency that had ties with the U.S. National Security Agency (“NSA”).[16] And the managing partner of the Washington-area offices of Hogan Lovells LLP recently noted that her firm “constantly intercept[s] attacks.”[17]

[9]       The message to law firms seems clear: first, if “you’re a major law firm, it’s safe to say that you’ve either already been a victim, currently are a victim, or will be a victim.”[18] Second, “[f]irms have to make sure they are not a weak link…which at its most basic level means their standards for protecting data need to be at least equivalent to those of the companies they represent.”[19]

[10]     It seems inevitable that client expectations and demands with regard to their legal service providers’ security will continue to evolve and expand. One commentator recently predicted that in the future “clients across the board will demand firms demonstrate they’re prepared for all shapes and sizes of cybersecurity breaches,”[20] while another prophesized that “in the name of risk management and data leakage prevention, a large financial industry corporation will challenge their outside counsel’s [Bring Your Own Device] program.”[21] Indeed, according to a 2014 report in the New York Times:

Banks are pressing outside law firms to demonstrate that their computer systems are employing top-tier technologies to detect and deter attacks from hackers bent on getting their hands on corporate secrets for their own use of sale to others….Some financial institutions are asking law firms to fill out lengthy 60 page questionnaires detailing the [law firm’s] cybersecurity measures, while others are demanding on-site inspections….Other companies are asking law firms to stop putting files on portable thumb drives, to stop emailing non-secure iPad or working on computers linked to a share network in countries like China and Russia.[22]

[11]     In short, lawyers, law firms, and other legal services providers cannot afford to be complacent when it comes to cybersecurity.

A.  Lawyering in the Cloud

[12]     Firm adoption of cloud services is on the rise, especially among boutiques and solo practitioners that previously lacked the resources to compete effectively with larger law firms when it came to technology and data storage.[23] At first, the added value of cloud services created a perception that “nirvana had arrived” in terms of leveling the playing field for smaller firms.[24] Notwithstanding the apparent advantages of the cloud, attorneys were quick to identify concerns associated with the technology and its supporting practices, including “increased sensitivity to cyber-threats and data security.”[25] Some commentators opted for a cautious and conservative approach, noting that the “legal profession has developed many safeguards to protect client confidences,” and that the use of cloud hosting, among other practices, fell on a continuum where, as “an individual attorney gives up direct control of his or her client’s information, he or she takes calculated risks with the security of that information.”[26]

[13]     There is hope for attorneys drawn to the advantages of cloud services, but vigilance and diligence is required. As noted in tech law guidance from March 2014, “[u]sing the cloud to hold data is fine, so long as you understand the security precautions.”[27] Security concerns have put a damper on adoption rates and the development of attorney-specific cloud services lags behind other industries. This reluctance is unsurprising given the slow rate of technological advancements within the profession generally,[28] and a deserved reputation that the tendency of firms is “to be technology followers, not leaders.”[29] That said, lawyers do seem to be embracing the cloud to some extent,[30] with the majority utilizing cloud solutions in some capacity,[31] even if implementation is mostly through “sporadic action and adoption among firms and law departments.”[32]

[14]     With respect to professional obligations, this type of implementation may not require specific technological expertise on the part of the attorneys. New York State Bar Association Opinion 1020, which addressed ethical implications of the “use of cloud storage for purposes of a transaction,” determined that compliant usage “depends on whether the particular technology employed provides reasonable protection to confidential client information and, if not, whether the lawyer obtains informed consent from the client after advising the client of the relevant risks.”[33]

[15]     Further, New Jersey Opinion 701 addresses the reality that it is

[N]ot necessarily the case that safeguards against unauthorized disclosure are inherently stronger when a law firm uses its own staff to maintain a server. Providing security on the Internet against hacking and other forms of unauthorized use has become a specialized and complex facet of the industry, and it is certainly possible that an independent [Internet Service Provider] may more efficiently and effectively implement such security precautions.[34]

[16]     Opinion 701 does include an additional caveat, that

[W]hen client confidential information is entrusted in unprotected form, even temporarily, to someone outside the firm, it must be under a circumstance in which the outside party is aware of the lawyer’s obligation of confidentiality, and is itself obligated, whether by contract, professional standards, or otherwise, to assist in preserving it.[35]

 B.  E-Discovery Tools

 [17]     To begin with, federal judges are unconvinced that many of the attorneys appearing before them understand how to make proper use of the technologies and related strategies associated with E-Discovery. A recent report, “Federal Judges Survey on E-Discovery Best Practices & Trends,”[36] compiled some of the judges’ concerns, noting first “the typical attorney…does not have the legal and technical expertise to offer effective advice to clients on e-discovery.”[37] Some of the judges’ comments were quite blunt, with one noting that “[s]ome attorneys are highly competent; but most appear to have significant gaps in their understanding of e-discovery principles.”[38]

[18]     Legal ethical rules and related opinions and scholarship provide guidance for what attorney E-Discovery competence should look like. At least one author has made the connection between professional responsibility and technological savoir-faire, noting that:

There is growing recognition across the country that the practice of law requires some degree of competence in technology. In the forum of litigation, competence in technology necessarily equates with competence in e-discovery. It is only a matter of time before ethics bodies across the nation call for competence in e-discovery.[39]

[19]     The opinions of courts and bar associations may carry the most weight, but a number of influential professional and industry groups also have offered useful commentary on technological competence. For example, competence is

…highlighted in the very first rule of legal ethics, according to the American Bar Association[’s] Rule 1.1 of the ABA Model Rules of Professional Conduct,” which “specifically recognized the need for technological competence through a significant change in August 2012 that formally notified all lawyers (and specifically those in jurisdictions following the Model Rules) that competency includes current knowledge of the impact of e-Discovery and technology on litigation.[40]

[20]     This guidance predated and perhaps presaged a number of state and federal reactions to technology and the impact of these developments on the practice of law, especially within the realm of E-Discovery. Delaware amended its Lawyers’ Rules of Professional Conduct as they related to technology in 2013;[41] North Carolina[42] and Pennsylvania[43] did the same shortly thereafter.

[21]     California’s relatively recent Formal Opinion No. 2015-193 (the “California Opinion”) addresses a number of issues associated with attorney ethical duties vis-à-vis E-Discovery. Although advisory in nature, the California Opinion states “attorneys have a duty to maintain the skills necessary to integrate legal rules and procedures with ‘ever-changing technology.’”[44] That reads broadly, but the California Opinion has been interpreted to indicate that, because E-Discovery arises “in almost every litigation matter, attorneys should have at least a baseline understanding of it.”[45] Specifically, the California Opinion begins with the premise that E-Discovery requires an initial assessment of its inclusion at the beginning of a matter.[46] If E-Discovery will be a component of a matter,

[T]he duty of competence requires an attorney to assess his or her own e-discovery skills and resources as part of the attorney’s duty to provide the client with competent representation. If an attorney lacks such skills and/or resources, the attorney must try to acquire sufficient learning and skill, or associate or consult with someone with expertise to assist.[47]

[22]     Other commentators have noted that the California Opinion focuses on “nine (9) core competency issues” which would offer “solid guidelines for attorneys…to maintain competency and protect client confidentiality in the era of eDiscovery.”[48] One author notes that one of these core competency issues and its related directive, that of performing data searches, stretches across the entirety of the E-Discovery process “occurring at each of these steps, from preservation and collection to review and redaction.”[49]

[23]     Soon after the California Opinion was decided, Magistrate Judge Mitchell Dembin issued a Southern District of California decision that addressed “counsels’ ethical obligations and expected competency” in HM Electronics, Inc. v. R.F. Technologies, Inc.[50] The HM Electronics case focused both on specific steps the attorneys should have taken (such as implementing a legal hold and doing the legwork necessary to certify discovery responses as true) as well as behavior actively detrimental to the case (instructing client personnel to destroy relevant documents).[51] Of note in Judge Dembin’s excoriation of the misbehaving attorneys is his statement that “a judge must impose sanctions for a violation of the Rule that was without substantial justification.”[52] One article suggests that part of the problem may be simply that “counsel and clients alike…fail to take seriously judges’ expectations for how they conduct themselves throughout the discovery process.”[53]

[24]     New York attorneys followed the California Opinion with interest, first noting that it merely presented “the standard tasks one should engage in and competently execute to properly collect and produce responsive ESI [Electronically Stored Information] to the opposing party.”[54] A 2009 S.D.N.Y. opinion had chastised attorneys who would otherwise disclaim experience, warning that it was “time that the Bar—even those lawyers who did not come of age in the computer era” understood E-Discovery technologies and their application.[55] A recent article indicated that there is “an ample basis to discern a framework for ethical obligations, derived from ethics rules, court rules, and sanctions decisions in the e-discovery context” based in part on the history of New York courts as “leaders in the advancement of e-discovery law.”[56]

[25]     But such a “framework for ethical obligations” might not even be necessary where competence is the ethical rule at issue. Competence “requires that lawyers have the legal knowledge, skill, thoroughness, and preparation to conduct the representation, or associate with a lawyer who has such skills”[57] and that supervision is appropriate to ensure that the work of others “is completed in a competent manner.”[58] The issue of supervision came up in another advisory opinion, Ethics Opinion 362 of the District of Columbia Bar, which indicated that retaining an e-Discovery vendor that provided all of the E-Discovery services was both impermissible (as the unauthorized practice of law on the part of the vendor) as well as a circumstance where the attorney engaging such a vendor was not absolved from understanding and supervising the work performed, no matter how technical.[59]

 1. Metadata in Electronic Files

[26]     A very basic threat to client confidentiality (as well as the secrecy of counsel’s strategy) is the existence of metadata embedded in electronic files exchanged between the parties or produced as evidence. Most frequently this threat exists in the form of automatically created information about a file, including changes made to the file, that can be recovered and viewed by a third party if not removed (or “scrubbed”) prior to disclosing the file. This “application metadata” can include information about the document itself, the author, comments and prior edits, and may also detail when the document was created, viewed, modified, saved or printed.[60] In addition to the fact that access to metadata can provide opposing parties with everything from revealing insights to damning evidence, there’s also a “real danger” that “application metadata may be inaccurate.”[61]

[27]     Further, disputes related to metadata regularly arise in the E-Discovery context. Indeed, one of the “biggest challenges in electronic discovery” concerns “[u]nderstanding when metadata is relevant and needs to be preserved and produced.”[62] To cite just one example, the concurring opinion in State v. Ratcliff noted that judges must determine whether submitted evidence contained more than the information visible on the face of the document, or whether metadata was included as well, where the distinction “is critical, both on an ethical and adjudicative basis.”[63]

[28]     Accordingly, understanding and managing metadata has become a baseline requirement for technological competence when dealing with client data and attorney work product. Numerous products exist to help save lawyers from themselves when it comes to accidental disclosure of metadata, including software applications that may be integrated into email programs to prevent documents from being sent outside the network without first passing through a scrubbing filter. And the e-filing portal in many jurisdictions “contains a warning reminder that it is the responsibility of the e-filer to strip metadata from the electronic file before submitting it through the portal.”[64] Reliance on these tools, however, may not suffice for long as the sophistication and complexity of issues related to the creation and manipulation of metadata continue to evolve.

III. Overview of U.S. Data Privacy and
Information Security Law

 [29]     The sectoral approach to privacy and data security law in the United States often is described as “a patchwork quilt” comprised of numerous state and federal laws and regulations that apply variously to certain types of data, certain industries, the application of particular technologies, or some combination of those elements. These laws may be enforced by a variety of regulators, with state Attorneys General and the Federal Trade Commission often leading the way.[65] Plaintiffs’ lawyers also are prominent actors in this space, bringing an ever-increasing number of class action and other civil suits alleging violations of privacy rights, data protection laws, and information security standards.

[30]     Although there are no federal or state privacy statutes specifically applicable solely to lawyers, numerous data protection laws and regulations may apply to attorneys in their role as service provider to their clients or in other contexts. The obligations associated with these laws often implicitly or explicitly demand that lawyers handling client data (1) have a thorough understanding of the potential privacy and security risks to that data; (2) assess and determine how best to secure the data and prevent unauthorized access to the data; and (3) supervise anyone acting on their behalf with respect to the data to ensure the data is appropriately protected at all times.

[31]     Below we describe a few of the privacy and data security laws that tend to come up frequently for lawyers and impose requirements on their handling of client data that may involve technological competence. This discussion is by no means exhaustive, as technology touches upon virtually every aspect of data protection regulation and information security counseling by attorneys in the field. To provide just a few examples, advising companies on restrictions applicable to cross-border data transfers, data localization requirements, cybersecurity standards and information sharing obligations, and regulatory action around the use of biometrics and geolocation technologies are just a few examples of areas where a lawyer must have an understanding of the underlying technology to effectively assist clients.

 A.  HIPAA – Business Associate Agreements

[32]     The Health Insurance Portability and Accountability Act of 1996 (“HIPAA”), is the most significant health privacy law in the United States, imposing numerous obligations on “covered entities” and “business associates” of those “covered entities” to protect the privacy and security of “protected health information” (“PHI”).[66] As required by HIPAA, the Department of Health and Human Services (“HHS”) issued two key sets of regulations to implement the statute: the Privacy Rule[67] and the Security Rule.[68]

[33]     Although attorneys and law firms are not themselves considered covered entities directly subject to HIPAA’s requirements,[69] when attorneys obtain PHI from covered entity clients in the course of a representation, the law firm may be subject to certain HIPAA Privacy Rule requirements[70] in its role as a business associate.[71] The Privacy Rule and the Security Rule apply to a covered entity’s interactions with third parties (e.g., service providers) that handle PHI on the covered entity’s behalf.[72] The covered entity’s relationships with these “business associates” are governed by obligatory contracts known as business associate agreements (“BAAs”) that must contain specific terms.[73] With respect to technological competence specifically, for example, the BAA requires the business associate to implement appropriate safeguards to prevent use or disclosure of PHI other than as provided for by the BAA, and states that the business associate must ensure that any agents/subcontractors that receive PHI from the business associate also protect the PHI in the same manner. And attorneys who “hold HIPAA data or [other PII] may be governed by state or federal law beyond the scope of the proposed rules, which is noted in the new comments”[74] to ABA Rule 1.6, discussed further below.

B.  GLBA Safeguards Rule Requirements

[34]     Pursuant to the Gramm-Leach-Bliley Act (“GLBA”), the primary federal financial privacy law in the United States, various federal agencies promulgated rules and regulations addressing privacy and data security issues.[75] For example, the Safeguards Rule requires financial institutions to protect security of personally identifiable financial information by maintaining reasonable administrative, technical, and physical safeguards for customer information.[76] To comply with the Safeguards Rule, a financial institution must develop, implement, and maintain a comprehensive information security program, and that program must address the financial institution’s oversight of service providers that have access to customers’ nonpublic personal information (“NPI”).[77]

[35]     Again, although a law firm is not a financial institution directly subject to the GLBA, when it acts as counsel to a financial institution, GLBA requirements may apply to its handling of NPI received from that client. To the extent a financial institution’s law firm will have access to such NPI in the course of the representation, the financial institution-client must take reasonable steps to ensure the law firm has the ability to safeguard such data prior to disclosing it to the firm, and require the firm to contractually agree (in writing) to safeguard the NPI. Assuming such data will be stored electronically (a safe assumption in virtually all cases), it is incumbent on the law firm to understand the potential data security risks and how to prevent unauthorized access, use, transfer, or other processing of their clients’ NPI.

 C.  State Data Security Laws

[36]     At the state level, there are numerous laws and regulations regarding the protection of personal information (and other types of data) that apply to all entities that maintain such data, including lawyers, law firms, and other legal service providers.

[37]     A number of states, such as California, Connecticut, Maryland, Nevada, Oregon, and Texas, have enacted laws that require companies to implement information security measures to protect personal information of residents of the state that the business collects and maintains.[78] These laws of general application are relevant to attorneys and law firms with respect to the personal information they maintain—both client data and data relating to their employees. Typically, these laws are not overly prescriptive and include obligations to implement and maintain reasonable security policies and procedures to safeguard personal information from unauthorized access, use, modification, disclosure, or destruction (though most do not offer a definition or description of what is meant by “reasonable” security). Some laws, such as California’s, impose a requirement to contractually obligate non-affiliated third parties that receive personal information from the business to maintain reasonable security procedures with respect to that data.[79]

[38]     Massachusetts was the first state to enact regulations that directed businesses to develop and implement comprehensive, written information security programs (“WISPs”) to protect the personal information of Massachusetts residents.[80] These regulations apply to all private entities (including law firms) that maintain personal information of Massachusetts residents, including those that do not operate in Massachusetts; they also list a number of minimum standards for the information security program.[81] The Massachusetts regulations are relatively prescriptive as compared to other similar state laws of this nature, and they include numerous specific technical requirements.

[39]     These requirements apply to law firms directly, but they also apply to law firms as service providers to businesses that maintain personal information of Massachusetts residents. A compliant WISP must address the vetting of service providers, and the contract must include provisions obligating the service provider to protect the data.[82]

IV.  Applicable Ethical Rules and Guidance

[40]     The myth of the Luddite[83] or caveman[84] lawyer persists, even if this type of anachronism is, in fact, an ethical violation waiting to happen.[85] But even attorneys who “only touch a computer under duress, and take comfort in paper files and legal research from actual books”[86] must deal with technology.[87] The adequate practice—or perhaps simply “the practice” of law does not exist without technology, and there is no longer a place for lawyers who simply “hope to get to retirement before they need to fully incorporate technology into their lives.”[88]

[41]     “Really?” goes the refrain. “Why can’t I just practice the way I always have, without [insert mangled, vaguely-recognizable technology portmanteau] getting in the way?”

[42]     Well, for one thing, to the extent attorneys rely on the protections of privilege to serve their clients, said attorneys must understand how the confidentiality of their communications and work product may be compromised by the technology they use. Technologies introduce complexity that, in turn, may affect privilege—especially when “many lawyers don’t understand electronic information or have failed to take necessary precautions to protect it.”[89] But how much understanding, exactly, may be required to competently represent clients in matters concerning E-Discovery, or data security, or even privacy? At many organizations, “[p]rivacy issues get handled by anyone who wants to do them” because the subject matter area is understaffed or ignored.[90] The key technological issues relevant to E-Discovery versus data privacy may be somewhat different, but the “solutions” companies find are eerily similar: the practitioners that are actually doing the work are often those who have been delegated the work, whose “expertise” is somewhat home-grown and may, in fact, not really represent true technological competence at all.[91]

[43]     What, then, are the requirements for expertise? Perhaps a pragmatic approach is best. Certainly, practitioners who use technology—again, likely all of them—must take some well-defined, initial steps toward acquiring the appropriate skill set. This might be as straightforward as the lawyer familiarizing herself with the relevant technologies at issue. Although it may sound a bit too easy, “just being well-versed enough to understand the issues is a big plus.”[92] That being said, “those considering a career in cybersecurity or privacy will need to spend time developing some level of technical expertise.”[93] In short, the answer is “it depends” and “no one really knows – yet.” In this relatively new space, actual decisions and definitive standards for “technological competence” are thin on the ground. Below we will examine some of the relevant rules and guidelines to consider.

 A.  Recent Guidelines in the Ethics Rules

[44]     Most attorneys do not have specialized training focused on a particular technological field. Certainly the vast majority do not hold themselves out as experts in cybersecurity, cloud-based storage, social media, biometrics, or any of a variety of related disciplines. However, even in the absence of expertise, there are some basic ethical rules that provide a framework for determining a practitioner’s professional duties and obligations with regard to technology—specifically, rules pertaining to competent client representation, adequate supervision, confidentiality, and communications.[94]

1.  Competent Client Representation (Model Rule 1.1)

[45]     As discussed briefly above, almost four years ago, the America Bar Association formally approved a change to the Model Rules of Professional Conduct to establish a clear understanding that lawyers have a duty to be competent not only in the law and its practice, but also with respect to technology. Detailed below, the passage of this rule contemplated changes in technology and eschewed specifics. Rather than a paint-by-numbers approach, ABA Model Rule 1.1 puts the responsibility on attorneys to understand their own—and their clients’—needs, and how new technologies impact their particular practice.

[46]     ABA Model Rule 1.1 states that:

A lawyer shall provide competent representation to a client. Competent representation requires legal knowledge, skill, thoroughness and preparation reasonably necessary for the representation.[95]

[47]     ABA Model Rule 1.1 was amended in 2012 by Codified Comment 8 as follows:

To maintain the requisite knowledge and skills, a lawyer should keep abreast of changes in the law and its practice, including the benefits and risks associated with relevant technology, engage in continuing study and education and comply with all continuing legal education requirements to which the lawyer is subject.[96]

[48]     Some note that Rule 1.1 “does not actually impose any new obligations on lawyers;”[97] neither does it require perfection.[98] Instead it “simply reiterates the obvious, particularly for seasoned eDiscovery lawyers, that in order for lawyers to adequately practice, they need to understand the means by which they zealously advocate for their clients.”[99] One article noted, in fact, that Comment 8 was evidence of “the ABA’s desire to nudge lawyers into the 21st century when it comes to technology.”[100] It did, however, caution that it was “a very gentle nudge.”[101]

 [49]     Nudge or not, that message has resonated across the United States. In the four years since that amendment was approved and adopted by the ABA, twenty-one states since have adopted the ethical duty of technological competence for lawyers.[102] As for many of the states that have not formally adopted the change to their Model Rules of Professional Conduct, those may still explicitly or implicitly acknowledge this emerging duty to be competent in technology, having a basic understanding of technologies their clients use, and a duty to keep abreast of such changes including a required awareness of regulatory requirements and privacy laws.[103]

2.  Supervision (Model Rules 5.1 and 5.3)

[50]     ABA Model Rule 5.1 also bears on a lawyer’s duties regarding technology insofar as duties aided or supported by technology are performed by someone other than the attorney. This responsibility extends to immediate as well as remote support staff, with ABA Model Rule 5.1 requiring that “[l]awyers must also supervise the work of others to ensure it is completed in a competent manner.”[104] This attempt at establishing “the principle of supervisory responsibility without introducing a vicarious liability concept”[105] has led to considerations regarding inexperience generally,[106] but the implications for technological applications should be clear—an associate or other paralegal professional is much more likely to use technology to support legal work[107] than she is to make a representation before a court or like body.

[51]     ABA Model Rule 5.3 also sets forth responsibilities of partners and supervising attorneys to non-lawyer assistants. This set of ethical considerations further reinforces the responsibilities attorneys have to apply sufficient care in their practice when outsourcing supporting legal work to inexperienced non-professionals, and to ensure that confidentiality is maintained with outsourcing staff.[108] This is not just a matter of supervising specific tasks. It also contemplates knowing which tasks are appropriate for delegation, both within the firm and to third-party vendors. For example, if a delegate of the attorney uses technology to begin an engagement, it’s possible that such an arrangement could be viewed as “establish[ing] the attorney-client relationship,” which may be prohibited under ABA Model Rule 5.5.[109]

3.  Duty of Confidentiality (Model Rule 1.6)

[52]     ABA Model Rule 1.6 states that it is critical that lawyers do not reveal confidential or privileged client information.[110] When information was kept in an attorney’s head, or perhaps committed to a sheet of paper, historical precedent on how to comply with this duty may have been helpful. In the “world of tomorrow,”[111] looking to the past for answers makes little sense, especially in those instances where the attorney is unclear as to how information is stored, accessed, maintained, or utilized.

[53]     Model Rule 1.6 also considers a duty of confidentiality that resides at the core of every attorney’s role and serves as one of the attorney’s most important ethical responsibilities. Model Rule 1.6 generally defines the duty of confidentiality as follows: “A lawyer shall not reveal information relating to the representation of a client unless the client gives informed consent, the disclosure is impliedly authorized in order to carry out the representation or the disclosure is permitted [elsewhere].”[112]

[54]     This rule is broad. It encompasses any client information, confidential or privileged, shared or accessible to the attorney and is not limited to just confidential communications. Further, it may only be relinquished under the most onerous of circumstances.[113] A lawyer shall not, therefore, reveal information relating to the representation of a client unless the client gives informed consent, the disclosure is impliedly authorized in order to carry out the representation, or the disclosure is permitted elsewhere in the rules.

[55]     In 2000, the Advisory Committee looked into its crystal ball and considered ESI on various platforms, in different repositories, in various forms. It then added Comment 18 to Rule 1.6, requiring reasonable precautions to safeguard and preserve confidential information. Comment 18 states that, “[A] lawyer [must] act competently to safeguard information relating to the representation of a client against … inadvertent or unauthorized disclosure by the lawyer or other persons who are participating in the representation of the client or who are subject to the lawyer’s supervision.”[114] Indeed, “[p]artners and supervising attorneys are required to take reasonable actions to ensure that those under their supervision comply with these requirements.”[115]

[56]     In addition to the ABA’s commentary, state and local professional organizations have issued guidance as well. In establishing a specific roadmap for lawyers to attain the skills necessary to meet their ethical obligations with respect to relevant technology in the practice of law, and returning to the California Bar’s Formal Opinion 2015-193, there is a sort of checklist that may assist lawyers in meeting their ethical obligations to develop and maintain core E-Discovery competence in the following areas:[116]

  • Initially assessing E-Discovery needs and issues, if any;
  • Implementing or causing (the client) to implement appropriate ESI preservation procedures, (“such as circulating litigation holds or suspending auto-delete programs”);[117]
  • Analyzing and understanding the client’s ESI systems and storage;
  • Advising the client on available options for collection and preservation of ESI;
  • Identifying custodians of potentially relevant ESI;
  • Engaging in competent and meaningful meet and confers with opposing counsel concerning an E-Discovery plan;
  • Performing data searches;
  • Collecting responsive ESI in a manner that preserves the integrity of the ESI; and
  • Producing responsive, non-privileged ESI in a recognized and appropriate manner.

[57]     But this technological competence inherent in the Duty of Competence represents only one third of the ethical duties that govern an attorney’s interaction with technology. This ESI and litigation skills checklist does not address “the scope of an attorney’s duty of competence relating to obtaining an opposing party’s ESI;”[118] nor does it consider the skills required of non-litigation attorneys, which must be inferred from the rule.

[58]     In addition, the State Bar of California’s Standing Committee on Professional Responsibility and Conduct, Formal Opinion 2010-179 states that “[a]n attorney’s duties of confidentiality and competence require the attorney to take appropriate steps to ensure that his or her use of technology in conjunction with a client’s representations does not subject confidential client information to an undue risk of unauthorized disclosure.”[119]

[59]     In reference to the duty of confidentiality, the New York County Lawyer’s Association’s Committee on Professional Ethics examined shared computer services amongst practitioners in Opinion 733, noting that an “attorney must diligently preserve the client’s confidences, whether reduced to digital format, paper, or otherwise. The same considerations would also apply to electronic mail and websites to the extent they would be used as vehicles for communications with the attorney’s clients.”[120] The New York State Bar’s Committee on Professional Ethics Opinion 842 further stated that, when “a lawyer is on notice that the [client’s] information…is of ‘an extraordinarily sensitive nature that it is reasonable to use only a means of communication that is completely under the lawyer’s control,…the lawyer must select a more secure means of communication than unencrypted Internet e-mail.’”[121] 

4.  Communications (Model Rule 1.4)

[60]     ABA Model Rule 1.4 on Communications also applies to the attorney’s use of technology and requires appropriate communications with clients “about the means by which the client’s objectives are to be accomplished,” including the use of technology.[122]

[61]     In construing all of these Model Rules and comments, it is clear that attorneys who are not tech-must (1) understand their limitations; (2) obtain appropriate assistance; (3) be aware of the areas in which technology knowledge is essential; and (4) evolve to competently handle those challenges; or (5) retain the requisite expert assistance. This list applies equally to data security issues, such as being aware of the risks associated with cloud storage, cybersecurity threats, and other sources of potential harm to client data, and can easily be extended to include awareness and understanding with respect to domestic and foreign data privacy issues.

[62]     The ethical obligations to safeguard information require reasonable security, not absolute security. Accordingly, under such rules and related guidance from the Proposal from the ABA Commission on Ethics 20/20,[123] the factors to be considered in determining the reasonableness of the lawyers’ efforts with respect to security include:

(1) The sensitivity of the information;

(2) The likelihood of disclosure if additional safeguards are not employed;

(3) The cost of employing additional safeguards;

(4) The difficulty of implementing the safeguards; and

(5) The extent to which the safeguards adversely affect the lawyer’s ability to represent the client.[124]

As New Jersey Ethics Opinion 701 states, “[r]easonable care however does not mean that the lawyer absolutely and strictly guarantees that the information will be utterly invulnerable against all unauthorized access. Such a guarantee is impossible.”[125]

B.  Ethics and Social Media

[63]     When considering their ethical duties with respect to technology, lawyers today must confront a host of challenges that would have been almost unimaginable even ten years ago. The rise and proliferation of social media as a daily part of most people’s personal and professional lives has created one such challenge.[126] Numerous courts have addressed—and continue to address—attorney duties with respect to social media in the context of spoliation motions when social media evidence has been lost, destroyed, or obfuscated due to negligence, or in accordance with attorney advice.[127] In addition, given the novelty and complexity of the issues, and in the interest of consistency, state bar associations have begun to address issues associated with attorney use of, counseling on, and preservation of social media.

[64]     The Association of the Bar of the City of New York’s Committee on Professional and Judicial Ethics, in Formal Opinion 2010-2, provided some helpful guidelines on attorney access to social media, stating that “[a] lawyer may not use deception to access information from a social networking webpage,” either directly or through an agent.[128] While focused on behaviors that attorneys and their agents should not undertake when developing a case, the opinion does note that the “potential availability of helpful evidence on these internet-based sources makes them an attractive new weapon in a lawyer’s arsenal of formal and informal discovery devices,” and also offers up “the Court of Appeals’ oft-cited policy in favor of informal discovery.”[129] Simply put, the duty is twofold: an attorney must both be aware of social media and know how to use social media to provide effective representation.

 2.  State Bar Association Guidance

[65]     State bar associations are becoming increasingly involved in providing guidance on social media and its implications for the practice of law. For example, in 2014, the New York and Pennsylvania State Bar Associations and the Florida Professional Ethics Committee issued guidance on social media usage by attorneys and addressed the obligations of attorneys to understand how various platforms work, what information will be available to whom, the ethical implications of advising clients to alter or change social media accounts, and the value of ensuring adequate preservation of social media evidence.

i.  New York

[66]     The Social Media Ethics Guidelines of the Commercial and Federal Litigation Section of the New York State Bar Association provide specific guidance for the use of social media by attorneys.[130] Guideline 4, relating to the review and use of evidence from social media, is divided into four subparts, all of which provide specific and pertinent guidance to attorneys:

  • Guideline No. 4.A: Viewing a Public Portion of a Social Media Website, provides that “[a] lawyer may view the public portion of a person’s social media profile or public posts even if such person is represented by another lawyer. However, the lawyer must be aware that certain social media networks may send an automatic message to the person whose account is being viewed which identifies the person viewing the account as well as other information about such person.”[131]
  • Guideline No. 4.B: Contacting an Unrepresented Party to View a Restricted Portion of a Social Media Website, provides that “[a] lawyer may request permission to view the restricted portion of an unrepresented person’s social media website or profile. However, the lawyer must use her full name and an accurate profile, and she may not create a different or false profile to mask her identity. If the person asks for additional information from the lawyer in response to the request that seeks permission to view her social media profile, the lawyer must accurately provide the information requested by the person or withdraw her request.”[132]
  • Guideline No. C: Viewing A Represented Party’s Restricted Social Media Website, provides that “[a] lawyer shall not contact a represented person to seek to review the restricted portion of the person’s social media profile unless an express authorization has been furnished by such person.”[133]
  • Guideline No. 4.D: Lawyer’s Use of Agents to Contact a Represented Party, “as it relates to viewing a person’s social media account,” provides that “[a] lawyer shall not order or direct an agent to engage in specific conduct, or with knowledge of the specific conduct by such person, ratify it, where such conduct if engaged in by the lawyer would violate any ethics rules.”[134]

ii.  Florida

[67]     In Advisory Opinion 14-1, the Florida Bar Association’s Professional Ethics Committee confirmed that an attorney could advise a client to increase privacy settings (as so to conceal from public eye) and remove information relevant to the foreseeable proceedings from social media as long as an appropriate record was maintained—the data preserved—and no rules or substantive laws regarding preservation and/or spoliation of evidence were broken.[135]

iii. Pennsylvania

[68]     In 2014, the Pennsylvania Bar Association issued a Formal Opinion that included detailed guidance regarding an attorney’s ethical obligations with respect to the use of social media. Among other guidelines, the Opinion specifically stated that:

  • Attorneys may advise clients about the content of their Social networking websites, including the removal or addition of information;
  • Attorneys may connect with clients and former clients;
  • Attorneys may not contact a represented person through social networking websites;
  • Although attorneys may contact an unrepresented person through social networking websites, they may not use a pretextual basis for viewing otherwise private information on social networking websites; and
  • Attorneys may use information on social networking websites in a dispute.[136]

3.  ABA Model Rule 3.4

[69]     Finally, although ABA Model Rule 3.4 on Fairness to Opposing Party and Counsel does not directly address social media, the principles behind the rule apply in the social media context. The Rule provides that an attorney shall not “unlawfully obstruct another party’s access to evidence or unlawfully alter, destroy or conceal a document or other material having potential evidentiary value” nor shall the attorney “counsel or assist another person” to undertake such actions.[137]

C.  Guidance on Duties Related to Cybersecurity

[70]     As we discussed above in Section II, attorneys face a complex threat landscape when it comes to security concerns related to the protection of their clients’ data.[138] Although the scope of an attorney’s ethical obligations in this regard remains somewhat unclear, there are several sources of guidance relevant to how lawyers are expected to manage cybersecurity risks.

[71]     One such source that squarely addresses the issue is the Resolution issued by the ABA’s Cybersecurity Legal Task Force. The Resolution contains a detailed Report explaining the ABA’s position regarding the growing problem of intrusions into computer networks utilized by lawyers and law firms, and urges lawyers and law firms to review and comply with the provisions relating to the safeguarding of confidential client information.[139] As the ABA noted in its Report, defending the confidentiality of the lawyer-client relationship and preservation of privilege in communications and attorney work product are fundamental to public confidence in the legal system.[140] Attorneys are directed to (1) keep clients reasonably informed as set forth in the Model Rules of Professional Conduct, as amended in August 2012 and adopted in the jurisdictions applicable to their practice; and (2) comply with other applicable state, federal, and court rules pertaining to data privacy and cybersecurity.[141] The ABA further urges the respect and preservation of the attorney client relationship during the pendency of any actions in which a government entity aims to deter, prevent, or punish unauthorized, illegal intrusions into computer systems and networks used by lawyers and law firms.

[72]     The comment to ABA Model Rule 5.7 states, perhaps somewhat axiomatically, that when “[a] lawyer performs law-related services or controls an organization that does so, there exists the potential for ethical problems.”[142] This, combined with Model Rule 1.6’s requirement for attorneys to safeguard and protect client information, suggests further potential duties associated with cybersecurity.[143] As one author notes

Fulfillment of a law firm’s duty to maintain client confidences in today’s world of cyberattacks requires much more than legal knowledge and legal skills. It requires sophisticated computer knowledge and skills far beyond legal practice. That is why cybersecurity experts should be used to assist in any law firm’s client’s data protection efforts.[144]

Indeed, “[t]raining in security, including cybersecurity should be a part of every lawyer’s education. It is especially important for lawyers who do electronic discovery”.[145]

[73]     On a related subject, in Formal Opinion 2015-3, the New York City Bar Association issued guidance indicating that lawyers do not violate their ethical duties by reporting suspected cybercrime to law enforcement.[146] If an attorney has performed “reasonable diligence” to determine whether a prospective client is actually attempting fraud, the opinion says, then the attorney is free to report.[147] The Opinion continued, highlighting the lack of duty associated with individuals who are not actually clients, stating that an

attorney who discovers that is he the target of an Internet-based trust account scam does not have a duty of confidentiality to the individual attempting to defraud him, and is free to report the individual to law enforcement authorities, because that person does not qualify as a prospective or actual client of the attorney.[148]

V.  Conclusion

[74]     It goes without saying that we live (and work) in interesting times. Cloud technology offers convenience, flexibility, cost savings—and a host of potential security issues that existing “hard-copy world” rules aren’t fit to address. The details of top-secret corporate transactions are now hashed out on collaborative virtual platforms that may be vulnerable to damage, destruction, or unauthorized access. And the increasing ubiquity of social media makes it ever more likely that lawyers and clients alike may post information without appreciating the potential legal ramifications. New technologies have the capacity to enrich our personal lives and enhance our professional lives, but they also create complex and novel challenges for lawyers already subject to a web of ethical duties concerning competence and confidentiality.

[75]     Given the speed with which this dynamic area is changing, the issues raised in this piece may well feel dated within months of publication as the next new product or service revolutionizes another fundamental aspect of human interaction and connectivity. Nevertheless, in this article we have outlined some of the many challenges facing attorneys operating in a threat-laden high-tech landscape, taken a look at the ways in which existing and emerging ethical rules and guidelines may apply to the practice of law in the digital age, and opened a door to further conversation about all of these issues as they continue to evolve.

 

 

* Stacey Blaustein is a Senior Attorney – Corporate Litigation with the IBM Corporation.

** Melinda L. McLellan is Counsel in the New York office of Baker & Hostetler LLP.

*** James Sherer is Counsel in the New York office of Baker & Hostetler LLP.

 

[1] See Roger V. Skalbeck, Computing Efficiencies, Computing Proficiencies and Advanced Legal Technologies, Virginia State Bar – Research Recourses (Oct. 2001), http://www.vsb.org/docs/valawyermagazine/oct01skalbeck.pdf, archived at https://perma.cc/8YWX-YAHF.

[2] See Ed Finkel, Technology No Longer a ‘Nice to Learn’ for Attorneys, Legal Management, Association of Legal Administrators (Oct. 2014), http://encoretech.com/wp-content/uploads/2014/10/Technology-No-Longer-a-Nice-to-Learn-for-Attorneys_ALA-Legal-Management_Oct2014.pdf, archived at https://perma.cc/HUT3-672F.

[3] See, e.g., Evan Weinberger, Fintech Boom Prompts Lawyers to Add Tech Know-How, Law360 (Sep. 4, 2015, 6:05 PM), http://www.law360.com/articles/692081/fintech-boom-prompts-lawyers-to-add-tech-know-how, archived at https://perma.cc/WVE8-UPGP; see also Allison O. Van Laningham, Navigating in the Brave New World of E-Discovery: Ethics, Sanctions and Spoliation, FDCC Q. 327(Summer 2007), http://www.thefederation.org/documents/V57N4-VanLaningham.pdf, archived at https://perma.cc/9L48-MPLU.

[4] See Frank Strong, Beautiful Minds: 41 Legal Industry Predictions for 2016, LexisNexis LawBlog (Dec. 17, 2015), http://businessoflawblog.com/2015/12/legal-industry-predictions-2016/, archived at http://perma.cc/BG5W-R4DB.

[5] To further complicate matters, for attorneys and law firms practicing in the financial technology area such as payment, online lending, bitcoin and other virtual currencies, these lawyers need to be competent in “fintech”, financial technology, another outgrowth of the expertise in technology requirement. See Evan Weinberger, Fintech Boom Prompts Lawyers to Add Tech Know-How, Law360 (Sep. 4, 2015, 6:05 PM), http://www.law360.com/articles/692081/fintech-boom-prompts-lawyers-to-add-tech-know-how, archived at https://perma.cc/L76C-FZRL.

[6] See infra Part III (explaining that agencies such as the FDA have issued guidance in their arena- Postmarket Management of Cybersecurity in Medical Devices).

[7] Report to the House of Delegates, ABA Cybersecurity Legal Task Force Section of Sci. & Tech. Law 1, http://www.americanbar.org/content/dam/aba/administrative/house_of_delegates/resolutions/2014_hod_annual_meeting_109.authcheckdam.pdf, archived at https://perma.cc/KQT3-AFAJ.

[8] Ellen Rosen, Most Big Firms Have Had Some Hacking: Business of Law, Bloomberg (Mar. 11, 2015, 12:01 AM), http://www.bloomberg.com/news/articles/2015-03-11/most-big-firms-have-had-some-form-of-hacking-business-of-law, archived at https://perma.cc/YDR6-ZUV8.

[9] See Melissa Maleske, A Soft Target for Hacks, Law Firms Must Step Up Data Security, Law360 (Sep. 23, 2015, 10:09 PM), http://www.law360.com/articles/706312/a-soft-target-for-hacks-law-firms-must-step-up-data-security, archived at https://perma.cc/6V7K-2WB4.

[10] See id.

[11] See Susan Hansen, Cyber Attacks Upend Attorney-Client Privilege, Bloomberg Businessweek (Mar. 19, 2015, 2:56 PM), http://www.bloomberg.com/news/articles/2015-03-19/cyber-attacks-force-law-firms-to-improve-data-security, archived at https://perma.cc/29A5-MUNG.

[12] See Rosen, supra note 8.

[13] Id.

[14] Mark Wolski, Report: Legal Industry Was Heavily Targeted with Cyber Threats in January, Bloomberg BNA (Mar. 9, 2016), https://bol.bna.com/report-legal-industry-was-heavily-targeted-with-cyber-threats-in-january, archived at https://perma.cc/ZCR9-2WRX.

[15] See id.

[16] James Risen & Laura Poitras, Spying by N.S.A. Ally Entangled U.S. Law Firm, N.Y. Times, Feb. 15, 2014, http://www.nytimes.com/2014/02/16/us/eavesdropping-ensnared-american-law-firm.html, archived at https://perma.cc/F8M4-TEQ7.

[17] See Rosen, supra note 8.

[18] See Hansen, supra note 11.

[19] Blake Edwards, Verizon GC: Law Firms Prime Targets for Hackers, Bloomberg BNA (Feb. 4, 2016), https://bol.bna.com/verizon-gc-law-firms-are-prime-targets-for-hackers/, archived at https://perma.cc/F6WU-N6FW.

[20] Strong, supra note 4.

[21] Id.

[22] Matthew Goldstein, Law Firms Are Pressed on Security for Data, N.Y. Times (Mar. 26, 2014), http://dealbook.nytimes.com/2014/03/26/law-firms-scrutinized-as-hacking-increases/, archived at https://perma.cc/Q77A-8BN3.

[23] See N.Y. City Bar Comm. on Small Law Firms, The Cloud and the Small Law Firm: Business, Ethics and Privilege Considerations 2 (Nov. 2013), http://www2.nycbar.org/pdf/report/uploads/20072378-TheCloudandtheSmallLawFirm.pdf, archived at https://perma.cc/A8EG-AH7E.

[24] Id.

[25] Strong, supra note 4.

[26] Patrick Mohan & Steve Krause, Up in the Cloud: Ethical Issues that Arise in the Age of Cloud Computing, 8 ABI Ethics Comm. News L. 1 (Feb. 2011), http://www.davispolk.com/sites/default/files/files/Publication/a2e048ea-3b12-45fe-a639-9fc2881a4db8/Preview/PublicationAttachment/0f8af440-1db0-4936-8d0d-a1937a0e6c8f/skrause.ethics.clouds.feb11.pdf, archived at https://perma.cc/SW3C-FYT5.

[27] Sharon D. Nelson & John W. Simek, Why Do Lawyers Resist Ethical Rules Requiring Competence with Technology?, Slaw (Mar. 27, 2015), http://www.slaw.ca/2015/03/27/why-do-lawyers-resist-ethical-rules-requiring-competence-with-technology/, archived at https://perma.cc/6HNN-UCDZ.

[28] Ed Finkel, Technology No Longer a ‘Nice to Learn’ for Attorneys, Legal Management, Association of Legal Administrators (Oct. 2014) http://encoretech.com/wp-content/uploads/2014/10/Technology-No-Longer-a-Nice-to-Learn-for-Attorneys_ALA-Legal-Management_Oct2014.pdf, archived at https://perma.cc/TW7N-4WP5.

[29] Leslie Pappas, The Security Concerns Holding Up One Firm’s Cloud Usage, Bloomberg BNA (Jan. 22, 2016), https://bol.bna.com/the-security-concerns-holding-up-one-firms-cloud-usage/, archived at https://perma.cc/Z4LJ-H83Q.

[30] See Casey C. Sullivan, Is It Time for a Law Firm Cloud Computing Security Standard?, FindLaw (Feb. 18, 2016), http://blogs.findlaw.com/technologist/2016/02/is-it-time-for-a-law-firm-cloud-computing-security-standard.html, archived at https://perma.cc/78HF-KKX4.

[31] See Jonathan R. Tung, Survey: Law Departments Are Warming Up to the Cloud, FindLaw (Feb. 18, 2016), http://blogs.findlaw.com/in_house/2016/02/survey-law-depts-are-warming-up-to-the-cloud.html, available at https://perma.cc/M89M-LC3M.

[32] Strong, supra note 4.

[33] N.Y. State Bar Ass’n Comm. on Prof’l Ethics, Op. 1020 (Sept. 12, 2014), http://www.nysba.org/CustomTemplates/Content.aspx?id=52001, archived at https://perma.cc/8MPU-62BR.

[34] N.J. Advisory Comm. on Prof’l Ethics, Op. 701 (2006), https://www.judiciary.state.nj.us/notices/ethics/ACPE_Opinion701_ElectronicStorage_12022005.pdf, archived at https://perma.cc/2H5Y-UYWX.

[35] Id.

[37] Aebra Coe, Judges Lack Faith in Attys’ E-Discovery Skills, Survey Says, Law360 (Jan. 28, 2016), http://www.law360.com/articles/751961/judges-lack-faith-in-attys-e-discovery-skills-survey-says, archived at https://perma.cc/5UJB-D2YX.

[38] Id.

[39] Bob Ambrogi, California Considers Ethical Duty to Be Competent in E-Discovery, Catalyst Blog (Feb. 27, 2015), http://www.catalystsecure.com/blog/2015/02/california-considers-ethical-duty-to-be-competent-in-e-discovery/, archived at https://perma.cc/2FXD-8KM4.

[40] Karin S. Jenson, Coleman W. Watson & James A. Sherer, Ethics, Technology, and Attorney Competence, The Advanced eDiscovery Inst. (Nov. 2014), http://www.law.georgetown.edu/cle/materials/eDiscovery/2014/frimorndocs/EthicsIneDiscoveryBakerHostetler.pdf, archived at https://perma.cc/TFR6-VZNG.

[41] See Order Amending Rules 1.0, 1.1, 1.4, 1.6, 1.17, 1.18, 4.4, 5.3, 5.5, 7.1, 7.2, and 7.3 of the Delaware Lawyers’ Rules of Professional Conduct, Del. R. Prof’l Conduct (2013), http://courts.delaware.gov/rules/pdf/dlrpc2013rulechange.pdf.

[42] See N.C. State. Bar Rules of Prof’l Responsibility & Conduct R. 1.1 (2014), http://www.ncbar.com/rules/rules.asp?page=4, archived at https://perma.cc/7R44-4JAG.

[43] See Notice of Proposed Rulemaking, 43 Pa. Bull. 1997 (Apr. 13, 2013), http://www.pa bulletin.com/secure/data/vol43/43-15/652.html, archived at https://perma.cc/WS5G-MHKQ.

[44] Bob Ambrogi, California Finalizes Ethics Opinion Requiring Competence in E-Discovery, Catalyst Blog (Aug. 6, 2015), https://www.catalystsecure.com/blog/2015/08/california-finalizes-ethics-opinion-requiring-competence-in-e-discovery/, archived at https://perma.cc/V7NV-QCWW.

[45] Id.

[46] See id.

[47] State Bar of Cal. Standing Comm. on Prof’l Responsibility & Conduct, Formal Op. 2015-193 (2015), https://ethics.calbar.ca.gov/Portals/9/documents/Opinions/CAL%202015-193%20%5B11-0004%5D%20(06-30-15)%20-%20FINAL.pdf, archived at https://perma.cc/8GWJ-BVJ2.

[48] Adam Kuhn, The California eDiscovery Ethics Opinion: 9 Steps to Competency, Recommind Blog (Aug. 11, 2015), http://www.recommind.com/blog/california-ediscovery-ethics-opinion-9-steps-to-competency, archived at https://perma.cc/2X2K-FCRQ.

[49] Id.

[50] H. Christopher Boehning & Daniel J. Toal, E-Discovery Competence of Counsel Criticized in Sanctions Decision, New York Law Journal (Oct. 6, 2015), http://www.newyorklawjournal.com/id=1202738840840/EDiscovery-Competence-of-Counsel-Criticized-in-Sanctions-Decision#ixzz42wNK34Ms, archived at https://perma.cc/4BMP-T76U.

[51] See generally HM Elecs., Inc. v. R.F. Techs., Inc., 2015 U.S. Dist. LEXIS 104100 (S.D. Cal. Aug. 7, 2015) (arguing the invalidity of the steps that the defendants took in order to certify discovery as true).

[52] Boehning & Toal, supra n. 50.

[53] Id.

[54] Samantha V. Ettari & Noah Hertz-Bunzl, Ethical E-Discovery: Core Competencies for New York Lawyers, New York Law Journal (Nov. 2, 2015), http://www.kramerlevin.com/files/Publication/60607051-f018-43b7-8a3c-7d43b4ff6e50/Presentation/PublicationAttachment/1e570a52-e27d-425f-a75b-9e25811df796/NYLJ%20Article-EDiscovery%2011.2.15.pdf, archived at https://perma.cc/F3R8-UWM6.

[55] William A. Gross Constr. Assocs., Inc. v. Am. Mfrs. Mut. Ins. Co., 256 F.R.D. 134, 136 (S.D.N.Y. 2009).

[56] See Ettari & Hertz-Bunzl, supra n. 54.

[57] See Ettari & Hertz-Bunzl, supra n. 54 (citing New York Rules of Professional Conduct (N.Y. Rule) 1.1.5).

[58] See Ettari & Hertz-Bunzl, supra n. 54 (citing N.Y. Rule 5.1(c)).

[59] See generally D.C. Comm. on Legal Ethics, Formal Op. 362 (2012), https://www.dcbar.org/bar-resources/legal-ethics/opinions/opinion362.cfm, archived at https://perma.cc/TXA5-26ZG (discussing the permissibility of non-lawyer ownership of discovery service vendors).

[60] See generally The Sedona Conference Working Group, Best Practices Recommendations & Principles for Addressing Electronic Document Production, The Sedona Principles: Second Edition, June 2007, at 60, 61 https://thesedonaconference.org/publication/The%20Sedona%20Principles, archived at https://perma.cc/UU5K-V8KQ (explaining the composition and functionality of metadata).

[61] Id. at 4.

[62] Id.

[63] State v. Ratcliff, 849 N.W.2d 183, 196 (N.D. 2014).

[64] See Christian Dodd, Metadata 101 for Lawyers: A 2-Minute Primer, Law360 (Oct. 15, 2015, 4:30 PM), http://www.law360.com/articles/712714/metadata-101-for-lawyers-a-2-minute-primer, archived at https://perma.cc/3VCT-TJRB.

[65] See Daniel J. Solove & Woodrow Hartzog, The FTC and the New Common Law of Privacy, 114 Colum. L. Rev. 583, 587 (2014).

[66]See Health Insurance Portability and Accountability Act of 1996 (HIPAA), 42 U.S.C. §§1320d to 1320d-8 (2007) [hereinafter HIPAA].

[67] See Standards for Privacy of Individually Identifiable Health Information, 65 Fed. Reg. 82,462 (Dec. 28, 2000) (codified at 45 C.F.R. pts. 160, 164).

[68] See Security Standards, 68 Fed. Reg. 8333, 8334 (Feb. 20, 2003) (codified at 45 C.F.R. pts. 160, 162, 164).

[69] The health plan within an organization, such as a law firm’s employee health plan, may itself be a “covered entity” for HIPAA compliance purposes, but a firm generally is not, itself, a covered entity. See, e.g., HIPAA, supra note 66.

[70] See John V. Arnold, PRIVACY: What Lawyers Must Do to Comply with HIPAA, 50 Tenn. B.J. 16, 17 (Mar. 2014).

[71] See Lisa J. Acevedo et. al., New HIPAA Liability for Lawyers, 30 GPSolo, no. 4, 2013, http://www.americanbar.org/publications/gp_solo/2013/july_august/new_hipaa_liability_lawyers.html, archived at https://perma.cc/F88Y-U928.

[72] See Standards for Privacy of Individually Identifiable Health Information, supra note 67; see Security Standards, supra note 68.

[73] Both the Privacy Rule and the Security Rule dictate certain terms that must be included in a BAA.

[74] See Nelson & Simek, supra note 27.

[75] See 15 U.S.C. §§ 6801–6809 (2012).

[76] See 16 C.F.R. §§ 314.2, 314.3(b).

[77] See 16 C.F.R. § 314.4(a-c).

[78] See, e.g., Cal. Civ. Code § 1798.81.5 (Deering 2009); Conn. Gen. Stat. § 42-471 (2010); Md. Code Ann., Com. Law §§ 14-3501 to 14-3503 (LexisNexis 2009); Nev. Rev. Stat. § 603A.210 (2009); Or. Rev. Stat. § 646A.622 (2009); Tex. Bus. & Com. Code Ann. §§ 72.001–72.051 (West 2009).

[79] See Cal. Civ. Code § 1798.81.5 (Deering 2009).

[80] See 201 Mass. Code Regs. 17.01–17.05 (2008).

[81] See id.

[82] See id.

[83] See Debra Cassens Weiss, Lawyers Have Duty to Stay Current on Technology’s Risks and Benefits, New Model Ethics Comment Says, ABA Journal Law News (Aug. 6, 2012, 7:46 PM)

http://www.abajournal.com/news/article/lawyers_have_duty_to_stay_current_on_technologys_risks_and_benefits/, archived at https://perma.cc/WPZ4-2DYH.

[84] See Unfrozen Caveman Lawyer, Saturday Night Live Transcripts, http://snltranscripts.jt.org/91/91gcaveman.phtml, archived at https://perma.cc/M7GB-DGJZ (“Sometimes when I get a message on my fax machine, I wonder: ‘Did little demons get inside and type it?’ I don’t know! My primitive mind can’t grasp these concepts.”) (last visited Apr. 5, 2016).

[85] See Megan Zavieh, Luddite Lawyers Are Ethical Violations Waiting to Happen, Lawyerist.com (last updated July 10, 2015), https://lawyerist.com/71071/luddite-lawyers-ethical-violations-waiting-happen/, archived at https://perma.cc/6V4W-94J7.

[86] Lois D. Mermelstein, Ethics Update: Lawyers Must Keep Up with Technology Too, American Bar Association – Business Law Today, Business Law Today (Mar. 2013), http://www.americanbar.org/publications/blt/2013/03/keeping_current.html, archived at https://perma.cc/T8CF-ZWND.

[87] See Blair Janis, How Technology Is Changing the Practice Of Law, GP Solo, http://www.americanbar.org/publications/gp_solo/2014/may_june/how_technology_changing_practice_law.html, archived at https://perma.cc/23P5-PGM7 (last visited Apr. 5, 2016).

[88] Kevin O’Keefe, We Need Laws Requiring Lawyers to Stay Abreast of Technology? LEXBLOG: Ethics & Blogging Law (Mar. 28, 2015), http://kevin.lexblog.com/2015/03/28/we-need-laws-requiring-lawyers-to-stay-abreast-of-technology/, archived at https://perma.cc/8DR5-XK43.

[89] Attorney-client Privilege: Technological Changes Bring Changing Responsibilities for Attorneys and Legal Departments, Corporate Law Advisory, http://www.lexisnexis.com/communities/corporatecounselnewsletter/b/newsletter/archive/2014/01/06/attorney-client-privilege-technological-changes-bring-changing-responsibilities-for-attorneys-and-legal-departments.aspx, archived at https://perma.cc/XQ53-P3MF (last visited Apr. 5, 2016).

[90] Daniel Solove, Starting a Privacy Law Career, LinkedIn Pulse (Aug. 27, 2013), https://www.linkedin.com/pulse/20130827061558-2259773-starting-a-privacy-law-career?forceNoSplash=true, archived at https://perma.cc/G78L-DM2X.

[91] See Peter Geraghty & Sue Michmerhuizen, Think Twice Before You Call Yourself an Expert, Your ABA (Mar. 2013), http://www.americanbar.org/newsletter/publications/youraba/201303article11.html, archived at https://perma.cc/HJK7-RSLG .

[92] Solove, supra note 90.

[93] Alysa Pfeiffer-Austin, Four Practical Tips to Succeed in the Cybersecurity and Privacy Law Market, ABA Security Law (Dec. 9, 2015), http://abaforlawstudents.com/2015/12/09/four-practical-tips-to-succeed-in-the-cybersecurity-and-privacy-law-market/, archived at https://perma.cc/AH9A-JCTU.

[94] See David G. Ries, Cybersecurity for Attorneys: Understanding the Ethical Obligations, Law Practice Today (Mar. 2012), http://www.americanbar.org/publications/law_practice_today_home/law_practice_today_archive/march12/cyber-security-for-attorneys-understanding-the-ethical-obligations.html, archived at https://perma.cc/N4VM-N4NG.

[95] Model Rules of Prof’l Conduct R. 1.1 (2014).

[96] Model Rules of Prof’l Conduct R. 1.1 cmt. 8 (2014) (emphasis added).

[97] Jenson, Watson & Sherer, supra note 40, at 2.

[98] See James Podgers, You Don’t Need Perfect Tech Knowhow for Ethics’ Sake—But a Reasonable Grasp Is Essential, ABA Journal (Aug. 9, 2014), http://www.abajournal.com/news/article/you_dont_need_perfect_tech_knowhow_for_ethics_sake–but_a_reasonable_grasp, archived at https://perma.cc/CB3P-R7YL.

[99] Jenson, Watson & Sherer, supra note 40, at 2.

[100] Kelly H. Twigger, Symposium, Ethics in Technology and eDiscovery – Stuff You Know, but Aren’t Thinking About, Ark. L. Rev. (Oct. 16, 2014), http://law.uark.edu/documents/2014/10/TWIGGER-Ethics-in-Technology-and-eDiscovery.pdf, archived at https://perma.cc/LTG8-7AYU.

[101] Id.

[102] These states are: Arizona, Arkansas, Connecticut, Delaware, Idaho, Illinois, Iowa, Kansas, Massachusetts, Minnesota, Nebraska, New Hampshire, New Mexico, New York, North Carolina, Ohio, Pennsylvania, Utah, Virginia, West Virginia, and Wyoming. See Robert Ambrogi, 20 States Have Adopted Ethical Duty of Technological Competence, Law Sites (Mar. 16, 2015), http://www.lawsitesblog.com/2015/03/11-states-have-adopted-ethical-duty-of-technology-competence.html, archived at https://perma.cc/B5TF-D6NJ (last updated Dec. 23, 2015) (listing 20 states not including Nebraska); see also Basic Technology Competence for Lawyers, Event Details, Nebraska Bar Assoc. (Apr. 6, 2016), https://nebar.site-ym.com/events/EventDetails.aspx?id=788239&group=, archived at https://perma.cc/SMU6-58TU (“[T]he need to be aware of and have a working knowledge of technology…is ethically required of all lawyers.”).

[103] Ann M. Murphy, Is It Safe? The Need for State Ethical Rules to Keep Pace with Technological Advances, 81 Fordham L. Rev. 1651, 1659, 1665–66 (2013), http://ir.lawnet.fordham.edu/cgi/viewcontent.cgi?article=4876&context=flr, archived at https://perma.cc/V69A-EETR.

[104] Samantha V. Ettari & Noah Hertz-Bunzl, Ethical E-Discovery: What Every Lawyer Needs to Know, LegaltechNews (Nov. 10, 2015), http://www.kramerlevin.com/files/Publication/d7dec721-693a-4810-a4b9-32dfe9c1864b/Presentation/PublicationAttachment/018a444a-d7de-46b2-bc16-506cff88d346/EDiscovery-Legaltech%20News11.10.15..pdf, archived at https://perma.cc/4YMR-XL9U (referring to Model Rule of Prof’l Conduct 5.1).

[105] American Bar Association, A Legislative History: the Development of the ABA Model Rules of Professional Conduct, 1982-2005 560 (2006).

[106] Jeffrey P. Reilly, Rule 5.1 of the Rules of Professional Conduct: What Must Corporate General Counsel Do? Association of Corporate Counsel, Baltimore Chapter FOCUS 2Q12 5–6 (2012), http://www.milesstockbridge.com/pdf/publications/ReillyACCArticle.pdf, archived at https://perma.cc/G26J-NTJE.

[107] See Jennifer Ellis, What Technology Does a Modern US Lawyer Generally Use in Practice?, Quora (Mar. 22, 2014), https://www.quora.com/What-technology-does-a-modern-US-lawyer-generally-use-in-practice, archived at https://perma.cc/4FX4-2UV7.

[108] See Model Rules of Prof’l Conduct R. 5.3.

[109] Frances P. Kao, No, a Paralegal Is Not a Lawyer, ABA Bus. Law Today, (Jan./Feb. 2007), https://apps.americanbar.org/buslaw/blt/2007-01-02/kao.shtml, archived at https://perma.cc/3J2N-ELPA.

[110] See Model Rules of Prof’l Conduct R. 1.6.

[111] See Jon Snyder, 1939’s ‘World of Tomorrow’ Shaped Our Today, Wired (Apr. 29, 2010, 8:00 PM), http://www.wired.com/2010/04/gallery-1939-worlds-fair/, archived at https://perma.cc/D5V4-36R5.

[112] Model Rules of Prof’l Conduct R. 1.6.

[113] See Saul Jay Singer, Speaking of Ethics: When Tarasoff Meets Rule 1.6, Washington Lawyer (May 2011), https://www.dcbar.org/bar-resources/publications/washington-lawyer/articles/may-2011-speaking-of-ethics.cfm, archived at https://perma.cc/A7E4-DSH6.

[114] Model Rules of Prof’l Conduct R. 1.6 cmt. 18.

[115] David G. Ries, Cybersecurity for Attorneys: Understanding the Ethical Obligations, Law Practice Today (Mar. 2012), http://www.americanbar.org/publications/law_practice_today_home/law_practice_today_archive/march12/cyber-security-for-attorneys-understanding-the-ethical-obligations.html, archived at https://perma.cc/59Q2-55Q4.

[116] See State Bar of Cal. Standing Comm. on Prof’l Responsibility and Conduct, Formal Op. 2015-193, 3–4 (2015) [hereinafter Cal. Ethics Op. 2015-193] (discussing what an attorney’s ethical duties are in the handling of discovery of electronically stored information).

[117] Ettari & Hertz-Bunzl, supra note 104.

[118] Cal. Ethics Op. 2015-193, supra note 116, at fn. 7.

[119] State Bar of Cal. Standing Comm. on Prof’l Responsibility and Conduct, Formal Op. 2010-179, 7 (2010) (discussing whether an attorney violates the duties of confidentiality and competence she owes to a client by using technology to transmit or store confidential client information when the technology may be susceptible to unauthorized access by third parties).

[120] N.Y. Cnty. Lawyers’ Ass’n Comm. on Prof’l Ethics, Formal Op. 733, 7 (2004) (discussing non-exclusive referrals and sharing of office space, computers, telephone lines, office expenses, and advertising with non-legal professionals).

[121] N.Y. State Bar Ass’n Comm. on Prof’l Ethics, Formal Op. 842 (2010) (discussing using an outside online storage provider to store client’s confidential information).

[122] Model Rules of Prof’l Conduct R. 1.4 (1983); see also 204 Pa. Code § 81.4 (1988), http://www.pacode.com/secure/data/204/chapter81/chap81toc.html, archived at https://perma.cc/6FG5-9VP3 (incorporating ABA Model Rule 1.4 into Pennsylvania’s Model Rule 1.4).

[123]See ABA Comm. on Ethics 20/20, Introduction and Overview (Feb. 2013), http://www.americanbar.org/content/dam/aba/administrative/ethics_2020/20121112_ethics_20_20_overarching_report_final_with_disclaimer.authcheckdam.pdf, archived at https://perma.cc/D2ZY-NYEU.

[124] Model Rules of Prof’l Conduct R. 1.6(c) cmt. 18 (1983).

[125] Opinion 701 also highlights, if inadvertently, the challenges attorneys face when trying to modify existing practices to fit new technologies. As part of the inquiry underpinning Opinion 701’s guidance, the opinion notes that “nothing in the RPCs prevents a lawyer from archiving a client’s file through use of an electronic medium such as PDF files or similar formats.” This note is nearly laughable when read in the context of current practice, as it suggests that attorneys were (or are?) concerned about whether PDF files are appropriate for retaining paper documents. N.J. Advisory Comm. on Prof’l Ethics, Formal Op. 701 (2006), https://www.judiciary.state.nj.us/notices/ethics/ACPE_Opinion701_ElectronicStorage_12022005.pdf, archived at https://perma.cc/EV9H-BN3T.

[126] See Brian M. Karpf, Florida’s Take on Telling Clients to Scrub Social Media Pages, Law 360 (Sept. 15, 2015, 4:33 PM), http://www.law360.com/articles/702288/florida-s-take-on-telling-clients-to-scrub-social-media-pages, archived at https://perma.cc/NZ3W-FHPS.

[127] See id.

[128] N.Y.C. Bar Ass’n Comm. on Prof’l. Ethics, Formal Op. 2010-2 (2010), http://www.nycbar.org/ethics/ethics-opinions-local/2010-opinions/786-obtaining-evidence-from-social-networking-websites, archived at https://perma.cc/JT9K-2EGV (discussing lawyers’ obtainment of information from social networking websites).

[129] Id.

[130] Mark A. Berman, Ignatius A. Grande & James M. Wicks, Social Media Ethics Guidelines of the Commercial and Federal Litigation Section of the New York State Bar Association, The New York State Bar Association (June 9, 2015), http://www.nysba.org/socialmediaguidelines/, archived at https://perma.cc/4ZSN-BXT4.

[131] Id.

[132] Id.

[133] Id.

[134] Id.

[135] See Fla. State Bar Comm. on Prof’l Ethics, Proposed Op. 14-1 (2015), http://www.floridabar.org/TFB/TFBResources.nsf/Attachments/B806500C941083C785257E730071222B/$FILE/14-01%20PAO.pdf?OpenElement, archived at https://perma.cc/DK9W-A44Z.

[136] Pa. Bar Ass’n. Comm. on Ethics, Formal Op. 2014-300, 2 (2014), http://www.americanbar.org/content/dam/aba/events/professional_responsibility/2015/May/Conference/Materials/pa_formal_op_2014_300.authcheckdam.pdf, archived at https://perma.cc/G6EY-PBFF.

[137] Model Rules of Prof’l Conduct R. 3.4 (1983).

[138] See supra Part II.

[139] See ABA Cybersecurity Legal Task Force, Resolution 118, 2 (August 2013), http://www.americanbar.org/content/dam/aba/administrative/law_national_security/resolution_118.authcheckdam.pdf, archived at https://perma.cc/UQ44-3Q2C.

[140] See id. at 4.

[141] See id. at 16.

[142] Model Rules of Prof’l Conduct R. 5.7, cmt. 1 (1983).

[143] See Model Rules of Prof’l Conduct R. 1.6.

[144] Ralph C. Losey, The Importance of Cybersecurity in eDiscovery, E-Discovery Law Today (May 9, 2014) http://www.ediscoverylawtoday.com/2014/05/the-importance-of-data-security-in-ediscovery/, archived at https://perma.cc/P64J-NYQ7.

[145] Ralph C. Losey, The Importance of Cybersecurity to the Legal Profession and Outsourcing as a Best Practice – Part Two, e-Discovery Team (May 18, 2014), http://e-discoveryteam.com/2014/05/18/the-importance-of-cybersecurity-to-the-legal-profession-and-outsourcing-as-a-best-practice-part-two/, archived at https://perma.cc/W3HW-AHCC.

[146] N.Y.C. Bar Ass’n Comm. on Prof’l Ethics, Formal Op. 2015-3, 4–5 (2015), http://www2.nycbar.org/pdf/report/uploads/20072898-FormalOpinion2015-3-LAWYERSWHOFALLVICTIMTOINTERNETSCAMS.pdf, archived at https://perma.cc/6BHV-V2YC.

[147] Id. at 1.

[148] Id. at 6 (emphasis added).

"Demand Response" In 2016

demand_response2-500x333

By: Ryan Suit,

If you have ever wanted to be paid to do less than you are doing right now, then you might be a fan of “demand response.” Demand response refers to the concept of paying electricity consumers to not use electricity at certain times.[1] Currently, most demand response participants are large commercial industries, and residential participation is still small. But as advancements in technology allow more people and businesses to take part in demand response, paying consumers to not use electricity will have a lot of benefits for consumers, the electric grid, and the environment.[2]

Demand response entails less energy being consumed, which means less energy needs to be produced. For one, this decreases the costs to consumers because they use less energy.[3] Second, demand response allows the grid to be more reliable because there is a lesser likelihood of overload on the grid.[4] Third, it also decreases the amount of carbon dioxide produced by electricity generators who use fossil fuels.[5] Many power plants that use fossil fuels are both inefficient and expensive to operate, so are only turned on when demand for electricity is at its peak. Demand response can prevent the need for these types of plants, and therefore prevent them from producing carbon dioxide pollution, because it helps to balance the supply and demand for electricity by using less energy, rather than producing more.

The big question for demand response is, “Who gets to regulate it?” Section 201 of the Federal Power Act gives the Federal Energy Regulatory Commission (FERC) the power to regulate interstate transmission of wholesale electricity sales.[6] While FERC was given the authority to regulate interstate electricity and wholesale rates, states were left to regulate intrastate electricity sales and retail sales to end-users.[7] Almost one year ago, the Supreme Court held in Learjet that federal natural gas laws do no preempt state laws that regulate any phase of natural gas production.[8] By holding that federal laws did not preempt state regulation of an energy industry, Learjet signaled that the Supreme Court might be in favor of allowing states to regulate demand response.[9] Earlier this year the Court clarified its stance on demand response in FERC v. EPSA.[10] In that case, the Supreme Court confirmed that FERC has the ability to require firms that transmit energy on the grid to accept bids from demand response companies.[11] This makes demand response a much more viable competitor to energy generators, and also creates more of the benefits described above: lower costs, more reliability, and less pollution. Though state regulatory commissions still have the ability to regulate demand response by prohibiting customers in their states from participating in demand response markets[12], the holding in FERC v. EPSA may indicate that “veto” power could be taken away, and FERC might soon have more power to increase demand response schemes.

Demand response is becoming more than just an energy concept, and is gaining more traction both economically and legally. More lawsuits dealing with demand response are likely to be litigated in the near future, but the Supreme Court’s recent rulings show that demand response could be in your near future as well.

 

 

[1] See Joel Eisen, FERC v. EPSA and the Path to a Cleaner Energy Sector, 40 Harvard Environmental Law Review 1 (2016).

[2] Id. at 2.

[3] Id.

[4] Id.

[5] Id.

[6] Federal Power Act § 201

[7] Id.

[8] See Oneok, Inc. v. Learjet, Inc.

[9] Ashley Davoli, Demand Response: The Consumer’s Role in Energy Use, Rich. J.L. & Tech. (April 26, 2015) http://jolt.richmond.edu/index.php/demand-response-the-consumers-role-in-energy-use/.

[10] See Federal Energy Regulatory Commission v. Electric Power Supply Assoc., 136 S.Ct. 760 (2016)

[11] Id. at 763-764.

[12] Id. at 779-780.

Smith et al v. Facebook, Inc. et al: Plaintiffs Allege Facebook is Mining Private Medical Information to Generate Profit

Facebook-lawsuit

By: Quinn Novak,

 

If you Google the American Cancer Society and search through their website for information about breast cancer, do you have a reasonable expectation of privacy? Or do you expect that someone is monitoring your activity and collecting your medical searches? Winston Smith believed he had privacy when searching those types of medical websites for cancer information. Smith did not realize that Facebook collects his private medical information from well-respected cancer organizations[1] and uses that private health data to create marketing profiles, targeting him with tailored advertisements based on his private information.[2] When Smith discovered this reality, he initiated a class action lawsuit against Facebook and seven healthcare organizations, including the American Cancer Society, the American Society of Clinical Oncology, and the Melanoma Research Foundation.[3]

Smith filed the complaint on March 15, 2016 in a San Jose, federal court.[4] The case was assigned to Magistrate Judge Nathanael M. Cousins.[5] The three plaintiffs, including Smith, allege that the named defendants violated the Health Insurance Portability and Accountability Act of 1996 (HIPAA), federal Wiretap Acts, and several state statutes.[6] According to HIPAA, because medical data is private, it should be difficult to acquire and companies are not allowed to gather or share medical information without express authorization of the patient.[7] Plaintiffs argue that because users have no idea that their information is being gathered and because Facebook does not disclose on its data and privacy policies that it tracks, collects, and intercepts users’ sensitive medical information and communications, Facebook and the named healthcare organizations violate HIPAA.[8]

Although it is evident that Facebook is harvesting cancer data to generate profit through targeted advertising,[9] it is unclear if the medical website owners have knowledge that Facebook is using their data.[10] However, if the healthcare organizations were aware that Facebook was collecting their user’s data, plaintiff claims that the organizations should have disclosed their relationship with Facebook to their users.[11]

Although Smith seeks certification, damages, restitution, and permanent injunction from all eight defendants, a Facebook spokesperson stated that the “Lawsuit is without merit and we will defend ourselves vigorously.”[12] In rebuttal, a representative from plaintiff’s counsel, Kiesel Law LLP, stated, “When you’re searching private medical information, you don’t realize it’s being sent to Facebook” and states that there is a reasonable expectation of privacy for these types of searches.[13] Fortunately, not all medical websites allow Facebook to track their user’s communications; the Mayo Clinic and Johns Hopkins Medicine website do not allow Facebook to mine their data through the use of cookies.[14] So, for now, if you need to search for medical information about cancer and you don’t want Facebook to keep track of that information, use one of the numerous protected websites. Otherwise, the next time you log onto Facebook, you can reasonably expect to see advertisements across your newsfeed catering to your cancer medical needs.

 

 

[1] See Bethy Squires, Facebook is Mining Private Data from Cancer Organizations, New Lawsuit Alleges, Broadly (Mar. 18, 2016, 4:15 PM), https://broadly.vice.com/en_us/article/facebook-is-mining-private-data-from-cancer-organizations-new-lawsuit-alleges.

[2] See Nicholas Iovino, Facebook Mines Data Off Cancer Sites, Users Say, Courthouse News Service (Mar. 16, 2016, 7:05 PM), http://www.courthousenews.com/2016/03/16/facebook-mines-data-off-cancer-sites-users-say.htm.

[3] See Carrie Pallardy, Lawsuit Claims Facebook Mined PHI from Websites of Cleveland Clinic, MD Anderson Cancer Center & More for Advertising Profit, Becker’s Health IT & CIO Review (Mar. 23, 2016), http://www.beckershospitalreview.com/healthcare-information-technology/lawsuit-claims-facebook-mined-phi-from-websites-of-cleveland-clinic-md-anderson-cancer-center-more-for-advertising-profit.html.

[4] See Smith et al v. Facebook, Inc. et al, PacerMonitor (Apr. 1, 2016, 12:07 AM), https://www.pacermonitor.com/public/case/10970091/Smith_et_al_v_Facebook,_Inc_et_al [hereinafter PacerMonitor]; see Neil Versel, Suit Claims Facebook Mines Private Cancer Data, MedCity News (Mar. 23, 2016, 1:21 AM), http://medcitynews.com/2016/03/facebook-cancer-data/.

[5] See PacerMonitor, supra note 4.

[6] See Versel, supra note 4.

[7] See Squires, supra note 1.

[8] See id.; see Iovino, supra note 2.

[9] See Iovino, supra note 2.

[10] See Squires, supra note 1; see also Versel, supra note 4 (stating that it is unclear whether cancer institutes named in the suit are aware of Facebook’s practices).

[11] See Pallardy, supra note 3.

[12] See Iovino, supra note 2.

[13] See Squires, supra note 1.

[14] See id.

 

Photo Source: http://www.valuewalk.com/wp-content/uploads/2015/02/Facebook-lawsuit.jpg

Are Your Legal (Or Illegal Undertakings) Really Anonymous?

 

how-can-i-buy-bitcoins-630x382

By Celtia van Niekerk,

When Silk Road was developed, it became a haven for illegal activity. Masked by the cryptic underworld of the dark web, many people thought that their activities online were finally free from the peering eyes of law enforcement.

The Developer of Silk Road, Ross Ulbricht was ones such person.

He created Silk Road, a website where narcotics were freely sold—an Amazon of the underworld. In order to sell narcotics, buyers and sellers turned to bitcoin, a digital currency which enabled them to conduct their activities in secrecy… Or so they thought.

The Bitcoin network relies on a shared public ledger called a block chain. This block chain records all transactions and in that way, the amount in each wallet is calculated.[1] This process is made secure through cryptography. The difficulty for law enforcement is that a user’s true identify is kept secret because instead of using your real name like you would at a bank, a user creates a code which serves as their digital signature in the blockchain.[2] But while Bitcoins itself are anonymous, spending them starts a forensic trail that may lead right back to you.[3]

Graduate students at Penn State were the first to crack the cryptography wall—by isolating some of the Bitcoin addresses, they were able to isolate other address and eventually map the IP addresses of over 1000 Bitcoin addresses.[4] But this easier said than done—once bitcoins mix in with other users, the trace is harder to follow as Bitcoin is designed to blur the lines between the IP address and the transaction.[5] According to Sarah Meiklejohn, a computer scientist, once you catch someone buying an illegal product off a website such as Silk Road, the blockchain serves as a history of all their criminal activity.[6]

Some have contended that the Federal government may issue their own cryptocurrency, like Bitcoin which would require a user to verify their real world identity.[7] But a move like this may have no effect on the popularity of Bitcoin, which offer’s their users more anonymity. One thing is for certain, Bitcoin is not as anonymous as once believed, leading law enforcement to take notice.

 

 

[1] Bitcoin, How does Bitcoin Work? https://bitcoin.org/en/how-it-works (last accessed March 14, 2016).

[2] John Bohannon, Why criminals can’t Hide behind Bitcoin, Science, (March 9,2016).

[3] Elliot Maras, How Bitcoin Technology Helps Law Enforcement Catch Criminals, CCN.LA, (March 10, 2016).

[4] Id.

[5] Supra, note 2.

[6] Supra, note 3.

[7] Supra, note 2 (Statement from Bill Gleim, head of machine learning at Coinalytics).

 

Photo Source: http://media.coindesk.com/2013/08/how-can-i-buy-bitcoins-630×382.jpg

A Litigator's Guide to the Internet of Things

Peyton Publication Version PDFpdf_icon 

 

Cite as: Antigone Peyton, A Litigator’s Guide to the Internet of Things, 22 Rich. J.L. & Tech. 9 (2016), http://jolt.richmond.edu/v22i3/article9.pdf. 

Antigone Peyton, Esq.*I

I. Introduction

[1]       Maybe you’ve heard about the Internet of Things (IoT). It’s the network of physical objects (or “things”) that connect to the Internet and each other and have the ability to collect and exchange data. It includes a variety of devices with sensors, vehicles, buildings, and other items that contain electronics, software, and sensors. Some IoT objects have “embedded intelligence,” which allows them to detect and react to changes in their physical state.[1] Though there is no specific definition of IoT, the concept focuses on how computers, sensors, and objects interact with each other and collect information relating to their surroundings.[2]

[2]       In 2009, the number of “things” connected to the Internet surpassed the number of people worldwide.[3] That was just the beginning of the IoT movement.[4] In fact, some industry experts estimate that there will be up to 50 billion connected devices by 2020.[5] The LinkedIn “Internet of Things Community” is 12,000 members strong, and it’s growing every day.[6] Lawyers need to understand how this explosive growth in the IoT market is going to change their practice in the courtroom.

[3]       From a litigator’s perspective, there are benefits and risks associated with IoT evidence. These connected objects, combined with big data analytics, can make cases simultaneously clearer and more complicated. The IoT movement also challenges litigators to roll up their sleeves and think creatively about how all these connected objects can tell a story. The key evidence that blows the case wide open may be right in front of your face, flying through the interweb, waiting patiently in a client’s smart phone app, or sitting on their fitness device.

[4]       For instance, and as this paper explores, IoT information can be used to track suspects’ movements at the time a crime occurred and provide evidence of an alibi. It can be used to attack the credibility of witness testimony and show how a vehicle was (or wasn’t) functioning properly when an accident occurred. As with all evidence we might use in the courtroom, lawyers, juries, and judges need to understand how IoT data should be interpreted and its limitations.

[5]       Lawyers also need to talk with clients about the smart objects they interact with and which objects might have information that is potentially relevant to litigation. The data those objects collect might reflect a client’s physical injury and diminished capacity, indicate the physiological response to a sexual harassment incident, or provide evidence of a former employee’s unauthorized access to company systems to steal data. Consider the narrative that can be created once you obtain the right IoT data from a client or opponent. You can’t consider the options, however, until you ask the right questions.

[6]       It’s time to hone your technical competence and start thinking about how IoT will forever change the way you prepare and try your case! This is the litigator’s guide to the Internet of Things.

II. The Internet of What?

[7]       The basic premise behind IoT is that everyday objects can be turned into “smart” devices that operate better, are more efficient, and communicate with their people masters and other objects. These objects are programmed to communicate via apps, text messages, browsers, and other tools. They tend to communicate using embedded sensors and wired and wireless communication protocols and systems, including Wi-Fi, Bluetooth, and a variety of specialized IoT protocols.[7]

[8]       Imagine a refrigerator that tells you when you need more milk,[8] or a home thermostat that can be adjusted remotely using an app on your mobile device and learns your behavior patterns relating to your home climate.[9] Or a networked house that connects power outlets to sounds systems, TVs, smoke detectors, security cameras, coffee pots, and the home owner through a software app.[10] These homes already exist,[11] and more are coming online everyday.

[9]       This increased connectivity includes objects outside the home. Workers and service professionals are connecting remotely and communicating with their company’s business equipment and office systems via mobile devices.[12] Consumers are buying networked cars,[13] and walking around with wearable fitness and health technologies strapped to their arms and embedded in their clothes that track their vitals and activity levels.[14] Bikers are using apps and devices to track their workouts and film their surroundings.[15] Google Glass wearers are creating and recording information as they travel and they are communicating with the Internet using voice commands.[16] All of these connected technologies create interesting information about their users and have some level of situational awareness.

III. The Connected State 

A. Connected Toys

[10]     There are a surprising number of everyday objects found in homes that are recording information and transmitting it offsite. One creepy example of the IoT revolution is Mattel’s talking Barbie.[17] Mattel’s connected Barbie can talk with your child through an embedded microphone and a Wi-Fi connection that’s engaged when you hold down a button on her belt.[18] When someone talks to “Hello Barbie,” the conversation is recorded and sent to a server back at the company that makes the voice recognition technology powering Barbie.[19] There, speech recognition software (think of a Barbie version of Siri) interprets the child’s statements and sends back a pre-programmed response.[20] That’s right, the doll talks back to the child. Mattel’s partner, ToyTalk, stores all of the children’s conversations and the conversations of others who interact with the doll.[21]

[11]     Whether ToyTalk is controlling the object or its behaviors or listening to the people or other objects that its products interact with, these activities are important to lawyers investigating potential sources of relevant evidence in the litigation context. Perhaps a lawyer might send a subpoena to ToyTalk seeking the audio records from its client’s Hello Barbie doll for use in a domestic abuse case. And Hello Barbie is not an outlier—there are a number of connected toys popping up on store shelves. It’s rarely, if ever, explained to the consumer where the conversations these toys record and transmit are being stored, how that information is being used by the manufacturer or a partner company, and how it might be collected for use in litigation.

[12]     Some enterprising companies, including several rent-to-own companies that ran into a bit of trouble with the FTC, put spyware (called Detective Mode) on their rental laptops that would turn on the built-in-cameras if the customer failed to make timely payments.[22] The spyware could also track the user’s location, disable the computers, and add a fake software registration popup window that would take a user’s registration information and transmit it back to the rental store, who would use it to track the renters to collect money.[23] Detective Mode also gathers data about whoever is using the computer, and transmits it to the software manufacturer every two minutes, who then sends the data to the rent-to-own store.[24] Since the software collected private data including user names and passwords for e-mail accounts, social media websites, financial institutions, Social Security numbers, medical records, private e-mails, bank and credit card statements, along with webcam pictures of children, partially undressed individuals, and intimate activities at home, the FTC put a stop to the practice.[25] While these rental laptops are not considered an IoT object, similar spyware can be loaded on any object with a chip that includes a camera and access to the Internet and used to collect massively sensitive information.

B. Wearable IoT Devices 

[13]     Wearable IoT devices include a wide range of medical devices and health and fitness products, including casual wearable fitness devices (like the Apple watch) and connected pacemakers and insulin pumps.[26] Wearable fitness devices, including smart watches and smart clothes, now monitor geolocation as well as heart rate, pulse, calorie consumption, sleep patterns, and other biological data.[27] Most wearable devices monitor very sensitive personal and health data. The devices constantly store data that users unconsciously create while going about their day. Wearables also transmit that data to the manufacturer and other entities for analysis and to share the information with the user so they can track their health and fitness over time.[28] Without a doubt, this data can be used in a court of law.

[14]     The information wearable fitness and health devices collect can be highly relevant in determining, for example, where an individual was at a particular time and whether they have been “disabled” or injured as a result of a particular accident. A personal injury lawyer might be interested in the data collected from their client’s wearable fitness device. For instance, the data obtained from a Fitbit device[29] has been used as evidence of an individual’s diminished physical activity resulting from a work-related injury in a Canadian personal injury case.[30] The plaintiff used her Fitbit data to show that her post-injury activity levels were lower than the baseline for someone of the same age and profession to prove she deserved compensation for the injury.[31] With the help of a startup analytic company that aggregates Fitbit data and prepares analytical reports, her lawyers contrasted her personal data with the general population’s health and wellness data (from other Fitbit devices) to make their case.[32]

[15]     Prosecutors and defense counsel seeking incriminating or exculpatory evidence can also use wearable device data. In a case alleging rape in Pennsylvania, the Fitbit data contradicted the statements of the alleged victim by showing that at the time of the crime, she was awake and walking around, even though she claimed she was attacked while asleep.[33] She now faces misdemeanor charges because the Fitbit data contradicted her story.[34]

[16]     Some wearables, like Google Glass, transmit location information, take photos and videos, and perform web searches. Imagine if a person who witnesses a crime while wearing this device took pictures of the perpetrator and the scene after the crime occurred.[35] Unlike surveillance technology, humans tend to look at something interesting or important. Technology like Google Glass might help them record valuable eye-witness evidence. The device may contain evidence like photos and geolocation information, along with time stamps, that police may use to investigate and prosecute crimes and civil litigants may use to pursue their cases.

[17]     However, there are downsides to a person’s voluntary collection of sensitive health information using a wearable device. Insurers and employers seeking to deny injury and disability claims can just as easily use wearable devices to support their own litigation claims and positions. It is generally seen as illegal for employers and insurers to force people to use the wearable devices.[36] But if individuals decide to collect this information on their own, device manufacturers or companies that store or report wearable device data might receive a subpoena for it, assuming the consumers don’t have it.

[18]     The fact that wearable device data may have evidentiary value should come as no surprise, given the fact that evidence from other self-tracking devices has already been used in court. Courts already use data from GPS devices and biking apps in cases involving bike accidents.[37] Police routinely use surveillance technology like Automatic License Plate Readers (ALPR) mounted on police cars, or on objects like road signs and bridges, to photograph thousands of plates per minute and track motorist movements.[38] Private companies also collect license plate photos and geotagged images and sell that data to law enforcement, insurers, and financial institutions.[39] They consider this analogous to taking photographs in public and disseminating the information, an activity protected by the First Amendment.[40] This is one part of a larger trend toward surveillance of private citizens’ activities. While this type of surveillance usually occurs without consent, wearable tracking is voluntary.

[19]     One issue raised by wearable evidence involves the reliability of the data and the analyses performed on it. The software that analyzes wearable data interprets the wearer’s daily activities and compares that data to predetermined baselines and standards set by the manufacturer. For example, Fitbit monitors sleep patterns, decides how many hours a user sleeps, and determines the quality and efficiency of that sleep.[41] The wearer is compared to the “average” sleeper (as determined by the manufacturer’s algorithm).[42] That information might be useful for an employer defending itself against a worker’s compensation claim, particularly if the sleep analysis reveals that the worker was considered “sleep deprived” by the data analysis at the time of the accident. So regardless of her personal optimal sleep duration or the outside forces that might have impacted her sleep the night before the accident occurred, she would be categorized and measured against a population baseline.

[20]     Other wearable devices collect different data, function differently, and use different algorithms and standards to analyze data and report trends and health information in comparison to the general population.[43] All of this means that before wearable evidence is used in a case, you need to understand what it means and the limitations inherent in the analysis of that data. This information should be clearly explained to the fact finder by someone who knows the IoT device that collected the data and the analytic method or methods it uses to interpret that data. Perhaps the IoT revolution will give rise to a whole new class of “experts” who interpret wearables data and the analytics engines in a courtroom setting.

C. Connected Cars

[21]     Another category of IoT technology relates to connected transportation. Today, many cars have sophisticated software that connect the user to many remotely managed features including real-time navigation, mapped points-of-interest, dash-based Internet search, streaming music, and mobile device app connectivity.[44] IoT implicates a wide variety of technologies involved with running and monitoring connected cars, including connected control systems, Event Data Recorders (EDRs), and other vehicle telematics.[45] Vehicle control software may use proximity sensors to identify collision risks and automatically engage the brake, survey blind spots and report objects, and park a vehicle without driver assistance. Automakers are turning vehicles into smartphones using connection technology that controls the entertainment and navigation systems, enables phone calls, and provides a Wi-Fi hotspot. Further, a number of well-know tech companies are currently testing driverless cars and intend to offer self-driving cars in the near future.[46] These cars will be connected to the Internet and they will transmit all kinds of data relating to the vehicle and its passengers’ activities.

[22]     Particularly in light of the Volkswagen emissions scandal,[47] the connected control systems on vehicles are of great interest to the public and regulatory bodies. Additionally, an insurance carrier might seek records reflecting the information an auto manufacturer collects through a connection with an in-dash entertainment system and the data relating to car speed and breaking that resides in the vehicle control system. Was the driver checking her email while driving 70 miles an hour before she rear-ended another car? And a class action lawyer might find the data housed on EDRs useful in a class action lawsuit relating to certain safety issues involving the physical components of vehicles or the software that runs them.

[23]     Some vehicles have safety features that include automated calls in case of emergencies, and in at least one reported incident, a hit and run accident was foiled when the fleeing driver’s car called the police after impact.[48] The car synced to the driver’s phone using Bluetooth, and because the emergency call feature was enabled, it gave police the vehicle’s GPS location and opened the line so the driver could talk with the police.[49] The owner told the police that her car was not in an accident when connected, but the dents in the front of her car and her airbags told a different story when the police showed up at her house later.[50]

[24]     At least one rental car agency is already putting cameras in navigational devices installed in its fleet of cars, and the user cannot disable the camera.[51] While the agency reports that these cameras are not currently optional, they are clearly moving towards the day when customers (and the entire interior of a car) will be visible to their representatives if a service call is made using the navigational device.[52]

IV. e-Discovery of IoT Information

[25]     Lawyers and clients should prepare for IoT-related e-discovery issues. IoT objects will present many challenges in the e-Discovery context. There are limitations on wearable devices and other IoT objects and the information they collect, however, the technology is becoming more sophisticated, accessible, and shareable every day. And when information is shared among multiple objects—a watch, a smartphone and a cloud computing system—the preservation issues are complex. Also, some IoT data is ephemeral and never really stored for future use or access. The Federal Rules of Civil Procedure provide some flexible guidance for dealing with this technical revolution, and counsel against “a limiting or precise definition of electronically stored information.”[53] Yet companies that store data from IoT devices will need to develop processes for preserving, collecting, and producing it when the duty arises—whether it’s the consumer’s duty or their own.

[26]     The legal regimes that govern the capture, processing, use, and ownership of object data are important when determining whether we—or our clients—have a duty to protect data generated from IoT activities (keep it secure and confidential) or preserve and produce it in a litigation. Often, consumers will expect that their wearable device data is “off limits” and they are surprised to learn that it can be used in certain types of cases. The sooner litigators identify the important IoT data clients and their customers generate and the objects they interact with everyday, the better off everyone will be when evaluating the legal risks and obligations to secure and produce that information.

[27]     Additionally, as IoT finds its way into the courtroom, judges will be asked to analyze the complex possession, custody, and control issues encountered in the IoT context. These questions may involve an analysis of the relative cost and burden associated with owner focused or manufacturer focused production options. For example, if an owner must jailbreak her device and hire an expensive expert to collect data off her wearable device, but the manufacturer can export her data with relative ease, courts should consider such practical realities when deciding their relative obligations. Moreover, access controls, privacy restrictions, and contractual obligations play a role in determining the appropriate process for engaging in e-discovery of IoT data.

[28]     One of the practical problems relating to collection of IoT information is that device manufacturers each collect data in their own way. And the analytic platforms that collect and aggregate IoT data do the same thing. Raw data residing on IoT objects may not be preserved or collected without undertaking significant efforts at a significant cost. The manufacturers don’t build these objects with the purpose of making it easy to collect information from them directly. This makes it particularly difficult to develop standard processes for preserving, collecting, reviewing, and producing information from a wide variety of IoT objects using their APIs or built in data reporting and download features. It also makes it hard to aggregate data from different devices and standardize it to obtain big data metrics using data collected from all wearable devices of a particular class. Given these issues, the cost associated with using this type of data could be prohibitive, given the relatively lower value of a case and the damages at stake. This is a prime area in which companies and e-discovery vendors can innovate and create a strong market for flexible services and solutions involving IoT device data.

[29]     Undoubtedly, more lawsuits involving IoT data are coming, as more lawyers and litigants realize that the data is discoverable, relevant, and useful as evidence that can support their case. Litigators and clients should understand how IoT objects work, what information they collect, where it is stored, how long it is stored, and who is obliged to keep it safe. Only after we understand how the system works, can we make strategic decisions about legal risks, e-discovery options and obligations, and appropriate use of IoT data in court. It will be interesting to see how the market responds to the challenges that will arise when parties start engaging in IoT discovery.

V. IoT Object As Witness 

[30]     As wearables and other IoT objects find their way into the courtroom, litigators must figure out how we will use IoT information as “witness” evidence. Did we ever imagine that the objects gathering information about us could be used against us? Will judges and juries treat it like forensic evidence, and give it the same weight and credibility as scientific analysis or the results reported by an expert witness? Not unlike scientific researchers or forensic experts, wearable technologies collect data, interpret it, and reflect it in reports that provide information about the user activity and experience.

[31]     It will be particularly interesting to see what happens when a witness’s sensory experiences (sight, sound, taste, etc.) clash with the “experience” reported by their wearable device and how the fact finder reconciles these competing stories. For example, if a biker testifies that they were traveling down a hill towards an intersection at about 15 miles per hour, but their wearable device or Strava[54] app reports the speed down the slope at 25 (due to a complicated three-dimensional GPS reading and reporting algorithms), which “witness” will the jury credit more? Both systems for reporting experiences are fallible and fraught with errors. But if litigators prioritize IoT data-driven evidence over eyewitness statements or expert analysis, then we must ensure that the algorithms used to analyze IoT data are understood and their imperfections are disclosed. As one commentator noted, if we think of devices as partial witnesses, we must understand that they carry biases and have a worldview, based on their relationship with their environment.[55]

[32]     There is a significant risk that IoT object information, for instance, the Fitbit data and its sleep analysis,[56] would carry more evidentiary weight than the owner’s own experience and view of her sleep patterns or alertness at the time an injury occurred. As with forensics results, there is a significant risk that judges and jurors will conclude that device data doesn’t lie or have an imperfect memory. Yet there is an interpretive activity lurking behind the scene. When wearable object data is collected and interpreted by analytics companies using proprietary algorithms, counsel, judges and juries will need to understand what’s happening under the hood, whether the results reported are reliable, and what evidentiary weight they should be given. The interpretive tools used to report IoT data are often highly subjective or an imperfect fit for a number of users because of their crude analysis methods or the individual’s health status and biology. This is but one area where possibilities are far ahead of the law on witness-style testimony from things connected the Internet.

[33]     Only time will tell whether this type of IoT information is seen as objective and unbiased evidence in the courtroom. If we can’t demonstrate that IoT evidence meets the requirements for introduction of scientific or forensic evidence, then it may be excluded.[57] If introduced, it may be given too much weight in light of its significant limitations. A balanced approach is needed.

[34]     Courts will also have to figure out how the Fifth Amendment protects the right against self-incrimination when the incriminating evidence involves user data created by an IoT object. And the Sixth Amendment provides the Constitutional right to confront a witness that will provide evidence against the accused in a criminal prosecution.[58] How would a witness confront her wearable device or the companies that think they know the best way to interpret the data it collects? This raises fundamental philosophical questions regarding the witness who must be available for “confrontation.” Is it you, your device, the manufacturer, the service provider that collects and analyzes your data, or the company that provides the algorithms used to interpret it? The case law is going to be messy and inconsistent as courts start considering the obstacles presented by use of IoT evidence in the courtroom and sorting the Constitutional issues out.

[35]     Additionally, as more IoT objects are used in litigations, people’s relationships with their wearables are likely to change. How will they react after learning that the connected IoT objects they interact with can be used as an involuntary informant? Perhaps the day is coming when eyewitness testimony will become almost irrelevant and will be replaced by the information our objects provide about our location, health, conscious state, and activities at any given time. But while IoT can reveal truths, those truths must be understood in context, in all their fallible or limited glory.

VI. Litigating in an IoT World

[36]     Some have called IoT a third major revolution—one built on the industrial revolution and the Internet revolution.[59] Lawyers and their clients are becoming more reliant on IoT to manage, monitor, and control their objects, interact, and work on the substantive aspects of their job. Regardless of the source, the information that IoT objects collect and share provide litigators rich new evidence stores that should be explored to find interesting information that impacts their case.

[37]     A tech-savvy lawyer knows how to get the right evidence in the right format from her client or opponent. The fact that IoT raises a number of novel and interesting legal issues and practical complexities means that tech-savvy lawyers, with a good grasp of the basic issues, will be well positioned to provide thoughtful and constructive advice. This guidebook provides some basic information regarding IoT technologies, legal issues, and practical concerns that should be considered. But it needs to be applied to the real world, for each client and case, and in the context of each connected collection of objects, companies, and people. The IoT movement is your opportunity to continue your self-education journey, and learn more about the implications of IoT on lawyering in the Information Age.

 

 

* Antigone Peyton is the founder and CEO of Cloudigy Law PLLC, an intellectual property and technology law firm located in McLean, Virginia. Antigone is an unabashed technophile focused on intellectual property litigation and cutting-edge legal and emerging technology issues, particularly those involving social media, patents, trademarks, copyrights, and trade secrets. Antigone is a frequent speaker and writer covering technological competence, IP, social media, and e-Discovery issues. You can find her on Twitter (@antigonepeyton) or on SnapChat (assuming you know what it is and how to use it).

[1] See Embedded Intelligence – Connecting Billions of Smart Sensors Into the Internet of Things, ARM Holdings, http://ir.arm.com/phoenix.zhtml?c=197211&p=irol-embeddedintelligence, archived at https://perma.cc/3HWX-QBWW (last visited Mar. 23, 2016).

[2] The “things” or “objects” in the IoT generally do not include desktop or laptop computers, smartphones, and tablets.

[3] See Dave Evans, Cisco Internet Bus. Solutions Grp., The Internet of Things: How the Next Evolution of the Internet Is Changing Everything 3 (2011), http://www.cisco.com/c/dam/en_us/about/ac79/docs/innov/IoT_IBSG_0411FINAL.pdf, archived at https://perma.cc/HDF9-NM6T.

[4] See Accenture, The Internet of Things: The Future of Consumer Adoption (2014), https://www.accenture.com/t20150624T211456__w__/us-en/_acnmedia/Accenture/Conversion-Assets/DotCom/Documents/Global/PDF/Technology_9/Accenture-Internet-Things.pdf, archived at https://perma.cc/JKG7-UT4P.

[5] See Evans, supra note 3, at 3. IDC’s Digital Universe study reports that by 2020, there will be 200 to 300 billion connected IoT objects. See The Digital Universe of Opportunities: Rich Data and the Increasing Value of the Internet of Things, EMC2 (Apr. 2014), http://www.emc.com/leadership/digital-universe/2014iview/internet-of-things.htm, archived at https://perma.cc/86RJ-786G; see also Data Set to Grow 10-fold By 2020 As Internet of Things Takes Off, ComputerWeekly.com (Apr. 9, 2014, 1:00 PM), http://www.computerweekly.com/news/2240217788/Data-set-to-grow-10-fold-by-2020-as-internet-of-things-takes-off, archived at https://perma.cc/KGW9-K7DF.

[6] See Internet of Things Community, LinkedIn, https://www.linkedin.com/groups/4662022/profile, archived at https://perma.cc/2CPN-EAXX (last visited Mar. 23, 2016).

[7] Current IoT products are communicating through a variety of communication platforms and standards, including new home automation standards produced by Google (Brillo/Weave) and Apple (HomeKit) that connect each company’s devices in a proprietary communication network.

[8] See Michael Gowan, LG Smart Fridge Spots Spoiled Food, Orders Groceries, NBCNews.com, http://www.nbcnews.com/id/50364798/ns/technology_and_science-tech_and_gadgets/t/lg-smart-fridge-spots-spoiled-food-orders-groceries/#.VvNWzmQrL6c, archived at https://perma.cc/6JXM-ZUY7 (last updated Jan. 4, 2013, 12:46 PM) (explaining how LG’s smart refrigerator connects to the Internet, allowing users to remotely access the refrigerator content list, keep track of their grocery list, and identify out-of-date products stored in it).

[9] See Bernard Marr, Google’s Nest: Big Data And The Internet of Things In The Connected Home, Forbes (Aug. 5, 2015, 10:52 AM), http://www.forbes.com/sites/bernardmarr/2015/08/05/googles-nest-big-data-and-the-internet-of-things-in-the-connected-home/#5a41706b58a1, archived at https://perma.cc/F2SQ-F867 (discussing the Nest thermostat and the usage data uploaded from individual devices via the Internet, which allows Nest to understand energy usage trends across community microcosms and around the world).

[10] See, e.g., A Smart Home Solution That Lives in the Cloud, Comcast, http://corporate.comcast.com/news-information/news-feed/the-future-of-the-home-bringing-the-power-of-the-cloud-to-home-management, archived at https://perma.cc/CR59-UF3J (last visited Mar. 23, 2016) (describing the Xfinity Home technology, which allows users to monitor and control security cameras, smoke detectors, thermostats, lights, and motion sensors through web browsers or Internet connected devices); see also Marr, supra note 9 (discussing how Google is building infrastructure for smart homes of the future that are fully networked by its own devices).

[11] See Daniel H. Wilson, Smart House: Your So-Called Sci-Fi Life, Popular Mechanics (Sept. 30, 2009), http://www.popularmechanics.com/technology/gadgets/a4109/4216434/, archived at https://perma.cc/R3LT-HH69.

[12] See Angela Moscaritolo, Your Printer Can Now Order Ink for You, Thanks to Amazon, PCMag.com (Jan. 19, 2016, 11:35 AM), http://www.pcmag.com/article2/0,2817,2498102,00.asp, archived at https://perma.cc/9HRR-R6MT.

[13] See Brendan O’Brien, The Cloud-Connected Car Drives IoT Monetization, TechCrunch (Oct. 20, 2015), http://techcrunch.com/2015/10/20/the-cloud-connected-car-drives-iot-monetization/, archived at https://perma.cc/7NJJ-VV8K.

[14] See James Stables, Best Fitness Trackers 2016: Jawbone, Misfit, Fitbit, Garmin and More, Wareable (Mar. 7, 2016), http://www.wareable.com/fitness-trackers/the-best-fitness-tracker, archived at https://perma.cc/HF2M-BJU9.

[15] See Elisha Hartwig, 5 Apps to Map Your Bike Route, Mashable (Sept. 11, 2013), http://mashable.com/2013/09/11/bike-route-apps/#nIAaDJ1kfEqZ, archived at https://perma.cc/E8D9-HTHM.

[16] See Matt Swider, Google Glass Review, TechRadar (Feb. 20, 2015), http://www.techradar.com/us/reviews/gadgets/google-glass-1152283/review, archived at https://perma.cc/6NW4-FLK3.

[17] See Lee Moran, Mattel Unveils Talking Hello Barbie Doll, Which Will Have Conversations with Kids, N.Y. Daily News, http://www.nydailynews.com/life-style/mattel-unveils-barbie-talk-kids-article-1.2119732, archived at https://perma.cc/QLJ9-PHRQ (last updated Feb. 18, 2015, 8:18 AM).

[18] See James Vlahos, Barbie Wants to Get to Know Your Child, N.Y. Times Mag. (Sept. 16, 2015), http://www.nytimes.com/2015/09/20/magazine/barbie-wants-to-get-to-know-your-child.html, archived at https://perma.cc/BPV7-HDTD.

[19] See Ashlee Kieler, Mattel Unveils Hello Barbie, a Doll That Can Hold a Conversation, Consumerist (Feb. 17, 2015), https://consumerist.com/2015/02/17/mattel-unveils-hello-barbie-a-doll-that-can-hold-a-conversation/, archived at https://perma.cc/24CB-PK44.

[20] See id.

[21] Mattel and ToyTalk responded to these concerns by confirming that the recorded conversations will not be used to advertise or market products to children, further nothing that parental consent is required to set up a Hello Barbie account. Also, interestingly, parents can listen to their child’s recorded conversations and delete all recorded conversations. Additionally, ToyTalk states that it will only use the recordings to improve its speech recognition technology. See Privacy Policy, ToyTalk, https://www.toytalk.com/legal/privacy/, archived at https://perma.cc/Z8K8-2DRS (last updated Jan. 11, 2016). Mattel does seem to obtain data that it can use to market other products, and it does so with a parent’s consent when they use Mattel’s websites and apps. See Mattel Online Privacy Statement and Children’s Privacy Statement, Mattel, http://corporate.mattel.com/privacy-statement-shared.aspx, archived at https://perma.cc/QSV6-SXAV (last updated Apr. 9, 2014).

[22] See Press Release, Fed. Trade Comm., FTC Halts Computer Spying (Sept. 25, 2012), https://www.ftc.gov/news-events/press-releases/2012/09/ftc-halts-computer-spying, archived at https://perma.cc/R5XS-6DPR; see also David Kravets, Rent-to-Own Laptops Secretly Photographed Users Having Sex, FTC Says, Wired (Sept. 25, 2012, 6:11 PM), http://www.wired.com/2012/09/laptop-rental-spyware-scandal/, archived at https://perma.cc/NQV4-6HQP.

[23] See Kravets, supra note 22.

[24] See Complaint at 3–4, FTC v. Designerware, LLC., Kelly, & Koller (2012), https://www.ftc.gov/sites/default/files/documents/cases/2012/09/120925designerwarecmpt.pdf, archived at https://perma.cc/96PJ-YVVP.

[25] See id.; see also Kravets, supra note 22.

[26] See Accenture, supra note 4, at 3–4 (noting some reports indicate that over 28% of consumers will own wearable IoT technology by the end of 2016).

[27] See, e.g., Fitbit App, Fitbit, https://www.fitbit.com/app, archived at https://perma.cc/5WER-PS9L (last visited Mar. 23, 2016).

[28] See Murray Grigo-McMahon, My Data, Your Data, Our Data, Qlik (July 6, 2015), http://global.qlik.com/us/blog/posts/murray-grigo-mcmahon/my-data-your-data-our-data?SourceID1=SocialChorus&__2hqwt_=2hqwt, archived at https://perma.cc/V479-N3CU.

[29] Fitbit is an extremely popular wearable fitness tracker.

[30] See Kate Crawford, When Fitbit is the Expert Witness, The Atlantic (Nov. 19, 2014), http://www.theatlantic.com/technology/archive/2014/11/when-fitbit-is-the-expert-witness/382936/, archived at https://perma.cc/AW5G-5NY2.

[31] See id.

[32] See id.

[33] See Brett Hambright, Woman Staged ‘Rape’ Scene with Knife, Vodka, Called 9-1-1, Police Say, Lancaster Online (June 19, 2015, 2:57 PM), http://lancasteronline.com/news/local/woman-staged-rape-scene-with-knife-vodka-called–/article_9295bdbe-167c-11e5-b6eb-07d1288cc937.html, archived at https://perma.cc/YY5M-QEXF.

[34] See Kashmir Hill, Fitbit Data Just Undermined a Woman’s Rape Claim, Fusion (June 29, 2015), http://fusion.net/story/158292/fitbit-data-just-undermined-a-womans-rape-claim/, archived at https://perma.cc/2J6W-BYAT.

[35] See Kashmir Hill, Google Glass Will Be Incredible for the Courtroom, Forbes (Mar. 15, 2013, 5:02 PM), http://www.forbes.com/sites/kashmirhill/2013/03/15/google-glass-will-be-incredible-for-the-courtroom/#604082cd36eb, archived at https://perma.cc/2QCU-NAYZ.

[36] See Adam Satariano, Wear This Device So the Boss Knows You’re Losing Weight, Bloomberg (Aug. 21, 2014, 1:26 PM), http://www.bloomberg.com/news/articles/2014-08-21/wear-this-device-so-the-boss-knows-you-re-losing-weight, archived at https://perma.cc/GS3Y-KXF6.

[37] See Patrick Brady, Prosecution Rest in LA Road Rage Rase. Defense Will Call Witnesses Monday, VeloNews (last updated Nov. 3, 2009, 7:00 PM), http://velonews.competitor.com/2009/10/news/prosecution-rest-in-la-road-rage-case-defense-will-call-witnesses-monday_99537, archived at https://perma.cc/87G9-KLSP.

[38] See Conor Friedersdorf, An Unprecedented Threat to Privacy, The Atlantic (Jan. 27, 2016), http://www.theatlantic.com/politics/archive/2016/01/vigilant-solutions-surveillance/427047/, archived at https://perma.cc/NL4V-AJKA (discussing how one private company has taken approximately 2.2 billion license-plate photos to date, and each month it captures and permanently stores nearly 80 million more geotagged images).

[39] See id.

[40] See David Sirota, Companies Test Their First Amendment Right to Track you, Or. Live, http://www.oregonlive.com/opinion/index.ssf/2014/03/companies_test_their_first_ame.html, archived at https://perma.cc/VZ8R-K7QS (last updated Mar. 8, 2014, 7:10 AM).

[41] See What Should I Know About Sleep Tracking?, Fitbit, https://help.fitbit.com/articles/en_US/Help_article/Sleep-tracking-FAQs, archived at https://perma.cc/KB2D-MZMW (last updated Mar. 7, 2016).

[42] See id.

[43] The wearable fitness device market includes Nike Fuelband, Fitbit, Withings Pulse, and Jawbone Up, among others. A number of companies have also developed fitness apps that interact with these wearable devices and collect the user data they create. Fitbit lists over 30 apps that are compatible with the Fitbit device. See Compatible Apps, Fitbit, https://www.fitbit.com/partnership, archived at https://perma.cc/P2L9-TNE4 (last visited Mar. 25, 2016).

[44] See, e.g., Cisco Connected Transportation, http://www.cisco.com/c/en/us/solutions/industries/transportation.html, archived at https://perma.cc/548E-TPXF (last visited March 25, 2016).

[45] An EDR is “a device or function in a vehicle that records the vehicle’s dynamic time-series data during the time period just prior to a crash event (e.g., vehicle speed vs. time) or during a crash event . . . intended for retrieval after the crash event.” 49 C.F.R. § 563.5 (2015). Telematics refers to data collection transmission, and processing technologies for use in vehicles.

[46] See Alice Truong, Tesla Just Transformed the Model S into a Nearly Driverless Car, Quartz (Oct. 14, 2015), http://qz.com/524400/tesla-just-transformed-the-model-s-into-a-nearly-driverless-car/, archived at https://perma.cc/J439-T5JZ; Cadie Thompson, There’s One Big Difference Between Google and Tesla’s Self-driving Car Technology, Tech Insider (Dec. 5, 2015, 12:00 PM), http://www.techinsider.io/difference-between-google-and-tesla-driverless-cars-2015-12, archived at https://perma.cc/RED9-CQTZ; Feann Torr, Next-gen Audi A8 Drives Better Than You, Motoring (Oct. 22, 2014), http://www.motoring.com.au/next-gen-audi-a8-drives-better-than-you-46963/, archived at https://perma.cc/UFL4-76FG; Tom Risen, Report: Uber, Lyft Poised to Win on Driverless Cars, U.S. News & World Rep. (Nov. 13, 2015, 4:05 PM), http://www.usnews.com/news/articles/2015/11/13/report-uber-lyft-poised-to-win-on-driverless-cars, archived at https://perma.cc/3HPV-8RJ9.

[47] See Russell Hotten, Volkswagen: The Scandal Explained, BBC News (Dec. 10, 2015), http://www.bbc.com/news/business-34324772, archived at https://perma.cc/8YKM-6W5W.

[48] See Kashmir Hill, Florida Woman’s Car Calls Police After She Flees the Scene of an Accident, Fusion (Dec. 7 2015, 11:46 AM), http://fusion.net/story/242193/womans-car-calls-police/, archived at https://perma.cc/JDE6-SDST.

[49] See id.

[50] See id.

[51] See Kashmir Hill, Hertz Puts Cameras in Its Rental Cars, Says It Has No Plans to Use Them, Fusion (Mar. 13, 2015, 1:46 PM), http://fusion.net/story/61741/hertz-cameras-in-rental-cars/, archived at https://perma.cc/85TF-DDUM.

[52] See id.

[53] Fed. R. Civ. Pro. 34, advisory committee’s note on 2006 amendments.

[54] Strava is a running and cycling GPS tracker. See generally Strava, https://www.strava.com/about, archived at https://perma.cc/9F99-EWPY (last visited Mar. 21, 2016).

[55] See Crawford, supra note 30.

[56] See What Should I Know About Sleep Tracking?, supra note 41.

[57] See Fed. R. Evid. 702.

[58] See U.S. Const. amend. VI.

[59] See Harish Nivas, How Internet of Things is the Next Big Industrial Revolution, IOTWorm (Jan. 23, 2016), http://iotworm.com/internet-of-things-next-big-industrial-revolution/, archived at https://perma.cc/JD9Y-T64S.

Page 65 of 84

Powered by WordPress & Theme by Anders Norén