Richmond Journal of Law and Technology

The first exclusively online law review.

CRISPR/Cas-9 Patent War Comes to Close, For Now

By: Sarah Alberstein

UC Berkeley and MIT’s Broad Institute have been battling over the patent to coveted CRISPR/Cas-9 technology since 2012.[1] CRISPR/Cas-9 technology can be used to “silence mutated organismal DNA, replace it with correct sequences, or both in conjunction…[and] to sustain and lengthen the lifespan of…bacterial cultures by protecting them from viral attack…minimiz[ing] hassle and time spent re-growing cultures following a viral attack while maximizing efficiency for the researcher.”[2] What’s more, unlike previous gene-editing technologies, CRISPR-Cas-9 “makes it possible to observe specific effects of a particular gene and thus allows for more precise data collection and observation” while minimizing down-stream mutation.[3] The implications of CRISPR/Cas-9 are immense. For example, CRISPR/Cas-9 has the potential to cure previously incurable diseases, like Alzheimer’s and HIV, remove malaria from mosquitos, develop new drugs, alter livestock and agricultural crops, and develop new cancer treatments.[4] It is no surprise that there would be controversy over who owns and controls the patent for this powerful technology.

As of April 2018, the U.S. Patent and Trademark Office “had issues 60 CRISPR-related patents to nearly 20 different organizations.”[5] However, there is one patent in particular which has been hotly contested – the patent that covers the use of CRISPR-Case9 to edit DNA in mammals.[6] In May 2012, Berkeley filed a patent application for the use of CRISPR/Cas-9 to “edit genes in various types of cells.”[7] In December 2012, the Broad Institute and MIT filed a patent for the use of CRISPR/Cas-9 to “modify DNA in eukaryotic cells.”[8] In April 2014, the USPTO granted the Broad Institute their December 2012 patent, which UC Berkeley subsequently contested as being too similar to UC Berkeley’s May 2012 patent.[9] In February 2017, the USPTO ruled in favor of the Broad Institute stating that the Broad Institute’s December 2012 patent was not an obvious extension of UC Berkeley’s May 2012 patent.[10] In June 2018, the USPTO granted a patent to UC Berkley for the use of CRISPR/Cas-9 to edit single-stranded RNA, and a patent for the use of CRISPR/Cas-9 to edit genome regions of 10 to 15 nucleotides long.[11]

Finally, in September 2018, the US Court of Appeals for the Federal Circuit upheld the USPTO’s ruling in favor of the Broad Institute’s December 2012 patent for the use of CRISPR/Cas-9 in editing eukaryotic cells.[12] As a result, the Broad Institute has the rights to “commercialize products developed by using the CRISPR/Cas-9 system to make targeted changes to the genomes of eukaryotes – a group of organisms that includes plants and animals…cover[ing] a wide swath of potential CRISPR/Cas-9 products.”[13] While the results of this case seem to indicate that the patent war over CRISPR/Cas-9 technologies is coming to a close, there is still some movement within the industry. UC Berkeley could appeal the US Court of Appeals decision to the US Supreme Court which, given the zeal each institution has exhibited during this patent dispute, is not outside the realm of possibility.[14] Moreover, the CRISPR/Cas-9 technology landscape is ever-evolving. Already, researchers “have discovered new enzymes to replace Cas-9, and modified the CRISPR/Cas-9 system to manipulate the genome in many ways…”[15] It seems then that there are many technological advancements and patent disputes to come.

 

[1]Jessica Kim Cohen, UC Berkeley and Broad Institute’s Legal Dispute Over CRISPR Ownership: A Timeline of Events,Becker’s Health IT & CIO Report (June 21, 2018), https://www.beckershospitalreview.com/data-analytics-precision-medicine/uc-berkeley-and-broad-institute-s-legal-dispute-over-crispr-ownership-a-timeline-of-events.html.

[2]Sarah Alberstein, CRISPR/Cas-9: The Ethics of Implementation, Grounds: Virginia Journal of Bioethics (Aug. 3, 2016), https://issuu.com/vabioethics/docs/vol._1__iss._1_final.

[3]Id.

[4]Mark Crawford, 8 Ways CRISPR-Cas9 can Change the World, ASME (May 2017),https://www.asme.org/engineering-topics/articles/bioengineering/8-ways-crisprcas9-can-change-world.

[5]Jessica Kim Cohen, UC Berkeley and Broad Institute’s Legal Dispute Over CRISPR Ownership: A Timeline of Events,Becker’s Health IT & CIO Report (June 21, 2018), https://www.beckershospitalreview.com/data-analytics-precision-medicine/uc-berkeley-and-broad-institute-s-legal-dispute-over-crispr-ownership-a-timeline-of-events.html.

[6]Id.

[7]Id.

[8]Id.

[9]Id.

[10]Id.

[11]Id.

[12]Heidi Ledford, Pivotal CRISPR Patent Battle won by Broad Institute, Nature(Sept. 10, 2018), https://www.nature.com/articles/d41586-018-06656-y.

[13]Id.

[14]Id.

[15]Id.

Image Source: https://www.publicdomainpictures.net/en/view-image.php?image=42718&picture=dna

Is SOPIPA the Answer to Student Privacy in the Age of Mobile Technology?

By: Zaq Lacy

The debates surrounding the use of technology in the classroom have raged for many, many years, arguably beginning with the introduction of the blackboard in 1801 and evolving as society has advanced.[1] Regardless of what position you take as to whether technology is beneficial, there can be no doubt that it has become prolific in K-12 classrooms across America, integrating into a wide variety of facets of the classroom and school that directly interact with students.[2] With the level of sophistication that today’s technology has, the rapid expansion of that technology being used by students, and the sheer amount of information being transmitted, these students’ privacy is at risk in three ways: illegal data collection, susceptibility to criminal activity, and identity theft caused by hacking.[3] Many of these students are under the age of 13 and are supposed to be protected by the Child Online Privacy and Protection Act (“COPPA”), a federal statute passed enacted in 1998 which was designed to restrict the collection of personal information from children online.[4]

Unfortunately, a combination of ambiguities and confusion in COPPA’s language[5] and a lack of enforcement by the Federal Trade Commission[6] has resulted in a failure to protect children, particularly those using technology in school.[7] Despite the glaring flaws in COPPA and other current federal laws dealing with student privacy, Congress has made it clear that it will not take steps to remedy this situation, leaving it up to individual states’ legislatures to address the rising concerns.[8] California paved the way with their Student Online Personal Information Protection Act (“SOPIPA”) in 2014,[9] which is regarded as the “most successful and strict piece of privacy legislation” and is the template for a number of other states’ attempts at bolstering protections for students.[10] It was written to fill in the gaps left in federal privacy laws and was the first to target “operator[s] of [I]nternet web site[s], online service[s], online application[s], or mobile application[s],” and applies to any educational technology (“edtech”) companies that reach California K-12 students, regardless of whether such companies are based outside of California.[11]

SOPIPA provides a number of restrictions on what information edtech companies connect collect and what they cannot do with the information they have collected, including selling data for commercial purposes.[12] It also includes affirmative obligations for such companies, requiring that they maintain and enforce appropriate security procedures to prevent “unauthorized access, destruction, use, modification, or disclosure” of student data, and to delete any such data upon request.[13] These are major steps in student privacy, but SOPIPA is still considered an imperfect solution.[14] Among other things, SOPIPA does not appear to hold to the Federal definition of de-identification of data, which companies may use for commercial use.[15] Additionally, questions over enforcement could prove troublesome, particularly where teacher awareness of SOPIPA standards regarding free edtech products is concerned.[16] Despite this, SOPIPA answers a number of the issues that were left untreated by federal law.

Recognizing the potential of SOPIPA, numerous other states have introduced similar legislation, and fifteen other states adopted variations of this law in 2015, adjusting it to fit their needs.[17] Even though there are still some kinks to work out, it is clear that SOPIPA is paving the way to stronger protections for our students’ data privacy when using technology at school.

 

[1] See Michael Horn, New Research Answers Whether Technology Is Good or Bad for Learning, Forbes.com (Nov. 14, 2017 8:28 am), https://www.forbes.com/sites/michaelhorn/2017/11/14/new-research-answers-whether-technology-is-good-or-bad-for-learning/#69fbd08f19d7.

[2] See Zaq Lacy, Is Classroom Technology Making Student Privacy Obsolete?, U. of Rich. J. of L. and Tech.: Blog (Nov. 9, 2018), https://jolt.richmond.edu/2018/11/09/is-classroom-technology-making-student-privacy-obsolete/.

[3] See Alexis Peddy, Note, Dangerous Classroom “App”-titude: Protecting Student Privacy from Third-Party Educational Service Providers, 17 B.Y.U. Educ. & L. J. 125, 128 (2017).

[4] 15 U.S.C. § 6501(1) (2012).

[5] See Peddy, supra note 3, at 130.

[6] Id. at 135.

[7] Id. at 136.

[8] Id. at 142.

[9] Student Online Personal Information Protection Act of 2014, Cal. Bus. & Prof. Code §§ 22584-22585 (Deering 2014).

[10] See Peddy, supra note 3, at 147.

[11] Dylan Peterson, Note, Edtech and Student Privacy: California Law As a Model, 31 Berkley Tech. L.J. 961, 973 (2016).

[12] See id. at 973-74.

[13] See id. at 975-76.

[14] See id. at 983.

[15] See id. at 992.

[16] See id.

[17] See Tanya Roscorla, More States Pass Laws to Protect Student Data, Govtech.com (Aug. 27 2015), http://www.govtech.com/education/k-12/What-States-Did-with-Student-Data-Privacy-Legislation-in-2015.html?utm_source=related.

Image Source: http://blog.identityautomation.com/6-things-schools-can-do-to-ensure-student-data-privacy

Spyflying and Spydiving on the Spyhopping Orcas

By: Paxton Rizzo

The Southern Resident Orcas, or Killer Whales as they are more commonly known, are one of the most critically endangered marine mammals in the United States.[1] Currently, their population is at its lowest in three decades with only seventy-four individuals remaining.[2] Since 2005, the Southern Resident Orcas have been on the endangered species list[3] and are protected by the Endangered Species Act.[4]

Three distinct pods of orcas make up the clan that is referred to as the Southern Resident Orcas. Those three pods are J, K, and L pods. Each pod has its own distinct dialect.[5] These pods fall into a specific category of orca known as Resident Orcas and are differentiated from other types of orcas, (Transient and Offshore) because, they do not migrate as much; they have unique dialects amongst pods and communicate frequently; and they hunt primarily fish.[6] The Southern Resident Orcas’ diet consists of mainly salmon (80%).[7] They spend most of the warmer months hunting salmon in the Puget Sound between Canada and the United States and in the winter have been found as far North as Alaska and as far South as Monterey, California.[8] Being tied to a specific area or habitat is an element in the Southern Resident classification as endangered.[9] Many factors of the Southern Resident Orcas’ population and environment place them under the Endangered Species Act, such as the pollution of the water and their food source and the depletion of their primary food source as a result of man made mechanisms such as dams.[10]

Since being classified as endangered in 2005, conservation efforts, though underway even before then, have increased and from the beginning, technology has been utilized in trying to learn about and understand the orcas. Until a few years back, a common form of technology used to learn about the orcas was a satellite tracker.[11] The tracker would be tagged onto the orcas dorsal fin, by piercing their skin, allowing researchers to track how far the orca traveled in a day, week, or month and where exactly they went.[12] In 2016, researchers were trying to learn where the orcas went in the winter so they would be better able to protect them by expanding the area[13] protected for the orcas under the Endangered Species Act.[14] On a tagging mission, a mistake happened that ended several weeks later with a whale succumbing to a bacterial infection.[15] After that, researchers felt a need to find better ways to monitor the orcas.[16]

Today, researchers use a variety of devices to monitor and track the Orcas such as passive acoustic monitors, digital acoustic tags and aerial drones.[17] Unlike the previous satellite tags, the digital acoustic tags attach by suction cups and track the movements of the orca and the sounds it makes and hears; three studies are underway that will be using this technology to learn about the Southern Resident’s time in their summer habitat.[18] Aerial drones allow researchers to view the Orcas from above and take picture of them.[19] By using this method Researchers have been able to observe how the orcas weight fluctuates.[20] Being able to see the orcas from above gives researchers a better angle to gauge orcas’ weight than the previous method of looking at them from the side, where their figure is harder to observe.[21] This method of tracking the orcas weight has been especially helpful in determining which orcas are pregnant and which orcas may be sick.[22] This gives them the opportunity to respond quickly in any attempt they may launch to save the orca.[23] Most notably, this year when observing orca J50, (affectionately known as Scarlet) researchers noticed that, though she had always been small, her fat stores were depleting quickly.[24] Researchers were able to react by giving her medication and attempting to get her food to eat.[25] They had come up with other creative plans to try and save her when it was determined, after not seeing her for awhile that she may be dead.[26]

Currently, the data collected from the technology used to track and monitor the orcas as well as stool samples,[27] are informing a governor’s task force in Washington State. They soon will release recommendations on what changes and long-term solutions need to be made and implemented in order to try and save the Southern Resident orcas.[28]

 

[1] See Southern Resident Orcas, Endangered Species Coalition http://www.endangered.org/campaigns/southern-resident-orcas/ (last visited Nov. 10, 2018).

[2] See Drones Helping Scientist Track Orca Health, king5.com https://www.king5.com/article/tech/science/environment/drones-helping-scientists-track-orca-health/281-599245989 (last visited Nov. 9 2018).

[3] See Southern Resident Orcas, supra note 1.

[4] 16 USCS § 1531 (LexisNexis Current current through PL 115-269, approved 10/16/18).

[5]  See Southern Resident Orcas, supra note 1.

[6] See Charles Q. Choi, New Killer Whale Species Proposed, Live Science (April 26, 2010, at 3:18 AM, ET), https://www.livescience.com/9893-killer-whale-species-proposed.html

[7] See Southern Resident Orcas, supra note 1.

[8] FAQ About the Southern Resident Endangered Orcas, The Whale Museum https://whalemuseum.org/pages/frequently-asked-questions-about-the-southern-resident-endangered-orcas (last visited Nov. 10, 2018).

[9] See16 USCS § 1533 Current through PL 115-269, approved 10/16/18.

[10] See Id, see also Southern Resident Orcas, supra note 1.

[11] See Craig Welch, Orca Killed by Satellite Tag Leads to Criticism of Science Practices, National Geographic (Oct 6, 2016), https://news.nationalgeographic.com/2016/10/orca-killed-by-satellite-tag-l59/.

[12] See id.

[13] See id.

[14] See16 USCS § 1533(b)(2) Current through PL 115-269, approved 10/16/18.

[15] See id.

[16] See id.

[17] See Spotlight on the Southern Resident Killer Whale – An Interview with NOAA Fisheries Biologist Lynne Barre, NOAA Fisheries (Feb. 13, 2018), https://www.fisheries.noaa.gov/video/spotlight-southern-resident-killer-whale-interview-noaa-fisheries-biologist-lynne.

[18] See Using DTAGs to study acoustics and behavior of Southern Resident killer whales, Northwest Fisheries Science Center https://www.nwfsc.noaa.gov/research/divisions/cb/ecosystem/marinemammal/dtags.cfm (last visited Nov. 10, 2018).

[19] See Drones Helping Scientist supra note 2.

[20] See id.

[21] See id.

[22] See id.

[23] See id.

[24] See Lynda V. Mapes, Orca J50 presumed dead but NOAA continues search, The Seattle Times (Sept. 24, 2018, at 7:57 PM), https://www.seattletimes.com/seattle-news/environment/orca-j50-declared-dead-after-search-southern-residents-down-to-74-whales/.

[25] See id.

[26] See id.

[27] See Spotlight on the Southern Resident Killer Whale supra note 16.

[28] See Task force unveils ‘potential recommendations’ to save killer whales, king5.com https://www.king5.com/article/news/local/task-force-unveils-potential-recommendations-to-save-killer-whales/281-597973069 (last visited Nov. 9, 2018).

Image Source: https://www.king5.com/article/tech/science/environment/drones-helping-scientists-track-orca-health/281-599245989

What Even is Machine Learning and Why Should Law Students be Wary?

By: Eric Richard

Machine learning is a complex concept. Depending on who you ask, you might get any number of answers. And depending even more on your level of familiarity with technology, you might not get any that make the concept seem any less complex or any more concise. However, at its most basic definition, machine learning is essentially the process of getting a computer to “learn” as a human does.[1] Now I know that there are many out there who would raise an objection to this over simplification, but for anybody outside the technology industry or lacking a degree in computer science, that is the easiest way to sum it up. Machine learning can be likened to the painstaking process of teaching an ignorant child what to do in certain situations in response to observations or real-world interactions.[2] The difference, maybe semi-obviously, is that a machine learns through algorithms as opposed to how a child might learn through negative or positive reinforcement.[3] The trickiest part in the process, needless to say, is exactly how to get a machine to learn.[4]

Once the how has been hurdled, the concern for law students might start to be a little clearer. In the words of Kai-Fu Lee, the former head of Google research in China, the “replacement is happening now.”[5] Routine office work is being done more and more by machines rather than people.[6] You know who does a lot of routine office work such as filing and research for law firms? Newly hired, fresh-out-of-law-school associates. While Lee and others feel that this replacement is akin to a white-collar-worker “doomsday scenario,” there are others who feel it might not be a bad thing.[7] With “low level” work delegated to machines, all attorneys, not just the fresh ones, will have more time for the more difficult work.[8] But how do we get to that point? How does a machine learn to do the work that, up until recent years, has required someone with years of scholastic and professional legal training?

The answer, quite humorously, might a game. A recent project between David Colarusso, director of Suffolk University Law School’s Legal Innovation and Technology (LIT) Lab, and the Stanford Legal Design Lab has attempted to solve the “how” of machine learning with a game born from legal questions posted by thousands of people on Reddit.[9] The game is simple enough. It involves presenting a fact pattern to the player with a question to follow.[10] The question usually consists of identifying what type of legal issue or segment of law can be spotted in the fact pattern.[11] The goal of the question and answer format is to allow a machine to learn how to “issue spot.”[12] If a machine is shown a sentence or a fact pattern with the words “wife” and “kids,” then odds are it is going to identify an issue associated with family law even though the concern might be for a speeding ticket instead.[13] Herein lies the problem with a machine attempting to issue spot on behalf of an attorney looking for precedent related to given fact pattern. The game, however, is the concern. With creative solutions such as a match-making game presenting a way for machines to become better at performing jobs traditionally reserved for fresh-out-of-school lawyers, the market and opportunity for current law students might be dwindling little by little every day.

 

[1] See Daniel Faggella, What is Machine Learning?, tech emergence (Oct. 29, 2018), https://www.techemergence.com/what-is-machine-learning/.

[2] See id.

[3] See id.

[4] See id.

[5] Will Knight, Is Technology About to Decimate White-Collar Work?, MIT Tech. Rev. (Nov. 6, 2017), https://www.technologyreview.com/s/609337/is-technology-about-to-decimate-white-collar-work/.

[6] See id.

[7] See id.

[8] See Bernard Marr, How AI Ans Machine Learning Are Transforming Law And The Legal Sector, Forbes (May 23, 2018), https://www.forbes.com/sites/bernardmarr/2018/05/23/how-ai-and-machine-learning-are-transforming-law-firms-and-the-legal-sector/#3475e1a532c3.

[9] See Jason Tashea, New game lets players train AI to spot legal issues, A.B.A. J. (Oct. 16, 2018), http://www.abajournal.com/news/article/new_game_lets_players_train_ai_and_close_the_justice_gap/?utm_source=maestro&utm_medium=email&utm_campaign=tech_monthly.

[10] See id.

[11] See id.

[12] See id.

[13] See id.

Image Source: https://www.thelawyer.com/clifford-chance-trials-video-game-test-trainee-applications-2/

Electronic vs. Paper Voting: A Legal Battle in Georgia

By: Scottie Fralin

Earlier this week, I cast my vote in the midterm election on a paper ballot. In Georgia, paper ballots have been replaced entirely by Direct Recording Electronic voting machines (DREs), which have no paper trail by which to verify or audit the recording of each elector’s vote.[1] DREs employ computers that record votes directly into the computers’ memory.[2] Some DRE systems are also equipped with a printer, which voters can use to confirm his or her choices before committing them to the computer’s memory.[3] Most states use paper ballots, and some use both paper ballots and DREs with mechanisms to ensure a paper trail.[4] The only states that use DREs without a paper trail and no accompanying paper ballot are Delaware, Georgia, Louisiana, New Jersey, and South Carolina.[5] Colorado, Oregon, and Washington use neither paper ballots nor DREs, and instead vote by mail.[6] The vast majority of states use a combination of paper ballots and DRE systems with a paper trail.[7] In those states, the ballot is typically retained after scanning in case verification or a recount is required.[8] Apparently, manufacturers of DRE voting machines have been so secretive in the past about how the technology works that they have required election officials to sign non-disclosure agreements preventing them from bringing in outside experts who could assess the machines.[9]

The skepticism surrounding electronic voting machines is well-founded, as computers can be vulnerable to viruses and malware. In fact, civil rights groups and voters in Texas and Georgia have filed complaints, alleging that electronic voting machines inexplicably deleted some people’s votes for Democratic candidates or switched them to Republican votes.[10] In August of 2017, the Georgia Coalition for Good Governance filed suit against Brian Kemp, claiming that the DRE voting system in Georgia is unsecure, unverifiable, and compromises the privacy and accuracy of their votes.[11] The Coalition claimed that Defendants’ continued use of the DRE system violated their constitutional rights.[12] Though the court denied the Coalition’s motions for preliminary injunctions, it advised the Defendants that further delay in dealing with the vulnerability of the state’s DRE systems is not tolerable because damage to the integrity of a state’s election system undermines public confidence in the electoral system and the value of voting.[13]

As the court said in Curling v. Kemp, “advanced persistent threats in this data-driven world and ordinary hacking are unfortunately here to stay.”[14] Therefore, especially given the upcoming 2020 elections, if a new balloting system in Georgia is to be launched, it must “address democracy’s critical need for transparent, fair, accurate, and verifiable election processes that guarantee each citizen’s fundamental right to cast an accountable vote.”[15] This Georgia case went up on appeal to the Eleventh Circuit, and state officials argue that the district court judge should have dismissed the suit on the grounds that it violates the government’s entitlement to immunity and improperly subjects the state to suit and discovery.[16] The Coalition argues that granting the state’s request to dismiss the suit would have a chilling effect on voters and voting-rights groups.[17] Despite federal Judge Amy Totenberg’s decision not to replace Georgia’s DREs just weeks before midterm elections, most commentators suggest that by 2020, Georgia’s voting systems will include some form of backup.[18] The public outcry and bad publicity surrounding Georgia’s DREs and their attendant risks is surely something to watch. It might just be a matter of time before legislatures or courts of other states will follow suit and call for an overhaul of election equipment to ensure ballot security.

 

[1] See Curling v. Kemp, No. 1:17-CV-2989-AT, 2018 U.S. Dist. LEXIS 165741, at *7 (N.D. Ga. Sep. 17, 2018).

[2] See Ballotpedia, Voting Methods and Equipment by State, https://ballotpedia.org/Voting_methods_and_equipment_by_state.

[3] See id.

[4] See id.

[5] See id.

[6] See id.

[7] See id.

[8] See Jeremy Laukkonen, Which States Use Electronic Voting? Lifewire, https://www.lifewire.com/which-states-in-united-states-use-electronic-voting-4174835 (last updated Nov. 1, 2018).

[9] See Jessica Schulberg, Good News for Russia: 15 States Use Easily Hackable Voting Machines, HuffPost (July 17, 2017), https://www.huffingtonpost.com/entry/electronic-voting-machines-hack-russia_us_5967e1c2e4b03389bb162c96.

[10] See Christian Vasquez & Matthew Choi, Voting Machine Errors Already Roil Texas and Georgia Races, Politico, https://www.politico.com/story/2018/11/05/voting-machine-errors-texas-georgia-2018-elections-midterms-959980 (last updated Nov. 6, 2018).

[11] See Curling v. Kemp, No. 1:17-CV-2989-AT, 2018 U.S. Dist. LEXIS 165741, at *15 (N.D. Ga. Sep. 17, 2018).

[12] See id. at *15.

[13] See id. at *57.

[14] See id.

[15] See id. at *57-58.

[16] See Kayla Goggin, Georgia Officials to Appeal Paper Ballot Ruling to 11th Circuit, Courthouse News Service (Sept. 20, 2018), https://www.courthousenews.com/georgia-officials-to-appeal-paper-ballot-ruling-to-11th-circuit/.

[17] See id.

[18] See, e.g., Mark Niesse, Federal Judge Weighs Throwing Out Georgia Electronic Voting Machines, The Atlanta Journal-Constitution (Sept. 12, 2018), https://www.ajc.com/news/state–regional-govt–politics/federal-judge-weighs-throwing-out-georgia-electronic-voting-machines/mzhkkHVRl1caitey2igxXI/.

Image Source: https://www.myajc.com/news/state–regional-govt–politics/plan-scrap-georgia-electronic-voting-machines-moves-forward/Tw9ib1BzBJPUfuPrY2N5VI/

Data Breaches: The New Normal

By: Sarah Alberstein

It seems that data breaches are all over the news these days, but what exactly is a data breach? According to Norton Security, a data breach is a “security incident in which information is accessed without authorization.”[1] In 2016, the most common information stolen in data breaches were “full names, credit card numbers, and Social Security numbers.”[2] As consumers in an ever-evolving technological landscape, the risk of having such personal information stolen can be alarming. This alarm is only solidified by what seems to be a steady increase in such breaches.

There were 1,300 data breaches in 2017.[3] By July of 2018, there were already over 600 data breaches.[4] What’s more, almost 50% of the breaches in 2018 were “of businesses related to retail, tourism, transportation, utilities, and other professional services that most of us use on a regular basis.”[5] Some of the businesses affected include: Macy’s, Adidas, Sears, Kmart, Delta Airlines, Best Buy, Saks Fifth Avenue, Lord & Taylor, Under Armour’s fitness app, Panera Bread, Forever 21, Whole Foods, Gamestop, Arby’s, Ticketfly, and Facebook.[6] With the frequency of these breaches and the types of industries impacted, it seems that the odds of having your data stolen is relatively high.

There have been some legislative efforts to combat data breaches, and to make consumers more aware when such data breaches occur. Beginning in 2010, individual states began enacting Security Breach Notification Laws which require “private or governmental entities to notify individuals of security breaches involving personally identifiable information.”[7] Security Breach Notification Laws typically include provisions describing which entities must comply with the law, what constitutes personal information, what constitutes a breach, notice requirements, and any exemptions.[8] Now, in 2018, all 50 states have enacted Security Breach Notification Laws.[9] Additionally, all 50 states have “computer crime laws” that target crimes committed using a computer, and some states are individually strengthening their data breach laws by requiring business managing personal data to implement additional security practices like security training, periodic audits,  and centralizing statewide cybersecurity oversight.[10]

Despite this, companies may still attempt to cover up breaches, keeping consumers in the dark. In 2016, the ride-hailing service, Uber, experienced a “major data breach…that exposed the personal information of 57 million people.”[11] This information included names, cellphone numbers, and email addresses.[12] Rather than notifying its Users, Uber paid the hackers a $100,000 ransom to conceal the breach.[13] Uber did not provide public notice of the breach until a year later in 2017.[14] In September 2018, Uber agreed to pay a staggering $148 million in a settlement between Uber, all 50 states, and the District of Columbia, and Uber has promised to develop a new data security policy.[15]

While there is legislation in place, and companies seem to be held responsible for data breaches, there are some things individual consumers can do on their own in order to protect their data. This includes things like reviewing a company’s privacy policy before providing your information, using complex, secure passwords, monitoring your back accounts, checking credit card reports, install security software, back up your files, and occasionally wiping your hard drive.[16] It seems that the legal landscape is constantly playing catch-up with the advancement of technology, but hopefully legislation like Security Breach Notification Laws and the efforts of individual consumers will bring a sense of security to the technological Wild West.

 

[1] What is a Data Breach?, Norton, https://us.norton.com/internetsecurity-privacy-data-breaches-what-you-need-to-know.html.

[2] Id.

[3] Rebecca Nanako Juchems, Enough is Enough: 2018 has Seen 600 too Many Data Breaches, Medium (July 24, 2018), https://medium.com/@AxelUnlimited/enough-is-enough-2018-has-seen-600-too-many-data-breaches-9e3e5cd8ff78.

[4] Id.

[5] Id.

[6] Dennis Green & Mary Hanbury, If you Shopped at These 16 Stores in the Last Year, Your Data Might Have Been Stolen, Business Insider (Aug. 22, 2018, 5:49 PM), https://www.businessinsider.com/data-breaches-2018-4#arbys-16; David Bisson, The 10 Biggest Data Breaches of 2018…So far, Barkly Blog (Jul. 2018), https://blog.barkly.com/biggest-data-breaches-2018-so-far.

[7] Breach of Information, National Conference of State Legislatures, http://www.ncsl.org/research/telecommunications-and-information-technology/overview-security-breaches.aspx; Security Breach Notification Laws, National Conference of State Legislatures, http://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-laws.aspx.

[8] Security Breach Notification Laws, National Conference of State Legislatures, http://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-laws.aspx.

[9] Id.

[10] See Pam Greenberg, Taking Aim at Data Breaches and Cyberattacks, National Conference of State Legislatures (Nov. 2017), http://www.ncsl.org/research/telecommunications-and-information-technology/taking-aim-at-data-breaches-and-cyberattacks.aspx.

[11] Dan M. Clark, $5.7M Slated for Pa. in Uber Data Breach Settlement, The Legal Intelligencer (Oct. 25, 2018, 2:40 PM),  https://www.law.com/thelegalintelligencer/2018/10/25/5-7m-slated-for-pa-in-uber-data-breach-settlement/.

[12] Id.

[13] Id.

[14] Id.

[15] Id.

[16] Rebecca Nanako Juchems, Enough is Enough: 2018 has Seen 600 too Many Data Breaches, Medium (July 24, 2018), https://medium.com/@AxelUnlimited/enough-is-enough-2018-has-seen-600-too-many-data-breaches-9e3e5cd8ff78; What is a Data Breach?, Norton, https://us.norton.com/internetsecurity-privacy-data-breaches-what-you-need-to-know.html.

Image Source: https://wattswebstudio.com/blog/

Ethical dilemmas of Using Artificial Intelligence

By: Brandon Larabee

Some of the ethical dilemmas of using artificial intelligence to address criminal justice issues are familiar to anyone who watched “Person of Interest.” The CBS science-fiction show revolved around the efforts of a team of human beings and “The Machine” — an artificial super-intelligence — to stop crimes before they could happen.

In the real world of criminal justice and the legal system, though, problems not anticipated by “Person of Interest” are cropping up with algorithms are used to predict criminal behavior. Where The Machine was relentlessly rational and unfailing (unless being interfered with), real-world machines are increasingly facing questions about whether they produce outcomes just as biased as the humans who build them.

As with many controversies in the public sphere, a counter-backlash is brewing. Writing recently for Wired, Noam Cohen argued that algorithms (and the computers that crunch the numbers) could as easily be sources of justice as of injustice. Cohen highlighted reporting by The New York Times that eventually led some New York City district attorneys to be more lenient with low-level marijuana offenses.[1]

“But imagine if we turned that spigot of data and incisive algorithms toward those who presume to judge and control us: Algorithms should be another important check on the system, revealing patterns of unfairness with a clarity that can be obscured in day-to-day life,” Cohen writes.[2]

That argument, though, comes amid a sustained pushback against efforts to use algorithms and predictive technology to do everything from making bail decisions to assisting in sentencing to deciding where police should focus their enforcement efforts.

New York City, for example, established an Automated Decision Systems Task Force to start looking at how the city uses its powerful data tools.[3] Activists have criticized a Los Angeles Police Department program that uses computer programs to choose surveillance targets because the data input into the system creates a “racist feedback loop.”[4] The COMPAS algorithm, used to create recidivism scores for judges to consider during sentencing, has been accused of bias against people of color.[5]

There are defenders of algorithms beyond Cohen. Sharad Goel of Stanford University told Nature: International Journal of Science that, in the journal’s words, discrepancies between error rates for whites and people of color “instead reflect the fact that one group is more difficult to make predictions about than another.”[6]

“It turns out that that’s more or less a statistical artifact,” Goel said.[7]

That might come as cold comfort to an offender being sentenced based on a flawed formula: The formula is working against him or her because it has a problem predicting what people of the offender’s race will do, not because it’s biased per se.

Those inclined to seek a compromise have started to float ideas meant to answer the questions of bias while still using the data algorithms produce to (one hopes) improve society. One idea is simply to accept that, by their very nature, algorithms are “biased” — so the public should have more information and more input into what goes into the formulas.[8]

At least one avenue for a possible resolution seems to be closed for now. The U.S. Supreme Court faced a decision last year about whether to take the case of Loomis v. Wisconsin, a frontal assault on the use of COMPAS in sentencing decisions.[9] But the court passed.[10]

 

[1] Noam Cohen, Algorithms can be a tool for justice — if used the right way, Wired (Oct. 25, 2018, 1:23 PM), https://www.wired.com/story/algorithms-netflix-tool-for-justice/.

[2] Id.

[3] Mayor de Blasio announces first-in-nation task force to examine automated decision systems used by the city, NYC.gov (May 16, 2018), https://www1.nyc.gov/office-of-the-mayor/news/251-18/mayor-de-blasio-first-in-nation-task-force-examine-automated-decision-systems-used-by.

[4] George Joseph, The LAPD has a new surveillance formula, powered by Palantir, The Appeal (May 8, 2018), https://theappeal.org/how-walmart-is-helping-prosecutors-get-10-year-sentences-for-shoplifting-7d868e8b38b8/

[5] See Sara Chodosh, Courts use algorithms to help determine sentencing, but random people get the same results, Popular Science (Jan. 18, 2018), https://www.popsci.com/recidivism-algorithm-random-bias#page-2

[6] Rachel Courtland, Bias detectives: The researchers striving to make algorithms fair, Nature: International Journal of Science (June 20, 2018), https://www.nature.com/articles/d41586-018-05469-3

[7] Id.

[8] Matthias Spielkamp, Inspecting algorithms for bias, MIT Technology Review (June 12, 2017), https://www.technologyreview.com/s/607955/inspecting-algorithms-for-bias/amp/.

[9] Adam Liptak, Sent to prison by a software program’s secret algorithms, N.Y. Times (May 1, 2017), https://www.nytimes.com/2017/05/01/us/politics/sent-to-prison-by-a-software-programs-secret-algorithms.html.

[10] Loomis v. Wisconsin, SCOTUSBlog, http://www.scotusblog.com/case-files/cases/loomis-v-wisconsin/.

Image Source: https://deadline.com/2016/06/person-of-interest-finale-jonah-nolan-interview-x-files-batman-taraji-p-henson-greg-plageman-1201775530/

Is Classroom Technology Making Student Privacy Obsolete?

By: Zaq Lacy

In many schools around the country, classroom technology made its debut in the early to mid-80’s, in the form of Apple II computer labs and the infamous (but so very nostalgic) words, “You have died of dysentery,” thanks in large part to the vision of Steve Jobs and his collaboration with the Minnesota Educational Computing Consortium (MECC) to “save the world by putting computing power in the hands of every kid in America.”[1] Today, the technology available to enhance the learning experience encompasses nearly every aspect of the classroom, from e-texts,[2] to a litany of third-party applications that incorporate social media with cloud-integrated collaboration tools,[3] to biometric identification systems used to pay for lunch.[4] This technology offers previously unheard-of precision in real-time assessment, allowing teachers to assess learning processes as well as responses.[5] Moreover, the use of the technology available today has significant benefits for the classroom and students.[6] Despite the benefits, however, there is an increasing concern over the privacy of our students.[7]

The Family Educational Rights and Privacy Act (FERPA)[8] protects the privacy of student education records[9] but has become as obsolete as the technology that existed when it was passed 40 years ago.[10] As tech companies produce more and more sophisticated software, and integration in the classroom becomes progressively pervasive, so too grows their ability to gather information on the users. Such companies have accumulated immeasurable information on students’ school activities,[11] causing some states, such as California, to take legislative steps to address the growing problem,[12] which some attorneys feel is the number one problem for schools and new educational technology companies.[13] California’s Student Online Personal Information Act (SOPIPA) has served as a model for a number of State legislatures, 15 of which passed similar laws in 2015.[14] Despite the progress that is being made, officials still acknowledge that technology is likely to continue to develop faster than legislation, which will create new problems in the future. [15] So, for now at least, our students are living with privacy protections that are so three years ago.

 

[1] See Matt Jancer, How You Wound Up Playing The Oregon Train in Computer Class, Smithsonian.com (Jul. 22, 2016), https://www.smithsonianmag.com/innovation/how-you-wound-playing-em-oregon-trailem-computer-class-180959851/.

[2] See Online Textbooks, Fairfax Cty. Pub. Sch., https://www.fcps.edu/online-textbooks (last visited Nov. 2, 2018).

[3] See Kathy Dyer, The Ultimate List- 65 Digital Tools and Apps to Support Formative Assessment Practices, NWEA.org (Jan. 9, 2018), https://www.nwea.org/blog/2018/the-ultimate-list-65-digital-tools-and-apps-to-support-formative-assessment-practices/.

[4] See Natasha Singer, With Tech Taking Over in Schools, Worries Rise, N.Y. Times (Sept. 14, 2014),  https://www.nytimes.com/2014/09/15/technology/with-tech-taking-over-in-schools-worries-rise.html.

[5] See Alvin Vista & Esther Care, Education Assessment in the 21st Century: New Technologies, BROOKINGS.edu (Feb. 27, 2017), https://www.brookings.edu/blog/education-plus-development/2017/02/27/education-assessment-in-the-21st-century-new-technologies/.

[6] See Jared Keengwe & Grace Onchwari, Technology and Early Childhood Education: A Technology Integration Professional Development Model for Practicing Teachers, 37 Early Childhood Educ. J. 209, 210 (2009); see also Effects of Technology on Classrooms and Students, U.S. Dep’t. of Educ., https://www2.ed.gov/pubs/EdReformStudies/EdTech/effectsstudents.html#change (last visited Nov. 2, 2018)

[7] See Singer, supra note 4.

[8] 20 U.S.C. § 1232(g) (2018); 34 C.F.R. § 99.31 (2018)

[9] Family Educational Rights and Privacy Act, U.S. Dep’t. of Educ., https://www2.ed.gov/policy/gen/guid/fpco/ferpa/index.html (last visted Nov. 2, 2018).

[10] See Singer, supra note 4.

[11] See id.

[12] Student Online Personal Information Protection Act of 2014, Cal. Bus. & Prof. Code §§ 22584-22585 (Deering 2014); Early Learning Personal Information Protection Act, Cal. Bus. & Prof. Code §§ 22586-22587 (Deering 2017).

[13] See Matthew Johnson, The Top Five Legal Issues for Edtech Startups and Schools, Edsurge.com (Apr. 16, 2016).

[14] See Tanya Roscorla, More States Pass Laws to Protect Student Data, Govtech.com (Aug. 27 2015), http://www.govtech.com/education/k-12/What-States-Did-with-Student-Data-Privacy-Legislation-in-2015.html?utm_source=related.

[15] See id.

Image Source: https://www.smithsonianmag.com/innovation/how-you-wound-playing-em-oregon-trailem-computer-class-180959851/.

Are Major League Baseball Pitchers Playing with a Chip on Their Shoulders?: A Look into The Use of Wearable Technology in Major League Baseball

By: Mariah Bayless Davis

The advancement of technology has given individuals the false sense of familiarity with people they have never met. Social media users are often motivated to overshare experiences and information about themselves to feel a sense of community on platforms such as Twitter and Facebook. While sharing what you ate is normal in today’s society, sharing your personal health information may not be something one is forthcoming about, rightfully so. The Health Insurance Portability and Accountability Act allows patients to guard their medical records and control the use of such personal health information.[1] However, what is the protection for data that is not deemed, “personal health information?” Information such as biometric data collected from wearable technology is not specifically regulated by federal laws.[2] That might not be an issue for a regular person that uses a Fitbit to track their daily steps, but what happens when an athlete uses wearable technology for rehab and training purposes? Barbara Osborne, a professor of sports law at University of North Carolina, commented on the murky waters that puddle at the intersection of public athletes and privacy by saying, “once [athletes] private biometric data is considered a term of employment, the contents of that data are no longer considered protected health information under law.”[3] How do athletes grapple with being public figures while simultaneously wanting to improve their skills in private with the help of wearable technology? With the introduction of the Motus mThrow sleeve in 2014[4], Major League Baseball and Major League Baseball Players Union attempted to navigate the unpaved roads of wearable technology in professional sports.

In professional sports, injuries are inevitable. Injuries can not only lead to poor performance by a team, but also wasted money in a league like Major League Baseball where contract money is guaranteed. In 2015 alone, the league reported $420 million in pitchers’ salaries “wasted” on the disabled list and pointed its finger at elbow injuries as the main culprit.[5] The sport of baseball is so plagued by elbow injures that the surgery to fix them is known as a “Tommy John procedure,” named after the first player to get the surgery.[6] As the founder of the American Sports Medicine Institute, Dr. James Andrews has had plenty of experience researching elbow injuries and performing elbow surgeries. When asked why he was so passionate about finding a preventative measure to lessen elbow injuries, he said, “I’d like to put myself out of business [one day].”[7] By researching the cause of elbow injuries, one could potentially develop a piece of technology to lessen the popularity of Tommy John surgeries, which “increased by 700 percent [from 2004 to 2010].”[8] That was Joe Nolan’s goal when he and his company, Motus, developed the mThrow Pitcher Sleeve.[9] The mThrow sleeve looks like a common compression sleeve that you could buy at Dick’s Sporting Goods. However, the Motus technology built into the sleeve is what could completely eliminate elbow injuries caused by the force of pitching. To be able to collect data relating to arm speed and release point, the sleeve stretches from the mid forearm to just under the shoulder and nestled inside a slot on the sleeve is a small sensor.[10] As the pitcher throws, the sleeve not only collects data and measurements pertaining to shoulder rotation, but also pertaining to the angles of the elbow and shoulder.[11] Knowledge about the angles of the elbow and shoulder shed light on the stress being put on the ulnar collateral ligament, which if weakened without correction can lead to an elbow injury.[12] Many companies that analyze the mThrow data for players and teams say that the technology is used mainly to “train players to withstand fatigue, rehab them faster and better, and hopefully prevent them from having surgery at the beginning.”[13]

Although the general consensus around the league is that athletes’ biometric data collected from the mThrow sleeve would be used for good purposes, some athletes are skeptical. Andrew Miller, an All-Star pitcher who most recently played for the Cleveland Indians shared his thoughts by saying, “you don’t want a team to treat you differently in some sort of contractual thing because they don’t think you’re not getting enough sleep or sleep poorly…it’s just a matter of how you work [data] in and who do you give access to and in what form?”[14] Alan Milstein, an attorney who lectures on sports and bioethics shares the same sentiment as Miller. He states that the use of the data might actually deter athletes from voluntarily using the technology, “if the purpose is to find out, ‘Geez, is this guy really worth it? Should we sign him to another year? Nah, he looks like he’s really failing. Let’s get rid of him,’ then it’s no longer in the athletes’ best interest to have the team be able to monitor every aspect of their health.”[15] Both Miller and Milstein question the regulations and further perimeters regarding not necessarily the taking of the data, but the use of it. A new Collective Bargaining Agreement was introduced in 2017 and while the agreement provided clarity regarding the technology and data that comes from it, the regulations might increase skepticism.

Attachment 56 of the 2017-2021 Collective Bargaining Agreement summarizes the manner in which new biometric technologies are evaluated and approved, while also making an effort to establish regulations regarding a player’s privacy when it comes to their data, who has access to it, and for what purpose.[16] In 2016, before Major League Baseball updated the Collective Bargaining Agreement, the only people who had access to the biometric data from the sleeve were the pitcher himself, the agency that represents him, and Motus.[17] As set out in the new agreement, “the Club representatives permitted access to a player’s wearable data are: General Manager, Assistant General Manager, Field Manager, Team Physician, Certified Athletic Trainer, Strength and Conditioning Coach, Rehabilitation Coordinator, and an individual hired by a Club to manage the use and administration of wearable technology.”[18] The agreement later regulates the use of the data and says that it cannot be exploited or used for any commercial purposes.[19] Although not explicit, the agreement also prohibits the use of a player’s biometric data as a tool during salary arbitration. Explicitly using the data in the argument during salary arbitration would lead to disclosing the data to outside parties not permitted to access data.[20] Initially the league and player’s union sought out to make the discussion about wearable technology clearer but the regulations point to a loophole that could be used by Club representatives.

During a salary arbitration, Club representatives and the athlete present their cases in front of a panel of arbitrators. While the athlete has his agent there to represent his interests, the Club usually sends two individuals on their behalf: The General Manager and Assistant General Manager.[21] Those two individuals are also amongst the permitted Club representatives who have access to wearable data. Although the GM and Assistant GM are not permitted to explicitly use this information during a salary arbitration, just the mere knowledge of it can point to anchoring bias. Anchoring bias is a human’s natural tendency to rely too heavily on the first piece of information received when making decisions.[22] Is it natural or even possible for someone to use data for one purpose but then not let that same data influence their decisions for another purpose? If the General Manager and Assistant General Manager of a Club can continue to access this loophole to use data in salary arbitrations, that could lead to another problem: abuse of power. As one of the first professional sports leagues to regulate the ownership and fair use of the advanced data[23], the MLB will act as a case study for the National Football League, as conversations regarding commercial use of football players’ data are already happening.

 

[1] See Jen Booton, Widespread Wearable Use Could Fundamentally Change Professional Sports, Sport Techie, (Aug. 28, 2018), https://www.sporttechie.com/widespread-wearable-use-could-fundamentally-change-pro-sports/

[2] See id.

[3] Id.

[4] See Will Carroll, The Sleeve That Could Save Baseball: Exclusive Look at New MLB Technology, Bleacher Report, (July 2, 2014), INSERT LINK

[5] See Tom Goldman, What’s Up with Those Baseball Sleeves? Lots of Data, and Privacy Concerns, NPR, (Aug. 30, 2017), https://www.npr.org/2017/08/30/547062884/whats-up-those-baseball-sleeves-lots-of-athletes-data-and-concerns-about-privacy

[6] See id.

[7] See Carroll, supra note 4.

[8] Id.

[9] Id.

[10] See Tom Goldman, What’s Up with Those Baseball Sleeves? Lots of Data, and Privacy Concerns, NPR, (Aug. 30, 2017), https://www.npr.org/2017/08/30/547062884/whats-up-those-baseball-sleeves-lots-of-athletes-data-and-concerns-about-privacy

[11] See id. See also Will Carroll, The Sleeve That Could Save Baseball: Exclusive Look at New MLB Technology, Bleacher Report, (July 2, 2014), https://bleacherreport.com/articles/2097866-the-sleeve-that-could-save-baseball-exclusive-look-at-new-mlb-technology (explaining the extent of technology and what can be recorded by the sleeve.)

[12] See Carroll, supra note 11.

[13] See Goldman, supra note 10 (explaining how technology could be used to prevent elbow injuries, instead of just using the technology after the fact.)

[14] Mike Vorkunov, Innovation vs. invasion of privacy: MLB wearable technology battle looms, USA Today, (Sept. 21, 2016), https://www.usatoday.com/story/sports/mlb/2016/09/21/innovation-vs-invasion-privacy-mlb-wearable-technology-battle-looms/90783188/

[15] See Goldman, supra note 10 (citing lack of perimeters as a reason athletes might not take advantage of the positives that the technology offers.)

[16] Stephanie Springer, An Update On Wearable Baseball Technology, Fan Graphs, (Aug, 7, 2018), https://www.fangraphs.com/tht/an-update-on-wearable-technology/

[17] See Vorkunov, supra note 14.

[18] 2017 Major League Baseball Collective Bargaining Agreement, attach. 56 (Dec. 1, 2016).

[19] See id.

[20] See id (explaining that the data shall not be disclosed by a Club to any party other than those persons listed as permitted.)

[21] Eric Stephen, Salary Arbitration: A Necessary Evil, True Blue LA, (Feb. 17, 2014), https://www.truebluela.com/2014/2/17/5379764/salary-arbitration

[22] Mohammed, S. (2018). The Hidden Trap of “Anchoring Bias” in Decision Making and The Leadership Lesson From “Moneyball” Movie. [Blog] Medium. Available at: https://medium.com/@shahmm/the-hidden-trap-of-anchoring-bias-in-decision-making-and-the-moneyball-movie-79aa7295f21d.

[23] See Springer, supra note 16.

Image Source: https://www.overthemonster.com/2018/2/6/16979376/mlb-starting-pitcher-rankings-chris-sale

Page 49 of 85

Powered by WordPress & Theme by Anders Norén