Richmond Journal of Law and Technology

The first exclusively online law review.

Electronic Health Records: Federal Guidelines for Managing Cybersecurity Risks

By Jessica Otiono

 

Electronic Health Records (EHR) utilize modern technology that allows for electronic entry, storage, and maintenance of digital patient data.[1] This data includes patient records from doctors such as demographics, test results, medical history, history of present illness, and past and current medications.[2] In the past two decades, the utilization of information technology in the delivery and management of healthcare, which resulted in the adoption of EHRs, has provided an efficient way of sharing healthcare records between healthcare professionals and patients who enjoy easy access to their records.[3]

However, this ease of access is met with cybersecurity threats and data privacy challenges.[4]  The sensitive and patient-care-centeredness of EHRs make them susceptible to cyber-attacks.[5]  This is because they contain Personal Health Information (PHI), which cyber attackers sell for profit on the dark web.[6] Cyberattacks on EHRs occur in diverse ways. Some of these cyberattacks include:

 

    1. Phishing – Phishing attacks are the most rampant cybersecurity threats in healthcare. It is the practice of infecting a seemingly harmless email with malicious links.[7]  The usual form of phishing attack is email phishing.[8]
    2. Malware/Ransomware – This type of malware disables access to computer systems and files until a ransom has been paid.[9]  Ransomware may infect a computer system through a phishing email containing a malicious link.[10]
    3. Distributed Denial of Service (DDOS) – DDOS floods a website or computer network with internet traffic to overwhelm it and impair its performance and availability.[11] Cybercriminals employ bots to submit an excessive number of requests.[12] DDOs used together with Ransomware are one of the most destructive cybersecurity attack combinations.[13]

In dealing with cybersecurity threats to EHRs, Federal compliance laws such as the Health Insurance Portability and Accountability Act (HIPAA) and Health Information Technology for Economic and Clinical Health (Act) were enacted to protect the privacy and data security of Personal Health Information (PHI) which are stored electronically.[14] In addition, the HIPAA privacy rule establishes “national standards to protect individuals’ medical records and other individually identifiable health information….”[15]

The HIPAA Security Rule also establishes appropriate safeguards to ensure the confidentiality, integrity, and security of electronically protected health information.[16] The Security Rule provides administrative, physical, and technical safeguards for managing healthcare data privacy.[17]  Some of these safeguards include: i.) establishing a security management process in which the covered entity must implement policies and procedures to prevent, detect, contain, and correct security violations;[18] ii.) appointing a designated security official who is responsible for the development and implementation of policies and procedures mandated by the Security Rule;[19] iii.) implementing policies and procedures to address security incidents when they occur;[20] iv.) creating policies and procedures for responding to an emergency that damages computer systems containing EHRs;[21] v.) establishing safeguards for workstation security;[22] vi.) implementing audit controls for information systems;[23] and vii.) implementing measures to protect against unauthorized access to electronic personal health information transmitted over an electronic communications network.[24]

In addition, the HITECH Act establishes the Breach Notification Rule. This rule requires Health care providers as well as other covered entities under HIPAA to promptly notify (within 60 calendar days from the day the breach is discovered) individuals of a data breach, as well as the Secretary of the U.S. Department of Health and Human Services (HHS) and the media in cases where the breach affects more than 500 individuals.[25] Breaches of fewer than 500 individuals must be reported to the Secretary of the HHS on an annual basis, no later than 60 calendar days from the end of the year.[26] As healthcare delivery technology continues to evolve, cyber-attacks on EHRs continue to happen. It is therefore imperative that healthcare providers and other key players implement policies that align cybersecurity and patient safety initiatives. These measures will protect patient safety and privacy while ensuring continuity in the delivery of high-quality healthcare by mitigating disruptions.

 

[1] Electronic Medical Record in Healthcare, U.S. Dept. Health Hum. Serv. 1, 3 (2022), https://www.hhs.gov/sites/default/files/2022-02-17-1300-emr-in-healthcare-tlpwhite.pdf.

[2] Id.

[3] Liu Hua Yeo & James Banfield, Human Factors in Electronic Health Records Cybersecurity Breach: An Explanatory Analysis, Perspectives In Health Info. Mgmt. (Mar. 15, 2022), https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9123525/.

[4] Id.

[5] Greg Kill, Top 5 Cybersecurity Threats to Electronic Health Records and Electronic Medical Records, Integracon (Apr. 28, 2018), https://integracon.com/top-5-cybersecurity-threats-to-electronic-health-records-and-electronic-medical-records/.

[6] U.S. Dept. Health Hum. Serv., supra note 1, at 6.

[7] Edward Kost, Biggest Cyber Threats in Healthcare, Upguard (Aug. 8, 2022), https://www.upguard.com/blog/biggest-cyber-threats-in-healthcare.

[8] Id.

[9] Cyber Attacks: In the Healthcare Sector, Ctr. Internet Sec., https://www.cisecurity.org/insights/blog/cyber-attacks-in-the-healthcare-sector (last visited Sept. 14, 2022).

[10] Id.

[11] Hardik Shah, Top 10 Cybersecurity Challenges in the Healthcare Industry, Global Sign (May 5, 2022), https://www.globalsign.com/en/blog/10-cybersecurity-challenges-healthcare.

[12] Id.

[13] Id.

[14]   Liu Hua Yeo & Banfield, supra note 3.

[15] See The HIPAA Privacy Rule, HHS.gov., https://www.hhs.gov/hipaa/for-professionals/privacy/index.html (last visited Sept. 14, 2022).

[16] The Security Rule, HHS.gov.,  https://www.hhs.gov/hipaa/for-professionals/privacy/index.html (last visited Sept. 14, 2022).

[17] Ryan L. Garner, Evaluating Solutions to Cyber Attack Breaches of Health Data: How Enacting A Private Right of Action For Breach Victims Would Lower Costs, 14 Ind. Health L. Rev. 127, 139  (2017).

[18] Id.; 45 C.F.R. § 164.308 (a)(3)(i) (2017).

[19] Id. § 164.308(a)(2).

[20] Id. § 164.308(a)(6)(i).

[21] Id. § 164.308(a)(6)(i).

[22] 45 C.F.R. § 164.308(a)(6)(i) (2017).

[23] Id. § 164.312(b).

[24] Id. § 164.312(e)(1).

[25] 45 C.F.R. §§ 164. 400-414 (2009).

[26] Id.

Image Source: https://www.aranca.com/knowledge-library/special-reports/valuation/healthtech-decoded

Employer Productivity Tracking in the Wake of Work From Home

By Paige Skinner

 

The COVID-19 pandemic came along with several unprecedented changes, including a large-scale transition towards employees working from home. Employees being in the office was no longer viable for many employers, so they moved to a remote work format.[1] With this move came the question of how productive employees could be without the built-in supervision of working from the office. Due to this concern, many employers looked for ways to track employee productivity.[2] They found an answer in software that could be downloaded onto employee devices, such as InterGuard and ActivTrak.[3] InterGuard allows employers to track employee activity through their location, how much time they spend idly on their devices, their perceived level of productivity, and can also secure employer data if the employee is terminated.[4] ActivTrak is similar in that it can provide information on employee behavior, determine how efficient an employee is being, and can help employees set goals and configure their workload balance.[5] These are just two of the numerous programs that employers can use to track their employees while they work from home. Employers can use the information they collect from software programs such as InterGuard and ActivTrak in several ways, including to secure the company’s data and assist them in making personnel decisions.[6]

Many employees may not even be aware that their employers have installed productivity tracking software on their work computers, phones, or tablets.[7] However, those who are aware have expressed concern over their employers tracking them this way.[8] Many argue that installing this technology feels like an invasion of their privacy.[9] These concerns, naturally, raise a question of the legality of software programs like InterGuard and ActivTrak. Concerned employees may be happy to learn that there is legislation that aims to protect them in these situations. The main statute that governs how employers may track their employees at home is the Electronic Communications Privacy Act (ECPA) of 1986.[10] The ECPA allows employers to “monitor employees in the workplace, including both written and verbal communications, for any legitimate business purpose” and can utilize other methods of monitoring if they receive employee consent.[11] A legitimate business purpose means anything that is in furtherance of the employer’s business or mission.[12] This purpose can look like an employer obtaining video footage, monitoring calls made through company phones, or tracking internet usage to ensure productivity.[13] A logical inference can be made that ensuring employee productivity is in direct correlation with a legitimate business purpose, as a business cannot be successful without productive employees. Because employers can easily link tracking employee productivity to furthering their business interests, it appears as though many of the lengths they go to track employees from their homes are, in fact, legal.[14] While this may seem like an invasion of privacy from an employee’s perspective, it is seen as a necessary tool for employers.[15] One employer went as far as claiming they believed “economic ruin” was in store for his company if his employees turned to remote work, and therefore tracking employee productivity was essential to prevent failure.[16]

As technology advances and remote work continues to become the norm post mandatory COVID-19 pandemic restrictions, productivity tracking software will likely continue to soar in popularity.[17] As the technology becomes more refined, employers should make it their priority to be in compliance with the ECPA and maintain transparency with their employees to ensure not only employee productivity but also employee morale.

 

 

[1] Tatum Hunter, Here are all the ways your boss can legally monitor you, The Washington Post (Sep. 24, 2021, 7:01 AM), https://www.washingtonpost.com/technology/2021/08/20/work-from-home-computer-monitoring/.

[2] Id.

[3] Skye Schooley, 5 Tools for Tracking Your Remote Staff’s Productivity, Business.com (Sep. 20, 2022), https://www.business.com/articles/11-tools-for-tracking-your-remote-staffs-productivity/.

[4] Id.

[5] Id.

[6] See id.

[7] Lindsay Lowe, What is ‘tattleware’? How employers may be tracking you at home, Today (Feb. 23, 2022, 9:12 AM), https://www.today.com/news/news/can-companies-track-workers-from-home-what-to-know-rcna17316.

[8] Id.

[9] Id.

[10] David C. Wells, Legal Considerations When Monitoring Remote Employees, EmploymentLawFirms, https://www.employmentlawfirms.com/resources/remote-employee-monitoring-laws.html#:~:text=At%20the%20federal%20level%2C%20the,for%20any%20legitimate%20business%20purpose (last visited Sep. 23, 2022).

[11] Id.

[12] See Managing Workplace Monitoring and Surveillance, SHRM, https://www.shrm.org/resourcesandtools/tools-and-samples/toolkits/pages/workplaceprivacy.aspx (last visited Sep. 23, 2022).

[13] Id.

[14] Wells, supra note 10.

[15] See Hunter, supra note 1.

[16] Id.

[17] Lowe, supra note 7.

Image Source: https://neuroleadership.com/your-brain-at-work/stop-the-surveillance

Framing Privacy Policies: A Competition Law Perspective

By Shravya Devaraj and Rohit Gupta*

 

I. INTRODUCTION

In 2019, the German Bundeskartellamt (Federal Cartel Office, hereinafter “German FCO”)[1] rendered the first decision[2] linking data protection with competition law. The German FCO made two main observations that have driven competition authorities to reconsider the factors influencing anti-competitive behavior in digital markets. First, it held that voluntary consent for processing information from third parties could not be assumed merely because consent is a prerequisite to accessing the services of facebook.com. Second, combining data collected by Facebook (now, Meta) owned services like WhatsApp and Instagram with Facebook cannot be processed without the users’ voluntary consent. These observations form an uncanny resemblance with the recently announced Competition Commission of India (hereinafter “CCI”) investigation against WhatsApp.[3] The 2021 privacy update[4] by WhatsApp negates the “voluntary” consent requirement by predicating access to the services solely on the acceptance of its new privacy policy. Further, the update introduced combining data collected through WhatsApp with other Facebook companies for marketing and advertising.

Contrasting the two decisions, the German FCO was guided by Article 6(1)(f) of the General Data Protection Regulation (GDPR)[5], whereas WhatsApp’s conduct in unilaterally denying consumers the “opt-out” option constituting a potential abuse of dominance spearheads the CCI investigation. In this piece, I aim to analyze the contours of antitrust scrutiny within the realm of privacy policies, specifically analyzing the role of consent in the CCI investigation.

II. DEFINING THE RELATIONSHIP BETWEEN PRIVACY POLICIES AND COMPETITION LAW

Digital platforms collect and monetize data through a direct subscription model (e.g., Spotify), by using collected data to tailor products directly to users (e.g., Amazon), or by selling targeted advertisements[6] (e.g., Facebook and Google Search). Social media companies like WhatsApp and Instagram also monetize by selling advertisements. Since these products are free platforms, they are called zero-price platforms.[7] The companies use the data they collect when users access their services to generate inferences about consumer preferences and behavior.

In competitive markets, companies compete fiercely for data and use this data to improve the quality and efficiency of goods and services. Since access to zero-price platforms is predicated on the data collected by companies, a lack of data can prevent companies from offering goods and services at competitive levels. This makes these companies less likely to survive in data-driven markets, leading to decreased competition. Courts[8] have previously recognized the role of data privacy as a significant factor of quality, hence an important parameter in analyzing anticompetitive behavior.[9] The value of data collected by zero-price platforms is not limited to the ad-tech industry; it extends to the company’s potential to use the data to innovate and ability to increase barriers to entry for new companies entering the market.

Further, zero-price platforms have forced competition regulators to revisit whether the absence of “data” within the competition law framework definition should preclude the basis for such investigation. For instance, the Competition Law Review Committee in India found the inclusion of data within the definition of price to tackle digital markets was unnecessary since the current definition of price encompasses “every valuable consideration, whether direct or indirect,” is wide enough to encompass any kind of consideration that has a bearing on a service or product. [10]

The relationship between privacy policies and competition law is not mutually exclusive. In the digital market, data substitutes price where the value and contribution of user data to the market prowess of companies are undeniable. For instance, Japan adopted guidelines to include a collection of personal data without consent as a violation of the Japan Anti-Monopoly Act.[11]

III. PRIVACY AS A COMPONENT AFFECTING COMPETITION

A growing sense of reckoning for consumers’ privacy and data protection has influenced conscious privacy policy frameworks. Firms compete by increasing the level of privacy protection through data minimization, storing personal data for shorter time periods (storage limitation), providing clear, precise, and understandable privacy policies (transparency), deploying PETs (data security & privacy by design), and implementing protective privacy features by default.

For instance, Google introduced alternatives to third-party data collection by Tools AI and federated learning of cohorts (FLoC).[12] However, the Competition and Markets Authority (UK’s competition regulator, hereinafter “CMA”) launched an investigation on these privacy policy changes by Google, called the Google Sandbox investigation.[13] The CMA is primarily concerned about the changes resulting in anti-competitive practices where Google would retain the ability to track individual web users on Chrome despite preventing third parties to do the same by the effective implementation of the Privacy Sandbox Proposals. Close involvement and interventions by CMA to ensure Google’s proposals to implement the Sandbox Proposal on Android do not distort market competition is a digression from competition regulators resorting to ex-post regulatory actions.[14] Another example is Apple’s change in its privacy policy which has made it harder for third-party apps to collect data – by introducing an enhanced notice and consent mechanism[15] based on user opt-in but exempting its own apps from the requirement, leading to potential self-preferencing conduct, especially since third-party applications continue to pay Apple a 15-30% fee.[16] Hence, privacy policies have formed an essential component in influencing competition law determinations.

In India, the WhatsApp investigation has presented an opportunity for competition regulators to determine the influence of privacy in influencing anti-competitive behavior. The recent CCI Telecom Report[17] also presented abusive conduct illustrations, including a low privacy standard. This implies a lack of consumer behavior, lower standards of data protection, which could indicate exclusionary behavior, and leveraging a data advantage across various services.

IV. ROLE OF CONSENT THROUGH THE CCI INVESTIGATIONS

Unlike Europe, India does not have exhaustive data protection guidelines. The current Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (hereinafter “SPDI Rules”)[18], and the Information Technology Act, 2000[19] (hereinafter “IT Act”) are insufficient to provide a legal basis for initiating anti-trust scrutiny through an exploitative privacy policy angle. It only provides broad compliance for the collection of sensitive personal data. Instead, the WhatsApp CCI investigation finds that data-sharing by WhatsApp with Facebook amounts to the degradation of non-price parameters of competition. This conduct prima facie amounts to the imposition of unfair terms and conditions upon the users of WhatsApp.

The CCI has previously investigated WhatsApp in 2016[20] and 2020.[21] The 2016 investigation resembles the current investigation since it involved a change in WhatsApp’s privacy policy.[22] The change in its policy allowed data sharing between WhatsApp and Facebook, though it allowed users to opt-out of this data sharing within thirty days. In 2016, CCI decided against abuse of dominance since the ‘opt-out’ made data sharing optional, and there were legitimate purposes for sharing the data with Facebook-owned companies.[23] The legitimate purposes included using the data for improving user and product experience and overall cyber security. However, the German FCO held that these exact reasons for sharing data were incompatible[24] since it did not ultimately lead to Facebook’s interest in processing data according to its terms and conditions outweighing user interests.

Though some researchers argue that the 2016 and 2020 CCI investigations established an ‘implicit’ and ‘explicit’ user choice standard for determining  unfairness[25] where the implicit element constitutes the user’s ability to opt-out of data sharing without limiting their access to the service – the explicit standard implies taking away additional choices that would have otherwise been available to users resulting in an unfair imposition on users. The 2021 policy is in contravention to both these standards and thus provides a sufficient basis for the CCI to decide against WhatsApp. However, the consequences of establishing anti-competitive behavior only on the competition regulator’s user choice standard of consent limit the extent of the commission in analyzing the repercussion of exploitative privacy policies to consent-based findings. This restricts the far-reaching implications of drafting privacy policies that might impose unreasonable time limits for storing and collecting data for vague purposes to slip between the cracks, especially if they merely fulfill the voluntary consent requirement.

V. CONCLUSION

In India, the scope for including privacy considerations is limited in the competition law legislative framework. The need of the hour is implementing robust data protection principles that have been envisaged in the 2021 Data Protection Bill. Further, addressing the collection of data that contributes to the unequal bargaining power of big tech companies might require an explicit inclusion of such provisions. Hence, instead of data protection standards playing catch up with the competition regulator’s findings, a clear framework for handling data with guidelines on formulating privacy policies will address the lacunae in the existing privacy law framework. Besides directing companies towards adopting better privacy policies, it would also facilitate anti-competitive behavior analysis. Having recognized the intersection of privacy policies and competition law, this article offers insights into the current CCI investigation and the impact of framing privacy policies on anti-competitive behavior. There are significant international differences in approaches to data protection and competition policy, and competition authorities worldwide differ in their mandate and the scope of their competition laws. Thus, applying global best practices in framing privacy policies will harmonize the application of legislative provisions specific to jurisdictions.

 

* Shravya Devaraj and Rohit Gupta are final year law students at West Bengal National University of Juridical Sciences, Kolkata.

[1] The German FCO is Germany’s national competition regulatory agency.

[2] Bundeskartellamt v. Facebook, Case KVR 69/19 (June, 2020).

[3] Re: Updated Terms of Service and Privacy Policy for WhatsApp Users, Suo Moto Case No. 01 of 2021.

[4] We updated our Terms of Service and Privacy Policy on January 2021, (January 2021) https://faq.whatsapp.com/5623935707620435/?locale=en_US.

[5] Processing shall be lawful only if and to the extent that at least one of the following applies:

processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child.

[6] Lina M. Khan, Amazon’s Antitrust Paradox, 126 YALE L. J. 564 (2017).

[7] Maurice Stucke, Allen Grunes, Big Data and Competition Policy, Oxford University Press (2016).

[8] Case No. AT.40684.

[9] CMA investigates Facebook’s use of ad data, (June 4, 2021), https://www.gov.uk/government/news/cma-investigates-facebook-s-use-of-ad-data.

[10] Ministry of Corporate Affairs, Report of Competition Law Review Committee, (July 2019), https://www.ies.gov.in/pdfs/Report-Competition-CLRC.pdf.

[11] Japan Fair Trade Commission, The Guidelines for Exclusionary Private Monopolization under the Antimonopoly Act, (2009).

[12] Federated Learning of Cohorts (“FLoC”), https://privacysandbox.com/intl/en_us/proposals/floc.

[13] Competition & Market Authority, Investigation into Google’s ‘Privacy Sandbox’ browser changes, (2021).

[14] Case Number 50972, Decision to accept commitments offered by Google in relation to its Privacy Sandbox Proposals.

[15] Legal Process Guidelines, Government of Law and Enforcement, https://www.apple.com/legal/privacy/law-enforcement-guidelines-us.pdf.

[16] Epic Games, Inc. v. Apple Inc., 559 F. Supp. 3d 898 (N.D. Cal. 2021).

[17] CCI Workshop on Competition Issues in the Telecom Sector in India (February 2021).

[18] Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, (2011).

[19] The Information Technology Act, (2000).

[20] In Re: Shri Vinod Kumar Gupta and Whatsapp, Case No. 99, (2016).

[21] In Re: Harshita Chawla and Whatsapp Inc., Facebook Inc., Case No. 15, (2020).

[22] In Re: Shri Vinod Kumar Gupta and Whatsapp, Case No. 99, ¶14, (2016).

[23] In Re: Shri Vinod Kumar Gupta and Whatsapp ,Case No. 99,¶15, (2016).

[24] Bundeskartellamt v. Facebook, Case KVR 69/19, (June, 2020).

[25] Centre for Internet & Society, The Competition Law Case Against WhatsApp’s 2021 Privacy Policy Alteration (March 2021), https://cis-india.org/internet-governance/blog/the-competition-law-case-against-whatsapp2019s-2021-privacy-policy-alteration.

Image Source: http://www.nlujlawreview.in/integrating-data-protection-and-competition-law-the-why-the-how-and-the-way-forward/

Protecting Data Privacy in the Post-Dobbs Era

By Taylor M. Sorrells

Since the recent U.S. Supreme Court decision in Dobbs v. Jackson Women’s Health Organization, ending the federal constitutional protection of abortion rights,[1] there has been legal uncertainty surrounding how to protect personal data held by third-party apps and software.[2] Specifically, some commentators are concerned that prosecutors in states that have criminalized abortion will access electronic data such as healthcare records, geolocation data, phone call,  text message records, and financial statements to prosecute those suspected of obtaining abortions.[3]

The U.S. Constitution provides very limited protection for data privacy. While the Fourth Amendment offers general protections against unreasonable searches and seizures by government officials,[4] most seizures of data in the abortion context will not be barred by the Fourth Amendment, so long as the prosecutor seeking to access the data obtains a search warrant.[5] Further, the Supreme Court has recognized that, in some cases, law enforcement can subpoena data from third parties without triggering the requirements of the Fourth Amendmentthus requiring no warrant.[6]

While the constitutional protections for data privacy are rather weak, several federal privacy statutes may offer some protection for those seeking reproductive healthcare.[7] However, there is only limited potential for current federal privacy laws to protect the majority of individuals seeking abortions in states that have criminalized the procedure because most of these laws include law enforcement exceptions, which enable third parties to disclose data to law enforcement officials without consumer consent.[8]

In an executive order dated July 8, 2022, the Biden administration, seeking to strengthen federal protections for abortions, ordered the Federal Trade Commission (FTC) chair to “consider actions, as appropriate and consistent with applicable law . . . to protect consumers’ privacy when seeking information about and provision of reproductive healthcare services.”[9] Under the FTC Act, the FTC has broad authority to address consumer privacy violations by diverse entities.[10] Similarly, President Biden instructed the Secretary of the Department of Health and Human Services (HHS) to strengthen existing privacy protections for individuals seeking reproductive healthcare.[11]

While the order signals the Biden administration’s interest in preserving data privacy rights within the reproductive healthcare context, for abortion advocates, it will not likely provide sufficient protection because of the law enforcement exceptions that are written into most of the statutes from which these executive agencies derive their authority.[12] Further, the Health Information Portability and Accountability Act (HIPAA), which HHS administers, has a privacy rule which specifically allows abortion providers to share information with law enforcement, and some state laws require sharing under certain circumstances.[13] Because of these weaknesses in current federal law, commentators believe that absent a change in law or a new rulemaking, patients’ sensitive data will remain at risk.[14]

Post-Dobbs, members of Congress have proposed several bills intended to preserve reproductive health data rights.[15] One of the strongest bills proposed so far is the My Body, My Data Act, which would require the FTC to enforce a national privacy standard for reproductive health data collected by third-party apps, cell phones, and search engines.[16] The bill has been introduced in both the House and Senate, but with the Senate deeply divided on the issue of abortion, its future remains uncertain.[17]

Without stronger federal privacy laws, in most cases, prosecutors in states that have criminalized the procedure will be able to use suspected abortion-seekers’ electronically stored data as evidence against them in criminal prosecutions. For many commentators, new federal laws governing data protection have been long overdue.[18] However, in the post-Dobbs era, abortion advocates’ calls for increased data privacy protection have taken on a new urgency.

Keep an eye on this evolving area of the law.

 

[1] 142 S. Ct. 2228 (2022) (overruling Roe v. Wade, 410 U.S. 113 (1973), and Planned Parenthood v. Casey, 505 U.S. 833 (1992)).

[2] Abby Vesoulis, How a Digital Abortion Footprint Could Lead to Criminal Charges—And What Congress can do About it, Time (May 10, 2022, 4:20 PM), https://time.com/6175194/digital-data-abortion-congress.

[3] Id.

[4] U.S. Const. amend. IV.

[5] Riley v. California, 573 U.S. 373, 377 (2014) (holding that police must obtain a warrant before searching a seized cell phone).

[6] United States v. Miller, 425 U.S. 435, 446 (1976) (upholding the warrantless subpoena of bank records); Smith v. Maryland, 442 U.S. 735, 745–46 (1979) (ruling that law enforcement did not need a warrant to access phone records).

[7] See, e.g., Health Information Portability and Accountability Act of 1996, Pub. L. No. 104-191, 110 Stat. 1936 (imposing federal privacy requirements upon entities within the healthcare industry); Gramm-Leach-Bliley Act, Pub. L. No. 106-102, 113 Stat. 1338 (1999) (requiring entities within the banking industry to protect customer data); Stored Communications Act, 18 U.S.C. 121 §§ 2701–12 (1986) (addressing data privacy within electronic communications); Privacy Act of 1974, 5 U.S.C. § 552a (safeguarding data privacy within federal agencies).

[8] Chris D. Linebaugh, Cong. Rsch. Serv., LSB10786, Abortion, Data Privacy, and Law Enforcement Access: A Legal Overview, at 3 (2022).

[9] Exec. Order No. 14076, 87 FR 42053 (2022).

[10] See Federal Trade Commission Act of 1914, 15 U.S.C. §§ 41-58.

[11] Exec. Order No. 14076.

[12] Linebaugh, supra note 8.

[13] Allie Reed & Christopher Brown, Abortion Privacy Push Pits Biden Against Criminal Laws in States, Bloomberg Law (Aug. 3, 2022, 5:45 AM), https://www.bloomberglaw.com/bloomberglawnews/health-law-and-business/XEHPIBOK000000?bna_news_filter=health-law-and-business#jcite.

[14] Id.

[15] E.g., Stop Anti-Abortion Disinformation Act, S. 4469, 117th Cong. (2022) (directing the FTC to prescribe rules prohibiting disinformation in advertisements for abortion services); Health and Location Data Protection Act, S. 4408, 117th Cong. (2022) (prohibiting data brokers from selling and transferring certain sensitive data).

[16] H.R. 8111, 117th Cong. (2022).

[17] Nik Popli & Vera Bergengruen, Lawmakers Scramble to Reform Digital Privacy After Roe Reversal, Time (July 1, 2022, 12:44 PM), https://time.com/6193224/abortion-privacy-data-reform.

[18] See, e.g., Cameron F. Kerry, Why Protecting Privacy is a Losing Game Today—and how to Change the Game, Brookings (July 12, 2018), https://www.brookings.edu/research/why-protecting-privacy-is-a-losing-game-today-and-how-to-change-the-game.

 

Image Source: https://www.nbcnews.com/tech/security/abortion-clinics-providers-digital-privacy-roe-overturn-rcna30654

 

AI: Artificial or Artistic Intelligence?

AI: Artificial or Artistic Intelligence?

By Austin Wade-Vicente

“We see this technology as an engine for the imagination,” emphatically stated David Holz, creator of the popular online AI art generation program Midjourney.[1] The same program table-top games creator James Allen used this past month to win first-place in digital art at the Colorado State Fair ahead of 20 other artists.[2] Allen’s above pictured “Théâtre D’opéra Spatial” is undoubtedly an appealing work of art, but, in the era of these AI-generated masterpieces, who can legally claim ownership of this blue-ribbon piece?

Growing Pains: Sales Tax on Retail Transactions Involving Cryptocurrency in Virginia

By Owen Giordano

 

In just over a decade, cryptocurrency has radically altered our society’s notion of currency.[1] With a growing number of United States (US) citizens holding onto cryptocurrency, many states are at an impasse as to how they should collect tax on the transactions made with the medium.[2] However, under Virginia’s sales tax statute, the state is allowed to levy a tax on transactions.[3] The scope of this tax is broad and allows taxation on transactions done with currency or through bartering (i.e., property for property).[4] As such, retail sales transactions involving cryptocurrency are likely taxable in the state of Virginia, and that retailer would be obligated to collect the tax on Virginia’s behalf.[5] This development would lead to multiple issues for retailers in Virginia.

Foundational knowledge of this topic is necessary before discussion. To start, cryptocurrency refers broadly to a decentralized, digital currency.[6] This medium lacks any sort of centralized oversight that traditional currencies have, with cryptocurrency transactions recorded on a digital register known as a blockchain.[7] A blockchain is best described as a “ledger” that records and tracks the transactions of all assets (tangible and intangible) done with a specific cryptocurrency. Further, a cash equivalent refers to any form of investment security that can be readily liquidated (turned into cash), such as checks.[9] Put bluntly, a cash equivalent is any medium that works like cash in a transactional setting. Finally, and importantly, this blog assumes that, because cryptocurrency is a medium that can be readily liquidated and is ultimately designed as a substitute for cash, Virginia designates cryptocurrency as a cash equivalent.[10]

Through both the relevant sales tax statute and the choice to label cryptocurrency as a cash equivalent, transactions conducted with cryptocurrency would assuredly be taxable, and retailers would be obligated to collect the associated sales tax.[11] While this creates various possibilities, the decision is not without issues.

To begin, cryptocurrency on paper seems more portable than tangible currency due to the medium’s digital (and thus intangible) nature.[12] The closest analogy would be making a purchase via check or card payment for an expensive transaction, in that those mediums save consumers from the hassle of carrying thousands and thousands of coins or dollars. However, the ability to “carry” cryptocurrency has higher barriers to access than carrying cryptocurrency. As a digital currency, the use of cryptocurrency requires some sort of digital device with internet access to engage in a transaction.[13] Conversely, access to such devices and services is not needed for all-cash transactions because of cash’s tangible nature. While over ninety percent of US adults have access to the internet, adding such hurdles makes cryptocurrency a less efficient medium than tangible currencies.[14] One could argue that this issue applies to check or card payments as well. Yes, but the tangible nature of check payments does not require the additional requirements of constant access to digital devices or the internet. Simply put, the current business infrastructure in Virginia (and the country at large) remains lacking for transactions involving cryptocurrency. Therefore, to fully benefit from the use of cryptocurrency, investment in blockchain technology, as well as the ability to access said platforms more readily and freely, is needed.

Secondly, there is the question of cryptocurrency’s acceptability as a form of payment. Cryptocurrency is notable for its oscillating value.[15] This is partly due to the medium’s decentralized, where there is no regulatory body helping to stabilize the medium.[16] The unstable value contributes to businesses’ apprehension towards accepting cryptocurrency, as a profit could turn into a loss within the span of a day.[17] In the realm of sales tax collection obligations, the oscillating values raise concerns as to when the tax should be collected and what a retailer should do about their tax collection obligation when there is swift and dramatic change in the value of a cryptocurrency.[18] As such, Virginia would need to develop policy to address this key issue.

To conclude, cryptocurrency offers potential as a cash equivalent. Its promise of decentralization and minimal regulatory interference offers many valid reasons for its adoption and use in transactions. However, due to the medium’s novel nature, there is still much planning and development needed to support its use. As such, should cryptocurrency qualify as a cash equivalent, investment in the appropriate technological infrastructure would be necessary for Virginia to reap the benefits this medium offers.

 

[1] “Cryptocurrency” is used this blog post to the broad concept of cryptocurrency, rather than a particular type of cryptocurrency.

[2] Eswar S. Prasad, Are Cryptocurrencies the Future of Money?, EconoFact (Oct. 19, 2021), https://econofact.org/are-cryptocurrencies-the-future-of-money#:~:text=Cryptocurrencies%20have%20captured%20the%20public,end%20from%20a%20societal%20perspective; Cryptocurrency Sales and Use Tax by State, The Bureau of National Affairs, Inc. https://pro.bloombergtax.com/brief/cryptocurrency-tax-laws-by-state/  (last updated: Nov. 22, 2021).

[3] For the purposes of this paper, a “transaction” concerns the retail sale of a taxable good or service. Other types of transactions, such as exchanges for cash equivalents or non-tangible goods, exist, with different states adopting different views on the taxability of such transactions. See Casey W. Baker et al., U.S. State Taxation of Cryptocurrency-Involved Transactions: Trends & Considerations for Policy Makers, 75 Tax Law. 601, 625-26 (2022); Va. Code §§ 58.1-603 (authorizing sales tax)

[4] Va. Code §§ 58.1-602, 58.1-603 (definitions, authorization of a sales tax for both transactions using currency or property).

[5] Va. Code § 58.1-612.

[6] Kate Ashford, What Is Cryptocurrency?, Forbes, https://www.forbes.com/advisor/investing/cryptocurrency/what-is-cryptocurrency/ (Jun. 6, 2022).

[7] Id.

[8] What is Blockchain Technology?, International Business Management, https://www.ibm.com/topics/what-is-blockchain#:~:text=Blockchain%20for%20Dummies%22-,Blockchain%20overview,patents%2C%20copyrights%2C%20branding (last visited: Aug. 18, 2022).

[9] James Chen et al., Cash Equivalents¸ Dotdash Meredith, https://www.investopedia.com/terms/c/cashequivalents.asp#:~:text=Cash%20equivalents%20are%20the%20total,are%20the%20most%20liquid%20assets (last updated: Nov. 27, 2020).

[10] See Eswar S. Prasad, Are Cryptocurrencies the Future of Money?, EconoFact (Oct. 19, 2021), https://econofact.org/are-cryptocurrencies-the-future-of-money#:~:text=Cryptocurrencies%20have%20captured%20the%20public,end%20from%20a%20societal%20perspective (contemplating the effects of using cryptocurrency in a transaction setting, in a manner similar to most cash equivalents); Paulina Likos & Coryanne Hicks, The History of Bitcoin, the First Cryptocurrency, U.S. News & Report, L.P. (Feb. 4, 2022), https://money.usnews.com/investing/articles/the-history-of-bitcoin (mentioning Bitcoin’s, a cryptocurrency, use in transactional settings in a manner similar to cash equivalents); Nathaniel Popper, Bitcoin Has Lost Steam. But Criminals Still Love It, N.Y. Times (Jan. 28, 2020), https://www.nytimes.com/2020/01/28/technology/bitcoin-black-market.html (criminals using cryptocurrencies in transactional settings, further comparison of the medium to cash equivalents).

[11] Va. Code §§ 58.1-602, 58.1-603, 58.1-612.

[12] Kate Ashford, What Is Cryptocurrency?, Forbes, https://www.forbes.com/advisor/investing/cryptocurrency/what-is-cryptocurrency/ (Jun. 6, 2022).

[13] The Basics about Cryptocurrency, State University of New York at Oswego, https://www.oswego.edu/cts/basics-about-cryptocurrency (last visited: Aug. 28, 2022).

[14] https://www.pewresearch.org/internet/fact-sheet/internet-broadband/

[15] Eswar S. Prasad, Are Cryptocurrencies the Future of Money?, EconoFact (Oct. 19, 2021), https://econofact.org/are-cryptocurrencies-the-future-of-money#:~:text=Cryptocurrencies%20have%20captured%20the%20public,end%20from%20a%20societal%20perspective

[16] Id.

[17] Ryan Haar, You Can Buy More Things Than Ever With Crypto. Here’s Why You Shouldn’t, NextAdvisor, LLC, (May 3, 2022) https://time.com/nextadvisor/investing/cryptocurrency/should-you-use-crypto-like-cash/ (citing only 20% of people will use cryptocurrency as a cash substitute).

[18] Va. Code § 58.1-612.

 

Image source: https://time.com/nextadvisor/investing/cryptocurrency/should-you-use-crypto-like-cash/

AI Cannot Get Patents…Yet

AI Cannot Get Patents…Yet

By Grayson Walloga

The recent decision in Thaler v. Vidal held that an artificial intelligence (“AI”) could not obtain a patent for its creations.[1] Thaler’s AI, DABUS, generated patentable inventions without any direct contribution from Thaler himself. He attempted to secure patent protection on his AI’s behalf for two such inventions in 17 jurisdictions all across the world.[2] The United States Patent and Trademark Office (PTO) denied these patents and claimed that a machine does not qualify as an inventor.[3] Thaler brought his case to court, but the court ended up siding with the PTO. He appealed his case, but the Court of Appeals for the Federal Circuit affirmed the lower court’s decision.[4]

In its analysis, the court noted the specific language used in both the Patent Act and the Dictionary Act. The Patent Act defines an inventor as “the individual or, if a joint invention, the individuals collectively who invented or discovered the subject matter of the invention.”[5] Since this act failed to provide a definition for “individual,” the court looked to the Dictionary Act, which observed a distinction between individuals and non-human entities such as corporations, associations, and societies.[6] Additionally, the Supreme Court had defined “individual” in prior cases as something that “ordinarily means a human being, a person.”[7]

Thaler attempted several different arguments for why his AI should be allowed to get a patent. He pointed out that DABUS already had a patent in another country.[8] The South African Patent Office granted the AI a patent for its application relating to a “food container based on fractal geometry.” [9] This shocking action by South Africa, however, had little effect in the United States apart from serving as a conversation starter. The Court of Appeals for the Federal Circuit explained that this did nothing for DABUS’s patent application in the United States because “[t]his foreign patent office was not interpreting our Patent Act.”[10] Australia went in a different direction following the South African patent grant.[11] Justice Jonathan Beach of the Federal Court of Australia ruled that AI fell within the scope of “inventor,” but it could not be an applicant or a grantee of a patent.[12]

Thaler tried to convince the skeptical American court that “inventor” should include AI because it would encourage innovation and public disclosure.[13] The court once again dismissed his claim as mere speculation that lacked a basis in any relevant text.[14] Thaler’s contention may be irrelevant in deciding what the Patent Act says, but it remains a good policy question for possible legislative change. The promise of a patent may have little effect on an AI’s motivation to create new things, but the same cannot be said of the person who created that AI.[15] Inventors could create something like DABUS and use it to help them invent new and useful technologies – resulting in more innovation for society.[16] The court did not completely stamp out Thaler’s hope for more innovation. Its decision was only meant to clarify the definition of inventor under the Patent Act. It did not suggest that inventions made by human beings with the assistance of AI are not eligible for patent protection.[17]

But, of course, most people have a fearful outlook towards AI.[18] Many believe that AI could replace them in their jobs or that AI will be relied upon too much in the future. The world would hardly have need of a Thomas Edison toiling away in some lab running experiments all day. An AI could handle all the calculations and simulations so long as its creator properly sets the parameters. The main obstacle for the inventor who wants a patent but uses an AI’s assistance would be the standard for obviousness under the Patent Act. Perhaps an AI generates some formula for success after analyzing scores of data. Would that still be considered obvious even though it might be impractical for an expert in that field to do the very same?[19] If more inventors all start using AI, would the obviousness standards be relative to AI or still just to normal human experts? Businesses continue to accelerate their AI adoption plans which indicates that these questions will not go away anytime soon.[20] But those of us who did not miss the point of the Terminator franchise can at least take solace in knowing that the decision in Thaler v. Vidal means AI cannot get patents…yet.

 

 

 

[1] Thaler v. Vidal, Appeal No. 2021-2347 (Fed. Cir. Aug. 5, 2022).

[2] Utkarsh Patil, India: South Africa Grants A Patent With An Artificial Intelligence (AI) System As The Inventor – World’s First!!, Mondaq (Oct. 19, 2021), https://www.mondaq.com/india/patent/1122790/south-africa-grants-a-patent-with-an-artificial-intelligence-ai-system-as-the-inventor-world.

[3] Thaler v. Vidal.

[4] Id.

[5] 35 U.S.C. § 100(f) (2012).

[6] Thaler v. Vidal.

[7] Mohamad v. Palestinian Auth., 566 U.S. 449, 454 (2012).

[8] Thaler v. Vidal.

[9] Patil supra note 2.

[10] Thaler v. Vidal.

[11] Patil supra note 2.

[12] Id.

[13] Thaler v. Vidal.

[14] Id.

[15] See Ryan Abbott, The Artificial Inventor Project, Wipo Magazine (Dec. 2019), https://www.wipo.int/wipo_magazine/en/2019/06/article_0002.html.

[16] Id.

[17] Thaler v. Vidal.

[18] How Americans think about artificial intelligence, Pew Research Center (Mar. 17, 2022), https://www.pewresearch.org/internet/2022/03/17/how-americans-think-about-artificial-intelligence/.

[19] See Derek Lowe, AI, and the Patent System, Science (June 8, 2022), https://www.science.org/content/blog-post/ai-and-patent-system.

[20] Joe McKendrick, AI Adoption Skyrocketed Over the Last 18 Months, Harvard Business Review (Sept. 27, 2021), https://hbr.org/2021/09/ai-adoption-skyrocketed-over-the-last-18-months.

Image source: https://thenextweb.com/news/why-ai-systems-should-be-recognized-as-inventors

Epic Apple Fight: Round 2 & 3

By Drew Apperson

 

In a 2020 blog post, I summarized some of the issues between Fortnite developer, Epic Games, Inc., and the respective app markets of Apple and Google. The feud lead to the then-pending lawsuit between Epic Games, Inc. and Apple, Inc. concerning allegedly anticompetitive policies of the App Store.[1] In September of last year, the United States District Court for the Northern District of California ruled in Apple’s favor.[2] However, Apple and Google are far from being in the clear.

In dismay of the District Court’s holding, “[n]early 40 law, business and economics academics” filed an amicus brief in the Court of Appeals for the Ninth Circuit this past January “arguing the [district court] judge wrongly accepted Apple’s justifications that restrictions on third-party app distribution are necessary to protect users.”[3] The brief hit on various flaws it saw in the District Court’s analysis, such as:

[T]he court could have concluded that, on balance, Apple’s restraints were anticompetitive. Short-circuiting the analysis at an earlier stage prevented the court from assessing the ultimate competitive effects under the Rule of Reason, as courts have done for the past 45 years. The court erred in not balancing harms and benefits.[4]

Meanwhile, the federal legislature has simultaneously been working to combat anticompetitive policies in app markets. The United States Senate’s introduced Bill 2710, Open App Markets Act, on August 11, 2021, The Congressional Research Service’s bill summary described the bill as follows:

The bill prohibits a covered company from (1) requiring developers to use an in-app payment system owned or controlled by the company as a condition of distribution or accessibility, (2) requiring that pricing or conditions of sale be equal to or more favorable on its app store than another app store, or (3) taking punitive action against a developer for using or offering different pricing terms or conditions of sale through another in-app payment system or on another app store.

A covered company may not interfere with legitimate business communications between developers and users, use non-public business information from a third-party app to compete with the app, or unreasonably prefer or rank its own apps (or those of its business partners) over other apps.[5]

Just last week at the Global Privacy Summit in Washington, D.C., Apple CEO, Tim Cook, was reported as “slamming” the proposed legislation, arguing that unvetted apps would cause profound results, such as developers circumventing Apple’s privacy rules and putting users at risk.[6] The bill for the Open App Markets Act cleared the Senate Judiciary Committee earlier this year.[7]

As the threat of private suits continue, and as the Congressional bill continues to progress, Apple’s App Store and Google’s Play Store will likely remain under the microscope for the foreseeable future.

 

[1] Drew Apperson, An Epic Apple Fight, Rich. J.L. & Tech. Blog (Dec. 25, 2020), https://jolt.richmond.edu/2020/12/25/an-epic-apple-fight/.

[2] Epic Games, Inc. v. Apple Inc., No. 4:20-cv-05640-YGR, 2021 U.S. Dist. LEXIS 172303 (N.D. Cal. Sep. 10, 2021).

[3] Bryan Koenig, Apple Can’t Hide Behind Privacy In Epic Fight, 9th Circ. Told, Law360 (Jan. 27, 2022, 7:17 PM), https://www.law360.com/articles/1459311.

[4] Brief of Amici Curiae: Law, Economics, and Business Professors in Support of Appellant/Cross-Appellee at 38, Epic Games, Inc. v. Apple, Inc., No. 21-16695 (9th Cir. Jan. 27, 2022).

[5] S.2710 – 117th Congress (2021-2022): Open App Markets Act, S.2710, 117th Cong. (2022), https://www.congress.gov/bill/117th-congress/senate-bill/2710.

[6] Ben Kochman, Apple CEO Claims Antitrust App Store Laws Will Hurt Privacy, Law360 (Apr. 12, 2022, 8:27 PM EST), https://www.law360.com/ip/articles/1482972/apple-ceo-claims-antitrust-app-store-laws-will-hurt-privacy.

[7] S.2710 – 117th Congress (2021-2022): Open App Markets Act, S.2710, 117th Cong. (2022), https://www.congress.gov/bill/117th-congress/senate-bill/2710.

Image source: https://wordpress.org/openverse/image/7edd602f-4fd2-4e31-8ace-dd29151a663b

 

How Will the I.R.S. Tax Staking Rewards? Examining Jarrett v. United States

By Merritt Francis

 

In prior blog posts, I’ve written on blockchain technology, its future implementation into the private and public sectors, and NFTs.[1]  Blockchain technology is a decentralized, distributed ledger that records transactions.  It may be helpful to think of blockchains like a bank ledger, where computers record the transactions rather than individuals. Blockchains are “decentralized” because there is no central authority for data stored on them; transactions are recorded by a peer-to-peer network run by participant “nodes”.  Blockchains are “distributed,” because all transactions are viewable by the public.  Once a transaction is recorded on a blockchain, it is unmodifiable.

Transactions are recorded on a blockchain by participant “nodes,” which are computers completing consensus algorithms, allowing the transaction to be recorded on the blockchain.  Once an individual submits a transaction on a blockchain, the transaction is only recorded once a participant node satisfies the blockchain’s respective consensus algorithm.  There are two main examples of consensus algorithms blockchains use today: proof of work (PoW) and proof of stake (PoS).  I will first explain the PoW and PoS consensus algorithms, then proceed to examine the tax consequences Joshua Jarrett found himself in by engaging in a PoS blockchain.

PoW (Proof of Work) Consensus Algorithms

Bitcoin and Ethereum, two widely adopted blockchain technologies, employ a proof of work (PoW) consensus algorithm.[2]  Under a PoW consensus algorithm, participant nodes (crypto miners) download and run the full chain through a mathematical function.  Once a node has satisfied the PoW consensus algorithm, it’s rewarded by “mining” a block, which is the act of adding valid blocks onto the blockchain.  For example, the hash for Bitcoin block #660000, mined on December 4, 2020, is 00000000000000000008eddcaf078f12c69a439dde30dbb5aac3d9d94e9c18f6.  The block reward for that successful hash was 6.25 BTC.[3]

PoW makes it extremely difficult to alter any aspect of the blockchain, since such an alteration would require re-mining all subsequent blocks.  Unfortunately, however, PoW is bad for the environment because it uses up an immense amount of computing power.  Ethereum’s PoW consumes 73.2 TWh (terawatt-hour) annually, which is the energy equivalent of a medium-sized country like Austria.[4]  Tesla suspended vehicle purchases using Bitcoin due to climate change concerns.[5]  Elon Musk elaborated on the decision in a tweet, saying “Cryptocurrency is a good idea on many levels and we believe it has a promising future, but this cannot come at great cost to the environment.”[6]

PoS (Proof of Stake) Consensus Algorithms

In response to the PoW blockchains’ negative externalities, the proof of stake (PoS) consensus algorithm was created as an environmentally friendly alternative.  PoS “doesn’t rely on expensive hardware using vast amounts of electricity to compute mathematical puzzles,” like bitcoin’s PoW system.[7]  Under a PoS consensus algorithm, owners of a cryptocurrency offer their coins to nodes as collateral, and those nodes have a chance to validate new blocks.[8]  Offering coins as collateral to participant nodes is the process of “staking” your cryptocurrency.[9]  Coin owners with staked coins become “validators,” which entitles them to cryptocurrency rewards for each new block its node validates.[10]  This is what happened in Jarrett v. United States.[11]

Jarrett v. United States of America

During 2019, a Nashville couple, Joshua and Jessica Jarrett, received 8,876 tezos (XTZ) in staking rewards.  The tezos coins were worth $9,407 when the Jarretts received them, and they reported $9,407 as income and paid the related taxes.[12]

On July 31, 2020, the Jarretts filed an amended tax return demanding a $3,793 refund from the IRS.  Under IRS Notice 2014-21, virtual currencies are considered property for federal tax purposes.[13]  And, pursuant to § 1001(a) of the IRC (Computation of gain or loss), the gain from the sale or other disposition of property shall be the excess of the amount realized over the adjusted basis.[14]  As such, the Jarretts argued the virtual currency they received as staking rewards did not amount to taxable income because property is only taxed when it is sold or dispossessed, rather than when the property is created.[15]

The United States Department of Justice ordered the IRS to issue a $3,793 refund to the Jarretts, which they received on February 14, 2022.  The Jarretts, however, refused to accept the refund because the IRS failed to acknowledge the true rationale for issuing the refund.[16]

The rationale behind issuing the refund would provide precedent for other stakers to properly file their income taxes in the future.  So, the Jarretts sought a formal ruling from the United States District Court for the Middle District of Tennessee.  In response, the United States filed a motion to dismiss arguing the Jarretts’ action was moot because the United States fully refunded the claimed overpayment.[17]

In its February 28, 2022 motion to dismiss, the United States stated that “Mootness is ‘the doctrine of standing set in a time frame: The requisite personal interest that must exist at the commencement of the litigation (standing) must continue throughout its existence (mootness).’”[18]  Because the Jarretts received a full refund, as the United States argues, the Jarretts’ action for a refund is moot.

Taxpayers who engage in staking their cryptocurrencies will likely have to continue to speculate as to how the IRS will approach taxing staking rewards.  One thing is clear: it is time for the I.R.S. to release guidance on the matter.

 

[1] Merritt Francis, You Bought a JPEG File for $69.3 Million – What Are You Allowed To Do With It?, Richmond J. L. Tech. (2021), https://jolt.richmond.edu/2021/10/20/you-bought-a-jpeg-file-for-69-3-million-what-are-you-allowed-to-do-with-it/; Merritt Francis, Blockchain as Best Practice: The Benefits of the Criminal Justice System Implementing Blockchain Technology, Richmond J. L. Tech. (2021), https://jolt.richmond.edu/2022/01/06/blockchain-as-best-practice-the-benefits-of-the-criminal-justice-system-implementing-blockchain-technology/.

[2] Jake Frankenfield, Proof of Work (PoW), Investopedia (July 22, 2021), https://www.investopedia.com/terms/p/proof-work.asp.

[3] Id.

[4] Proof-Of-Work (POW), Ethereum (Mar. 28, 2022), https://ethereum.org/en/developers/docs/consensus-mechanisms/pow/.

[5] Lora Kolodny, Elon Musk says Tesla will stop accepting bitcoin for car purchases, citing environmental concerns, CNBC (May 12, 2021), https://www.cnbc.com/2021/05/12/elon-musk-says-tesla-will-stop-accepting-bitcoin-for-car-purchases.html.

[6] Id.

[7] Rachel-Rose O’Leary, The Creator of Proof-of-Stake Thinks He Finally Figured It Out, CoinDesk (Sept. 13, 2021, 4:21 AM), https://www.coindesk.com/markets/2018/09/07/the-creator-of-proof-of-stake-thinks-he-finally-figured-it-out/.

[8] Id.

[9] Id.

[10] Id.

[11] Jarrett v. United States of America, Docket No. 3:21-cv-00419 (M.D. Tenn. May 26, 2021).

[12] Id.

[13] I.R.S. Notice 2014-21.

[14] I.R.C. § 1001.

[15] Jarrett, Docket No. 3:21-cv-00419.

[16] Id.

[17] Id.

[18] Id.

Image source: https://sftaxcounsel.com/taxation-of-crypto-staking/

Page 15 of 84

Powered by WordPress & Theme by Anders Norén