Public Comments for 01/27/2025 Communications, Technology and Innovation
HB1796 - Corporations; creates a regulatory framework for decentralized autonomous organizations.
I am a Director with the Virginia Blockchain Council. We believe that it is important to have this BILL pass. We think that it is important to have legal structure and framework for DAOs. For several reasons. Guidelines for the structures to operate. This allows accountability and enforcement. Allows DOAs to be recognized by the courts in dealing in contract s. Also there are great use cases needed for this structure including but not limited to underserved communities that can benefit by having access to DOAs for community-driven initiatives like managing public resources or funding social projects. We also believe that Virginia believes in innovation and keeping its citizen at the forefront of change and Giving the State a competitive edge. We support this BILL.
I am against this bill which aims to create a regulatory framework for decentralized autonomous organizations (DAOs) in Virginia. Regulatory Complexity: Introducing a new legal entity structure like DAOs adds complexity to Virginia's corporate law landscape. This could complicate the regulatory environment, making it harder for businesses and regulators to navigate, especially given the unique nature of DAOs which operate via blockchain technology and smart contracts. Legal Ambiguity: The integration of smart contracts into corporate governance introduces legal ambiguities. Smart contracts, being self-executing, might not align well with traditional legal frameworks, potentially leading to disputes over interpretation, enforcement, and liability that current laws are not equipped to handle. Security and Fraud Risks: DAOs, due to their decentralized nature, are susceptible to security breaches and fraud, as seen in past incidents like The DAO hack. The regulatory framework might not sufficiently address these risks, leaving investors and participants vulnerable without clear legal recourse. Member Rights and Responsibilities: The bill's provisions on member rights within DAOs could be challenging to enforce in a decentralized setting where traditional corporate governance structures do not apply, potentially leading to confusion over accountability, decision-making, and profit distribution. Dissolution Challenges: The process for dissolving a DAO, as outlined, might not account for the technical challenges of unwinding blockchain-based entities, where assets might be locked in smart contracts or distributed across a global network, complicating asset recovery and closure. Innovation vs. Regulation: While DAOs represent innovation in corporate structures, premature or overly rigid regulation could stifle this innovation. The technology and operational models of DAOs are still evolving, and a fixed legal framework might not adapt well to future developments. Delayed Effective Date: The delay until January 1, 2026, might seem beneficial for preparation, but it could also mean that by the time the framework is implemented, the technology or market dynamics of DAOs might have significantly changed, rendering parts of the regulation obsolete or inadequate. Global Nature of DAOs: DAOs often operate on a global scale, and state-specific regulations might not effectively govern entities that can exist and operate outside traditional jurisdictional boundaries, potentially reducing the effectiveness of Virginia's regulatory oversight. Precedent for Other States: By being one of the first to regulate DAOs, Virginia might set a precedent that other states follow, potentially leading to a patchwork of state laws that could complicate the operation of DAOs across the U.S., affecting their scalability and interoperability. I oppose this legislation due to concerns over regulatory complexity, legal ambiguity, security risks, the challenge of defining member rights in a decentralized context, dissolution issues, the balance between innovation and regulation, potential obsolescence, the global nature of DAOs, and the precedent it might set for other jurisdictions. A more cautious, flexible approach allowing for the evolution of both technology and law might be more appropriate.
HB2021 - Fair Voice Purchasing Act; established, penalties.
The disAbility Law Center of Virginia supports choice for people with disabilities. HB 2021 enhances choice for all, by making voice purchasing an affirmative decision for the purchaser, rather than an often unknown default feature. We support HB 2021.
TechNet Remarks HB 2021
HB2043 - Consumer Data Protection Act; user-generated content protected, civil penalty.
Please see attached.
Because so many Internet services are free to the user, it is easy to forget that the provider has a business plan that probably involves the user's data or content. This bill will help protect users and make them more aware of what's going on, Therefore I support HB2043.
TechNet's comments on HB 2043 are attached.
HB2046 - High-risk artificial intelligence; development, deployment, and use by public bodies, report.
I support HB2046. We need this well thought out approach to regulating artificial intelligence before we find ourselves faced with a host of unintended consequences. I particularly like the recognition that some of the material AI has been using is highly biased or otherwise inaccurate. Garbage in, garbage out is still an important rule in computing. Please vote for HB2046.
TechNet Remarks HB 2046
HB2094 - High-risk artificial intelligence; definitions, development, deployment, and use, civil penalties.
Dear Chair Torian, Vice Chair Sickles, and Distinguished Members of the Committee, On behalf of the Electronic Transactions Association (ETA), the leading trade association representing the payments industry, we appreciate the opportunity to share our opposition and broad concerns with HB 2094. ETA and its members are supportive of efforts to promote responsible use of artificial intelligence (AI) tools and systems. Our industry has long been at the forefront of developing and implementing safeguards to ensure AI is used responsibly and does not result in unjustified differential treatment. ETA’s members and their use of AI occurs within the confines of one of the most highly regulated industries, while adhering to the principles of explainability, privacy, risk management, and fairness within existing legal frameworks, including: the Equal Credit Opportunity Act (ECOA), which governs both traditional and AI-assisted lending practices, and state privacy laws. Currently, the list of activities in the definition of consequential decisions uses the term “(iv) a financial or lending service,” which ETA believes is overbroad and is likely to include low risk AI uses that greatly benefit consumers. Therefore, ETA believes that financial services should be removed from the list of consequential decisions. Doing so will enable companies to take a risk-based approach, consistent with multiple sections of this legislation, and avoid burdensome requirements for low-risk AI uses, such as using AI to categorize expenses for tax or other financial planning purposes or connecting people to financial experts. It will also avoid redundancies because our members already adhere to strict state and federal regulations.....
Thank you for your consideration of the attached comments from the Business Software Alliance regarding HB 2094.
Please see attached.
TechNet's written remarks on HB 2094 are attached.
The R Street Institute strongly urges your opposition to HB2094 as outlined in the attached long form testimony.
My name is Stefan Padfield, and I am the Executive Director of the Free Enterprise Project, which is part of the National Center for Public Policy Research. In that capacity, I spend a significant amount of my time researching corporate diversity, equity, and inclusion programs, otherwise known as DEI programs. I also routinely engage with corporations about those programs. In light of that experience, I am here today to comment on the definition of “algorithmic discrimination” in the High-Risk Artificial Intelligence Developer and Deployer Act, and specifically the proposed exclusion from that definition of “the expansion of an applicant, customer, or participant pool to increase diversity or redress historical discrimination.” In his 2019 book, "How to Be an Antiracist," Ibram X. Kendi infamously asserted that: “The only remedy to past discrimination is present discrimination.” This discrimination in the name of anti-discrimination is integral to critical race theory and DEI programs. It is rooted in an utterly divisive world view that separates people into oppressed and oppressor classes based on the color of their skin, and which has accordingly been quite properly described as neo-Marxist. Calling this discrimination “anti-discrimination” does not make it legal, as the U.S. Supreme Court ruled in the 2023 case of Students for Fair Admissions v. Harvard, wherein the Court struck down affirmative action programs in colleges and universities because, among other things, it is impossible to discriminate in favor of one group based on race without discriminating against other groups on the basis of race, as the opening quote from Ibram Kendi quite brazenly acknowledges. This discrimination in the name of anti-discrimination, which can fairly be called neo-racist in addition to being neo-Marxist, has also been rejected by the American people as immoral. This can be seen in the rash of giant corporations that have recently walked back or entirely eliminated their DEI programs in response to having those programs exposed publicly or in order to avoid having those programs exposed publicly. These include Amazon, Meta, McDonald's, American Airlines, Boeing, Caterpillar, Ford, Harley-Davidson, John Deere, Lowe's, Toyota, Tractor Supply Company, and Walmart. And yet this illegal and immoral discrimination in the name of antidiscrimination appears to be precisely the goal of defining “algorithmic discrimination” in the High-Risk Artificial Intelligence Developer and Deployer Act as excluding “the expansion of an applicant, customer, or participant pool to increase diversity or redress historical discrimination.” This discrimination in the name of antidiscrimination has always been illegal and immoral, but it captured many of our institutions in the upheaval following the BLM riots of 2020 and under the cover of misleading labels like “anti-racism”, “diversity”, “equity”, and “inclusion”. But the truth of what is actually being done under those banners is ultimately rejected by mainstream Americans whenever and wherever it comes to light. In fact, the 2024 election results can be seen as in part driven by the country’s rejection of the left’s obsession with smearing this great nation as systemically racist, which underlies the discrimination in the name of antidiscrimination under consideration here. I urge you not to make this illegal and immoral discrimination a part of this legislation. Thank you.
TCAI supports passing HB 2094 because of its strong protections against discrimination.
TCAI supports recommending HB2094 for consideration by the full House.
HB2124 - Synthetic digital content; definition, penalty, report, effective clause.
I am against this bill which expands defamation laws to include synthetic digital content and introduces new penalties, considering the following legal precedents and concerns: First Amendment: The expansion of defamation laws to synthetic content could infringe on free speech, as the Supreme Court in New York Times Co. v. Sullivan (1964) set a high bar for defamation claims to protect free expression. This bill might lower that threshold, potentially chilling legitimate speech. Vagueness Doctrine: The definition and application of "synthetic digital content" might be too broad or vague, risking violation of the void for vagueness doctrine, as seen in Grayned v. City of Rockford (1972), where laws must be clear to avoid arbitrary enforcement. Criminalization of Speech: By making the use of synthetic content for fraud a Class 1 misdemeanor, the bill treads into the area of criminalizing speech, which could conflict with United States v. Stevens (2010), where the Court struck down a law for overbroad criminalization of speech. Double Jeopardy: Imposing a separate penalty for using synthetic content in fraud might raise double jeopardy concerns, similar to issues discussed in Blockburger v. United States (1932), where multiple punishments for the same act are scrutinized. Civil Liability Expansion: Allowing individuals depicted in synthetic content to sue for damages could lead to an expansion of civil liability, potentially conflicting with Hustler Magazine v. Falwell (1988), where the Court protected parody and satire under the First Amendment, which might extend to some forms of synthetic content. Precedent of Overregulation: This bill might set a precedent for overregulation of digital content, similar to concerns in Reno v. ACLU (1997), where broad internet regulations were found unconstitutional for stifling free speech. Work Group Overreach: The Attorney General convening a work group to study enforcement might lead to recommendations for further regulation, echoing the regulatory overreach criticized in Lochner v. New York (1905), where excessive government intervention in private matters was rebuked. Privacy and Publicity Rights: While seeking to protect individuals, this bill might inadvertently complicate privacy and publicity rights, as seen in Zacchini v. Scripps-Howard Broadcasting Co. (1977), where the right of publicity was upheld, potentially leading to conflicts in how synthetic content is regulated. I oppose this legislation due to potential First Amendment violations, vagueness in law, criminalization of speech, double jeopardy issues, expanded civil liability, the risk of regulatory overreach, and complications with privacy and publicity rights, advocating for a more nuanced approach that balances protection against fraud with freedom of expression.
TechNet's written remarks on HB 2124 are attached.
HB2250 - Artificial Intelligence Training Data Transparency Act; transparency and disclosure requirements.
Please see attached.
TechNet Remarks HB 2250
TCAI strongly support HB 2250 which meaningfully provides transparency in AI development.
TCAI is proud to submit a letter of strong support for this important legislation that promotes transparency in AI development without being onerous.
HB2268 - Emerging Technologies, Cybersecurity, and Data Privacy, Division of; established.
Please vote for HB2268 to establish a Division of Emerging Technologies, Cybersecurity, and Data Privacy. The need is obvious. I particularly like that public engagement is built in to this bill.
HB2411 - Consumer Counsel, Division of; expands duties, artificial intelligence fraud and abuse.
I am against this bill which expands the duties of the Division of Consumer Counsel to include programs against AI fraud and abuse, based on legal precedents and concerns: Scope of Authority: This expansion might exceed the Division's traditional scope, similar to issues in Whitman v. American Trucking Associations, Inc. (2001), where the delegation of legislative power was scrutinized. The Division's role might not be suited for tech-specific issues like AI fraud. Regulatory Overreach: By establishing a statewide alert system for AI fraud, this could be seen as an overreach, echoing concerns from Lochner v. New York (1905) about government intervention in areas outside its expertise, potentially stifling innovation or creating unnecessary bureaucracy. First Amendment: Programs targeting AI content might infringe on free speech, as seen in United States v. Alvarez (2012), where the Court protected false statements under the First Amendment, questioning how broadly 'fraud and abuse' could be interpreted without impacting legitimate speech. Vagueness: The terms 'fraud and abuse' related to AI might be too vague, risking a violation of the void for vagueness doctrine from Connally v. General Construction Co. (1926), where laws must provide clear standards to avoid arbitrary enforcement. Privacy Concerns: Implementing an alert system could lead to privacy issues, reminiscent of Katz v. United States (1967), where the expectation of privacy was upheld. Monitoring AI use might require surveillance that could infringe on personal privacy. Preemption by Federal Law: Given the federal interest in regulating technology and fraud, this state initiative might face preemption issues, as in Arizona v. United States (2012), where state laws conflicting with federal authority were invalidated. Resource Allocation: Expanding the Division's duties might divert resources from its core functions, akin to the concerns in Massachusetts v. EPA (2007) regarding agency focus, potentially reducing effectiveness in traditional consumer protection areas. Legal Liability: The Division might face increased legal liability or challenges in defining what constitutes AI fraud, similar to the complexities in Central Hudson Gas & Electric Corp. v. Public Service Commission (1980) regarding commercial speech regulation, where clarity and justification are required. I oppose this legislation due to concerns over the scope of authority, regulatory overreach, First Amendment implications, legal vagueness, privacy issues, potential federal preemption, resource misallocation, and increased legal liability, suggesting a more focused role for the Division or alternative solutions that respect legal boundaries and consumer rights.
HB2462 - Unauthorized use of name, portrait, etc.; digital replica, civil liability, statute of limitations.
Please see the attached.
HB2462 is needed to protect our individual right to control our own likeness. I do not want to discover that my face has been added to something embarrassing. I ask you to support HB2462.
TechNet Remarks HB 2462
HB2541 - Information Technology Access Act; digital accessibility, definitions, procurement requirements.
I am against this bill which proposes changes to the Information Technology Access Act regarding digital accessibility, considering legal and practical concerns: Federal Preemption: The bill might conflict with federal standards like the Americans with Disabilities Act (ADA) and Section 508 of the Rehabilitation Act, as discussed in PGA Tour, Inc. v. Martin (2001), where state laws must not undermine federal accessibility standards, potentially leading to legal challenges. Vagueness and Overbreadth: The definition of "information and communications technology" could be too broad or vague, risking violation of the void for vagueness doctrine from Connally v. General Construction Co. (1926), where laws must be clear to avoid arbitrary enforcement, especially in defining what constitutes digital accessibility for all disabilities. Administrative Burden: Designating a digital accessibility coordinator in each covered entity adds significant administrative overhead, similar to the concerns raised in Massachusetts v. EPA (2007) regarding agency resource allocation, potentially diverting focus from other critical functions. Compliance Costs: Implementing comprehensive digital accessibility policies could impose substantial costs on entities, echoing economic concerns from Lochner v. New York (1905) about government regulations that might unfairly burden businesses, particularly smaller entities or those in less populated areas. Delayed Implementation: The staggered effective dates might lead to inequity in accessibility, where some populations receive benefits later than others, potentially conflicting with the equal protection principles discussed in Brown v. Board of Education (1954), albeit in a different context. Enforcement Challenges: Ensuring compliance across various entities could be logistically challenging, akin to issues in Wyoming v. Oklahoma (1992) regarding state enforcement capabilities, where uniform application might be difficult, leading to inconsistent accessibility standards. Privacy Concerns: The role of digital accessibility coordinators might involve collecting and managing personal data related to disabilities, raising privacy issues similar to those in Carpenter v. United States (2018), where the handling of personal information was scrutinized. One-Size-Fits-All Approach: A uniform policy might not cater to the diverse needs of different disabilities, potentially not meeting the individualized assessment standard set by Olmstead v. L.C. (1999), which emphasized tailored accommodations. I oppose this legislation due to potential conflicts with federal laws, vagueness in definitions, increased administrative burdens, compliance costs, delayed and inequitable implementation, enforcement challenges, privacy issues, and the risk of a one-size-fits-all approach not adequately addressing the spectrum of disabilities, advocating for a more nuanced, resource-conscious, and privacy-respecting approach to digital accessibility.
HB1624 - Consumer Data Protection Act; social media platforms; addictive feed prohibited for minors.
Prior to my retirement, I was Staff Director of a major subcommittee of the US House of Representatives, and my immediate boss was Congressman Jon Porter, who represented: LAS VEGAS, NEVADA. And Jon used to tell me, "Ron, I never go into those places because I figured out a long time ago how they pay for them." And, of course he was saying casinos are NOT money losers for the house. Every time lawmakers seem to want new laws that govern behavior they send us deeper into the swamp of moral rot and decay, not to mention teaching our children to use drugs and engage in promiscuous behavior, all of which are destructive to our neighborhoods, cities, states, and most important, our Nation. Encouraging "striking it rich" and "I'll beat the odds" activity causes individuals and families to sacrifice limited incomes on the pipe dream that they will be "more lucky, next time." Do NOT impose this new scourge on us--NO CASINO in Fairfax County. Ron Martinson 703 354-3997
Comments Document
The Consumer Choice Center expresses concern with VA HB1624 which requires that social media networks identify their users to classify those under 18 years of age and require parental consent if said platforms provide what the legislation broadly declares are “addictive feeds”. The bill also restricts social media firms from offering alternative products to minors. The goal of protecting children online and steering them toward healthy uses of technology and social media is an important and noble goal that we also champion. However, due to the language in this bill and the effects it would have on practically all users of social media, the measure would cause more harm than good. HB1624 would have a worrying impact on the ability of anyone – minor or adult – to freely use certain social platforms and participate online. Please see the attached letter for more information on our policy concerns with the proposed legislation. Stephen Kent Manassas, VA Consumer Choice Center
Comments Document
Contains our feedback on the proposed legislation based on our expertise with this issue from around dealing with this issue in other states.
Comments Document
Dear Chair Hayes, Delegate Thomas, and Members of the Committee, We at Stop Child Predators write to you today to express our opposition to House Bill 1624, which would fail to protect children in Virginia, infringe on Virginians’ freedom of speech, and disenfranchise parents from their right to make decisions on matters regarding their own children. Social media can be beneficial for teenagers when it comes to age-appropriate friendships and exploring their hobbies and interests. Parental involvement and monitoring is critical to ensuring teenagers remain safe online and learn healthy online habits before they become adults. It should be parents who decide what tools and resources are appropriate for their children. House Bill 1624’s definition of what is “addictive” is incredibly vague and the legislation fails to offer resources to encourage parents or teenagers to implement time limits or screen time online. In fact, the algorithms House Bill 1624 would prohibit keep our children safer by filtering unwanted content for children – providing parents another resource to better protect their children. Governor Youngkin was absolutely right to issue an executive order to protect kids online because something must be done. He is showing strong leadership for Virginia in this area, but legislators should empower parents as they take his direction. Additionally, this bill is destined to be held up in court on constitutional questions related to free speech, where similar bills in California in New York have faced temporary injunctions as the courts make a final ruling on the bill’s constitutionality. Virginia should not follow California and New York’s lead by passing House Bill 1624 before the California courts issue their final ruling regarding the constitutionality of this legislation. If House Bill 1624 is passed, it will likely waste taxpayer dollars defending an unconstitutional bill that will do nothing to stop child predators or protect children online in the meantime. This is a far cry from what Virginia parents need. As an advocacy group that has focused on stopping child predators for nearly 20 years, we encourage the Virginia legislature to consider legislation that puts parents in the driver’s seat and makes real headway against predators by providing for municipal, state, and federal law enforcement, more prosecutors, more judges, and more support for victims. There are many helpful solutions that would keep kids safe online without violating free speech. House Bill 1624 will only waste time and money while our teenagers are in desperate need of real solutions. Respectfully, Stop Child Predators
Comments Document
Chairman Hayes and members of the committee: My name is Max Gulker, and I am a senior policy analyst at Reason Foundation. Thank you for the opportunity to offer our analysis of House Bill 1624 (HB 1624). Along with the many heavily debated pros and cons of online age restrictions and parental consent in general, HB 1624 opens the door to negative unintended consequences due to the way that it is written. HB 1624 attempts to define which features and services offered by social media platforms are potentially risky for minors. Concerns over the personalized algorithms social media uses to recommend content (“addictive feeds”) stem from some highly publicized recent studies about kids and screen time, as well as features of algorithms some worry have addictive properties. Using this definition, HB 1624 breaks down the activities performed by social media algorithms and provides a list of features for which minors must have verifiable parental consent. Algorithms recommending, displaying, and moderating content provided by others based on information obtained from the user would require minors to have parental consent, though the bill sets forth several exemptions. The framework defining “addictive feeds” in HB 1624–with dozens of terms defined and several exemptions set forth–may do a good job of describing the ways we use social media algorithms today. But a similar effort five or ten years ago would likely have looked very different, and the framework of HB 1624 will likely be obsolete in ways we can’t predict five or ten years from now. This will reduce the effectiveness of HB 1624. Enforcement will grow more complicated, and new features and activities provided by social media will lead to new arguments about what is and isn’t covered in the bill. As compliance and enforcement grow more complicated, the playing field will tilt toward larger and more experienced firms and away from small firms and new entrants. The “addictive feeds” framework, particularly if adopted by more states, could ultimately distort future innovation, once again in unintended and undesirable ways. By concerning itself with the specific nuts and bolts of how social media operates in 2025, innovative platforms may stick to old processes because they make compliance easier, even when new processes or features could be of greater value or even safer for users. Finally, the same factors that make the “addictive feeds” framework problematic–high-profile and rapidly changing technology–raise questions about whether the research on potential harms will stand the test of time. The studies noted above are both recent and controversial. Our understanding of yesterday’s media revolutions (recorded music, television, to name just two) and their possible harms have changed with time. Our understanding of any harms from social media, and the sources of those harms, will no doubt change as well. These potential consequences of adopting the “addictive feeds” framework are but one small part of the complicated set of debates over kids and social media. But they suggest that the “addictive feeds” framework is the wrong approach for an industry defined by rapid and unpredictable technological change. The potential unintended consequences from this method of requiring parental consent, in our view, outweigh the potential benefits.
Comments Document
Please see attached.
Oppose this bill because it is DEI and similar divisive legislation that lowers standards and expectations, and creates separation.
Comments Document
TechNet HB 1624 remarks
Comments Document
Writing in regard to Virginia House Bill 1624.
Comments Document
See attached file.
Comments Document
See attached file.
Comments Document
The Virginia legislature should learn from California's recent mistake of passing an unconstitutional law restricting online speech and avoid repeating it with HB 1624. Instead of wasting taxpayer money on likely unconstitutional legislation, Virginia lawmakers should focus on legal and effective solutions to promote online safety for children, such as educational initiatives that empower parents. These efforts would better serve Virginians and respect their free speech rights.
Comments Document
The Virginia legislature should learn from California's recent mistake of passing an unconstitutional law restricting online speech and avoid repeating it with HB 1624. Instead of wasting taxpayer money on likely unconstitutional legislation, Virginia lawmakers should focus on legal and effective solutions to promote online safety for children, such as educational initiatives that empower parents. These efforts would better serve Virginians and respect their free speech rights.
Comments Document
We have serious constitutional concerns about this bill's restrictions on personalized content for minors on social media platforms. The Supreme Court has consistently upheld strong First Amendment protections for both sharing and receiving online content, most recently in Moody v. NetChoice in 2024, where the Court explicitly recognized that social media feeds are protected expression. Additionally, just last year, California enacted a law similar to HB 1624. But California is currently prohibited from enforcing it. The case challenging California’s law will soon be heard by the Ninth Circuit. At the very least, if the Committee is not convinced of HB 1624’s unconstitutionality, it should wait until the California litigation is resolved before enacting HB 1624.
I hope I understand this bill correctly. I believe it is protecting our kids from being exposed to inappropriate content such as weed, cigarettes, and sexual material? If that’s the case, then why are Democrats legalizing cannabis? Why lower expectations and standards in schools because that is what keeps happening from the far-left Democrat Party. I don’t want to see these inappropriate things myself. Nothing is safe for our children and young adults. I am This begins with the family and following the Rule of Law.
Comments Document
TechNet's written remarks on HB 1624 - addictive feeds - are attached.