Public Comments for 02/02/2026 Communications, Technology and Innovation - Communications Subcommittee
HB83 - Virginia Information Technologies Agency; powers of the CIO; creation of Cyber Civilian Corps.
No Comments Available
HB310 - Artificial Intelligence Workforce Impact Act; established, report.
No Comments Available
HB669 - Impersonation of certain licensed professionals by chatbot; definitions, notice, civil liability.
No Comments Available
HB713 - Fostering Access, Innovation, and Responsibility in Artificial Intelligence Act; established.
Last Name: January Organization: Chamber of Progress Locality: McLean

On behalf of Chamber of Progress, a tech industry association supporting public policies to build a society in which all people benefit from technological advances, I respectfully urge you to oppose HB 713, which would establish a new artificial intelligence (AI) regulatory framework that raises significant concerns around liability, disclosure, and enforcement without clearly addressing the gaps in existing law.

Last Name: January Organization: Chamber of Progress Locality: McLean

On behalf of Chamber of Progress, a tech industry association supporting public policies to build a society in which all people benefit from technological advances, I respectfully urge you to oppose HB 713, which would establish a new artificial intelligence (AI) regulatory framework that raises significant concerns around liability, disclosure, and enforcement without clearly addressing the gaps in existing law

HB797 - Va. Information Tech. Agency; artificial intelligence, independent verification organizations.
Last Name: January Organization: Chamber of Progress Locality: McLean

On behalf of Chamber of Progress, a tech industry association supporting public policies to build a society in which all people benefit from technological advances, I respectfully urge you to oppose HB 797, which would impose a broad, mandatory third-party artificial intelligence (AI) verification regime that adds regulatory complexity without clear evidence of improved consumer protection or public safety.

HB1186 - School board policies; prohibition on use of AI chatbots for certain student instructional purposes.
Last Name: McAvoy Organization: Fairfax County Public Schools Parents for Intentional Technology Locality: Fairfax Station

Good morning, Fairfax County Public Schools Parents for Intentional Technology was formed to encourage intentional and evidence based utilization of technology tools in education. The Brookings Institution recently (Jan 2026) released a premortem research report on AI in education. It's conclusion states "At this point in its trajectory, the risks of utilizing generative AI in children’s education overshadow its benefits." Decisions made for our students at critical phases of development and learning should not be made based on the current hype cycle of a new technology. Recent history teaches us that slow, deliberate and intentional implementation of tested technologies is prudent. Starting in 2012 and amplified by the covid pandemic, the vast majority of students were accessing instruction by means of a school issued computer or tablet. Recent research shows us that the more money that a school invests in technology, the lower the achievement of their students. According to a UNESCO report in 2023, there is very little independent evidence that educational technologies are benefiting our students. Uninformed and unregulated use of AI chatbots in education is irresponsible. There is no evidence to suggest it is an appropriate tool for learning. We must not be making educational decisions based on the marketing messages of technology companies. Fairfax County Public Schools Parents for Intentional Technology is asking you to please follow the guidance of the Brookings Institution - the current risks of utilizing generative AI in our classrooms outweighs any potential benefits. Please pass HB1186

Last Name: Suttmiller Locality: Chesapeake

I am an educator who has been in the classroom for 25 years. Since AI has been introduced, I have seen a huge increase in students ability to cheat. They use ChatGPT to find answers to test and writing assignments - from short answer to essays. By using AI to do the thinking for them, students have essentially become numb to the excitement of learning. They have little to no experience of what it is like to struggle in order to learn a new concept and the satisfaction of finally achieving accomplishment. They have also lost the ability to come up with original ideas. There is basically no critical thinking, analysis, or evaluation taking place in the classroom when AI is present. Students must be engaged with real-life learning experiences and simulations for learning to be impactful. For these reasons, I urge the House Committee to pass HB1186.

Last Name: Clement Locality: Fairfax County

I've been teaching in Virginia's public schools for over thirty years. I have not seen much that is as bad for student learning as AI tools are. It is not that these tools do not have the potential for good. Obviously they do. It is that overwhelmingly, students do not use tech tools in an ideal way. They do not use tech tools to enhance their learning. Students laugh at the way adults think they will use a new tech tool. They use these tools to circumvent work. It's human nature -- but even more so for a young person. The prefrontal cortex is not fully developed in young people. Our kids are simply not capable of using powerful tools like this in consistently productive ways. First of all, we have been down this road before. We once encouraged students to bring their own phones and other devices to school. We know now that was such a disaster that we have banned phones in school. We will be dealing with that learning loss for years. However, we learned our lesson with smartphones in the classroom. We have the power to avoid this same mistake when it comes to AI tools. Second, one of the things we (teachers) hear about every new tech tool is that we (schools) have to teach students how to use them, because they are part of life, and they will be seeing these tools in the work world and so on. However, AI tools are very easy to use. That's the point of them. You don't need any training to use them -- just ask whatever is on your mind (or whatever problem your teacher has given you), and the problem is solved. It might not be correct, and the student will have learned either nothing or the wrong thing, but the assignment will be done. We do not need to train students on how to use these tools. We need to train students on how to think and solve problems on their own. Third, one need only read the peer-reviewed science that exists on this topic to realize the broken thinking that is pushing for AI use in schools. The 2025 M.I.T. study entitled, "Your Brain on ChatGPT" is a staggering indictment of the effect of AI tools on cognition. Wess Trabelsi of the Ulster (NY) Board of Cooperative Educational Services has done a meta-analysis of peer-reviewed studies of the effects of AI tools in education. The results make it clear that any benefits to AI use in the classroom pale in comparison to the burdens inflicted on student learning. It would be irresponsible to know what is in these studies and not do what we can to prevent the certain damage to student learning that will accompany AI use in schools. Please support HB1186. Thank you.

Last Name: Marinaccio Organization: Center for Responsible Technology Locality: Alexandria City

As a business owner, tech developer, Virginia resident, and most importantly, a father, I rise in support of HB1186. A 2025 study from Microsoft & Carnegie Mellon shows that knowledge workers who use chatbots suffer from decreased critical thinking and a “deterioration of cognitive faculties.” If adults are negatively harmed, we should not gamble with our most valuable resource—our children. We have been told that as our world becomes more reliant on technology, we need to be trained in it ever earlier. But paradoxically, we are getting worse at tech. Over the past decade scores have actually fallen on the ICILS (an international digital literacy test), not improved. I see more and more applicants for jobs with fewer real hard skills and less diligence, fortitude, and resilience. It is absolutely critical that we no longer pander to the wants and lies of Big Tech, but rather place our children’s needs as the guiding principle for crafting rigorous educational standards with great care and discernment. Historically, Virginia has produced some of the brightest minds in our country. Though we may not always agree on everything, it is through lively debate and an exchange of ideas that we have excelled. Virginia students should not be required to outsource their thinking to entertainment bots, sold as "artificial intelligence." Focus on the real aim of education: human intelligence and avoid the risk of destroying the cognitive potential of an entire generation. Thank you.

HB1294 - Use of artificial intelligence-based tools; covered artificial intelligence, disclosure of use.
Last Name: Blair Organization: Policing Project Locality: Brooklyn

The attached document provides a written testimony in support of HB 1294 - requiring the disclosure of AI used by law enforcement

HB1295 - Law enforcement; artificial intelligence inventory, civil action.
Last Name: Blair Organization: Policing Project Locality: Brooklyn

The attached document provides written testimony in support of HB 1295 - requiring law enforcement agencies to release an annual list containing basic information about their AI tools.

Last Name: Belcher Locality: Richmond County

I am a sworn law enforcement officer in the Commonwealth of Virginia, and I respectfully submit this testimony in strong opposition to House Bill 1295. While framed as a measure to increase “transparency,” this bill functions as a litigation factory aimed squarely at law enforcement agencies. If enacted, it would have serious negative consequences for public safety, the integrity of investigations, and officers’ ability to perform their duties without undue interference. HB 1295 would require law enforcement agencies to publicly inventory and describe nearly all modern investigative software, including routine tools. Its broad, vague definition of “artificial intelligence” sweeps in systems used daily in policing. Agencies would be compelled to disclose capabilities, limitations, and operational details, exposing internal procedures that could be exploited by those seeking to evade detection or manipulate investigations. The bill also creates a private right of action allowing any resident to sue for technical noncompliance—even with no harm, no citizen encounter, and no violation of rights. Automatic attorneys’ fees paid by taxpayers, no intent requirement, no safe harbor, and no materiality standard mean agencies could face endless lawsuits over minor omissions or paperwork defects, diverting critical resources from public safety duties. HB 1295 is not about protecting rights. It governs law enforcement through litigation, chills lawful investigations, slows adoption of technologies that enhance policing, and effectively turns courts into regulatory agencies. Officers may hesitate to use legitimate tools for fear of triggering lawsuits, undermining crime prevention and delaying critical investigative work. The incentives created by this bill are misaligned with public interest. Instead of increasing accountability, it prioritizes litigation and administrative oversight over effective policing. Agencies will be forced to devote significant personnel time and resources to compliance reporting and defense of meritless claims, reducing their capacity to respond to serious crimes and protect communities. Enacting HB 1295 would have real-world consequences: fewer resources for investigations, slower responses to criminal activity, and a workforce discouraged from using lawful tools due to fear of liability. It creates a disincentive to adopt technological advancements that make communities safer, all in the name of ill-defined “transparency” better achieved through internal policies, public records laws, and proper oversight—not fee-shifting litigation. For these reasons, I respectfully urge the members of this committee to oppose House Bill 1295. This is bad policy, introduces perverse incentives, and threatens public safety in Virginia. Law enforcement should be accountable, but accountability must be reasonable, practical, and rooted in actual harm—not the creation of a litigation factory designed to regulate policing through lawsuits.

HB1521 - Digital innovation and infrastructure; establishing rights in digital property and technology resources; requiring risk management policies for critical infrastructure facilities controlled by critical artificial intelligence systems; providi...
No Comments Available
End of Comments