Dueling AI bills raise questions about Lamont’s ties to Oak HT/FC

Dueling AI bills raise questions about Lamont’s ties to Oak HT/FC


A proposed artificial intelligence (AI) bill brought forward by Gov. Ned Lamont appears to treat companies that are financially backed by Oak HT/FC, the equity investment firm co-founded by Annie Lamont, more leniently than a AI bill put forward by Sen. James Maroney, D-Milford.

Senate Democrats, led by Maroney, who is considered to be an AI expert, have been at odds with the governor over proposals to regulate AI since Senate Bill 2 was brought forward last year.

The governor’s bill largely places the regulatory power over AI into the hands of the Office of Policy and Management’s (OPM) chief data officer, who would review state agency data and publish any that would be useful “for artificial intelligence systems, machine learning, and other statistical means of data analysis to create economic opportunity and support state economic development goals, through private businesses, nonprofit organizations and other entities that will use such data.”

OPM’s chief data officer would then develop policies to ensure AI-driven decisions don’t lead to unlawful discrimination or reveal certain personally identifiable information.

Maroney’s bill, on the other hand, contains specific regulations for private businesses that carry compliance costs, and have drawn push back from a number of groups and companies in the AI industry, who argue that the bill will impose considerable burdens upon small businesses.

The legislation would require developers of “high-risk artificial intelligence systems,” defined as AI systems that make consequential decisions or communicates with consumers for the purposes of providing information, to disclose to users foreseeable risks and known harmful uses, documentation of data used to train the system and known limitations, and documentation on how the system was evaluated to mitigate discrimination.

The bill would also direct anyone who integrates a high risk AI system into a product that is offered to consumers to “use reasonable care to protect consumers from any known or reasonably foreseeable risks of algorithmic discrimination” resulting from its use. Other provisions would direct companies or service that utilize AI to notify consumers of when, how, and why it has been used in their product.

AI developers would also have to make technical documentation for their models accessible. It would further direct developers to disclose any documentation to the attorney general upon request.

“The requirements for conducting impact assessments, maintaining risk management programs, and documenting AI decision-making processes will impose substantial costs on small businesses in Connecticut,” Chris Davis, vice-president of public policy at the Connecticut Business and Industry Association, wrote in testimony on the bill. “Many small businesses simply do not have the resources or time to dedicate to comply with the law and may choose not to use beneficial artificial intelligence to create efficiencies that improve productivity increase tax revenue, and grow our economy.”

Maroney’s bill, voted out of the General Law Committee on March 24, contained an “off the shelf AI exemption” that would exempt companies using AI models they did not develop in their products — Maroney gave the example of a company using an AI program to screen resumes — from many of the requirements for deployers.

But that exemption wouldn’t apply to companies like Komodo and Notable, both of which are healthcare AI tools, or Feedzai, an AI fraud detection tool. All three companies develop their own tools and platforms, and all three are Oak HT/FC investments, according to their website. In total, the Oak HT/FC currently has 19 AI companies, mostly in the healthcare and financial sectors, in their portfolio.

While both bills propose an AI regulatory sandbox, which exempts participating businesses from having to comply with regulations for a certain period of time, Lamont’s proposed bill more directly focuses on businesses in sectors where Oak HT/FC is invested.

The governor’s bill would direct the Department of Economic and Community Development (DECD) to consult with the commissioners of the Department of Banking, Office of Health Strategy, Department of Public Health, and Insurance Department to develop a regulatory sandbox program.

The governor’s plan was supported in public testimony by DECD Commissioner Daniel O’Keefe. O’Keefe stated the sandbox would focus on “applications in insurance, finance, and health services,” areas where Oak HT/FC’s investments are focused.

Much of O’Keefe’s testimony focused on proposals in Lamont’s bill, recommending the General Law Committee consider those over the provisions in Maroney’s bill. Like Lamont, DECD has been at odds with Senate Democrats and proposals contained in this and last year’s proposed AI bills.

The governor’s bill further directs Connecticut Innovations, a state-owned venture capital arm focused on biotech and IT, to create an Artificial Intelligence and Quantum Technology Fund and invest in AI companies that either are located in the state or that move significant portions of their operations to the state.

Alison Malloy, the director of investments for Connecticut Innovations and niece of former Gov. Dannel Malloy, and Annie Lamont have ties, including as co-founders of the Tidal River Network, which promotes female investors.

Julia Bergman, a spokesperson in the governor’s office, denied that there might be a concern that the difference in the two bills could be interpreted as a conflict of interest.

“There is no concern at all about whether these proposals create a conflict of interest, because not only do they not present a conflict, but the Governor and First Lady have been continuously transparent since the first day of the governor’s first term by sharing all recusals related to Oak HC/FT.” Bergman said.

She attributed the differences in the bills to “different approaches” to regulating the rapidly changing AI industry.

“The two proposals represent different approaches to regulating an industry that is still rapidly evolving and one that is facing potential federal and state regulations that have nothing to do with Oak investments. The Governor believes the federal government could provide a national framework for regulation versus a patchwork of rules across states. He also wants to ensure innovation can thrive and that Connecticut is seen as a welcoming place for businesses. More importantly, he wants Connecticut’s future workforce to be trained on using AI and be prepared to fill the job of today’s economy and tomorrows.” Bergman added.

Generally, Lamont has argued that being the first state to regulate an emerging technology could prove difficult, might slow Connecticut’s potential to house a booming AI sector, and that a regulatory approach coming from a federal level or coalition of states would be preferable to a patchwork of individual state regulations.

O’Keefe made similar arguments both in his written testimony on Maroney’s bill and in comments made during the public hearing. Speaker of the House Matt Ritter, D-Hartford, has also signaled that he may agree with the governor’s position.

Gov. Lamont’s ties to Oak HT/FC became a political issue for the governor during the COVID-19 pandemic when state COVID testing contracts were awarded to Sema4, a company backed by Oak HT/FC.

The Office of State Ethics, however, determined there was nothing unethical about the arrangement, and while Lamont pledged that any profits derived from Sema4 getting those contracts would go to charity, the company fell on tough times following the end of the pandemic.

Maroney’s bill directs the DECD, along with OPM’s chief data officer and the Connecticut Technology Advisory Board, to create an AI regulatory sandbox program “to facilitate the development, testing and deployment of innovative artificial intelligence systems in the state.”

The program would encourage developers to promote “safe and innovative systems; balance consumer protection, privacy, and public safety,” and it would “provide clear guidelines for developers to test artificial intelligence systems” while they are exempt from regulations. The bill indicates the sandbox would apply to the use of AI across “various sectors, including, but not limited to, education, finance, health care and public service.” DECD would determine who to include in the program.

Maroney also said that the important part of the bill is the definition of high-risk AI and that the updated language the committee approved has narrowed this down to take away some “unintended consequences.”

The General Law Committee also voted to advance Lamont’s AI bill on March 24, the same day it advanced Maroney’s bill. Both bills now await action on the Senate calendar.

While the Senate passed Maroney’s AI bill last session, Ritter did not call it for a vote in the House and it died on the calendar when the legislature adjourned sine die. That dynamic may again prevent Maroney’s bill, which currently sits on the Senate calendar, from advancing this year.

Creative Commons License

Republish our articles for free, online or in print, under a Creative Commons license.



Source link