The Whitehouse has cancelled the October 2023 executive order from the previous administration on AI regulation and oversight as one of many such cancellations now that the new administration is in power.
The executive order generally called for the development of standards and best practices to address various aspects of AI risk, such as for detecting AI-generated content and authenticating official communications. It also directed government agencies to study things such as how AI could impact the labor market and how agencies collect and use commercially available information. It also emphasizes the development of new technologies to protect privacy rights and bolster cybersecurity, as well as training on AI discrimination, and the release of guidance on how different agencies should be using AI.
The AP noted that many of the items in the original executive order have been fulfilled already—such as numerous studies on things like cybersecurity risks and effects on education, workplaces and public benefits—and so there is not that much to repeal in the first place.
One key provision that is now gone, however, is the requirement that tech companies building the most powerful AI models share details with the government about the workings of those systems before they are unleashed to the public. Opponents of the executive order had long said it would reveal trade secrets and hamstring US tech companies.
When the EO was first signed, Alex Hangerup, co-founder and CEO of payment automation and insights solutions provider Vic.ai, said the executive order was a good first step, as its scope was ambitious and comprehensive, though expressed concerns that the order has a lot of moving parts and may be difficult to maintain. Asked about his feelings regarding the repeal, he did not seem especially troubled, noting that players in the AI space should be relying on a collaborative framework versus a top-down bureaucratic approach anyway.
“AI has the potential to be one of the most transformative forces in modern finance, and fostering innovation in this space is critical. The previous Executive Order was a step toward structured oversight, but any AI regulation must strike a balance—protecting against risk while ensuring we don’t stifle progress. The decision to rescind the order underscores the importance of a more adaptive, market-driven approach to AI governance. Rather than relying on rigid top-down mandates, we need a collaborative framework that evolves with the technology. Responsible AI development doesn’t mean excessive bureaucracy; it means accountability, transparency, and engagement with the businesses actually building these solutions. At Vic.ai, we believe the future of AI in accounting—and across industries—depends on fostering innovation while ensuring AI remains a force for accuracy, efficiency, and trust,” he said.
Pascal Finette, co-founder and CEO of technology consulting firm “be Radical,” at the time said the order seemed to be crafted by people who didn’t really understand AI much in the first place, and many of its provisions seemed more motivated by fear and worry, pointing to language that infers AI is a weapon which must be controlled. Overall, at the time, he said the order felt far reaching and somewhat reactionary given its focus on foundational models, and said regulation would be much easier applied at the application level. Overall, though, he wasn’t very concerned there would be any direct impacts on the accounting solutions space, as most vendors don’t create their own models but instead rely on those created by other companies that may or may not fall under the executive order.
When asked what he thought about the repeal, he repeated that many of the original provisions didn’t seem that thought out and so it was good some of the more ill-conceived aspects will be cancelled, though it leaves open questions about sustainability and responsibility.
“I’d say (with probably anything Trump says or does), it’s too early to tell. On one hand, I think it’s good and useful that we removed this somewhat ill-advised policy; on the other hand, there are huge question marks around the responsible and long-term sustainability of AI and its impact on society and businesses,” said Finette.
Aaron Harris, chief technology officer at practice management solutions provider Sage, said at the time the executive order was signed that it was an important step forward, given the rapid proliferation of AI technologies. He felt at the time that the order sets the appropriate tone for AI development, as it emphasizes safe and responsible uses. Today, he said there is need to simultaneously nurture and support AI innovation while recognizing SMEs need to feel confident they’re working with technology partners who adhere to safe, ethical AI development practices. Harris added that the Trump administration’s decision to cancel the executive order doesn’t change this fundamental relationship.
“AI remains one of the greatest opportunities of our time. And as AI evolves, it’s expected that governmental policies and regulations around AI will as well. At Sage, our stance remains that AI practices must be ethical and responsible. We are committed to building AI technology for the future that is safe, transparent and trustworthy. As the regulatory landscape evolves, our mission remains clear: to innovate responsibly and empower businesses without compromising ethical standards. In the U.S., I am optimistic that the current administration will continue to create opportunities to evolve and accelerate AI innovation — improving lives and driving economic growth — while staying true to our duty of upholding the highest standards of ethics and trust,” said Harris.
Amy Matsuo, regulatory insights leader at KPMG, noted in a statement that this might lead to increasing divergence between state and federal regulators. She also pointed out that while there is nothing to replace the executive order, companies should still expect some regulatory focus regarding their AI ambitions.
“As expected, the new Administration has repealed the previous Administration’s 2023 AI Executive Order, but did not immediately initiate a series of net-new AI actions. To drive US leadership in AI, the new Administration is reportedly looking to expand data center and energy capacity and encourage innovative model development and application. Companies should expect regulatory focus on critical security, national security and sensitive data. However, increased divergence with state and global AI- and privacy-related regulatory activity will increase (with a flurry of 2025 state bills already in motion), resulting in a continued regulatory patchwork as well as likely expanded state AG actions
The news comes around the same time that the administration also announced a $500 billion investment in AI technology.
“The $500 billion investment is pretty nuts—I’m not sure if you saw the comparisons, but it’s a multiple of the cost of the whole Apollo program (in today’s dollars),” said Finette.