Tax ID numbers, Social Security numbers, net income, etc. CPAs manage a tremendous amount of valuable information for themselves and for their clients. Keeping it safe is a serious responsibility.
Practices are increasingly turning to artificial intelligence to help with data management and security, but paradoxically, that technology can pose security risks of its own. How can a CPA practice use AI tools effectively while continuing to be responsible for client information cybercriminals are trying to access regularly? That’s where the 2023 Federal Trade Commission Safeguards Rule comes into play.
Using AI to streamline operations
While AI can perform mundane tasks such as drafting emails and providing customer service via chatbot, its greatest value is in processing large amounts of information and making it accessible to humans.
Artificial intelligence has numerous use cases in accounting. AI can be used to analyze and categorize client receipts, learning to identify questionable or duplicate entries. It can research and summarize information from disparate sources in far less time than a human could, while freeing up an accountant’s time to develop insights and make decisions about the data. AI can review and analyze historical data and create budget forecasts.
When complicated tax questions arise, AI can carry out detailed legal research to identify pertinent legislation and regulations. It can be used to automate tax returns. The list is essentially endless.
Risks to look out for
A tool as powerful as AI comes with risks, however. One of the biggest areas of risk associated with AI in accounting is confidentiality. Information that is processed, analyzed, summarized or the like becomes subject to the AI tool’s own cybersecurity vulnerabilities. Users need to weigh the value of the use of AI for a particular application against the possibility of exposure of sensitive information.
Users also need to remember that AI is not infallible. It has been shown to produce results that are incorrect or biased. It’s important to view AI results with a critical eye to look for responses that don’t make sense or perpetuate biases or stereotypes. Often, these kinds of results can be avoided by providing good prompts. Guides and training programs for writing effective AI prompts are beginning to pop up across the internet.
Responsibilities under the FTC Safeguards Rule
As a business that stores personally identifiable information about its clients, a CPA practice must follow federal regulations concerning cybersecurity. In the cybersecurity arena, the Federal Trade Commission has jurisdiction over what it defines as financial institutions, i.e., “companies that offer consumers financial products or services like loans, financial or investment advice, or insurance.” Accounting practices fall squarely under this definition and thus must comply with the FTC’s Safeguards Rule. This set of regulations contains nine main requirements, including elements like naming a Qualified Individual to head the firm’s cybersecurity efforts, carrying out a risk assessment and regularly testing the system for vulnerabilities, and monitoring a firm’s service providers as to their cybersecurity compliance.
Forming the foundation of an accounting practice’s cybersecurity system is a Written Information Security Plan. This overarching document identifies what the firm would do in the event of a security breach — who makes final decisions, who must be contacted and how, and how the breach would be contained. For CPAs, having a WISP is critical, because they must certify on their application for a Preparer Tax Identification Number that they have a WISP in place. Without a PTIN, a CPA cannot file taxes for their clients. Accounting firms that do not have an up-to-date WISP and follow other Safeguards Rule compliance requirements risk having their PTINs revoked.
Experts offer a number of tips to help with making the transition to AI.
- Adoption doesn’t have to happen all at once. Practices can try out AI a bit at a time, using it for one application and then adding more as staff become adjusted to it. Products from different AI providers can be tested and compared.
- Using clean data is vital. AI cannot make good reports from bad data, so it’s important to follow good data management practices.
- Training is key. AI is constantly changing and to make the most of it, associates need ongoing training.
Balancing the rewards and risks behind AI tools is critical. Use the Safeguard Rules as a guide to ensure FTC compliance and risk management.