Companies believe that AI regulations are increasing, and they view compliance with regulatory requirements as a priority, but few organizations have automated processes that will be key to meeting new regulations, according to the findings from the 2023 AI Regulations Readiness study from Verta Insights.
The study, which collected feedback from more than 300 organizations, examined trends in how organizations are preparing for upcoming AI regulations like the EU AI Act and the American Data Privacy and Protection Act (ADPPA).
The research looked at the extent to which organizations have automated AI governance and model documentation processes to understand how prepared they are to meet various compliance requirements called for in different proposed laws.
The results suggest relatively low levels of maturity around automating these processes. Close to 90% of companies have little or no automation in place for the governance or documentation that they prospectively will need to rely on to ensure regulatory compliance. This would include capabilities like ensuring against bias detection, model explainability and transparency.
This relatively low level of maturity contrasts with another finding from the research, where we asked participants to rate their level of confidence that their organization would be able to complete an Algorithm Impact Assessment as specified by the proposed ADPPA. More than one-quarter (28%) of respondents were highly confident about their organization’s ability to meet this requirement, while nearly half (49%) were somewhat confident.
Participants also provided an estimate of how many person-hours they believed it would require for their organization to complete the assessment for one model. The average across all participants: 40 hours for one assessment covering one model or model version.
Verta Insights Perspective
The levels of automation around governance and documentation processes that underpin regulatory compliance don’t support the optimistic projection of 40 hours needed to complete an algorithm impact assessment for one model. We frequently hear from industry that this process would take from several weeks to several months for a single model.
Organizations typically have not standardized or centralized documentation around models. It’s very often a manual, ad hoc process. Moreover, documentation for third-party or open-source models may be difficult or impossible to obtain.
As a result, when regulations come into force, many organizations will have to go through a multi-quarter, largely manual effort to retroactively compile the documentation needed for compliance. They’ll be relying on data scientists who likely worked on the models in question months or quarters ago, if those individuals are even with the organization anymore.
The teams that will be best positioned to meet the new requirements will have preemptively put the tooling and processes in place to centralize their model inventory and standardize documentation. This kind of tooling also can deliver operational benefits such as streamlined handoffs between stakeholder teams, better visibility to model performance, improved governance and lower risk.
For more detail of the requirements in the proposed EU AI Act, including timelines, penalties, compliance challenges and more, download the EU AI Act Overview and Article 10 and 11 fact sheets from Verta.
Subscribe To Our Blog
Get the latest from Verta delivered directly to you email.