As enterprises adopt digital as a way of life, hyper-automation is a key cog in the wheel that turns high-touch processes to low-touch and finally to no-touch processes. The advent of Robotic Process Automation (RPA) that helps automate repetitive tasks has led to exponential savings. Today, data sciences and analytics are infusing intelligence into automation, leading to an era of hyper-automation.
The role of automation in improving efficiencies, quality, productivity, safety, and reducing waste has been recognised by a variety of industries for decades. This entire journey of automation is leading to a proliferation of digital workers in the enterprise (also known as bots) that support humans or autonomously execute tasks.
With such an increase of digital workers in the enterprise, compliance audits of industry standards that were traditionally executed on humans and processes, have now become a necessity for the digital workforce a.k.a. bots. Having said that, enterprises can benefit immensely from hyper-automation, which enables optimisation and modernisation of operations at scale.
For example, recently an organisation was facing issues with vendor payments, duplicate payments, and incorrect supplier debits in spite of traditional automation and multiple human gating points. A traditional sampling-based audit approach was not able to find the extent of the problem at hand. It considered hyper-automation (machines doing the audit with 100% coverage as against sampling) in a bid to minimise human involvement and cut down losses.
New risk from new technology
Just as this astounding hyper-automation power can change how an organisation does business, it also creates new challenges in the control environment. Internal audit can help the organisation understand, evaluate and assess the new governance, risk management and control mechanisms associated with hyper-automation using cognitive systems.
Organisations that deploy hyper-automation need to establish a mechanism to monitor and govern risks. They need to establish key risk indicators of the bots which will help set up "early warning systems" in place. This capability to audit digital workers/bots will be at the heart of being able to address compliance, consistency, integrity and security of operations.
The key is to establish a governance framework suited to an "automation-driven" environment. Here are a few examples of some of the base principle areas that need to be covered to run Bot Audit-as-a- Service:
- Bot Password encryption - Password / sensitive information should be encrypted and stored both for Data at Rest and Data in Motion
- Open Source compliance - Ensure there are no high-risk open source libraries used in the product which compromises the security and compliance of the bot
- Open Source vulnerability - Ensure there are no security threats on the open-source components
- GDPR Compliance - Effective mechanisms to ensure PII data is handled in a secured manner throughout the bot, in compliance with the GDPR norms
- Validate the presence of cognitive aspect associated with a bot to differentiate an RPA bot from a cognitive bot. This has significant implications as cognitive bots 'degenerate' overtime and that could end up resulting in inaccurate results which would go unnoticed if not metered, governed and audited separately once identified
- Information security and vulnerability checks for bots
- Auditing the explainability, transparency and fairness of AI-based bot is a very crucial aspect as well. A business example would be in international payments processing. In a payment screening process for suspect payments, any complex link to a 'blacklisted' entity will result in stopping the payment. Compliances team will need a clear white box explanation from the machine on why it took the decision to flag off the "suspect payment". Hence auditing whether the machine can offer explainability is key to successful adoption and sustainability.
While building governance dashboards of bots that run on heterogeneous technologies in the landscape, organisations must ensure that the digital workers are compliant of standards/policies of the organization on a routine basis. Essentially it's like a set of 'Virtual Auditors' that audit 'worker bots' for defined base principles listed above. This report would provide dramatic peace of mind to various personas including CISO, IT and business leaders in the enterprise.
A deep revolution in customer experience, innovative products and services, capacity improvements, cost management, and new business models is underway at the hands of hyper-automation. Admittedly, navigating the complexities of hyper-automation can be challenging as scale exponentially increases. Machine-based "checker bots" auditing mass digital workers and reinforcing human auditors in an enterprise may be the only scalable way to ensure adequate probes are in place to identify potential issues.
Hyper-automation requires new considerations for governance and controls to manage risk. Whether your hyper-automation program is just beginning or is already up and running, internal controls through 'Bot Audit-as-a-Service' would greatly help businesses know that they operate in a secure, explainable, transparent and fair manner.(The author is Vice President and Head - AI & Automation Ecosystem, Wipro Limited.)