Monthly Archives: November 2016

Takeaways from Rutgers Business School’s 38th World Continuous Auditing & Reporting Symposium

I attended the Rutgers Business School 38th World Continuous Auditing and Reporting Symposium  on November 4th & 5th 2016 on the Rutgers campus in Newark, NJ.  This was the 4th of these symposiums I’ve attended and all were very worthwhile.  The symposium was once again sold out and there were attendees watching the webcast from all around the world.  These are my takeways from the two days.  I invite comments from other attendees or the presenters to correct any errors and add information you feel is important that I left out.

38th World Continuous Auditing & Reporting Symposium
38th World Continuous Auditing & Reporting Symposium

Dr. Miklos Vasarhelyi (Miklos) hosts the symposium and led off the first day with an update on developments in continuous auditing (CA) and continuous monitoring (CM).  Miklos also spoke with pride about the Rutgers Accounting Web that now includes more than 800 video hours of accounting classroom instruction, an accounting research directory and numerous other resources.  All accountants and compliance professionals should have this bookmarked.

A panel discussed the Rutgers AICPA Data Analytics Research Initiative (RADAR).  The three components of the project are:

  1. Multi-dimensional audit data (MADS): This project will propose an outlier prioritization methodology to identify a sample that is more likely to be problematic in performing substantive tests of details.”  The goal is to develop methods to identify and remove high risk transactions from the population, subject these to detailed testing, and develop a framework to justify reduced scrutiny/testing of the remaining population which has a much lower risk of error/non-compliance.
  2. “Sandbox Project: The sandbox project proposes to look at a range of audit analytics including:
    1. Process mining,
    2. Text mining,
    3. Continuous control monitoring, and
    4. Risk-based prioritization of controls to be tested.”
  3. “Visualization: This project will address the understanding of the basic axioms of visualization in the audit process as well as its integration with audit analytic methods.”

The roundtable discussed using visualization to identify outliers such as journal entries posted after 9 PM, or of a certain dollar amount, or posted by certain individuals.  They discussed defining the critical path of transactions and using audit analytics to identify transactions that are outliers to this critical path.  Also, they discussed applying analytical tools to ERP activity logs to identify unusual transaction for testing.  The overall goal is to improve the efficiency and effectiveness of the audit process.  They emphasized that a new framework resulting from this process must replace current methods rather than adding additional audit testing to the current process.  All panel members agreed that the current audit process is lengthy, expensive and in need of improvements but new methods will only gain acceptance if they reduce effort and cost.

This process will likely take years to arrive at new auditing standards that are supported by all parties, including the SEC and PCAOB, but all were encouraged that the process in underway under the leadership of the AICPA with support of the major audit firms.  Patience is required but progress is happening.

Michael Cangemi, author of Managing the Audit Function, presented 2016 FERF research: Data Analytics and Financial Compliance: How Technology is Changing Audit and Business Systems.  He made the point that continuous monitoring should apply to the entire population and should be timely but do not be scared off by the notion of “continuous”.  The tools should monitor continuously but the review and actions can be periodic.  He encouraged the audience to get started with monitoring tools, expand as you learn, evolve your process, and continuously improve.

Michael mentioned that one of the main concerns of senior management and the Board is the rising costs of compliance.  He noted that total compliance/audit costs since 2002 SOX section 302/404 requirements have increased more than 100% and, per an FEI survey, increased 3.2% & 3.4% for 2014 and 2015, respectively.  Management seeks a return on investment (ROI) beyond assurance.  There is a disconnect because the auditors view assurance as the main goal.

Michael discussed highlights from the FEI Research study on Data Analytics & Financial Compliance as follows:

  1. “Audit quality is the primary goal;
  2. Detection and recovery of duplicate payments is “easy win” with analytics;
  3. Analytics can be used to identify risk;
  4. Some auditors wish to partner with the business, others feel they need to operate independently with their use of analytics;
  5. There is a shortage of staff trained and experienced with the current data analytics tools.”

Michael noted that the public accounting firms are hesitant to explore and expand the use of data analytics until they are confident that the PCAOB will accept these methods as appropriate and adequate audit evidence that replaces the traditional methods of auditing.

Michael called attention to the staffing challenge related to the use of analytics as
“Internal Audit & Public Accounting need people who:

  1. Know how to audit,
  2. Understand work processes, and
  3. Have expertise in technology or an interest in learning to use new software solutions.”

He concluded by stressing that auditors need to fight for resources, both analytic tools and staff that can leverage these, and push forward to incorporate analytics in both the business process and the audit process.  Only by challenging the status quo can we move the profession forward to more timely, effective and efficient oversight using analytics.

William Bible, Deloitte partner, presented “Blockchain Technology – Disrupting the World of Accounting”.

He describes Blockchain as a fusion of:

  1. Peer-to-peer networking
  2. Key Cryptography
  3. Proof of work

William noted that each of the above are computationally taxing and even more so taken together, however, the computational power of current systems make the widespread use of blockchain possible and even routine.

He went on to discuss that some key aspects of blockchain are:

  1. Everyone has access to all transactions but, due to encryption, individual transactions are only available to those w/ the private key.
  2. Blockchain is a continuously growing database of transactions.
  3. Each transaction has a unique identifier.  You cannot change a prior transaction.  The assigned hash value puts the transaction in sequence with all the prior and subsequent blocks.  Therefore, you cannot modify/change records/transactions because the network validates all transactions using the hash total.
  4. Blockchain ledgers are:
    1. Immutable,
    2. Transparent,
    3. Redundant,
    4. Distributed and
    5. Timestamped
  5. Blockchain is distributed, not centralized.  Nobody needs to perform validation like that required with a centralized database (e.g., bank, credit cards, broker, insurance, rewards points).
  6. Blockchain enforces one standard for all parties. Data standardization helps:
    1. Financial statement preparation
    2. Auditing technique development
    3. Tools and analytics development

Not all blockchains are created equal:

  1. “Permissionless” – Open to all parties.  Bitcoin is an example.
  2. Permissioned – Set up by a consortium of parties who specifically grant permission to join.

William concluded by noting that blockchains have several features that make their use are ideal for certain applications.  He expects the use of blockchains, especially “permissioned” ones to continue to expand in the future.

Seth Rosensweig, partner PwC, presented “The Audit Analytics (R)evolution” by discussing their practice focused on analytics and the possibilities to “change the paradigm”.  They will like to transition their staff from:

  • Reactive to Proactive
  • Siloed to Linked
  • Data Supported to Data Driven
  • Static to Agile and Adaptive

He discussed an “Analytics Maturity Model in Internal Audit”.

Seth’s five E’s for the Analytics Revolution:

  1. Enable – build tech as a capability (not an add-on) – e.g., Unstructured Text Analytics – Lease Accounting – contract extraction for lease accounting.  Integrating Optical Character Recognition.
  2. Embed – Automated Analytic Apps – Robot Auditor – Process Model Discovery – test out the “truth” – test electronically the process a transaction needs to follow.
    Risk Assessment Analytics – Continuous Risk Assessment – profile the data and look for risky or unexpected results
  3. Empower – define analytics related roles and performance objectives.
  4. Enhance – training on “analytics mindset”, should know how to write pivot tables, navigate Tableau, and some team members know how to use SAS.
  5. Execute – conduct CAATs w/ a business feedback loop.

John Gomez, CEO of Sensato Cybersecurity Solutions, presented, “Cybersecurity Risks: Myths, Fallacies and Facts”.  He noted that most breaches go undetected for 265 days on average.  The duration of a breach has increased over the years from 15 days.  Given a duration of 265 days, internal control procedures like requiring password changes every 90 days obviously doesn’t help.  John said that if the attacker figured it out once, they will re-run the same approach to figure it out again…or they moved on and have an administrative password and no longer need a user password.

John went on to indicate that encryption, another internal control, doesn’t matter as much as many compliance professionals think because once the attacker has your credentials, they have the rights you do.  Encryption doesn’t matter.  Encryption is not an end all be all.

Monitoring data activity to detect breaches is appropriate but John also noted to not take too much comfort that this procedure will detect an attack.  Attackers do not take huge amounts of data at once because they know this will lead to detection.

John discussed the disturbing migration from hackers to attackers (well-funded and deadly serious).  He classified attackers as follows:

  1. Criminals – profit motivated, EAS à Espionage As a Service, they post to on-line sites, “we can get this data if anyone wants it” – they then execute a statement of service for those who contract them to obtain the data.  It’s ransomware as a business
  2. Spies – nation states – highly sophisticated and resourced
  3. Terrorists – most dangerous, based on ideology.

John described the “Attacker Methodology” as follows:

  1. Mission planning
  2. Intelligence gathering
  3. Assess vulnerabilities
  4. Infiltration
  5. Exploitation
  6. Exfiltration
  7. Mission review

He cautioned to bear in mind that attackers do not have a timeline.  They have as much time as they decide to devote.

John also advised to look for adjacent domains (similar name misspelled, good idea to register the adjacent domains) to your own and gain control of these.  He gave an example of “wellpoint.com” and “we11point.com”.

John reminded us that attackers collaborate by nature.  Cyber levels the playing field.  A common person with knowledge can have the same capability as the largest military.

John gave the following recommendations:

  1. You must have relevant, timely data security and privacy policies.
  2. Executives must understand the risks and support the efforts with needed resources.
  3. Every organization needs a one to three-year cybersecurity plan.
  4. Deploy Honeypots in your network.  This is a low cost/high return technology to detect/deflect attackers.

Nigel Cannings of Credibility Analysis International (CAI) and Intelligent Voice presented “Giving Voice to Compliance”.  He discussed ways to analyze live and recorded telephone calls to identify indicators of fraud and other issues.  The tools either analyze the actual audio or translate the audio to text for analysis.

Nigel noted that G711 is the standard way of transmitting voice.  This standard was developed in 1972.  Given the limitations of technology and storage in 1972, a focus of the standard was to reduce the amount of the signal/data to the allow adequate processing on the systems in place.  As such, the G711 standard provides a very low quality signal which complicates analysis and accurate conversion to text.

Nigel went on to describe some use cases for this technology.  One application is to “flag” potential rogue stock traders.  A second application is to analyze insurance claims reports and insurance applications.  They use 47 different markers and analyze in a neural network with machine learning to identify calls with “suspicious language” for further analysis.  They continuously improve their detection algorithms to reduce false positives.  The goal is to analyze live calls, identify the calls (the majority) that have no indicators of fraud and speed up the processing of those transactions to improve customer service for most customers, and “flag” calls (the minority) that have some indicators of fraud and subject these to a greater level of scrutiny and follow-up.

Brad Ames, Director of IA at HP, presented, “Monitoring Appropriateness of Changes to Automated Controls”.  He pointed out that many application controls are configured in the same tables (HP has 43 application controls but all reference to the same SAP table T030).  He therefore recommends monitoring the changes to that table and by ensuring all changes are appropriate, you thereby address all concerns for these 43 application controls.

Brad also recommends monitoring to compare GL Account to accounting standards.  If there is a change to an account in the GL system, send an e-mail to the authorizer of the change, obtain an explanation, assess the explanation and document/file this oversight activity.  He describes such monitoring as very efficient because the requestor will remember the reason for the change because the review is timley.  In addition, the quick turnaround to request an explanation sends a message that all changes are monitored and thereby reduces the risk of unauthorized changes.

Brad further recommended to “trend the transaction flow through the GL A/C”.  Essentially, set expectations for the types, source, volume, and dollar amounts of activity to each GL accounts and then monitor the activity.  Identify activity that is different than expected, “flag” this for review, and request an explanation (e-mail received back w/ business justification).  For example, a posting with the source code indicating the payroll system to an account that is not identified as appropriate for payroll posting.

Eric Au, Leader – Analytics Innovation Group at Grant Thornton (Canada), presented “How Professional Services are being Revolutionized with AI”.  Eric made the following points:

  1. Anomaly detection in finance is an appropriate use of artificial intelligence (AI);
  2. His team works w/ MindBridge to identify ways to use their AI to push the audit profession forward;
  3. Journal entry testing is one area they see as an important target for this application because JE testing requires a lot of judgement, thereby requiring an experienced (i.e., expensive) auditor.  They see that there is a “mental cost” as humans proceed through review of many JEs.  This “mental cost” can lead to reduced scrutiny as the auditor proceeds.  The machine doesn’t become fatigued and therefore the level of scrutiny remains consistent.
  4. To properly execute these complex tasks requires an auditor to understand not only the item under review but also what is around that item (the context).  AI has this contextual potential.
  5. Risk is many shades of grey, so evaluation of risk should be on a continuous scale.
  6. If you can hone in on the risky transactions, you can not only do a better job but save the time previously spent looking at many transactions (i.e., the typical random sample) that are not risky.
  7. K-means Clustering – finds connections and groupings to cluster data sets. Must group before assessing which clusters are of concern or not.
  8. Machine learning is targeted to learn to identify anomalous transaction over time.

Jun Dai, a PhD student at Rutgers Business School, presented “Imagineering Audit 4.0”.  Jun referenced the German Trade and Invest initiative Industrie 4.0 which is focused on industrial IT using the internet of things as a basis and motivation for a similar future state of auditing she calls “Audit 4.0”.   Jun describes, “Audit 4.0 will piggyback on technology promoted by Industry 4.0 to collect financial and non-financial information, and analyze, model, and visualize data for the purpose of providing effective, efficient, and real-time assurance”.  For example, data from machine sensors related to quantity of inputs, energy used, processing time and other factors can be used to validate (e.g., recalculate based on formularies) the amount of finished goods inventory produced by a process.

Jun went on to discuss several graphic models (see link to presentation) which used modeling of the business activities/processes to define expected outcomes and then use continuous monitoring audit software to confirm that actual activity agrees to that expected by the model.  All unexpected activities are treated as exceptions and reviewed for error, impropriety or, if valid, used to adjust the model’s valid expected outcomes.

There was much more presented at the symposium than I included here.  I met some great people, learned a lot, and come away with some great ideas to improve my work.  Continuous auditing, continuous monitoring and data analytics enablers to leveraging compliance.