Tag Archives: ISACA

Elements of a Comprehensive Trademark Licensing Audit

Internal Audit departments are focused and staffed to effectively address the risks and compliance requirements (e.g., SOX, FCPA, PCI, GDPR) facing their entity.  Internal Audit plans emphasize documenting, testing and improving the major business processes and supporting business change initiatives.  Trademark licensing audits do not fit this focus as these are contract compliance audits of the relevant books and records of a business partner.  There is no business process documentation for planning nor is there any testing of internal controls.  The focus of these audits is to ensure that the information provided for sales, purchases, advertising, payments, etc. is complete and accurate, and then to analyze this information for compliance with the terms and restrictions of the agreement.  The testing is substantive because the licensee is only required to supply information specific to the agreement.  The licensee may also need to provide some high-level information related to their overall business to support agreeing the information provided to their financial records on a sample basis.  The “thinking on your feet” necessary to obtain evidence that the information provided is complete and accurate requires either co-sourcing with specialists or a senior/seasoned member of the audit team.

The most important risk of trademark licensing is potential damage to the brand.  Aspects of product quality and the “fit’ of product with the brand image are beyond the scope of a financial audit.  Likewise, factory/social compliance risks are also beyond the scope and require proof from the licensee of qualified factory social compliance audits or the licensor co-sourcing for such audits.  The scope of the financial licensing audit should include the identification of sales in territories or channels (e.g., clubs, off-price, deep discount) that will degrade the brand, and which violate or exceed limits in the agreement.

Other trademark licensing risks are driven by the clauses in the contract.  Some clauses may make sense to lawyers and management on paper but are difficult to impossible to effectively abide by operationally.  One common clause that comes to mind is that any sales of product to the licensor must be at the lowest price previously sold to any licensee customer or even a certain percentage off that lowest price.   It is not practical for a licensee to program their sales order system to meet such a requirement which means the licensee must manually research prices prior to filling an order for a licensor.  Is it likely the licensee will incur the cost of the time required to manually determine such prices?  Identifying such challenging agreement clauses and developing effective audit tests is critical for a comprehensive trademark licensing audit.

Considering and identifying opportunity/risk for intentional/unintentional understatement of royalties or overstatement of required spend (e.g., advertising) by the licensee remaining undetected is a critical planning aspect of a comprehensive trademark licensing audit.  For example, shipments directly from factories to customers are commonplace and increasing in frequency in many industries.  This is particularly true of consumer products and especially sales to major retailers who have much more efficient supply chains than their vendors and can save time and money importing the products using their own logistics.  In today’s highly efficient supply chains, the company that designs and sells the product is now akin to a costly “middleman”.  Such activity presents significant risks of royalty understatement for licensors.  The licensee never takes physical possession of the product and therefore may not be able to invoice customers using their normal/structured process.   In most ERP systems, shipment triggers invoicing and ensures a “clean” sales cutoff.  Because they are not shipping the product, a different workflow, perhaps even manual invoicing, is required for these direct shipments.  Alternate workflows introduce the risk of intentional/unintentional understatements of sales.  An example of a licensee overstating required expenditures is including payments for product positioning (e.g., end caps) or co-operative advertising on schedules supporting required advertising spend.  Some agreements permit these as includable items, but many do not.  Most agreements require advertising in national magazines, digital advertising and mailers but do not permit including advertising support to customers.  A comprehensive trademark audit defines the specific advertising requirements and develops testing to ensure only such advertising is included on the schedules provided by the licensee to support this spend.  Some agreements require the licensee to remit any shortfall in advertising spend to the licensor whereas others require that these amounts are added to the required spend for a future year.  Either outcome is a significant benefit to the licensor.

Licensee miscoding of product is another risk leading to understatement of sales/royalties.  Some licensees build the brand coding into the SKU number whereas others use sequential SKU numbers and utilize another field in the product master file to indicate brand.  Given that all the reporting is from the licensee systems, the auditor has no basis to rely on reports as complete and accurate.  In the former example, perhaps the reports/queries do not capture branded SKUs in certain product categories.  In the latter example, perhaps the brand designation code was left blank or input incorrectly leading to no inclusion of sales for that branded product in the reports used to report royalties or used to provide reports to the auditor.  Complexity of coding expands further for licensees that ship and invoice multiple brands on a consolidated invoice.  Product coding challenges are typically not known until the auditor is in the field at the licensee’s offices.  A keen understanding of processing and reporting is required to quickly develop audit tests to establish the completeness of sales given such product coding challenges.

We know that excellent business reporting is always a challenge, mainly due to cost and resources.  Many functions require system reporting to effectively manage their responsibilities and they must compete for such resources.  As we might expect, the priority to have excellent reporting to determine sales to pay royalties to an outside party does not rise to the top of many licensee’s priority lists.  Certainly, front line reporting to manage and maximize sales should be a primary goal.  Both the licensee and licensor benefit from effective sales management.  An ERP system report with a heading for the SKU/product and then a listing of units and dollars by customer with totals is helpful for managing sales but not very helpful for the auditor because the SKU, customer, units and dollars are not on the same line.  The reporting challenge is greater with some of the more nuanced clauses of the agreement.  One example is royalty rates tiered by sales level.  Different royalty rates for different product types is another example.  A third example is different sales minimums by product category, territory, gender, etc. or maximum sales percentages to certain channels (e.g., off-price).  The auditor typically must accumulate sales data files by invoice line item with all the appropriate information on each line to then analyze compliance with all agreement clauses.  In some cases, the auditor must develop an elegant solution in EXCEL whereas in other cases the auditor must be willing to grunt through hours of cut/paste to arrive at the complete data file required for compliance analysis.  Experience allows the auditor to determine a solution quickly and get to it.  A bias for action is critical.  Each audit is different, and an auditor likely will not return to the same licensee for years.  There is no time to search for a best solution, just a solution that works.

The skills and experience to complete comprehensive trademark licensing audits on tight budgets require co-sourcing for most companies.  The contract analysis, test development and data analytic skills required for comprehensive trademark licensing audits is unique.  Co-sourcing must be approached with caution, however, because there are many providers that offer these services that simply do not have the skills.  Use the points above in your discussions when interviewing potential co-sourcing partners.  These are typical challenges of an audit and the provider should be conversant with quick and clear strategies to handle these challenges.  Choosing the right partner is the best way to leverage compliance to identify recoveries, ensure control, and deliver value to your entity.

 

About the Author

Glenn Murphy, the co-founder of BestGRC and founder of GRC Management Consulting LLC, primarily focuses on empowering entities to leverage their compliance activities through the BestGRC “cloud” software, his consulting work, publications, and the “Leverage Compliance” blog.  In addition, Glenn provides licensee compliance audits in conjunction with Licensing Compliance Group and IT Governance/Cybersecurity Assessments in conjunction with Ra Security Systems.  Find Glenn’s full profile at http://www.linkedin.com/in/glenntmurphy/, follow him @GlennMurphyGRC and subscribe to the Leverage Compliance blog at http://www.bestgrc.com/blog/

Why Internal Audit Should Oversee Trademark Licensing Audits

I performed thirteen trademark licensing audits in the past six months for five different corporations.  Theses audits were contracted and overseen by Internal Audit (IA) for only one of these five clients.  Upcoming audits we scheduled or are scheduling with three other clients are similarly contracted by functions other than IA.  All these clients have an IA function.  There are several reasons why IA should oversee trademark licensing audits.  So why is co-sourcing for trademark licensing audits left to others?  IA leaders must communicate the organizational necessity of their contracting and overseeing the trademark licensing program.

The Licensing Department is typically a small group in the enterprise.  For this reason, the function may be partly overlooked when preparing full process and controls documentation.  Royalty income from trademark licensing is typically not subject to the internal control procedures used for other sources of revenue.  If invoices are issued for royalties, these are typically for guaranteed minimum royalties or estimated royalties using manual invoicing either within or outside of your ERP system.  Internal control activities should include the oversight and analysis performed by the internal licensing accounting team to arrive at these invoiced/recorded revenues as well as tracking/recording other obligations of the licensee.  Do they analyze and recalculate all the royalty statements, submissions for advertising and other reporting provided by the licensees?  Do they reconcile all payments and follow-up on late/missing payments?  Is there sound internal communication of exceptions (e.g., sales to Warehouse Clubs outside the license agreement) from the licensing team to the accounting team so that they expect/accrue/invoice these additional royalty receivables?

There are two aspects to the controls over the licensing function.  The internal processes of the licensing/accounting team to oversee and record the current activity and the audit of licensee books and records to validate the complete and accurate reporting and compliance with the licensing agreement.  To the extent that non-compliance is identified by licensee audits and additional amounts are due, there is a shortfall in the overall internal control process because such recoveries are recorded in an accounting period subsequent to the period in which they are earned.  This risk of understatement is inherent to licensing due to reliance on complete information from an outside partner.  To reduce this risk to an acceptable level, the trademark licensing annual risk assessment (see my prior blog) which determines the frequency of audit of each licensee should be a key control.  Those licensees deemed of a higher risk should be audited more frequently to ensure the complete and accurate revenues are captured during or close to the appropriate financial reporting period.  Performance of the licensee audits by IA either directly or through co-sourcing partner with IA setting the parameters and scope becomes an extension of the Internal Audit plan and a service that IA provides to the business.  Oversight of licensee audits by a business function requires IA to document and test the oversight performed by this function.  Direct oversight of licensee audits by IA is a more timely and effective approach.

The independence of the IA function and especially the reporting to the Audit Committee should not be understated.  My experience as a Chief Audit Executive (CAE) for a company with more than 30 licensees highlights this distinction.  The licensee audit co-source partner was overseen by the Finance function for many years prior to IA adding this oversight to our responsibilities in the early 2000s partly as an extension of oversight in response to SOX requirements.  Many of the licensee audits ended in negotiated settlements prior to our oversight.  We brought the rigor and structure of an IA approach in coordination with the same co-sourcing partner which eliminated many of the “gray areas” in the findings.  More importantly, the licensees recognized IA as an independent function with the weight of reporting to the Audit Committee and knew that there was little room for negotiation of amounts due.  The result was incremental recoveries of several hundred thousand dollars each year and a more responsive attitude on the part of licensees to address the issues that led to the findings.  Following IA leadership of licensee audits, the only negotiated settlements were related to contract terms that were truly ambiguous and for which we tracked such issues to a future contract addendum to clarify the ambiguity.

Our experience was that IA oversight and follow-up led to a higher level of licensee contract compliance over time.  The number and amounts of recoveries decreased, which was our goal.  The decrease in recoveries directly reflected an improvement in the completeness and accuracy of licensee reporting and payment.  This meant that the accounting for licensee activity likewise improved as did the timeliness of cash flow from licensing.   Adding the oversight of trademark licensing audits to your audit plan is one more way to leverage compliance to improve business process and profitability for your Company.

About the Author

Glenn Murphy, the co-founder of BestGRC and founder of GRC Management Consulting LLC, primarily focuses on empowering entities to leverage their compliance activities through the BestGRC “cloud” software, his consulting work, publications, and the “Leverage Compliance” blog.  In addition, Glenn provides licensee compliance audits in conjunction with Licensing Compliance Group and Cybersecurity/NIST/Penetration Tests/SOC for Cyber/SOC 2/3 Assessments in conjunction with Ra Security Systems.  Find Glenn’s full profile at http://www.linkedin.com/in/glenntmurphy/, follow him @GlennMurphyGRC and subscribe to the Leverage Compliance blog at http://www.bestgrc.com/blog/

Does Internal Audit Devote Enough Resources to Managerial Accounting?

Sarbanes-Oxley compliance (SOX) is more routine than in the past but continues as a major distraction to the internal audit profession.  The continued PCAOB scrutiny with published Inspection Report Findings continue to pressure external audit firms performing 404 Reviews, who in turn require expanded SOX compliance activities at their clients to earn a “clean” opinion.  More recently, cybersecurity and data governance have garnered a lot of attention and resources.  Perhaps lost in these distractions is the appropriateness, accuracy and availability of information management needs to effectively direct and oversee the strategic, financial and operational objectives of the business.  Certainly, compliance professionals can “walk and chew gum at the same time”, but is scrutiny of the most important business information “taking a back seat”?

Financial accounting reports follow generally accepted accounting principles (GAAP).  The users of financial accounting results are mostly external parties to the organization (investors, analysts, banks, government).  Financial accounting is of some value for management, mainly to retrospectively compare their results to others in their industry using the same baseline accounting principles (GAAP).  Managerial accounting provides organizational leaders with the accounting information to run the business, assess current performance, and forecast future performance.  This information is critical to directing operations, setting near term funding needs, and communicating goals to internal stakeholders.

Much of the Management Discussion and Analysis (MD&A) section of financial filings and the information discussed on the earnings call is the information management uses to assess performance.  This information falls more in the realm of managerial accounting.  How deeply do we audit this information?  Do we only make sure there is evidence that is it is prepared and reviewed w/ appropriate sign-offs or do we audit the accuracy of the information?  Do we question the appropriateness of the information versus other managerial accounting measures that perhaps gives better insight into the performance of the firm?

Management accounting helps management to discern between value-added activities and those activities that do not add value.  Financial accounting only concerns proper classification of the expenditures related to either activity type.  While it is certainly helpful to understand the cost of activities, it is much more important to identify and eliminate wasteful activities.  Moreover, to choose optimal value-added activities over those that add less value.  Regardless of the core mission of the business, whether profit or not-for-profit, profits are necessary to sustain the mission over time.

How does internal audit refocus on ensuring the information management needs for decision-making in available and accurate?  Are internal auditors equipped to assess management reporting?  The answer is to simply expand the tools and techniques we use for all internal control activities.  The COSO Framework is as appropriate for managerial accounting and reporting as for all other critical business functions.  As the use of “Non-GAAP” measures expand in financial filings and the need for immediate information for management to make informed decisions grow, the reporting risks increase and the assessment of the processes and controls to ensure ongoing accurate and timely managerial accounting information should become an important part of the annual internal audit plan.  Auditors frequently speak about earning a “seat at the table”.  The best way to earn this “seat” is to provide assurance to key decision makers regarding the accuracy and appropriateness of the information they receive, which is typically managerial accounting information, not financial accounting information.  Look at your risk-assessment and audit plan process to make sure that there is an appropriate identification and focus on managerial accounting information and controls.  If necessary, refocus your process to ensure you effectively leverage your compliance activities optimize your service to all key business constituents.

 

About the Author

Glenn Murphy, the co-founder of BestGRC and founder of GRC Management Consulting LLC, primarily focuses on empowering entities to leverage their compliance activities through the BestGRC “cloud” software, his consulting work, publications, and the “Leverage Compliance” blog.  In addition, Glenn provides licensee compliance audits in conjunction with Licensing Compliance Group and Cybersecurity/Penetration Tests/SOC for Cyber/SOC 2/3 Assessments in conjunction with Ra Security Systems.  Find Glenn’s full profile at http://www.linkedin.com/in/glenntmurphy/, follow him @GlennMurphyGRC and subscribe to the Leverage Compliance blog at http://www.bestgrc.com/blog/

Takeaways from Rutgers Business School’s 38th World Continuous Auditing & Reporting Symposium

I attended the Rutgers Business School 38th World Continuous Auditing and Reporting Symposium  on November 4th & 5th 2016 on the Rutgers campus in Newark, NJ.  This was the 4th of these symposiums I’ve attended and all were very worthwhile.  The symposium was once again sold out and there were attendees watching the webcast from all around the world.  These are my takeways from the two days.  I invite comments from other attendees or the presenters to correct any errors and add information you feel is important that I left out.

38th World Continuous Auditing & Reporting Symposium
38th World Continuous Auditing & Reporting Symposium

Dr. Miklos Vasarhelyi (Miklos) hosts the symposium and led off the first day with an update on developments in continuous auditing (CA) and continuous monitoring (CM).  Miklos also spoke with pride about the Rutgers Accounting Web that now includes more than 800 video hours of accounting classroom instruction, an accounting research directory and numerous other resources.  All accountants and compliance professionals should have this bookmarked.

A panel discussed the Rutgers AICPA Data Analytics Research Initiative (RADAR).  The three components of the project are:

  1. Multi-dimensional audit data (MADS): This project will propose an outlier prioritization methodology to identify a sample that is more likely to be problematic in performing substantive tests of details.”  The goal is to develop methods to identify and remove high risk transactions from the population, subject these to detailed testing, and develop a framework to justify reduced scrutiny/testing of the remaining population which has a much lower risk of error/non-compliance.
  2. “Sandbox Project: The sandbox project proposes to look at a range of audit analytics including:
    1. Process mining,
    2. Text mining,
    3. Continuous control monitoring, and
    4. Risk-based prioritization of controls to be tested.”
  3. “Visualization: This project will address the understanding of the basic axioms of visualization in the audit process as well as its integration with audit analytic methods.”

The roundtable discussed using visualization to identify outliers such as journal entries posted after 9 PM, or of a certain dollar amount, or posted by certain individuals.  They discussed defining the critical path of transactions and using audit analytics to identify transactions that are outliers to this critical path.  Also, they discussed applying analytical tools to ERP activity logs to identify unusual transaction for testing.  The overall goal is to improve the efficiency and effectiveness of the audit process.  They emphasized that a new framework resulting from this process must replace current methods rather than adding additional audit testing to the current process.  All panel members agreed that the current audit process is lengthy, expensive and in need of improvements but new methods will only gain acceptance if they reduce effort and cost.

This process will likely take years to arrive at new auditing standards that are supported by all parties, including the SEC and PCAOB, but all were encouraged that the process in underway under the leadership of the AICPA with support of the major audit firms.  Patience is required but progress is happening.

Michael Cangemi, author of Managing the Audit Function, presented 2016 FERF research: Data Analytics and Financial Compliance: How Technology is Changing Audit and Business Systems.  He made the point that continuous monitoring should apply to the entire population and should be timely but do not be scared off by the notion of “continuous”.  The tools should monitor continuously but the review and actions can be periodic.  He encouraged the audience to get started with monitoring tools, expand as you learn, evolve your process, and continuously improve.

Michael mentioned that one of the main concerns of senior management and the Board is the rising costs of compliance.  He noted that total compliance/audit costs since 2002 SOX section 302/404 requirements have increased more than 100% and, per an FEI survey, increased 3.2% & 3.4% for 2014 and 2015, respectively.  Management seeks a return on investment (ROI) beyond assurance.  There is a disconnect because the auditors view assurance as the main goal.

Michael discussed highlights from the FEI Research study on Data Analytics & Financial Compliance as follows:

  1. “Audit quality is the primary goal;
  2. Detection and recovery of duplicate payments is “easy win” with analytics;
  3. Analytics can be used to identify risk;
  4. Some auditors wish to partner with the business, others feel they need to operate independently with their use of analytics;
  5. There is a shortage of staff trained and experienced with the current data analytics tools.”

Michael noted that the public accounting firms are hesitant to explore and expand the use of data analytics until they are confident that the PCAOB will accept these methods as appropriate and adequate audit evidence that replaces the traditional methods of auditing.

Michael called attention to the staffing challenge related to the use of analytics as
“Internal Audit & Public Accounting need people who:

  1. Know how to audit,
  2. Understand work processes, and
  3. Have expertise in technology or an interest in learning to use new software solutions.”

He concluded by stressing that auditors need to fight for resources, both analytic tools and staff that can leverage these, and push forward to incorporate analytics in both the business process and the audit process.  Only by challenging the status quo can we move the profession forward to more timely, effective and efficient oversight using analytics.

William Bible, Deloitte partner, presented “Blockchain Technology – Disrupting the World of Accounting”.

He describes Blockchain as a fusion of:

  1. Peer-to-peer networking
  2. Key Cryptography
  3. Proof of work

William noted that each of the above are computationally taxing and even more so taken together, however, the computational power of current systems make the widespread use of blockchain possible and even routine.

He went on to discuss that some key aspects of blockchain are:

  1. Everyone has access to all transactions but, due to encryption, individual transactions are only available to those w/ the private key.
  2. Blockchain is a continuously growing database of transactions.
  3. Each transaction has a unique identifier.  You cannot change a prior transaction.  The assigned hash value puts the transaction in sequence with all the prior and subsequent blocks.  Therefore, you cannot modify/change records/transactions because the network validates all transactions using the hash total.
  4. Blockchain ledgers are:
    1. Immutable,
    2. Transparent,
    3. Redundant,
    4. Distributed and
    5. Timestamped
  5. Blockchain is distributed, not centralized.  Nobody needs to perform validation like that required with a centralized database (e.g., bank, credit cards, broker, insurance, rewards points).
  6. Blockchain enforces one standard for all parties. Data standardization helps:
    1. Financial statement preparation
    2. Auditing technique development
    3. Tools and analytics development

Not all blockchains are created equal:

  1. “Permissionless” – Open to all parties.  Bitcoin is an example.
  2. Permissioned – Set up by a consortium of parties who specifically grant permission to join.

William concluded by noting that blockchains have several features that make their use are ideal for certain applications.  He expects the use of blockchains, especially “permissioned” ones to continue to expand in the future.

Seth Rosensweig, partner PwC, presented “The Audit Analytics (R)evolution” by discussing their practice focused on analytics and the possibilities to “change the paradigm”.  They will like to transition their staff from:

  • Reactive to Proactive
  • Siloed to Linked
  • Data Supported to Data Driven
  • Static to Agile and Adaptive

He discussed an “Analytics Maturity Model in Internal Audit”.

Seth’s five E’s for the Analytics Revolution:

  1. Enable – build tech as a capability (not an add-on) – e.g., Unstructured Text Analytics – Lease Accounting – contract extraction for lease accounting.  Integrating Optical Character Recognition.
  2. Embed – Automated Analytic Apps – Robot Auditor – Process Model Discovery – test out the “truth” – test electronically the process a transaction needs to follow.
    Risk Assessment Analytics – Continuous Risk Assessment – profile the data and look for risky or unexpected results
  3. Empower – define analytics related roles and performance objectives.
  4. Enhance – training on “analytics mindset”, should know how to write pivot tables, navigate Tableau, and some team members know how to use SAS.
  5. Execute – conduct CAATs w/ a business feedback loop.

John Gomez, CEO of Sensato Cybersecurity Solutions, presented, “Cybersecurity Risks: Myths, Fallacies and Facts”.  He noted that most breaches go undetected for 265 days on average.  The duration of a breach has increased over the years from 15 days.  Given a duration of 265 days, internal control procedures like requiring password changes every 90 days obviously doesn’t help.  John said that if the attacker figured it out once, they will re-run the same approach to figure it out again…or they moved on and have an administrative password and no longer need a user password.

John went on to indicate that encryption, another internal control, doesn’t matter as much as many compliance professionals think because once the attacker has your credentials, they have the rights you do.  Encryption doesn’t matter.  Encryption is not an end all be all.

Monitoring data activity to detect breaches is appropriate but John also noted to not take too much comfort that this procedure will detect an attack.  Attackers do not take huge amounts of data at once because they know this will lead to detection.

John discussed the disturbing migration from hackers to attackers (well-funded and deadly serious).  He classified attackers as follows:

  1. Criminals – profit motivated, EAS à Espionage As a Service, they post to on-line sites, “we can get this data if anyone wants it” – they then execute a statement of service for those who contract them to obtain the data.  It’s ransomware as a business
  2. Spies – nation states – highly sophisticated and resourced
  3. Terrorists – most dangerous, based on ideology.

John described the “Attacker Methodology” as follows:

  1. Mission planning
  2. Intelligence gathering
  3. Assess vulnerabilities
  4. Infiltration
  5. Exploitation
  6. Exfiltration
  7. Mission review

He cautioned to bear in mind that attackers do not have a timeline.  They have as much time as they decide to devote.

John also advised to look for adjacent domains (similar name misspelled, good idea to register the adjacent domains) to your own and gain control of these.  He gave an example of “wellpoint.com” and “we11point.com”.

John reminded us that attackers collaborate by nature.  Cyber levels the playing field.  A common person with knowledge can have the same capability as the largest military.

John gave the following recommendations:

  1. You must have relevant, timely data security and privacy policies.
  2. Executives must understand the risks and support the efforts with needed resources.
  3. Every organization needs a one to three-year cybersecurity plan.
  4. Deploy Honeypots in your network.  This is a low cost/high return technology to detect/deflect attackers.

Nigel Cannings of Credibility Analysis International (CAI) and Intelligent Voice presented “Giving Voice to Compliance”.  He discussed ways to analyze live and recorded telephone calls to identify indicators of fraud and other issues.  The tools either analyze the actual audio or translate the audio to text for analysis.

Nigel noted that G711 is the standard way of transmitting voice.  This standard was developed in 1972.  Given the limitations of technology and storage in 1972, a focus of the standard was to reduce the amount of the signal/data to the allow adequate processing on the systems in place.  As such, the G711 standard provides a very low quality signal which complicates analysis and accurate conversion to text.

Nigel went on to describe some use cases for this technology.  One application is to “flag” potential rogue stock traders.  A second application is to analyze insurance claims reports and insurance applications.  They use 47 different markers and analyze in a neural network with machine learning to identify calls with “suspicious language” for further analysis.  They continuously improve their detection algorithms to reduce false positives.  The goal is to analyze live calls, identify the calls (the majority) that have no indicators of fraud and speed up the processing of those transactions to improve customer service for most customers, and “flag” calls (the minority) that have some indicators of fraud and subject these to a greater level of scrutiny and follow-up.

Brad Ames, Director of IA at HP, presented, “Monitoring Appropriateness of Changes to Automated Controls”.  He pointed out that many application controls are configured in the same tables (HP has 43 application controls but all reference to the same SAP table T030).  He therefore recommends monitoring the changes to that table and by ensuring all changes are appropriate, you thereby address all concerns for these 43 application controls.

Brad also recommends monitoring to compare GL Account to accounting standards.  If there is a change to an account in the GL system, send an e-mail to the authorizer of the change, obtain an explanation, assess the explanation and document/file this oversight activity.  He describes such monitoring as very efficient because the requestor will remember the reason for the change because the review is timley.  In addition, the quick turnaround to request an explanation sends a message that all changes are monitored and thereby reduces the risk of unauthorized changes.

Brad further recommended to “trend the transaction flow through the GL A/C”.  Essentially, set expectations for the types, source, volume, and dollar amounts of activity to each GL accounts and then monitor the activity.  Identify activity that is different than expected, “flag” this for review, and request an explanation (e-mail received back w/ business justification).  For example, a posting with the source code indicating the payroll system to an account that is not identified as appropriate for payroll posting.

Eric Au, Leader – Analytics Innovation Group at Grant Thornton (Canada), presented “How Professional Services are being Revolutionized with AI”.  Eric made the following points:

  1. Anomaly detection in finance is an appropriate use of artificial intelligence (AI);
  2. His team works w/ MindBridge to identify ways to use their AI to push the audit profession forward;
  3. Journal entry testing is one area they see as an important target for this application because JE testing requires a lot of judgement, thereby requiring an experienced (i.e., expensive) auditor.  They see that there is a “mental cost” as humans proceed through review of many JEs.  This “mental cost” can lead to reduced scrutiny as the auditor proceeds.  The machine doesn’t become fatigued and therefore the level of scrutiny remains consistent.
  4. To properly execute these complex tasks requires an auditor to understand not only the item under review but also what is around that item (the context).  AI has this contextual potential.
  5. Risk is many shades of grey, so evaluation of risk should be on a continuous scale.
  6. If you can hone in on the risky transactions, you can not only do a better job but save the time previously spent looking at many transactions (i.e., the typical random sample) that are not risky.
  7. K-means Clustering – finds connections and groupings to cluster data sets. Must group before assessing which clusters are of concern or not.
  8. Machine learning is targeted to learn to identify anomalous transaction over time.

Jun Dai, a PhD student at Rutgers Business School, presented “Imagineering Audit 4.0”.  Jun referenced the German Trade and Invest initiative Industrie 4.0 which is focused on industrial IT using the internet of things as a basis and motivation for a similar future state of auditing she calls “Audit 4.0”.   Jun describes, “Audit 4.0 will piggyback on technology promoted by Industry 4.0 to collect financial and non-financial information, and analyze, model, and visualize data for the purpose of providing effective, efficient, and real-time assurance”.  For example, data from machine sensors related to quantity of inputs, energy used, processing time and other factors can be used to validate (e.g., recalculate based on formularies) the amount of finished goods inventory produced by a process.

Jun went on to discuss several graphic models (see link to presentation) which used modeling of the business activities/processes to define expected outcomes and then use continuous monitoring audit software to confirm that actual activity agrees to that expected by the model.  All unexpected activities are treated as exceptions and reviewed for error, impropriety or, if valid, used to adjust the model’s valid expected outcomes.

There was much more presented at the symposium than I included here.  I met some great people, learned a lot, and come away with some great ideas to improve my work.  Continuous auditing, continuous monitoring and data analytics enablers to leveraging compliance.

Takeaways from Rutgers Business School’s 35th World Continuous Auditing and Reporting Symposium

I attended the Rutgers Business School’s 35th World Continuous Auditing and Reporting Symposium on November 6th & 7th 2015 on the Rutgers campus in Newark, NJ.  This was the 3rd of these symposiums I’ve attended and they have all been very educational and thought provoking.  I met attendees who traveled from the Netherlands, Brazil, Germany and even…Kansas.

http://raw.rutgers.edu/35wcars

Dr. Miklos Vasarhelyi (Miklos) hosts the symposium and led off the first day with an update on developments in continuous auditing (CA) and continuous monitoring (CM).  The AICPA recently published an update to their “red book” (1999) which Miklos refers to as the CM “pink book” and he recommends it as a worthy addition to our research libraries.

Miklos informed us about the recently developed Rutgers Audit Analytics Certificate Program which is intended to help auditors update their skills and thereby drive the profession forward.  The curses are offered online.

He also spoke with pride about the RAW Digital Library which hosts videos of lectures.  Initially intended to provide a resource for Rutgers Business School (RBS) students that missed a class or wanted to review a concept explained in class, RBS decided to make these videos available to anyone that wants to explore topics.  The RAW Digital Library had more than 80,000 unique visits during September 2015.

A point made by Bob Cuthbertson during his presentation of CaseWare IDEA is that they generally acquire new clients that are looking to solve “real problems” that arose in their enterprise.  They do not see a lot of audit/compliance departments looking to implement CA and CM to improve their internal controls, improve efficiency and effectiveness of their process, to take their operational and compliance activities to the next level, nor to significantly reduce the occurrence of the “real problems” (e.g., scandal, fraud, material weakness) Bob mentioned initiated the search for a solution.  Audit/Compliance departments are slow to adopt software tools unless there is an issue/reason.  This sentiment was echoed during numerous presentations over the two days by data analysts, software solution providers, internal and external auditors.  This failure to embrace CA/CM was obviously a point of frustration voiced during Q&A sessions.  The audience this symposium attracts are more analytical and IT forward than internal auditors and compliance professionals in general.  Unfortunately, the generally conservative nature of many internal auditors (and their colleagues in finance) has inhibited the adoption of CA/CM in most organizations.  The same situation exists related to use of technology for GRC where, according to John Wheeler of Gartner “75% of companies (worldwide) are not using technology to integrate GRC.”  The true advent of a CA/CM revolution requires auditors to be much more forward thinking and to fight for the budget dollars to transform the compliance function in their organizations.

Bob Cuthbertson from ACL highlighted developments in visualization for CA/CM including script tools from a scripts store that are akin to “apps”.  These support quick sophisticated analysis without the need to invest the time needed to create the scripts.

Bob made reference to Gartner in describing stages of Data Analytics

  1. Description – what happened?
  2. Diagnostic – Why did it happen?
  3. Prescriptive – What should I do?
  4. Predictive – How can I prevent it?

Patrick Taylor from Oversight Systems had a very entertaining presentation that emphasized a major difference in their approach as compared to many other data analytics/monitoring application suppliers.  They offer a “turnkey” solution for analyzing certain categories of transaction data.  Meaning they take responsibility for using the proper statistical/analysis method for the data type to discover anomalies.  They are selling expertise with an accompanying cloud application.  I also see the majority of future growth of the cloud as taking this approach of selling solutions more so than increasing availability, security and access by hosting applications.

A government panel discussed the migration from paper to electronic submission of information to the government and the validation and fraud prevention opportunities this migration presents.  They also discussed the move to transparency in data related to government spending and discussed USA Federal Spending and State of Ohio Spending as two leaders.  Some Rutgers PhD candidates are a conceptualizing a range of apps to analyze government data – providing a user-friendly interactive tool for users to analyze government information databases – as part of their research work.  The ultimate vision is a legion of concerned citizens that will identify questionable spending to improve governance over this spending over time.  Who will do this?  Well just consider the time people volunteer making Wikipedia better, translating old books into electronic versions or making genealogy records more accessible.

For auditors in the public sector, the recently released Common Body of Knowledge report was issued by the IIA.  “The purpose of this initiative is to gain a global perspective and better understanding of stakeholders’ expectations of internal audit’s purpose, function, and performance. Stakeholders include members of executive management, board and audit committee members, and C-suite executives excluding chief audit executives who were included in the practitioner survey.”

Jon Spivey and Lorenz Schmid from PwC discussed the data driven audit and stressed consideration of the following Megatrends

  1. Demographic shifts – aging in developed countries, population growth in developing countries, many more women in the workplace;
  2. Shifts in Global Economic Power – realignment of economics – BRICS will have GDP twice western countries in the not too distant future;
  3. Accelerating Urbanization – migration from suburbs to the cities;
  4. Climate Change & Resource Scarcity – increasing demands for energy and water;
  5. Technological Breakthroughs – massive expansion of data, “90% of the data in the world today is only 2 years old.”

PwC has observed that clients are demanding a data driven audit.  They note that auditors of the future must understand databases, at least Access and SQL, to do their jobs.  Basically, SQL is the EXCEL for the future in their view.

Dr. Rojendra Srivastava discussed the SeekINF tool for searching online SEC filings.  This is best explained in this online manual.

Dr. Hans Verkruijsse (Hans) and Dr. Angelique Koopman discussed “Process Mining & Framework for Continuous Monitoring”.  Dr. Koopman defined Process Mining as “using data mining to understand the true process”.  She discussed elephant paths or the human tendency to shortcut a process.  She then demonstrated a software tool that uses system event logs to show the path of transactions through activities (e.g. the activities to process an invoice).  The tool shows which transactions follow desired path, Hans describes as the “happy path”, and which bypass or go another path (the elephant path).  Those transactions not on the “happy path” can then be audited which will identify configuration issues that allow override of system preventative controls or process issues for which additional control procedures are required.  The visualization of such software quickly allows the user (e.g., auditor) to understand the “true process” (see above) as compared to the process that was described by the process owner.

Hans offered the following definition:  CM + Continuous IA = Continuous Data Level Assurance with a goal to:

Identify the elephant path –> put in controls to prevent this path at data level –> all on “happy path”

Dr. Mieke Jans reinforced that process mining starts with event logs to discover the real process as compared to the documented or desired process.  She noted the move toward the XES structure for event logs.  Based on XES-structure, there are 3 categories of decisions auditors must make:

  1. Which process instance to follow?
  2. Which activities on that process instance to capture (auditor needs to make this decision)?
  3. Which attributes (extra characteristics) to store?

These decisions impact the resulting process mining data available for audit.

Dr. Daniel O’Leary (Dan) presented issues of privacy related to big data which was very thought-provoking.  He noted that all data has a purpose, however, moving this data to other purposes can create privacy problems.  He gave the example of Zest Finance and their mantra of “All data is credit data”.  They mine all kinds of data from the web to help their customers make “better credit decisions”.

Dan went on to describe the concept of the “Big Data Lake” – compiling different types of data into one place which could lead to piecing data together in ways that give rise to privacy issues.  The combination of traditionally available data combined with the expansion of location data and the coming explosion of internet of things data can, in Dan’s thinking, allow for tracking and/or exposing more about ourselves than should be ethically permissible.

Yoan Widjaja and Sheetal Gour, members of the CM development group within the IA function at Dell, presented their experiences from the field getting traction for CM initiatives.  A CM project related to discounting and pricing at Dell provides dashboards, reporting and other analysis tools for the operational teams in the pricing function to rather easily identify potential errors, fraud and abuse for investigation.   Their team develops these tools internally mainly using SQL.  As noted earlier related to IA and the general resistance to change, they’ve had a similar experience with the operational teams.  Even though this group is doing the work to understand the process and develop tools for these functions with no financial charge-back, they still encounter significant resistance amongst many operational areas to support a project and/or accept the tools as their own at the end of the project.  This has been such a pervasive problem that they now have the function sign a Statement of Work at the planning stage of the project so all parties understand their roles and the function being assisted agrees that they will use the tools resulting from the project.  Hard to believe such resistance to essentially free tools to improve the process (in this case eliminate errors/abuse and improve margins), but this was a consistent underlying theme throughout the symposium.

I keep two documents open on my computer during this symposium.  One document to take notes of presentation content and the other for inspiration/”out of the box” ideas that come to me as I see approaches to certain problems that inspire ideas to improve our software or client services.  I can’t say I typically do this at other conferences.  The majority of attendees are forward thinking and looking for improvement.  Much more so than other seminars or symposiums I attend.  There is a real energy in the crowd that makes it a great place to be.  There is also a frustration that arises because while the attendees are like-minded in their efforts to improve companies/departments/services/applications, the overall progress of CA/CM has been slowed by a resistance to change, and perhaps a resistance to transparency, amongst our business leaders and colleagues.  This includes resistance to change of many CAEs and CFOs/Controllers.  This frustration came somewhat to a head during the external audit panel where several attendees expressed frustration that the same topics are talked about year after year with little progress.  This frustration was misplaced.  Certainly, the external audit firms have significant talent to bring to CA/CM, however, this is not an effort that can or should be led by external parties.  CA/CM must be led from the inside with appropriate resources contracted/licensed to establish the routines/reports/dashboards.  We have to keep the faith.  We need to keep pushing our organizations in the direction of automation to highlight potential problems and to document actions taken to resolve not only that instance but hopefully preclude future occurrences.  We have to move beyond the days in which a majority of controls take place a week or two after a period close.  The pace of change in modern business can no longer tolerate that.  We have to find ways to convince our leaders to improve the process before a problem occurs.  CA, CM and data analytics are the road to leveraging compliance.

About the Author

Glenn Murphy, the co-founder of BestGRC and founder of GRC Management Consulting, primarily focuses on empowering entities to leverage their compliance activities through the BestGRC “cloud” software, his consulting work, publications and the “Leverage Compliance” blog.  Find Glenn’s full profile at http://www.linkedin.com/in/glenntmurphy/ , follow him @GlennMurphyGRC and subscribe to the Leverage Compliance blog at http://www.bestgrc.com/blog/