Data Mining Journal Entries for Fraud Detection: A Pilot Study

advertisement
Data Mining Journal Entries f or Fraud Detection: A Pilot Study Roger S Debreceny§ Shidler College of Business University of Hawai‘i at Mānoa roger@debreceny.com Glen L Gray College of Business and Economics California State University at Northridge glen.l.gray@csun.edu Our thanks for the important support of a software vendor in providing data, Neil Herman for data extraction and coding and the Shidler College of Business at the Shidler College of Business at the University of Hawai‘i at Mānoa for financial support. Contact Author: §
2404 Maile Way Honolulu, HI 96822 First Draft‐Please Do Not Quote‐All Comments Welcome June 2009 Data Mining Journal Entries f or Fraud Detection: A Pilot Study Abstract Fraud detection has become a critical component of financial audits and audit standards have heightened emphasis on journal entries as part of fraud detection. This paper canvasses perspectives on applying data mining techniques to journal entries. In the past, the impediment to researching journal entry data mining is getting access to journal entry data sets, which may explain why the published research in this area is a null set. For this project, we had access to journal entry data sets for 29 different organizations. Our initial pilot test of the data sets had interesting preliminary findings. (1) For all 29 entities, the distribution of first digits of journal dollar amounts differed from that expected by Benford's Law. (2) Regarding last digits, unlike first digits, which are expected to have a logarithmic distribution, the last digits would be expected to have a uniform distribution. Our test found that the distribution was not uniform for many of the entities. In fact, eight entities had one number whose frequency was three times more than expected. (3) We compared the number of accounts related to the top‐five most‐frequently occurring three last digit combinations. Four entities had a very high occurrences of the most frequent three‐digit combinations that involved only a small set of accounts, one entity had a low occurrences of the most‐frequent three‐digit combination that involved a large set of accounts, and 24 had a low occurrences of the most‐frequent three‐digit combinations that involved a small set of accounts. In general, the first four entities would probably pose the highest risk of fraud because it could indicate that the fraudster is covering up or falsifying a particular class of transactions. In the future, we will apply more data mining techniques to discover other patterns and relationships in the data sets. We also want to seed the dataset with fraud indicators (e.g., pairs of accounts that would not be expected in a journal entry) and compare the sensitivity of the different data mining techniques to find these seeded indicators. i Data Mining Journal Entries f or Fraud Detection: A Pilot Study I.
Introduction This paper explores emerging research issues related to the application of statistical data mining technology to fraud detection in journal entries. The detection of fraud and particularly of financial statement fraud i has become an increasingly important component of the financial statement audit over the last decade. A number of important financial statement frauds have involved fraudulent journal entries or managerial override of controls that have utilized journal entries within computerized accounting information systems. These journal entries have often involved well‐known examples of financial statement fraud including inappropriate revenue recognition, inappropriate capitalization of expenses and a wide variety of inappropriate accruals. Given likely fraudster response to known patterns of fraudulent journal entries such as “top level” so‐called manual journal entries and the enormous volume of journal entries in typical computerized accounting information systems, it is questionable that direct auditor assessment of small samples of journal entries will effectively and efficiently detect likely patterns of fraudulent activity. Automated auditor analysis of journal entries has been increasingly mandated by auditing standards in the U.S. and internationally. Some degree of direct computerized analysis of journal entries is now part of the toolkit of audit teams on major audit engagements. There is, however, very little knowledge of the efficacy of this important class of audit procedures. Although there are large bodies of literature regarding data mining in other domains, a broad search of audit literature did not locate any research literature on the data mining of journal entries. ii Yet, auditing standards require that auditors consider fraud in their financial audits and those standards specifically require that auditors examine journal entries. Based on the successful applications of data mining to other domains, it would appear that data mining holds the potential to improve both the effectiveness and efficiency of the auditors in their analysis of journal entries and fraud detection. In this paper, we set out the underlying issues that will guide effective and efficient data mining of journal entries. We review the standards from auditing regulators and guidance from the professional audit community. We explore the potential for statistical data mining of large sets of journal entries. We 1 then test the statistical properties of journal entries, in a pilot study. We make first steps to data mining of such journal entries. We acquire a set of journal entries for 29 entities. We consider the essential elements of the journal entries. We explore their statistical properties, concentrating on their dispersion from known distributions. We identify some preliminary patterns within the journal entries. The paper makes an important contribution to the literature on data analysis, data mining, and fraud detection within journal entries. The remainder of this paper proceeds as follows: the next section provides general background material and then specifically addresses the role of journal entries in committing fraud and draws lessons from recent frauds that used journal entries. The section also summarizes the responses of standard setters to the heightened fraud risk environment since the late 1990. In the third section, we explore the issues involved in data mining journal entries. We discuss both the technical and the statistical properties of journal entries and how data mining can leverage the economic relationships embedded in the account combinations represented in the journal entry. In the fourth section, we introduce our data set. We then discuss our initial exploration of the statistical properties of the journal entries in our data set in the next section. In the final section, we draw conclusions and point to a research agenda. II.
Background Over the last several years, there has been an increased emphasis on the detection of fraud as a key element of the financial statement audit. In 2000, the AICPA’s Public Oversight Board’s Panel on Audit
Effectiveness pointed to a variety of necessary reforms to ensure the long‐term viability of the audit (POB 2000). The significant frauds that involved manipulation of financial statements and disclosures in the late 1990s and early part of this century gave added impetus to a fundamental shift in the conduct of audits. A number of fraud schemes involved “top level” journal entries that were designed to make relatively simple adjustments between classes of accounts such that the financial statement results would show an improved position at the margin. As a response to significant financial statement frauds over the last decade or so, a number of changes have been made to auditing standards and the regulatory environment governing the profession of auditing. The promulgation in 2002 by the Auditing Standards Board (ASB) of SAS 99 (Consideration of Fraud in a Financial Statement Audit) and the enactment of Sarbanes‐Oxley Act (SOX) by the U.S. government were central events. SAS 99 significantly increased the responsibility of auditors 2 to address potential fraud as an integral part of financial audits (ASB 2003; CAQ 2008). For example, SAS 99 requires the direct assessment of journal entries for fraud risk. The International Auditing and Assurance Standards Board (IAASB) followed SAS 99 with similar language in their IAS 240 (IAASB 2007a). While individual frauds have been substantial and the range of fraud techniques employed broad, the proportion of frauds within the broader population of audit clients is minuscule. In support of the preparation of financial statements and accompanying notes and other disclosures, audit clients employ sophisticated information systems that generate vast quantities of electronic evidence. Finding evidence of fraud detection within this information milieu is challenging. Employing data mining has the potential to improve the efficiency and effectiveness of audit teams in the conduct of fraud‐related audit tasks. Modern accounting information systems increasingly record transactions in the general ledger at the atomic level. It is common for entities to have several hundred thousand journal entries in a given accounting period. Managers intent on committing fraud may also choose to conceal fraudulent transactions within other transactions in so‐called “jumbo” entries. These factors make data mining of journal entries to detect fraud a challenging exercise. The remainder of this section takes a closer look at financial statement fraud involving journal entries and the response of standard setters to heightened risk from fraud. Financial Statement Frauds Involving Journal Entries As introduced in the previous section, the focus of this paper is data mining of journal entries within computerized accounting information systems. This potential class of substantive tests is designed to support the auditors’ assessment of material misstatements in the financial statements arising from fraud. While misappropriation of assets is important, detection of financial statement fraud is of greater concern. This latter type of fraud usually has greater probability of giving rise to a material misstatement and to be committed by upper management. We are particularly concerned with the use of journal entries to facilitate the fraud and techniques to discover those entries. Within the recent examples of financial statement frauds, the reallocation of operating expenses to capital expenditure and the other aspects of the fraud at WorldCom Inc. is perhaps the most egregious. WorldCom provides a very useful model of how financial statement frauds will, in most cases, involve many adjusting journal entries. The WorldCom fraud was relatively straightforward, primarily involving 3 adjustments from expense accounts to capital expenditure accounts. As the special report to the Board of Directors on the fraud (Beresford et al. 2003) shows, there was no justification in accounting principles or practice for these material adjustments. The amounts were large and well known within the corporation. Some of these adjustments were even the topic of conversation between accountants in international operations and local auditors (Woods 2002). The WorldCom journal entries provide a useful example of the way in which frauds involve multiple signals that must be identified in the aggregate. We can discern seven important characteristics of the adjustments and of the use of journal entries at WorldCom. First, the fraud involved straightforward and inappropriate accounting reallocations. These included transfers from flows to stocks. For example, significant transfers were made from what was effectively a suspense expenditure account, “Prepaid Capacity Costs,” to the “Construction in Progress” account, which was treated as capital expenditure (Beresford et al. 2003). Second, frauds also involved accounting treatments designed to influence disclosure rather than recognition. For example, line costs were transferred to accounts that rolled up into “Selling, General and Administrative Expenses (SG&A).” These adjustments did not change the reported profits, but did change the allocation between gross and net profit disclosures (Beresford et al. 2003). This change in disclosure was clearly designed to influence the conclusions of analysts on WorldCom financial performance. Third, many of the suspicious journal entries were ill concealed, with large adjustments in rounded amounts that would be obvious to the most casual of inspections. A WorldCom employee, the Director of Expense/COGS Operations, observed entries that “jumped off the page” within minutes of electronically searching the General Ledger (Beresford et al. 2003). Fourth, there were a large number of inappropriate or at best questionable journal entries. The special report noted that “[w]e found hundreds of huge, round‐dollar journal entries made by the staff of the General Accounting group without proper support...” (Beresford et al. 2003. Emphasis added.), Fifth, inappropriate journal entries were often accompanied by failures in documentation and breaches in normal internal controls. Sixth, the adjustments were almost universally carried out at the corporate level. In many cases, however, these “top side” adjustments made at the corporate level required adjustments at operating divisions and international operations. Finally, many individuals and groups within the corporation quickly became aware – or should have been aware – of the implications of fraudulent entries passed at headquarters, not the least of which was as the result of sweeping up after the aforementioned “top side” adjustments (Beresford et al. 2003). 4 Perhaps the most interesting aspect of the WorldCom case from the perspective of this paper is the statement that “WorldCom personnel also repeatedly rejected Andersen’ s requests for access to the computerized General Ledger through which Internal Audit and others discovered the capitalization of line costs” (Beresford et al. 2003). There might have been a very different outcome to recent US corporate history if Andersen had more vigorously pursued electronic access to the General Ledger. The WorldCom case was particularly egregious and, as the special report to the Audit Committee clearly describes, the potential red flags for the auditors were many and varied. Nonetheless, many of the same red flags were repeated in other examples of financial statement frauds. The Cendant Corporation fraud that pre‐dated the WorldCom fraud was almost a word‐for‐word transcription. A 1998 report to the Audit Committee of the then Cendant Corporation noted that in what “shows to have been a carefully planned exercise,” a large number of “unsupported journal entries to reduce reserves and increase income were made after year‐end and backdated to prior months; merger reserves were transferred via inter‐company accounts from corporate headquarters to various subsidiaries and then reversed into income; and reserves were transferred from one subsidiary to another before being taken into income” (Willkie Farr & Gallagher and Arthur Andersen LLP 1998). Perhaps what distinguishes the Cendant case from WorldCom was the wide range of accounts and accounting treatments that were involved in the fraud at Cendant. Hundreds of journal entries were required to achieve the desired impact on net income. The fraudulent entries impacted revenue, cash, accounts receivable and deferred revenue. From a somewhat different perspective on the Cendant case, Wallace (2000) noted that control violations within Cendant were highly disaggregated. Auditors and others charged with discovering these violations may need to aggregate these disaggregated transactions in order to see the broader picture and be in a better position to identify the control violations. A similar picture of journal entries at the heart of financial statement frauds can be drawn in many other exemplars of the last decade, including Health South (Weld et al. 2004) and Xerox, Enron and Adelphia (BFA) (DeVries and Kiger 2004). Response of Standard Setters to Heightened Risks from Fraud The heightened recognition of the importance of financial statement fraud in the 1990s lead to an increased emphasis on fraud amongst auditing standard setters internationally. The then Public 5 Oversight Board (POB) of the AICPA provided some of the most influential guidance on how auditing should respond to this heightened risk environment in the 2000 report of their “Panel on Audit Effectiveness” (POB 2000). The Panel conducted reviews of working papers, which the Panel termed “Quasi Peer Reviews,” for a significant number of audits. In addition, they reviewed the SEC’s Accounting and Auditing Enforcement Releases (AAERs) over the previous two year period, which was a particularly active period in SEC enforcement. When reviewing the response of auditors to high levels of fraud risk, the Panel made particular note of the failure of audit teams to assess “non‐standard” journal entries. In some fifteen percent of cases, auditors did not have a sufficient understanding of the client systems for preparing such entries. In nearly one third of other cases, the audit teams did not undertake substantive tests of non‐standard journal entries (POB 2000). The Panel recommended to the ASB “develop stronger and more definitive auditing standards to effect a substantial change in auditors’ performance and thereby improve the likelihood that auditors will detect fraudulent financial reporting” (POB 2000). The Panel made a series of detailed and integrated proposals; the most important of which for the purposes of this paper was that the audit contains a “forensic‐type fieldwork phase.” This was not designed to turn the audit into a forensic investigation, which would dramatically change the character of the audit. Rather, the proposals were designed to bring selected forensic techniques to the financial statement audit. Unsurprisingly, given their findings, the Panel made specific recommendations on direct examination of “non standard” journal entries. The Panel noted that “[a]ll or virtually all entities record non‐standard entries. These entries can provide an avenue for management to override controls that could lead to fraudulent financial reporting. Consequently, auditors need to design tests in the forensic‐type phase to detect non‐standard entries and examine their propriety. This aspect of the forensic‐type phase affects not only the extent of testing, but also its timing, because such entries can be recorded at various times during the year” (POB 2000). The response of the ASB was SAS 99, “Consideration of Fraud in a Financial Statement Audit” (ASB 2003) iii. This standard notes that a material misstatement in financial statements can arise from fraudulent financial reporting, defined as “intentional misstatements or omissions of amounts or disclosures in financial statements designed to deceive financial statement users” and from misappropriation of assets (ASB 2003). SAS 99 requires that the auditor undertake a variety of analytical and planning tasks and substantive audit procedures to support the detection of errors arising from fraudulent financial reporting. The standard makes particular note of the role of journal entries and other adjustments in the conduct of financial statement fraud. SAS 99 imposed a considerably enhanced 6 set of requirements on the auditor. The standard required auditors to “design procedures to test the appropriateness of journal entries recorded in the general ledger and other adjustments (for example, entries posted directly to financial statement drafts) made in the preparation of the financial statements” (ASB 2003). SAS 99 provides detailed guidance on selection of entries and adjustments, requiring the auditor to assess the risk of misstatement from fraud, effectiveness of controls over journal entries and the nature and complexity of entries and accounts. The standard identifies markers of fraudulent entries including: “entries (a) made to unrelated, unusual, or seldom‐used accounts, (b) made by individuals who typically do not make journal entries, (c) recorded at the end of the period or as post‐closing entries that have little or no explanation or description, (d) made either before or during the preparation of the financial statements that do not have account numbers, or (e) containing round numbers or a consistent ending number” (ASB 2003). Auditors are cautioned that they should pay particular attention to non‐standard entries and to other adjustments such as consolidation entries. Finally, SAS 99 makes a number of explicit requirements for the auditor to undertake substantive tests of the detail of controls and transactions. The standard notes that fraudulent journal entries are likely to occur around the closing process and that, consequently, testing should concentrate on entries posted in the period leading up to the fiscal year end or during the preparation of the financial statements. Indicative tests of the journal entries data set include: •
Non‐standard journal entries •
Entries posted by unauthorized individuals or individuals who while authorized do not normally post journal entries •
Unusual account combinations •
Round number •
Entries posted after the period‐end •
Differences from previous activity •
Random sampling of journal entries for further testing 7 The detailed requirements of SAS 99 were a considerable augmentation to those of its predecessor, SAS 82. Because of the detailed requirements of SAS 99, a major thrust of audit firms has been to develop technologies, policies and procedures designed to enable them to fulfill these requirements. More recently, the ASB has also addressed the question of journal entries in the so‐called “risk standards.” These include SAS 109 on understanding the entity and its environment (ASB 2006a) and SAS 110 on audit procedures (ASB 2006b). SAS 109 requires that the auditor assess the manner in which information is moved to the general ledger from other systems, how system and non‐standard journal entries are created and controlled, and the role of consolidation and close processes (ASB 2006a). These requirements are incremental to those in SAS 99, arguably ensuring that the auditor develops a sophisticated understanding of the close process and the roles played by general ledger journal entries. The Center for Audit Quality has also provided guidance on the processes involved in selecting, acquiring, testing and analyzing journal entries for fraud detection (CAQ 2008). The International Auditing and Assurance Standards Board (IAASB) have taken a somewhat more nuanced approach to the audit of journal entries. In 2003, the IAASB revised their fraud standard, ISA 240, at least in part as a response to the increased requirements of SAS 99. ISA 240 requires that the auditor conduct substantive tests to address potential management override of controls by testing “the appropriateness of journal entries recorded in the general ledger and other adjustments made in the preparation of financial statements” (IAASB 2007a). Factors to take into consideration in the selection process identified by the Board are somewhat similar to SAS 99, including assessment of risk of misstatement from fraud, controls over journal entries and adjustments and the nature and complexity of the evidence environment. The Board also provides a checklist of markers of potentially fraudulent journal entries, again, similar to SAS 99. Similarly, the audit risk standard, ISA 330, notes that the auditor should examine “material journal entries and other adjustments made during the course of preparing the financial statements” (IAASB 2007b). Taken together, SAS 99, the two new risk standards (SAS 109 and 110), and ISA 240 considerably increased the requirements on the auditor to assess the controls over journal entries, assess the fraud risk environment as it impacts the creation of different classes of journal entries and the conduct of substantive tests. While prior to SAS 99 auditors might have inspected suspicious journal entries in exceptional circumstances, assessment of journal entries and adjustments is now a standard element of the audit at least for larger and higher risk clients. The proportion of public company audits that are 8 subjected to full analysis of journal entries is significant and growing. This change in the nature and extent of substantive tests has come at the same time as considerable changes in the information technology environment in which journal entries are processed. The following section addresses this new technology environment. III.
Understanding the Properties of Journal Entries In this section, we address how to build a systematic understanding of a set of journal entries that may contain deliberate deception or other signals of financial statement fraud or misappropriation of assets. As we will discuss in more detail later in the section, there is sparse research on the interrogation or data mining of journal entries. Given the parlous state of research, it is perhaps necessary to turn to first principles and build a new research agenda and a new program for extracting knowledge from journal entries. An important element of any program of data mining is developing knowledge of the properties of the subject of the investigation. As with many subjects of data mining, journal entries have a number of attributes that must be assessed both individually and taken together. Figure 1 shows a simplified data model of a typical journal entry. There is journal entry header information that uniquely identifies the JE. Some of that header information is entered by the user, but most of the information is automatically assigned by the software. The entry detail has one database record for each line in the body of the JE. In database terminology, there is a one‐to‐many relationship between the entry header information and the entry detail information. Since any JE, at minimum, would be expected to have at least one debit and one credit, we would expect at least two entry detail records for each entry record. In addition to the JE itself, sometimes a batch (or group) of JEs are submitted and processes together. In that case there would be batch data elements similar to those listed at the top of Figure 1. Insert Figure 1 about here The Technology Environment The effective and efficient data mining of journal entries requires comprehensive understanding of likely markers (red flags) of fraudulent entries or adjustments, statistical properties of journal entries and the technological environment in which the client transacts the journal entries. In following paragraphs, we address the last of these considerations. There are a number of important technological and policy issues to consider before deliberating on the most effective and efficient form of data mining 9 of journal entries. First, to what extent are journal entries passed in subsidiary ledgers and only summarized in the General Ledger? Second, what automated controls exist over the passing of journal entries? Third, how does the client process accounting estimates, consolidations, adjustments, and other fine‐tuning into journal entries? Fourth, what are the implications of bespoke and third party analytical and consolidation software applications in generating so‐called “mega” journal entries? We address each of these issues in turn in the following discussions. Granularity in Processing of Journal Entries The General Ledger within the accounting information system is the final repository for the economic impact of all economic events that affect an organization and, by extension, the financial statements. The coupling of the General Ledger to the original processing of the economic event that triggers the event is an important factor in the design of data mining solutions. General Ledger systems typically process data that may have arisen from a variety of transaction processing systems, including: sales, purchases, logistics, maintenance, and manufacturing sub‐systems. The General Ledger could receive transactions from those systems at different levels of aggregation. In many older and legacy accounting information systems, data are coming in at a high level of aggregation. There may be a single journal entry each month that captures all the multitude of transactions that were processed by the particular sub‐system. For example, sales systems will transmit a single entry each month that will have aggregated postings to revenue, cost of goods sold, accounts receivable, etc. Automated drill down from the General Ledger to the original transaction is usually difficult or infeasible in such systems. The mining of value‐adding processes at the General Ledger level is then not necessarily particularly productive. Conversely, non‐standard journal entries, such as closing adjustments that may be markers of financial statement fraud, are likely to be relatively obvious to the auditor of these more traditional or legacy accounting information systems. In these systems, non‐standard journal entries can be observed among a relatively small number of journal entries. They are likely to have clearly identified markers that can be observed by the data mining application. On the other hand, in many client instantiations of ERP iv systems, the General Ledger records the transactions at the atomic level. It is not uncommon for clients to have several hundred thousand journal entries. Certain transaction classes may be aggregated, but are exceptions to this rule. These include low transaction value retail sales and individual monthly billings for telecommunications clients. 10 Nonetheless, the GL is typically a very rich information environment. Further, many clients maintain closely coupled transaction processing systems. A drill down is increasingly feasible from the GL to, for example, sales processing and customer relationship systems. Whether data mining techniques can identify potentially fraudulent journal entries in a population of hundreds of thousand of journal entries is still an unanswered empirical question. Identifying non‐
standard journal entries within a large population of journal entries is also particularly challenging. Adjusting Journal Entries Adjusting journal entries for estimates to accounts (e.g., estimate for bad debt allowance) are a focal point for financial statement fraud and require a highly developed approach to data mining. There is an enormous variation in how clients calculate these many estimates that are used in financial reporting. Many adjusting journal entries will be as the result of spreadsheet analysis of accruals, reserves, impairments, etc. These spreadsheet analyses are an important source of control risk (Janvrin and Morrison 2000; Panko 2006; PricewaterhouseCoopers. 2004). In other cases, adjusting entries might result from stored procedures that are triggered in business intelligence data warehouses or from programs written in SAP’s ABAP language. The logic of the journal entry will rest in the supporting application and require further inquiry or assessment. When assessing what may seen to be questionable journal entries the auditor must consider the risks associated with the particular method for generation of the journal entry. Classes of Journal Entries and Adjustments There are several classes of General Ledger journal entries. The bulk of journal entries within General Ledger systems are so‐called “system” entries. These entries will normally be posted as result of the conclusion of some phase in a business process such as the acquisition of inventory or at the delivery and billing of goods or services. Such journal entries are posted under the control of the application software. In most cases, these system entries result in an appropriate reflection of the nature of the business process in the accounting system and do not represent fraud. However, some of these entries may be because of systematic fraud. For example, in financial statement frauds that entail inappropriate revenue recognition may well involve system transactions that have been fraudulently entered into the accounting information system at the direction, intervention or insistence of some level of management within the enterprise. 11 A second class of journal entries are the so‐called “manual” entry or “topline” journal entries to the General Ledger (CAQ 2008). As with system entries, the vast majority of these transactions are entirely appropriate. For example, a number of manual entries results from analyses conducted by enterprise staff. Entries to adjust allowances for doubtful debts to reflect debtor payment histories at the end of the period; rollback of inventory to adjust standard costs to lower of cost or market and adjustments to account for impairment to the valuation of acquired goodwill are all examples of manual journal entries that normally will be valid and appropriate. Equally, any of these entries could be inappropriate and be evidence of fraud. A third class of journal entries is comprised of so‐called “mega” journal entries. These are entries are pushed to the General Ledger from analytical and consolidations systems. These systems include custom applications for managing entity‐specific accounting accruals and estimates. Examples are applications for managing exposures, such as the valuation of financial instruments, warranty provisions and capital leases. Consolidation and rapid‐close application, such as those from Hyperion, v are a second example of systems that give rise to “mega” journal entries. These mega entries will typically be few in number and impact upon many accounts – at worst, a single journal entry transferred from Hyperion might post to several hundred accounts in the General Ledger system. Journal entries arising from these systems are likely to be high value from a fraud perspective given that they are by definition arising not from a particular internal or external atomic transaction, but from a review of an accrual or adjustment. Employing known markers of fraudulent entries may be highly debatable when the auditor views so many economic assumptions for accruals or adjustment through the lens of such a highly aggregated single journal entry. A challenge also arises from the disparate locus of control between the General Ledger and the analytical and consolidations systems. These systems may or may not have their own systems of controls and transaction logs. Hyperion, for example, has sophisticated built‐in controls and logs. The controls over custom, in‐house developed systems are likely to vary widely. Nonetheless, controls over these accruals and adjustments are split between the general ledger and the analytical system, making data mining potentially difficult. 12 Understanding the statistical properties of journal entries At the most elementary level, we can see a set of journal entries as independent members of a population of events, corresponding to a known or expected distribution. Unfortunately, research into the statistical properties of journal entries appears to be a null set. An extensive literature review did not identify a single paper, other than the discussion of applying digital analysis (significant digit law) that we discuss in the next paragraphs. There is no literature that models the statistical properties of populations of journal entries. Nor is there a literature that takes exemplar databases of journal entries and tests the statistical properties of those databases. This is indeed surprising, given the public policy importance of financial statement fraud or the centrality of these assessments to the value adding characteristics of audit firms. The literature on audit sampling is of only limited value to the discussion of data mining journal entries. When assessing journal entries, the problem is not generating a representative sample since the auditor has the complete population available in electronic form. Rather, the task is to identify those journal entries that are anomalous and potentially indicative of fraud. What distribution will represent such a population? Digital analysis is a generic term employed in the forensic accounting and auditing profession for investigations of leading digits within populations of interest. The Significant Digit Law vi, which is at the heart of digital analysis, shows that leading digits in a variety of populations are not normally distributed (Hill 1995). To the contrary, they follow a logarithmic distribution. As Hill (1995) notes, empirical evidence for this law has been found in a wide range of natural, as distinct from artificially constructed, populations. An alternative approach using a Bayesian approach has been proposed by Ley (1996) and supported by evidence from extensive simulations by Geyer and Williamson (2004). Digital analysis has been employed to determine fraudulent patterns of data in operations management (Hales et al. 2008), scientific publishing (Diekmann 2007) and earnings management (Guan et al. 2006; Skousen et al. 2004). Digital analysis has been strongly recommended by Nigrini and others as a vital, even essential, tool in fraud detection (Nigrini 2000; Nigrini and Mittermaier 1997). While there are clear challenges with the practical application of digital analysis (Cleary and Thibodeau 2005), it remains a very useful tool for detection of possible fraud within a large data set. While the first digits of journal entries are of considerable interest in the detection of fraud, the final digits are also of considerable interest. Are there journal entries with significant numbers of zeroes or other indications of fraud, as suggested in SAS 99? We are particularly interested in the three to six 13 digits from the right, as indicative of thousands to millions of dollars. After three digits from the left, we expect that digits will appear with equal probability. Detection of unusual patterns in the right‐most digits can employ traditional parametric measures such as goodness of fit, skewness and kurtosis. Cheng and Hall (1998) note that these tests may influenced almost as much by the validity of the particular parametric model e.g. by the weight of the tails of the fitted distribution as by the hypothesis of homogeneity.” An alternative approach is suggested by Hartigan and Hartigan (1985). Their “dip” statistic measures the maximum difference between the unimodal distribution function (worst case) and equal distribution, as the most extreme modality. Later in this paper, we employ the revised algorithm of the dip test devised by Cheng and Hall (1998). The dip test allows us to see patterns in the right hand side digit that might indicate fraud. Understanding the General Ledger Structure Each journal entry must be interpreted in light of the chart of accounts. The structure of the chart of accounts for the general ledger is specific to the particular entity. The concept of journal entry with “unusual account combinations” requires matching the conceptual understanding of such combinations to the client’s chart of accounts. Typically, the auditor develops an internal taxonomy to represent a generic chart of accounts. The generic chart of accounts allows analysis of the journal entries in terms of unusual patterns of activity. These patterns include abnormal volumes of transactions to particular classes of accounts; transactions to classes of accounts at atypical times in the closing cycle; and journal entries made to unusual combinations of accounts. The auditor and, particularly, those from audit firms that have centralized data collection and analytical functions must map these standard templates or taxonomies to the client’s chart of accounts. There is clearly a significant time cost in matching the hierarchy of the client’s chart of accounts to the standard taxonomy. Whenever the client modifies its chart of accounts, the data collection team must adjust the mapping of the client’s modified chart of accounts to the audit firm’s generic chart of accounts taxonomy. There is a potential role for XBRL GL in providing more sophisticated mapping of the client’s general ledger to the generic taxonomy. The mapping may include not just the date, account and transaction amount but information on controls and data sources. The significance of the double entry accounting system as the foundation for a n‐dimensional matrix representation of the value adding activities of the companies is well‐known (Ijiri 1975; Ijiri and Kelly 1980; Leech 1986; Mattessich 1964, 2003; Sampson and Olan 1992). At a minimum, each journal entry 14 involves at least two accounts as well as a time dimension. In the case of systems journal entries, any one transaction may simultaneous post to several hundred accounts. The accounts involved within a journal entry form part of an accounting taxonomy. The journal entry line amounts must be interpreted in relation to this taxonomy as well as the amount, statistical properties and temporal characteristics of the transaction. In addition, each transaction is not an entity unto itself. Each transaction has to be interpreted in light of all the other transactions that impact upon an individual account or group of accounts. Taken in the totality, this set of attributes provides a very rich population to data mining software. Yet, how auditors can apply matrix techniques to the analysis of journal entries is still highly tentative and speculative (Arya et al. 2000; Sampson and Olan 1992). Much additional research is required to better explore how matrix representations could be better integrated into journal entry analysis. Putting it all together In summary, the questions that affect the design and application of data mining to journal entries in the audit are: •
What are the sources of the journal entries? How do those sources influence data mining for all enterprises? For the particular enterprise? •
Are there unusual patterns in the journal entries between classes of accounts? •
Does the class of journal entry influence the nature of the journal entry? For example, do adjusting journal entries carry a greater probability of fraud? •
Is there evidence of unusual patterns in the amount of the journal entries either from the left most digits (Benford’s Law) or from the right most digits (Hartigan and Hartigan’s dip test)? •
How can we triangulate and combine these various possible drivers of fraud in the journal entries to allow directed data mining? IV.
Investigating Populations of Journal Entries We now move to the first stages of assessing the questions we set out in the previous section. One of the major difficulties for researchers in the financial fraud domain is obtaining access to real‐world 15 internal accounting data to test various hypotheses and models. For this paper, we were fortunate to have access to a large database that included data sets of journal entries for a wide variety of organizations. Specifically, an anonymous software vendor provided journal entries for 36 organizations. For two of those organizations, we have journal entries for two years. The software vendor removed any identifying information from the files, prior to their transfer to us. Unfortunately, we did not have access to the opening trial balance for these organizations. These organizations were from both the for‐ and not‐for‐profit sectors. The underlying accounting information systems were all different. Eight of the data sets were for periods less than twelve months and were excluded them from the analysis below. One organization’s journal entry data set was incomplete and was also eliminated from the analysis. Mapping Individual Charts of Accounts One of the challenges in analyzing this journal entry data was that each organization had its own chart of accounts. Using the disparate charts of accounts would have made cross‐sectional analysis very difficult. As such, the first step in preparing this data for analysis was to create a comprehensive standard chart of accounts. Then each organization’s chart of accounts was mapped to the standard chart of accounts. Table 1 shows the descriptive statistics of the 29 sets of charts of accounts of the organizations we studied. There are a relatively small number of accounts in use in most of the organizations, with a small number of organizations having complex charts of accounts. Insert Table 1 about here We constructed a master chart of accounts with a “five‐four” structure. The first five digits designate the primary account and the second four digits for the sub‐account. In most cases, the four digits correspond to a particular sub‐account for one of the entities in the sample. Conversely, the master accounts (five digits) are part of the logical structure of the master chart of accounts. There are 1,672 accounts (five‐four) in the master Chart of Accounts, with 343 primary (five digits) accounts. The resulting database that we used for our subsequent analysis had a total of 496,182 line items across the 29 organizations. There is considerable variation in posting to the various accounts. Table 2 shows the number of transactions per primary account: Insert Table 2 about here 16 There are ten accounts with more than 10,000 transactions each, including the usual suspects of Accounts Receivable (38,714 transactions), Accounts Payable (44,916 transactions), and Salaries and Wages (44,158 transactions). Descriptive Statistics Table 3 shows some basic descriptive statistics for the 29 organizations. The first observation that pops out is how different the statistics are for the different organizations. The first column lists the total number of journal entry line items within the fiscal year. The highest number was nearly 154 thousand and the lowest was less than one thousand. The dollar values of the journal entry lines also varied widely. The maximum journal entry line item for the ChiEta entity was $362,478,016. The smallest maximum was $34,929 for Pi. Insert Table 3 about here The relatively larger maximum entries probably indicate that summary journal entries transferred information between accounting modules. For example, perhaps only one journal entry was used to transfer summary information once a month from the accounts payable module to the general ledger. Summary versus detailed journal entries will be part of the auditor’s risk analysis and their subsequent develop of the audit program. For example, if only monthly summary journal entries are used, then the auditor is going to focus on the source modules (e.g., accounts payable). From a data mining perspective, it is probably better if details are transferred to the general ledger because the general ledger will then be essentially one large, comprehensive database. If, on the other hand, the details are kept in each module, then each module is its own isolated database. Being able to data mining across modules can be important in an audit. For example, a common search is to find any vendor addresses (stored in the vendor master file) that are the same as employee addresses (stored in the employee master file) (CAQ 2008). Matches could mean that an employee has set up a fake vendor that is subsequently receiving checks from the company. The number of journal entries and line items per journal entry varies widely. Table 4 shows the number of individual line items that make up the various journal entries for each entity. The first column shows the number of distinct journal entries in the year (N). Then we show the mean, standard deviation and minimum and maximum number of line items per journal entry. There were organizations that have journal entries with very large numbers of line items (e.g. Beta, Chi and Zeta). These are 17 examples of so‐called “mega entries,” where transactions are coming either from stored automatic journal entries that reverse prior‐period adjusting journal entries or transfer data from subsidiary systems. Insert Table 4 about here V.
Statistical Analysis of Journal Entries In this section, we investigate the statistical analysis of our set of journal entries. Digital Law Digital analysis (also first‐digit law or Benford’s law) is a statistical technique regularly discussed in the professional guidance on fraud detection in general (Benford 1938; Nigrini 2000; Nigrini and Mittermaier 1997; Tackett 2007) and journal entries in particular (CAQ 2008). Digital analysis predicts that the first digit of a set of numbers will have the distribution shown in Table 5. Insert Table 5 about here Table 6 shows the number of journal line items for each organization with a particular first digit of the dollar amounts, the actual distribution (Act%) of those digits, and the variance from the expected distribution (Diff = Act% ‐ Expected%). We show the Chi‐square and p‐value for each entity. For every one of the 29 entities in the study, the Chi‐square distribution indicates that the observed pattern of first digits differs from that expected by Benford's Law. If we assume that Benford's Law should apply to a population of journal entries, then the variations in the table indicate many red flags that need further investigation by the auditor. For example, why is the number of 5’s considerably greater than expected at Beta? Probably the interesting question becomes: how is the auditor going to investigate that question? Beta has 40,614 journal entry lines. Is the auditor going to pull every journal entry where there was a journal entry line where the dollar amount started with 5? That would be 5,902 journal entry lines. Instead, the auditor would want to determine ways to efficiently analyze patterns in those lines. For example, do 5’s show up more frequently for specific account numbers, for specific combinations of account numbers, for specific employee ID’s posting the journal entries, for specific time periods (e.g., near end of quarters, end of years, or just after the start of the next quarter or next year), or for other patterns that the data mining software discovers. Benford’s Law builds on certain assumptions about underlying data, including that there are no systematic assignment of the numbers. 18 So, as an alternative explanation, it may be that journal entries violate one or more of those assumptions. This critical question needs further research. Insert Table 6 about here Last Digits Professional guidance also discusses journal entries that contain "round numbers or a consistent ending number" (CAQ 2008). Journal entries with these characteristics have abnormal distributions of last digits. Unlike the first digit, which is expected to have a logarithmic distribution, the last digits would be expected to have a uniform distribution. As a test of uniformity, Table 7 shows the distribution of the fourth digit for each organization for all dollar amounts greater than $999. By this position we expect a uniform distribution of the integers (the same number of 0's, 1's, etc.). Table 7 shows the distribution was definitely not uniform for many of the entities. For example, for Beta, some 19% of the fourth digit of journal entry line items ended in zero (10% was expected). Many of the entities had journal line items with fourth digit significantly greater than expected, ranging up to 58% for XiNu. Some eight of the 29 entities had one of the fourth digits being three times more than expected. However, there could be situations in organizations that make some numbers appear more often. For example, an appliance company might price plasma TVs at $1,598, $1,998, and $2,498. In that situation, 8’s would be expected to appear more frequently in the population. Insert Table 7 about here Auditor investigation of journal entries with particularly high levels of rounded or other unusual patterns cannot rely, however, purely on those patterns as a screen as the number of entries would be too large to investigate. These patterns have to be considered in conjunction with the number of accounts involved. Figure 2 illustrates these relationships. The extent of abnormal patterns, such as rounded journal entries, is the vertical axis. The number of different accounts involved in these unusual patterns is the horizontal axis. If an entity is in Quadrant A (high‐small), there is a high proportion of unusual journal entries with a relatively small number of accounts involved in these journal entries. Quadrant B (high‐large) also has a large number of journal entries with abnormal patterns but with a large number of accounts to which these entries post. Quadrant C (low‐small) has both low abnormality and a small number of accounts. Quadrant D (low‐large) has relatively low levels of abnormal journals but with a large number of accounts. 19 Insert Figure 2 about here If an entity is in Quadrant A, there is the significant potential for fraud but only a small number of accounts. We believe the investigation cost is relatively low, as it is likely that there are few patterns of transactions to identify. Conversely, Quadrant B is more difficult to investigate as there are more accounts involved and, we believe, a larger set of patterns in the transactions. To understand the large differences in the level of abnormal patterns in the journal entries we display in Table 7 above and to see if these journal entries could be seen as following the patterns we show in Figure 2, we undertook additional analysis. We selected the last three digits (to the left of the decimal place) of each line item in each journal entry. In line with the previous discussion, we expected that these last three digits would be uniformly distributed. We viewed the line items in each journal from three perspectives: 1) all line items, 2) only line items greater than $1,000 to eliminate discussion of minor journal entries and 3) journal entries totaling at least $1,000. We counted the total number of line items within journal entries under each of the options. We then identified the number line items with the five most common sets of last three digits and took this as a proportion of the total line items. We also considered the number of accounts posted within the journal entries making up the line items with the five most common sets of last three digits. Figure 3, Figure 4 and Figure 5, below, show the three variants of journal entries as compared with the number of accounts. Insert Figure 3, Figure 4 and Figure 5 about here There are interesting patterns in these three figures. There are several entities that have very high levels of abnormally frequent “last three digits.” There are four of our 29 entities that have 30 to 60% of their transactions made up of just the top five of the last three digit patters. With uniform distribution we would expect any five three‐digit numbers to represent only 0.5% of transactions. Interestingly, each of these entities employs only a maximum of 40 accounts within these “top five” transactions. Effectively, in our sample, all those entities that had strongly abnormal transaction patterns had only a relatively small number of accounts. These four entities could be placed in Quadrant A (high‐small). No entities were in Quadrant B, one was in Quadrant D (low‐large) and by far the most, 24, in Quadrant C (low‐small). In general, all else being equal, the four firms in Quadrant A probably pose the highest risk of fraud for the auditors. These firms have a very high number of round number or consistent number transactions and they are posted to just a few accounts which could indicate that the fraudster is covering up or falsifying a particular class of transactions such posting fictitious sales. 20 Unusual temporal patterns The primary objective of data mining is finding unusual patterns (or outliers that do not fit a pattern) in the data. These unusual patterns would constitute the red flags that the auditor would subsequently investigate. One red flag that an auditor may investigate is unusual patterns in the journal entry activities. End‐of‐quarter and end‐of‐year journal activities are usually of particular interest. The auditor’s concern is that management will make inappropriate journal entries to improve or manage their performance numbers (e.g., net income) prior to closing their books for their quarterly filings (form 10‐Q for public companies) and the more closely followed annual report (form 10‐K for public companies), which under goes a formal certified financial audit. By far the most common forms of financial fraud center on revenue recognition. For example, a company may book a sale in one year that actually occurred in the following year. In the most egregious form of revenue recognition fraud, management books completely fictitious sales. These examples of revenue recognition fraud could result in a variety of journal entries to book those sales over and above the normal journal entries, which would therefore increase the overall number of journal entries posted during the period of fraud (e.g., the last month of the year). As Figure 6 illustrates, ferreting out unusual patterns can be challenging because defining normal is in itself a challenge. For example, increases in journal entry activities would be expected in the last month of the fiscal year as a variety one‐time normal closing journal entries and accruals would be posted. Graphs on the left side of Figure 6 show journal entry line item volume for each month and graphs on the right side show the average dollar value of each journal entry line for each month. Visually comparing the organizations in the figure, it would be hard to define what is normal. [The particular three organizations in Figure 6 were selected to illustrate the wide differences in the 29 organizations in our database.] For the 29 organizations for which we have 12 complete months of journal entries (including the ones shown in Figure 6), only two organizations had the highest volume in the last month and only one of the 29 other organizations had the highest average dollar values in the last month. Does this mean that there was potential revenue recognition fraud for those one or two organizations and no potential for revenue recognition fraud at the other organizations? Of course not on both counts. But it does illustrate that the auditor cannot visually cherry pick potential problem areas. It is going to take deeper data mining to isolate unusual patterns and transactions. 21 VI.
Conclusion Fraud detection has become an increasingly important element of the financial statement audit. There is clear evidence of the importance of journal entries in the conduct of financial statement frauds over the last decade, with one of the most egregious being WorldCom. It is hardly surprising, then, that a key element in recent professional developments in increasing the fraud detection requirements in the financial statement audit has been significantly heightened requirements to assess the controls on journal entries and to conduct substantive tests thereon. Unfortunately, research on data mining journal entries from a fraud detection perspective is essentially a null set. In this paper, we canvass a number of perspectives on such data mining. The nature and form of the population of journal entries posted to the general ledger in computerized accounting information systems is a function of several technological and entity‐level characteristics. In a modern ERP system, journal entries will be highly granular—even atomic. In more traditional accounting information systems, general ledger journal entries may be highly aggregated where the general ledger will receive summarized journals from subsidiary systems. These summarized journal entries will capture information with a very different profile than in an ERP system. Journal entries will flow from a variety of other systems and business processes. Journal entries may flow from consolidation systems, from automated or semi‐automated general ledger, and from manual entries. Data mining approaches must be sufficiently flexible to accommodate these different data structures and flows. There is a clear and pressing need for research on a variety of interrelated areas in data mining journal entries. Data mining journal entries must bring together five characteristics, viz (a) amount, (b) chart of accounts code to establish impact on the general ledger, (c) source of the journal entry, (d) control characteristics surrounding the individual journal entry and (e) opening and, by extension, closing, general ledger balances. The biggest impediment to doing research in data mining of journal entries is getting access to one or more real‐world journal entry data sets. For this project, we had access to 36 different data sets, of which 29 were appropriate for our initial analysis. There are potentially many more data mining techniques that could be applied to this data set, however, our initial pilot test of the 29 sets of journal entries did bring up some interesting preliminary findings, including: 22 •
For all 29 entities we tested, the Chi‐square distribution indicates that the first digits of journal dollar amounts differs from that expected by Benford's Law. If, on one hand, we assume that Benford's law should apply to journal entries, these variations means the auditors would have a tremendous number of red flags to investigate. On the other hand, Benford’s law builds on certain assumptions about underlying data, so, further research is needed to explore whether or how journal entries violate one or more of those assumptions. •
Professional guidance recommends identifying journal entries that contain round numbers or a consistent ending number. Unlike first digits, which are expected to have a logarithmic distribution, the last digits would be expected to have a uniform distribution. Our test found that the distribution was definitely not uniform for many of the entities. Eight of the 29 entities had one of the fourth digits being three times more than expected. However, there could be situations in organizations that make some numbers appear more often, which would have to be identified by the auditors. •
Since investigating false positives could be expensive for the auditors, auditors will have to develop and select audit methodologies appropriate to the characteristics of the journal entries. We compared the number of accounts related to the top‐five most‐frequently occurring three last digit combinations. Of the 29 entities, four entities had a very high occurrences of the top‐five three‐digit combination that involved only a small set of accounts, one had a low occurrences of the top‐five three‐digit combination that involved a large set of accounts, and 24 had a low occurrences of the top‐five three‐digit combination that involved a small set of accounts. In general, all else being equal, the first four firms probably pose the highest risk of fraud for the auditors since they had a very high number of rounded number or consistent number transactions and they are posted to just a few accounts which could indicate that the fraudster is covering up or falsifying a particular class of transactions. •
In term of general patterns of transaction volumes, there did not appear to be any. We expected to see increases at quarter end or year, but we did not find consistent examples of this in our 29 entities. 23 Our initial analysis of the 29 journal entry data sets just begins the potential analysis of these data sets. In the future, we expect to apply many more data mining techniques to discover other patterns and relationships in the data sets. We also want to start seeding the dataset with fraud indicators (e.g., pairs of accounts that would not be expected in a journal entry) and compare the sensitivity of the different data mining techniques to find these seeded indicators. 24 VII.
References Arya, A., J. Fellingham, and D. A. Schroeder. 2000. Accounting Information, Aggregation, and Discriminant Analysis. Management Science 46 (6):790‐806. ASB. 2003. Statement on Auditing Standards No. 99 Consideration of Fraud in a Financial Statement Audit. New York: Auditing Standards Board, American Institute of CPAs, 26. ———. 2006a. Statement on Auditing Standards No. 109 Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement. New York: Auditing Standards Board, American Institute of CPAs, 46. ———. 2006b. Statement on Auditing Standards No. 110 Performing Audit Procedures in Response to Assessed Risks and Evaiuating the Audit Evidence Obtained. New York: Auditing Standards Board, American Institute of CPAs, 26. Benford, F. 1938. The law of anomalous numbers. Proceedings of the American Philosophical Society 78:551‐572. Beresford, D. R., N. d. Katzenbach, and C. B. Rogers. 2003. Report of Investigation by the Special Investigative Committee of the Board of Directors of WorldCom, Inc. Clinton, Miss.: WorldCom, Inc., 345. CAQ. 2008. Practice Aid for Testing Journal Entries and Other Adjustments Pursuant to AU Section 316. Washington, DC: Center for Audit Quality. Cheng, M.‐Y., and P. Hall. 1998. Calibrating the excess mass and dip tests of modality. Journal of the Royal Statistical Society, Series B 60 (3):579‐589. Cleary, R., and J. C. Thibodeau. 2005. Applying Digital Analysis Using Benford's Law to Detect Fraud: The Dangers of Type I Errors. Auditing 24 (1):77‐81. DeVries, D. D., and J. E. Kiger. 2004. Journal entries and adjustments ‐ your biggest fraud danger. Journal of Corporate Accounting & Finance 15 (4):57‐62. Diekmann, A. 2007. Not the First Digit! Using Benford's Law to Detect Fraudulent Scientific Data. Journal of Applied Statistics 34 (3):321‐329. Geyer, C. L., and P. P. Williamson. 2004. Detecting Fraud in Data Sets Using Benford's Law. Communications in Statistics: Simulation & Computation 33 (1):229‐246. Guan, L., D. He, and D. Yang. 2006. Auditing, integral approach to quarterly reporting, and cosmetic earnings management. Managerial Auditing Journal 21 (6):569‐581. Hales, D. N., V. Sridharan, A. Radhakrishnan, S. S. Chakravorty, and S. M. Siha. 2008. Testing the accuracy of employee‐reported data: An inexpensive alternative approach to traditional methods. European Journal of Operational Research 189 (3):583‐593. Hartigan, J. A., and P. M. Hartigan. 1985. The dip test of unimodality. Annals of Statistics 13 (1):70‐84. 25 Hill, T. 1995. A statistical derivation of the significant‐digit law. Statistical Science 10 (4):354‐363. IAASB. 2007a. International Standard on Auditing 240 ‐ The Auditor’s Responsibility to Consider Fraud in an Audit of Financial Statements. In Handbook Of International Auditing, Assurance, and Ethics Pronouncements. New York: International Auditing and Assurance Standards Board, International Federation of Accountants, 268‐313. ———. 2007b. International Standard on Auditing 330 ‐ The Auditor’s Procedures In Response To Assessed Risks. In Handbook Of International Auditing, Assurance, and Ethics Pronouncements. New York: International Auditing and Assurance Standards Board, International Federation of Accountants, 398‐419. Ijiri, Y. 1975. Theory of Accounting Measurement. Sarasota, Fla: American Accounting Association. Ijiri, Y., and E. Kelly. 1980. Multidimensional accounting and distributed databases: their implications for organizations and society. Accounting, Organizations and Society 5 (1):115‐123. Janvrin, D., and J. Morrison. 2000. Using a structured design approach to reduce risks in end user spreadsheet development. Information & Management 37 (1):1. Leech, S. A. 1986. The Theory and Development of a Matrix‐Based Accounting System. Accounting & Business Research 16 (64):327‐341. Ley, E. 1996. On the peculiar distribution of the U.S. stock indexes’ digits. American Statistician 50:311‐
313. Mattessich, R. 1964. Accounting and Analytical Methods. Homewood, Ill: Richard D Irwin. ———. 2003. Accounting research and researchers of the nineteenth century and the beginning of the twentieth century: an international survey of authors, ideas and publications. Accounting, Business & Financial History 13 (2):125‐170. Nigrini, M. J. 2000. Digital Analysis Using Benford's Law: Tests Statistics for Auditors. Second ed. Vancouver: Global Audit Publications. Nigrini, M. J., and L. J. Mittermaier. 1997. The use of Benford's Law as an aid in analytical procedures. Auditing: A Journal of Practice & Theory 16 (2):52‐67. Panko, R. R. 2006. SPREADSHEETS AND SARBANES‐‐OXLEY: REGULATIONS, RISKS, AND CONTROL FRAMEWORKS. Communications of AIS 2006 (17):2‐50. PCAOB. 2002. AU Section 316‐Consideration of Fraud in a Financial Statement Audit [URL]. Public Company Accounting Oversight Board Cited: 20 October 2006. Available from http://pcaob.org/standards/interim_standards/auditing_standards/index_au.asp?series=300&s
ection=300. POB. 2000. Report and Recommendations of the Panel on Audit Effectiveness. New York: American Institute of Certified Public Accountants, Public Oversight Board. 26 PricewaterhouseCoopers. 2004. The Use of Spreadsheets: Considerations for Section 404 of the Sarbanes‐Oxley Act. New York, NY: PricewaterhouseCoopers LLP, 9. Sampson, W. C., and M. J. Olan. 1992. An Innovation in Auditing: Matrix Summaries of Journal Entries. Abacus 28 (2):133‐141. Skousen, C. J., L. Guan, and T. S. Wetzel. 2004. Anomalies and Unusual Patterns in Reported Earnings: Japanese Managers Round Earnings. Journal of International Financial Management & Accounting 15 (3):212‐234. Tackett, J. A. 2007. Digital analysis: A better way to detect fraud. Journal of Corporate Accounting and Finance 18 (4):27‐36. Wallace, W. 2000. Reporting practices: potential lessons from Cendant Corporation. European Management Journal 18 (3):328‐333. Weld, L. G., P. M. Bergevin, and L. Magrath. 2004. Anatomy of a Financial Fraud: A Forensic Examination of HealthSouth. CPA Journal (October):44‐49. Willkie Farr & Gallagher, and Arthur Andersen LLP. 1998. Report to the Audit Committee of the Board of Directors of Cendant Corporation, August 24 Cited: 1 September 2007. Available from http://www.secinfo.com/dsvrn.71H3.htm. Woods, L. 2002. In Confidence [Email], 26 April Cited: 10 May 2007. Available from http://news.findlaw.com/hdocs/docs/worldcom/worldcom062602cstemail.pdf. 27 Batch
Description
Reference
Batch
Source
Authority
Approval
Audit Number
Language
Entry Header
CreationDate
Creator
Identifier
Entry Detail
Sign
Line Number
Comment
Figure 1: Data Structure of Journal Entry vii 28 Deviation from Expected
Figure 2: Patterns of Journal Entry Values vs. Number of Accounts Involved Figure 3: Proportion of Journal Entries to Number of Accounts All Items 29 Figure 4: Proportion of Journal Entries to Number of Accounts Line Items Greater Than $1,000 Figure 5: Proportion of Journal Entries to Number of Accounts Journal Entries Greater Than $1,000 30 Beta Eta MuXi Figure 6. Monthly distributions of monthly journal entry line volume and dollar volume for a sample of organizations. 31 Minimum 43
Maximum Active Accounts 1036
Median Active Accounts 107
Average Active Accounts 164
Table 1: Active Accounts in Organizational Chart of Accounts Minimum 1
Maximum 44,916
Median 86
Mean 1,401
Standard Deviation 4,784
Table 2: Transactions Per Five Digit Accounts in Master Chart of Accounts 32 (3) (4) (1) (2) Entity Number Line Items Total Line Items Maximum Line Item $ $(000) Beta 40,617 $240,221
$2,927,854 Chi 153,800 $60,889,933
$250,650,816 ChiEta 18,572 $13,396,011
$362,478,016 ChiNu 2,421 $27,374
$653,316 ChiPi 4,871 $2,712
$495,667 Delta 29,866 $78,716
$393,500 Eta 689
$215,237
$12,000,000 EtaNu 2,318 $43,517
$489,822 EtaPi 1,445 $23,463
$685,613 Gamma 4,433 $38,823
$618,214 Kappa 7,244 $97,421
$576,281 KappaXi 7,210 $41,261
$464,904 MuXi 11,531 $1,730,387
$24,223,476 Nu 3,182 $6,533
$425,000 Omicron 5,303 $195,516
$12,893,261 Phi 8,410 $19,229,705
$549,332,992 PhiPsi 38,329 $41,455
$663,000 Pi 1,426 $2,140
$34,929 PiNu 6,998 $13,940
$324,012 Psi 3,258 $2,569
$46,710 Rho 4,579 $4,529
$80,000 Sigma 1,378 $840
$19,367 Tau 1,377 $863
$15,353 Theta 1,739 $1,516
$41,785 Upsilon 4,524 $9,337
$129,566 Xi 30,174 $674,415
$9,016,084 XiNu 2,781 $154,551
$1,637,364 XiRo 32,554 $5,381,983
$38,741,784 Zeta 62,638 $11,094,568
$38,232,288 Table 3: Descriptive Statistics for Organizations 33 (1) Entity Beta Chi ChiEta ChiNu ChiPi Delta Eta EtaNu EtaPi Gamma Kappa KappaXi MuXi Nu Omicron Phi PhiPsi Pi PiNu Psi Rho Sigma Tau Theta Upsilon Xi XiNu XiRo Zeta (2) (3) (4) (5) N Mean Std Dev Min 1,097 37
48
2
5,275 29
132
2
2,915 6
8
2
502
5
7
2
2,422 2
1
2
3,576 8
16
2
144
5
7
2
400
6
5
2
484
3
2
2
541
8
13
2
991
7
8
2
2,325 3
4
2
2,790 4
4
2
536
6
10
2
578
9
20
2
1,433 6
6
2
9,215 4
3
2
360
4
4
2
1,086 6
9
2
859
4
6
2
826
6
8
2
491
3
3
2
412
3
2
2
484
4
5
2
864
5
6
2
552
55
73
2
986
3
3
2
2,637 12
36
2
4,682 13
42
2
Table 4: Line Items Per Journal Entry 34 (6) Max 642 1375 66 34 32 318 46 32 14 118 66 45 36 65 387 35 85 44 67 76 58 45 23 34 63 450 21 525 736 Digit Probability Digit 1 30.1% 6 6.7% 2 17.6% 7 5.8% 3 12.5% 8 5.1% 4 9.7% 9 4.6% 5 7.9% Table 5: Expected digit distribution under Benford’s law. 35 Probability Entity Beta Χ2=2911.7 P=0.000 Chi Χ2=272.0 p=0.000 ChiEta Χ2=49.9 p=0.000 ChiNu Χ2=44.5 p=0.000 ChiPi Χ2=618.3 p=0.000 Delta Χ2=180.5 p=0.000 Eta Χ2=46.0 p=0.000 EtaNu Χ2=37.0 p=0.000 EtaPi Χ2=24.4 p=0.001 Gamma Χ2=38.9 p=0.000 Kappa Χ2=81.2 p=0.000 KappaXi Χ2=42.7 p=0.000 MuXi Χ2=134.2 p=0.000 Nu Χ2=523.3 p=0.000 Omicron Χ2=18.9 p=0.015 Phi Χ2=21.8 p=0.005 PhiPsi Data Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count 1 10,366 26% ‐5% 48,726 32% 2% 5,752 31% 1% 795 33% 3% 810 17% ‐13% 9,410 32% 1% 219 32% 2% 662 29% ‐2% 428 30% 0% 1,300 29% ‐1% 2,362 33% 3% 2,249 31% 1% 3,297 29% ‐1% 651 20% ‐10% 1,602 30% 0% 2,475 30% ‐1% 11,705 2 6,348 16% ‐2% 26,660 17% 0% 3,410 18% 1% 460 19% 1% 1,200 25% 7% 4,772 16% ‐2% 103 15% ‐3% 415 18% 0% 258 18% 0% 795 18% 0% 1,188 16% ‐1% 1,298 18% 0% 1,947 17% ‐1% 889 28% 10% 925 17% 0% 1,512 18% 0% 7,057 3 5,182 13% 0% 18,844 12% 0% 2,208 12% ‐1% 249 10% ‐2% 921 19% 6% 3,592 12% 0% 119 17% 5% 253 11% ‐2% 192 13% 1% 564 13% 0% 1,037 14% 2% 813 11% ‐1% 1,311 11% ‐1% 365 11% ‐1% 699 13% 1% 1,131 13% 1% 4,643 4 4,767 12% 2% 14,425 9% 0% 1,694 9% ‐1% 244 10% 0% 531 11% 1% 2,723 9% ‐1% 46 7% ‐3% 218 9% 0% 132 9% ‐1% 425 10% 0% 666 9% 0% 627 9% ‐1% 1,050 9% ‐1% 529 17% 7% 446 8% ‐1% 790 9% 0% 3,937 36 5 5,902 15% 7% 12,252 8% 0% 1,453 8% 0% 225 9% 1% 381 8% 0% 2,755 9% 1% 50 7% ‐1% 221 10% 2% 125 9% 1% 341 8% 0% 513 7% ‐1% 655 9% 1% 1,122 10% 2% 247 8% 0% 418 8% 0% 653 8% 0% 3,422 6 2,290 6% ‐1% 9,656 6% 0% 1,203 6% 0% 147 6% ‐1% 320 7% 0% 1,952 7% 0% 46 7% 0% 124 5% ‐1% 71 5% ‐2% 342 8% 1% 450 6% 0% 489 7% 0% 969 8% 2% 192 6% ‐1% 341 6% 0% 604 7% 1% 2,064 7 2,128 5% ‐1% 8,118 5% ‐1% 976 5% ‐1% 96 4% ‐2% 228 5% ‐1% 1,559 5% ‐1% 18 3% ‐3% 177 8% 2% 99 7% 1% 205 5% ‐1% 458 6% 1% 359 5% ‐1% 624 5% 0% 111 3% ‐2% 298 6% 0% 472 6% 0% 2,072 8 1,984 5% 0% 8,008 5% 0% 898 5% 0% 116 5% 0% 220 5% ‐1% 1,594 5% 0% 55 8% 3% 126 5% 0% 53 4% ‐1% 200 5% ‐1% 319 4% ‐1% 398 6% 0% 615 5% 0% 97 3% ‐2% 301 6% 1% 367 4% ‐1% 2,049 9 1,650 4% ‐1% 7,111 5% 0% 958 5% 1% 89 4% ‐1% 260 5% 1% 1,509 5% 0% 32 5% 0% 122 5% 1% 87 6% 1% 261 6% 1% 251 3% ‐1% 322 4% 0% 573 5% 0% 101 3% ‐1% 273 5% 1% 385 5% 0% 1,380 Χ2=275.6 p=0.000 Pi Χ2=20.3 p=0.009 PiNu Χ2=86.2 p=0.000 Psi Χ2=50.9 p=0.000 Rho Χ2=28.0 p=0.000 Sigma Χ2=84.6 p=0.000 Tau Χ2=24.4 p=0.001 Theta Χ2=47.7 p=0.000 Upsilon Χ2=72.1 p=0.000 Xi Χ2=394.2 p=0.000 XiNu Χ2=20.6 p=0.008 XiRo Χ2=326.0 p=0.000 Zeta Χ2=222.5 p=0.000 Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff 31% 0% 450 32% 1% 2,041 29% ‐1% 1,023 31% 1% 1,286 28% ‐2% 390 28% ‐2% 417 30% 0% 560 32% 2% 1,241 27% ‐3% 8,864 29% ‐1% 854 31% 1% 9,118 28% ‐2% 17,809 28% ‐2% 18% 1% 251 18% 0% 1,322 19% 1% 567 17% 0% 856 19% 1% 174 13% ‐5% 232 17% ‐1% 323 19% 1% 915 20% 3% 5,221 17% 0% 499 18% 0% 6,254 19% 2% 10,841 17% 0% 12% 0% 167 12% ‐1% 1,047 15% 2% 377 12% ‐1% 628 14% 1% 185 13% 1% 192 14% 1% 181 10% ‐2% 491 11% ‐2% 3,102 10% ‐2% 319 11% ‐1% 4,333 13% 1% 8,815 14% 2% 10% 1% 116 8% ‐2% 684 10% 0% 366 11% 2% 464 10% 0% 151 11% 1% 146 11% 1% 221 13% 3% 432 10% 0% 3,079 10% 1% 254 9% ‐1% 3,199 10% 0% 6,193 10% 0% 9% 1% 129 9% 1% 524 7% 0% 309 9% 2% 390 9% 1% 127 9% 1% 126 9% 1% 154 9% 1% 344 8% 0% 2,611 9% 1% 248 9% 1% 3,111 10% 2% 5,267 8% 0% 5% ‐1% 122 9% 2% 332 5% ‐2% 171 5% ‐1% 254 6% ‐1% 160 12% 5% 88 6% 0% 86 5% ‐2% 285 6% 0% 1,741 6% ‐1% 147 5% ‐1% 2,130 7% 0% 4,233 7% 0% 5% 0% 65 5% ‐1% 399 6% 0% 145 4% ‐1% 250 5% 0% 64 5% ‐1% 57 4% ‐2% 74 4% ‐2% 304 7% 1% 2,196 7% 1% 180 6% 1% 1,800 6% 0% 3,692 6% 0% Table 6. Observed digit distributions in journal entry database. 37 5% 0% 71 4% ‐1% 55 5% 0% 354 5% 0% 134 4% ‐1% 244 5% 0% 55 4% ‐1% 45 3% ‐2% 74 4% ‐1% 232 5% 0% 1,644 5% 0% 161 6% 1% 1,427 4% ‐1% 3,071 5% 0% 4% ‐1% 295 4% 0% 166 5% 1% 207 5% 0% 72 5% 1% 74 5% 1% 66 4% ‐1% 280 6% 2% 1,716 6% 1% 119 4% 0% 1,182 4% ‐1% 2,717 4% 0% Entity Beta Χ2=881.8 p=0.000 Chi Χ2=606.8 p=0.000 ChiEta Χ2=613.3 p=0.000 ChiNu Χ2=756.7 p=0.000 ChiPi Χ2=44.65 p=0.000 Delta Χ2=334.1 p=0.000 Eta Χ2=449.4 p=0.000 EtaNu Χ2=3483..9 p=0.000 EtaPi Χ2=39.8 p=0.000 Gamma Χ2=1651.4 p=0.000 Kappa Χ2=956.6 p=0.000 KappaXi Χ2=926.7 p=0.000 MuXi Χ2=2986.2 p=0.000 Nu Χ2=28.3 p=0.000 Omicron Χ2=49.3 p=0.000 Phi Χ2=468.2 p=0.000 PhiPsi Χ2=4901.3 Digit Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count 0 1,685 19% 9% 9,923 12% 2% 1,875 17% 7% 400 34% 24% 16 18% 8% 1,500 15% 5% 198 36% 26% 870 54% 44% 63 13% 3% 696 38% 28% 823 1 741 9% ‐1% 7,728 10% 0% 1,011 9% ‐1% 73 6% ‐4% 18 21% 11% 815 8% ‐2% 60 11% 1% 62 4% ‐6% 33 7% ‐3% 124 7% ‐3% 191 2 769 9% ‐1% 8,234 10% 0% 1,058 10% 0% 100 8% ‐2% 18 21% 11% 1,105 11% 1% 37 7% ‐3% 94 6% ‐4% 73 15% 5% 135 7% ‐3% 295 3 790 9% ‐1% 7,861 10% 0% 944 9% ‐1% 98 8% ‐2% 7 8% ‐2% 870 9% ‐1% 37 7% ‐3% 75 5% ‐5% 54 11% 1% 85 5% ‐5% 282 4 778 9% ‐1% 7,539 9% ‐1% 1,005 9% ‐1% 100 8% ‐2% 6 7% ‐3% 1,031 10% 0% 40 7% ‐3% 65 4% ‐6% 34 7% ‐3% 114 6% ‐4% 214 5 819 9% ‐1% 8,366 10% 0% 1,110 10% 0% 75 6% ‐4% 1 1% ‐9% 863 8% ‐2% 57 10% 0% 79 5% ‐5% 43 9% ‐1% 169 9% ‐1% 398 6 744 9% ‐1% 7,700 10% 0% 1,033 9% ‐1% 121 10% 0% 8 9% ‐1% 1,081 11% 1% 18 3% ‐7% 114 7% ‐3% 36 7% ‐3% 101 6% ‐4% 329 7 679 8% ‐2% 7,862 10% 0% 941 8% ‐2% 57 5% ‐5% 9 10% 0% 1,005 10% 0% 56 10% 0% 137 8% ‐2% 49 10% 0% 154 8% ‐2% 272 8 857 10% 0% 7,327 9% ‐1% 1,016 9% ‐1% 82 7% ‐3% 1 1% ‐9% 1,017 10% 0% 21 4% ‐6% 57 4% ‐6% 70 14% 4% 105 6% ‐4% 259 9 809 9% ‐1% 7,659 10% 0% 1,093 10% 0% 88 7% ‐3% 3 3% ‐7% 919 9% ‐1% 26 5% ‐5% 64 4% ‐6% 44 9% ‐1% 132 7% ‐3% 192 Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % 25% 15% 688 26% 16% 1,566 34% 24% 35 18% 8% 278 14% 4% 1,179 18% 8% 1,996 38% 6% ‐4% 120 4% ‐6% 316 7% ‐3% 19 10% 0% 159 8% ‐2% 610 9% ‐1% 281 5% 9% ‐1% 223 8% ‐2% 351 8% ‐2% 27 14% 4% 221 11% 1% 583 9% ‐1% 401 8% 9% ‐1% 180 7% ‐3% 352 8% ‐2% 12 6% ‐4% 197 10% 0% 607 9% ‐1% 369 7% 7% ‐3% 174 7% ‐3% 303 7% ‐3% 14 7% ‐3% 190 9% ‐1% 588 9% ‐1% 300 6% 12% 2% 395 15% 5% 451 10% 0% 24 13% 3% 199 10% 0% 696 11% 1% 735 14% 10% 0% 191 7% ‐3% 326 7% ‐3% 10 5% ‐5% 218 11% 1% 579 9% ‐1% 341 6% 8% ‐2% 289 11% 1% 329 7% ‐3% 17 9% ‐1% 178 9% ‐1% 640 10% 0% 290 6% 8% ‐2% 229 9% ‐1% 324 7% ‐3% 13 7% ‐3% 175 9% ‐1% 552 8% ‐2% 322 6% 6% ‐4% 185 7% ‐3% 288 6% ‐4% 21 11% 1% 208 10% 0% 591 9% ‐1% 225 4% 38 p=0.000 Pi Χ2=32.5 p=0.000 PiNu Χ2=71.1 p=0.000 Psi Χ2=377.3 p=0.000 Rho Χ2=168.7 p=0.000 Sigma Χ2=53.1 p=0.000 Tau Χ2=98.1 p=0.000 Theta Χ2=47.3 p=0.000 Upsilon Χ2=68.1 p=0.000 Xi Χ2=221.4 p=0.000 XiNu Χ2=4774.4 p=0.000 XiRo Χ2=705.6 p=0.000 Zeta Χ2=1802.9 p=0.000 Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count Act % Diff Count 28% 43 11% 1% 206 16% 6% 195 33% 23% 179 23% 13% 13 7% ‐3% 30 13% 3% 52 13% 3% 142 13% 3% 1,809 13% 3% 1,076 58% 48% 3,461 ‐5% 31 8% ‐2% 110 9% ‐1% 43 7% ‐3% 63 8% ‐2% 16 9% ‐1% 8 3% ‐7% 37 9% ‐1% 102 10% 0% 1,243 9% ‐1% 77 4% ‐6% 2,381 ‐2% 31 8% ‐2% 94 7% ‐3% 39 7% ‐3% 91 12% 2% 6 3% ‐7% 9 4% ‐6% 30 8% ‐2% 86 8% ‐2% 1,393 10% 0% 81 4% ‐6% 2,212 ‐3% 65 17% 7% 141 11% 1% 36 6% ‐4% 89 11% 1% 22 13% 3% 63 26% 16% 25 6% ‐4% 68 6% ‐4% 1,423 10% 0% 87 5% ‐5% 2,171 ‐4% 29 8% ‐2% 130 10% 0% 54 9% ‐1% 57 7% ‐3% 11 6% ‐4% 26 11% 1% 56 14% 4% 98 9% ‐1% 1,223 9% ‐1% 76 4% ‐6% 2,246 4% 28 7% ‐3% 119 9% ‐1% 30 5% ‐5% 49 6% ‐4% 14 8% ‐2% 18 8% ‐2% 34 9% ‐1% 73 7% ‐3% 1,350 10% 0% 101 5% ‐5% 2,552 ‐4% 40 11% 1% 101 8% ‐2% 39 7% ‐3% 72 9% ‐1% 40 23% 13% 33 14% 4% 67 17% 7% 154 15% 5% 1,561 11% 1% 110 6% ‐4% 2,212 ‐4% 44 12% 2% 110 9% ‐1% 37 6% ‐4% 65 8% ‐2% 15 9% ‐1% 17 7% ‐3% 34 9% ‐1% 102 10% 0% 1,352 10% 0% 70 4% ‐6% 1,902 ‐4% 27 7% ‐3% 116 9% ‐1% 73 13% 3% 45 6% ‐4% 10 6% ‐4% 13 5% ‐5% 34 9% ‐1% 99 9% ‐1% 1,166 8% ‐2% 84 5% ‐5% 2,179 ‐6% 42 11% 1% 129 10% 0% 38 7% ‐3% 77 10% 0% 29 16% 6% 21 9% ‐1% 23 6% ‐4% 128 12% 2% 1,423 10% 0% 88 5% ‐5% 2,094 Act % Diff Count Act % Diff 15% 5% 6,717 16% 6% 10% 0% 3,912 9% ‐1% 9% ‐1% 3,942 9% ‐1% 9% ‐1% 3,908 9% ‐1% 10% 0% 3,852 9% ‐1% 11% 1% 4,308 10% 0% 9% ‐1% 3,996 10% 0% 8% ‐2% 3,702 9% ‐1% 9% ‐1% 3,859 9% ‐1% 9% ‐1% 3,554 9% ‐1% Table 7. Observed fourth digit distributions in journal entries database. i
The Auditing Standards Board of the AICPA defined financial statement fraud in SAS 99 as: “Misstatements arising from fraudulent financial reporting are intentional misstatements or omissions of amounts or disclosures in financial statements designed to deceive financial statement users where the effect causes the financial statements not to be presented, in all material respects, in conformity with generally accepted accounting principles (GAAP)” (PCAOB 2002). There is a fine line between earnings management and financial statement fraud, but a line that is beyond the scope of this paper. We confine the discussion to deliberate and intentional material misstatements, typically undertaken by one or more members of senior management. ii
This lack of published literature does not mean that the audit firms are not doing any journal entry data mining. Quite the contrary, the firms are deploying data mining technology, but what they are doing is proprietary and, as such, rarely gets published for public consumption. iii
SAS 99 is now an interim audit standard of the Public Company Accounting Oversight Board. 39 iv
Enterprise Resource Planning is a class of enterprise wide application software for tracking financial transactions, logistics, sales, purchases, inventory etc. v
Hyperion is now part of Oracle Inc. vi
This is commonly known as Benford’s Law (Benford 1938) who independently determined relationships found decades previously by Newcomb. vii
Based in part on the XBRL GL taxonomy and SAP R/3 general journal ledger entry. 40 
Download