Comprehensive Analytic Maturity Assessment A Guide to the Approach and Execution of an Analytic Maturity Assessment Written By Michael L. Gonzales, PhD Director of Research and Advanced Analytics Forward By Wayne Eckerson Eckerson Group Dr. Michael Gonzales, Stream Integration, and Wayne Eckerson, Eckerson Group, have teamed up to take the art of assessing a client’s business intelligence (BI), analytics, and data management capabilities to a new level using state of the art software, cloud services, and statistical techniques. This document represents the heart and soul of this effort. ! ! 2015-02-06 Preface I’ve conducted analytic assessments for many years and have seen first-hand their power to help companies understand their strengths and weaknesses and develop a strategy for improvement. More recently, I’ve teamed up with Michael Gonzales, PhD, of Stream Integration to conduct analytic assessments with joint clients. Michael also has a long history with assessments, conducting them for many large corporations, including The Gap, US Postal Service, General Motors, Canada Post, and Dell. Together, we plan to take the art of assessing a client’s business intelligence (BI), analytics, and data management capabilities to a new level using state of the art software, cloud services, and statistical techniques. This document represents the heart and soul of this effort. Assessments to Benchmarks. The real value of an assessment is to give executive leaders a quick and visual representation of how the company compares to its peer competitors. In other words, the best analytic assessments are industry benchmarks. Michael and I, with support from the Eckerson Group, are embarking on a journey to create the world’s biggest database of analytical assessment data, which subscribers can access and analyze using a self-service cloud application. This will enable organizations to continually evaluate their progress against an industry maturity model and a community of peers. Open and Repeatable. This open and online approach represents a new, and refreshing, departure from the analytic assessments offered by most consulting firms, which are typically black-box projects. You pay a lot of money and get results, but you cannot repeat the assessment yourself, and therefore, evaluate your progress over time, unless, of course, you pay the consultancy to repeat the assessment. The assessment described in this document uses a white-box approach with knowledge transfer at the heart of the process. We urge clients to learn to conduct the assessment themselves, either online or on paper, so that they can continue to get value out of the assessment long after we are gone. We hope you find this document valuable and that it will kick start an assessment process in your organization. -Wayne Eckerson, Eckerson Group Comprehensive Analytic Maturity Assessment ! 2 Contents TABLES AND FIGURES Introduction.................................................................................4 Table 1. Assessment Measures ...........................................4 The Value of Assessment.............................................................6 Figure 1. Analytic Maturity Level Model................................5 Performance Metrics...............................................................6 Establish a Repeatable Process...............................................7 Create a Roadmap...................................................................7 Table 2. Maturity Model........................................................5 Figure 2. Sample Dimensions Measured..............................6 Gain Organizational Collaboration and Buy-In..........................8 Table 3. Team Roles and Responsibilities ............................9 The Assessment Team.................................................................8 Table 4. Executive Structured Interview Question...............12 Assessment Design...................................................................10 Figure 3. Executive Survey Technology Word Cloud...........14 Understand the Scope of the Assessment.............................10 Table 5. Sample Plan Weekly Summary.............................19 Assessment Information Channels .........................................10 Identify Survey Questions ......................................................11 Segment Questions to Match Participants.............................11 Table 6. Likert Scales ........................................................22 Figure 4. Executive Structured Interview Sample................23 Executive Interviews ..................................................................12 Table 7. Sample Survey Questions....................................24 Structured Interview Instruments...........................................12 Table 8. Sample End User Survey......................................25 Selecting and Scheduling Interview........................................13 Table 9. Sample Project Plan.............................................26 Conducting an Interview........................................................14 Techniques for Analyzing Short Answer and Open Ended Questions.........................................................14 Techniques for Analyzing Executive Survey Responses..........15 SME Survey...............................................................................15 Creating the Survey...............................................................15 Identify SME Participants.......................................................15 Techniques for Analyzing Survey Responses for SMEs..........16 End User Survey........................................................................16 Creating the Survey...............................................................16 Identify End User Participants Using Sampling.......................16 Conducting Pre-Survey Workshop.........................................17 Executing Survey...................................................................18 Techniques for Analyzing End User Survey Responses..........18 Project Plan...............................................................................19 Appendix A – Conducting Surveys and Analyzing Responses....21 Conducting Surveys..............................................................21 Techniques for Analyzing Survey Responses.........................21 Appendix B – Sample of Executive Structured Interview............23 Appendix C – Sample Survey Questions....................................24 Appendix D – Sample End User Survey.....................................25 Appendix E – Sample Project Plan.............................................26 Appendix F - References ...........................................................27 Comprehensive Analytic Maturity Assessment ! ! 3 Introduction Strategy is best created by establishing synergy among a company’s activities. The success of a strategy depends largely on integrating many activities well, as opposed to excelling at only one. Without synergy among activities, no distinctive or sustainable strategy is possible (Porter, 1996). Strategic fit among several activities is, therefore, cornerstone for creating and sustaining a competitive advantage. Interlocking and synchronizing several activities is simply more difficult for competitors to emulate. An organization’s competitive position that is established on a system of multiple activities is more sustainable than one built on a single capability/activity. Establishing a sustainable competitive advantage in analytics, therefore, requires a network of interrelated analytic-centric activities, including: Business Intelligence (BI), Visualization, Data Warehousing (DW), data integration, statistics, and other relevant activities. Companies that know how to leverage their IT resources gain an analytic-enabled competitive advantage (Porter, 1980; Sambamurthy, 2000), which is the basis of Stream Integration’s analytic-enabled competitive advantage research. For the purpose of this paper, the term analytics will represent the comprehensive view that encompasses concepts such as predictive and exploratory analysis, BI, Visualization, DW, and Big Data. The challenge, when creating an analytic strategy, is to identify which activities to focus on. To that end, our research identifies factors of analytic-centric initiatives that significantly contribute to the overall maturity and success of a program (Gonzales, 2012). Building on this research, coupled with extensive practical application of maturity assessments for leading companies, Stream Integration’s Comprehensive Analytic Maturity Assessment (CAMA) creates an index that measures the analytic-enabled competitive maturity of an organization. The constructs of this index and the metrics on which the constructs are quantified are outlined in Table 1. Comprehensive Analytic Maturity Assessment ! Table 1. Assessment Measures Constructs Metrics Sponsorship Leadership Style Organiza4on Planning Funding Value & Risk Risk Investment Value Architecture Infrastructure Data Development Metadata User Competencies Skill Training & Learning Analy4c Team 4 The objective of CAMA is to estimate not only the overall level of analytic maturity within an organization, but to do so while providing guidance and a roadmap to evolve to higher levels of maturity. Once the metrics are calculated and the competitive advantage index is quantified, it is then evaluated against a maturity model. We recommend leveraging the maturity model shown in Figure 1.1 Figure 1. Analytic Maturity Level Model Therefore, Stream Integration’s approach to measuring the analytic maturity level of an organization is based on two models: 1. The first model provides the means to quantify the analytic-enabled competitive advantage index. 2. The second model then applies that index to a best-practice maturity model in order to categorize the maturity level within widely accepted, prescriptive, maturity stages. Table 2 provides context for each of the intersections between the assessment measures and maturity levels. Table 2. Maturity Model Nonexistent Preliminary Repeatable Managed Optimized Leadership Infrastructure Value & Risk Skill No formal sponsorship or leadership Spreadmarts Cost Centre Self-taught and dependent on tacit knowledge transfer Disparate data marts Tactical Departmental support with limited sponsorship Departmental leadership Multiple, non-integrated and sponsorship Enterprise sponsorship data warehouses Enterprise data and leadership held accountable warehousing with dependent data marts Self-service and timely C-level support with strong accountability BI and advanced analytics Mission critical Some vendor-based training Formal corporate training and support Strategic Highly skilled user segments Competitive differentiator Corporate curriculum tied to carrier path 1Maturity Level Model is based on a TDWI model originally developed by Wayne Eckerson and updated by Michael L. Gonzales, PhD in 2012. Comprehensive Analytic Maturity Assessment ! 5 The Value of Assessment Many organizations that invest in an analytic-centric assessment do so in order to establish an unbiased measurement of their organization and, more specifically, of their analytic program. And while this is an important task for any company focused on an analytic-enabled competitive advantage, it is only a fraction of the value that can be mined from this investment. There are four other success factors that should be leveraged in order to maximize your return on investment (ROI), including: 1. Establish performance metrics to measure and monitor your program 2. Periodically conduct the same assessment to measure and monitor progress 3. Create a roadmap for your analytic program improvement and evolution to higher levels of maturity 4. Ensure both business and IT are involved Each is discussed below. Performance Metrics An effective maturity study will measure multiple dimensions of your organization and its analytic program, including: leadership, organizational structure, user competencies, and other points. Each provides key metrics to both measure your current maturity level and monitor the program’s progress. Figure 2. Sample Dimensions Measured Comprehensive Analytic Maturity Assessment ! 6 Establish a Repeatable Process Organizations embark on analytic maturity assessments for several reasons. Whatever your motivation for conducting this type of study, you should not use the assessment results as merely a one-time, snap-shot and then shelve the expensive report. Instead, it should be used as a starting point, a baseline for your analytic strategy. A baseline assumes there are subsequent measurements that will be conducted. To that end, you should establish an assessment schedule. Depending on the volatility of your ecosystem/organization, you may want to conduct the same assessment, internally, once every six to 12 months. Doing so achieves the following: Ability to demonstrate real gains across quantitative points. Contribution to budget elements. If you can demonstrate significant maturity increases over several key metrics, the results will support your argument for budget increases in order to secure more staff, software, hardware, etc. However, if you are going to conduct the same assessment periodically, you must insist on retaining3 the instruments used and methodology applied to arrive at the gauged maturity level. Some assessment services will simply not comply. It is this author’s recommendation that you should not invest in any assessment process that contains black-box calculations/weights that are proprietary to the service provider. Frankly, if you do not have a white-box assessment, one that provides visibility to all aspects of how the assessment is derived, then it is not worth the price you will be asked to pay. Real value from these initiatives is derived when you can internalize the assessment instruments and processes to enable your organization to periodically conduct the assessment. Create a Roadmap Assessments of value will expose a list of opportunities for improvement. But it is important that the opportunities are identified in terms that are actionable. For example, if the assessment informs you that the program is weak, but does not specify which aspects of the program are weak and what can be done to improve them, then the assessment is of little value as nothing is identified as actionable. Actionable insights with clear objectives of how to improve your program should be the objective of a roadmap. A roadmap will provide clear steps to improve your analytic initiatives. For example, if the assessment identified that your organization lacks technology standards, prohibits data exploration, and has no consistent definitions for key reference data, such as customers or products, then the roadmap could specify the creation of a data governance program, technical architecture standards, an exploratory (sandbox) platform, and the implementation of a customer Master Data Management (MDM) program, including the steps necessary to achieve each objective. Comprehensive Analytic Maturity Assessment ! 7 Gain Organizational Collaboration and Buy-In An effective assessment project provides an excellent opportunity for an organization to foster collaboration between business and IT. An effective assessment project provides an excellent opportunity for an organization to foster collaboration between business and IT. When selecting members for the assessment team, there must be some members from the assessment firm and others from your organization. And of those representing your organization, you should select individuals associated with the business and IT sides of your company. This means that not only is your company actively involved in the assessment process but you’ve also ensured that business and IT are focused on an unbiased assessment of your company, from both dimensions. Some assessments are sponsored by business, often because they feel IT has not been responsive. And sometimes an assessment is sponsored by IT in order to measure their internal capabilities to deliver BI or to serve as a means to make arguments for more funding. This author has found that the most effective method for conducting enterprise assessments that provide clear, unbiased findings is to involve both business and IT. Read more about the assessment team in the Assessment Team section of this paper. The Assessment Team When conducting a maturity assessment, it is important to keep in mind the following: 1. Your organization must not only actively participate in the initial assessment process, but must also learn how to conduct the assessment for subsequent progress reports. 2. The assessment represents a great opportunity for collaboration between business and IT. These two points dictate what you should expect to contribute to the process and the role the consulting firm that is providing the assessment process is to play. From the numerous assessments this author has conducted, there are at least eight key groups that can contribute to the overall effort as defined in Table 3. The number of groups you have involved in the effort will be determined by the scope of your study. Comprehensive Analytic Maturity Assessment ! 8 Table 3. Team Roles and Responsibilities Group Role Responsibility Consulting Firm Lead Assessment Process Provide the necessary plan, process and related instruments to conduct the assessment. This is true for the 1st time you conduct the assessment. Subsequent assessments can be conducted without outside consultants. Analytic/IT Stakeholders Assessment Team Partners Business Stakeholders Assessment Team Partners Executives Interview Participants Analytic Business Subject Matter Experts (SMEs) Survey Participants Analytic Technical SMEs Survey Participants Non-Managerial Users Survey Participants 3rd Party Experts Ranking/Matching Participants With the guidance from the consulting firm, analytic/IT stakeholders must be activity engaged in the assessment process. It is important that these individuals have intimate knowledge of the analytic program. Same as analytic/IT Stakeholders. Make time available on your schedules to give a 1 to 2 hour interview. Be frank and as complete as possible in your responses. Business SMEs represent those individuals that are constantly attempting to leverage the analytic-centric ecosystem of your organization. Same as Analytic Business SMEs. These individuals should represent the broad user communities that are consumers of your BI ecosystem. While they are not experts, they do have to use the results of the analytic environment. Between 2 to 5 external experts should be identified to participate in the assessment. Enterprise efforts whose goal is to accurately measure level of maturity for the organization will likely leverage all the groups defined in Table 3. Smaller or more focused assessments may choose to use only those groups most relevant. Comprehensive Analytic Maturity Assessment ! 9 Assessment Design You should insist that the assessment consulting company you hire provides an assessment design, including methodology and instruments. However, you can create your own assessment as outlined in this section or at the very least, measure the type of assessment your consulting team plans to execute. Understand the Scope of the Assessment Not all analytic assessments are the same. This author has worked on assessments where sponsors would only allow a few executives to be interviewed and no end users were to participate. Other clients have opened their entire organization with the objective of gaining unbiased, enterprise perspective of the analytic program. The scope, however, will dictate the questions to ask and the type of instruments to be created. Assessment Information Channels Depending on the scope of your study, there are several channels of information that you can garner for calculating the final results. These channels are consistent with the groups you have chosen to include in the study as defined in Table 3 of the Assessment Team section of this paper. For brevity, outlined below are those key channel participants, each representing a particular dimension of the user experience of your analytic program, including: Executive Interviews. These represent a key channel of information for your assessment. Their unique perspective must be collected and recorded in a manner consistent with their position. Consequently, structured interviews are the only effective way to document their perspective as defined in the Executive Interviews section of this paper. Key Subject Matter Expert Survey. There are a few experts that we invite to participate in the assessment. Specifically, these groups are associated with the following: Business. Include senior analysts and data scientists. IT. Include both technical and data architecture. 3rd Party. Experts in the analytic space. Non-managerial User Survey. For any comprehensive study, end users must be included. They represent the front-line of analytic-centric information consumption. In addition to the interview and survey participants, there are other channels of necessary information to gather for a composite view of the BI program. These include: Application Footprint Inventory Data Architecture Inventory Technology Inventory Combined, these multiple channels of information provide the assessment team with the broadest perspective of analytic-centric activity being implemented and experienced throughout the organization. Comprehensive Analytic Maturity Assessment ! 10 Identify Survey Questions Many assessments offered by leading consulting companies are based only on anecdotal experience. This means that some of the firm’s subject matter experts have decided that the questions are significant when measuring maturity. The problem is that many of these types of questions, while they may seem relevant, may actually not be statistically significant for assessing maturity. For example, this author reviewed 40 questions used by a major provider of BI/DW maturity assessments and found that only 18 of the questions were actually statistically significant. This means that they ask 22 questions that are basically irrelevant. The best questions to include in an assessment are grounded. That means questions that have been proven to be statistically significant in assessing maturity. With relatively simple research you can identify many questions that have been used and are proven important in assessing maturity in previous, relevant studies. Since vendors typically do not publish their question pool, you can research academic studies that will do so. The key is to build a pool of grounded questions that can be leveraged in the subsequent instruments. Refer to Appendix A for more information for conducting and analyzing surveys. Segment Questions to Match Participants Once you have a database of grounded questions, you want to select the questions you plan to include in surveys designed for specific participating groups. As defined in the Assessment Information Channels section, potential participants responding to survey questions include: Executives, BI Business SMEs, BI Technical SMEs, Non-Managerial Users, and 3rd Party Experts. While there will be some questions that are relevant to only a specific group, there are other questions that should be asked of each participating group. For example, a statement like, “The Strategy of our BI program is well aligned with our corporate strategy,” is best answered by executives and business SMEs. But asking non-managerial users to respond to this statement is likely not productive for the simple reason that they may not know what the corporate or BI program strategies are. However, a statement like, “Employees are given access to the information they need to perform their duties,” could be asked to each participating group. For non-managerial users you may want to consider changing the wording slightly. Instead of starting with “Employees are given…,” you may consider using, “I am given….” Repeating key questions between groups provides the research team a means to compare and contrast perspectives of different organizational communities. In the previous example, asking executives, IT SMEs, and non-managerial users to respond to the statement can provide important insight. It is entirely possible that the IT SMEs believe employees get access to all the information they need, whereas, executives might believe that access to data is limited without involving IT. This type of insight provides the assessment team guidance in maturity level and demonstrates a lack of: 1) IT’s understanding of business needs, or 2) Executive’s lack of knowledge of how to access the data. It represents a disconnect between IT and business which always infers a lower level of maturity. Comprehensive Analytic Maturity Assessment ! 11 Executive Interviews Conducted correctly, executive interviews provide valuable insight into the vision and direction of your organization and the impact that information and IT-centric applications have on its ability to compete. The operative word is ‘correctly.’ Many assessment efforts, including those from high-end business consulting firms, conduct executive interviews almost as ad hoc, information trolling efforts. And once complete, all the notes scribed during the interviews are consolidated (assuming a scribe was dedicated to the effort), interviewers review their notes and provide their perspective in a black-box approach in rating issues within the company. This approach reduces executive interviews to anecdotal guidance regarding the challenges facing the organization, but little else. Structured executive interviews are the only professional approach. Structured Interview Instruments The worst use of an executive’s time is conducting an ad hoc interview. Executive interviews must be planned and scripted. To that end, there are three types of questions that must be crafted in a single interview to extract the maximum value of the executive’s time and insight. Table 4. Executive Structured Interview Question Executive Structured Interviews Question Purpose Example Cross-Correlation Study Short Answer Get each executive’s perspective on specific topics relevant to the What are the critical success factors for BI at your firm? Compare and contrast responses between executives from different BUs, FUs, or geographical regions. Open Response Single Answer assessment. Provide a means for each executive to voice their hot issues or re- What else to you think we need Isolate unforeseen issues and to know about BI/IT and its ability compare and contrast the key to serve the mission of this concerns between participating emphasize points raised. company? Ask direct questions that We view BI as a competitive elicit single answer advantage for our company? responses. executives. Ask similar questions to all other participating assessment groups. This gives us insight between what executives think vs. SMEs vs. end users. Shown in Table 4 are the three types of questions that should be a part of any executive interview: Short Answer, Open Response, and Single Answer. The questions provide executives an opportunity to share information, but in a structured, guided format. If your assessment consulting firm suggests a few leading questions are necessary to get an interview session started and then let the executive ramble on and share whatever is on their mind, fire them! Our objective is to gain the insight and perspective of the executive office with regard to BI-centric issues and their impact on the company’s goals and objectives. Moreover, our interview should be conducted in a style that ensures a means to quantify comparisons between executives as well as contrast the executive office with other important user communities. Comprehensive Analytic Maturity Assessment ! 12 To that end, this author has successfully created a structured interview form that is followed to maximize the time with an executive. Refer to Appendix B for an example. The format of any interview follows the question types outlined in Table 4. We start with pertinent, short answer responses to specific questions. The questions will vary between companies based on industry, culture, geographic region, etc., but they are all focused on BI-enabled competitive advantage. After 10 to 15 short answer questions, an open response question is offered. The strategy of asking short answer questions before an open ended question is simple: if your short answer questions are well crafted, the executive will have little to add when given the opportunity to share anything they believe we have missed or that they want to make sure to re-emphasize in the open ended question. This is an important measure of the effectiveness of the structured interview: How many notes are taken when an executive is offered the opportunity to respond to an open ended question? If few notes are taken, you can infer that you are asking a good combination of short answer questions. If the open ended invitation generates significant notes, it means you need to adjust the short response questions. Your goal should be to ask the right combination of short answer questions that minimize open end response notes. The end of your interview should be a series of single response questions. In other words, questions that require one response, typically on a Likert4 scale, e.g. Strongly Disagree, Disagree, Uncertain, Agree, or Strongly Agree. This list of questions must be similar to questions on surveys for SMEs and end users. This will allow the assessment team to quantify, compare, and contrast how the executives respond versus other key user communities. Selecting and Scheduling Interview Interviews are the most time consuming information channel of an assessment. Not in terms of the interview itself, but in terms of scheduling. Since an executive’s time is in demand, getting on their schedule is a challenge in itself. Consequently, it stretches out the time it will likely take to interview all participating executives. This is exacerbated when assessments are conducted around holidays or during summer vacation season. Do not underestimate the scheduling challenges. In order to minimize the scheduling issue, some assessment teams attempt to conduct workshops where several executives are asked to participate. This author believes workshops are ill conceived ideas and a poor substitute when attempting to gather insight from the executive office. Not only do the executives have competing agendas that must be properly documented, but each represents a particular aspect of the organization, functional and business unit alike. Their perspectives must be accorded the attention and documentation warranted from such a unique, single individual. Knowing the potential problem of scheduling time with executives, the assessment team must emphasize the following: 1. Have a properly built structured interview instrument to maximize the limited time in front of an executive. 2. Be judicious in selecting the executives to be interviewed. Driven by company politics, non-C level individuals are often added to the interview list. Consequently, assessment teams interview far too many ‘managers’ as opposed to strictly executives. As long as the number of interviews does not adversely affect the assessment timeline, adding non C-level individuals may not be a problem--but be selective. Teams should limit face-to-face interviews with C-level executives, which are few even in the largest organizations. Comprehensive Analytic Maturity Assessment ! 13 Conducting an Interview There are at least two individuals necessary from the team for each interview: an interviewer and a scribe. Armed with your structured interview, the interviewer sets the pace and guides the executive through the questions and the executive’s responses. If the interviewer asks a question that requires a short response, the interviewer must reserve the right to guide the executive back to the question at hand if they veer off topic. The interviewer must attempt to get responses for all questions of the structured instrument. It is the scribe’s role to properly document responses for each question and, where possible, to document relevant follow-up questions/answers between the interviewer and executive. Techniques for Analyzing Short Answer and Open Ended Questions Most executive interviews are conducted using open ended or short answer questions that create free form text. Generally a scribe is assigned to the interview team in order to take notes or the interviews are recorded for later transcription. Put simply: interviews generate significant text. And when interviews are conducted without a means for quantifying the results, this author believes they are essentially a waste of time. Any insight gleaned will be anecdotal at best and require a black-box process to score. It is always recommended that you conduct executive interviews with a clear understanding of how their free form responses will be scored and incorporated into the overall assessment. From this author’s perspective, interviews should be designed and information collected with the following method in mind: Text Mining for key words and phrases. Text mining provides an effective means of quantifying free form responses. Text mining processes can range from simple word clouds (see Figure 3) to semantic examination. This type of mining technique allows the assessment team to quantify the following: An individual executive’s key concerns across all subjects queried Compare and contrast executives and their functional and business units Overall executive perspective per short answer question. For example, Figure 3 shows a word cloud with regard to the technology used to analyze data across all executives interviewed Figure 3. Executive Survey Technology Word Cloud Comprehensive Analytic Maturity Assessment ! 14 Techniques for Analyzing Executive Survey Responses The single response survey question section of a structured executive interview will be designed similar to those survey questions on the SME and end user surveys. Doing so allows your assessment team to identify a pool of questions to be asked across all participating survey communities, including executives, SMEs, and end users. Asking similar questions across the communities provides a unique assessment perspective; specifically, are there differences in responses between participating communities? For example, do executives believe end users have access to all the data they need versus the end user perspective? Single response questions should be scored and compared between the following respondent groups: Individually between executives and their functional and business units As an aggregate of ‘executive’ scores compared to SMEs and end user communities For more information about survey design and analysis, please refer to Appendix A. SME Survey The SME survey will always be the most comprehensive. This user community represents those with the most intimate knowledge of the analytic ecosystem, the data, and the scope of analysis. Consequently, surveys crafted for this group should be comprehensive, covering a broad range of topics. Creating the Survey This author recommends executing online surveys. Refer to Appendix A for more information about how to create and execute surveys. Appendix C provides an example of a detailed SME survey. The SME survey will contain the entire scope of single response and short answer questions. From this survey, subsets of questions are duplicated and reworded for the other survey audiences. For example, single response questions about leadership and executive perspective of the BI program for SMEs are duplicated in the structured interview for executives. And questions asked about users getting the information and support they need are duplicated in the end user surveys. Consequently, the SME survey must represent the scope of relevant questions that are correlated between all survey participants. It not only represents the entire pool of single and short response questions, but serves as the point of rationalization of questions to ask across all channels of the assessment. Identify SME Participants The assessment team should select the participants for the SME survey. Since this is a comprehensive questionnaire, it can only be submitted to individuals intimate with many aspects of the BI program. Consequently, the audience will be small, less than 20. Many of the assessment team members from the organization will likely be candidates for this survey. Comprehensive Analytic Maturity Assessment ! 15 Techniques for Analyzing Survey Responses for SMEs Since the SMEs participating in the survey are considered your organization’s experts regarding BI, data warehousing, and analytics, they are uniquely qualified to respond to more detailed types of questions and longer surveys. A SME survey often covers the entire scope of dimensions being assessed (refer to Table 1), including: Leadership, Value & Risk, Infrastructure, and Skill. While the detail and number of questions being asked to SMEs is different than those single responses included in the executive and end user surveys, how questions are constructed and scored are the same. For more information regarding the design and analysis of survey questions, refer to Appendix A. End User Survey End user surveys are often referred to as non-management or front-end user surveys. These participants represent the consumers of analytic programs (or “the analytic program”). This user community can help assess the value of the reports (in terms of content, quality, and timeliness) they receive as well as the provided training and support. Creating the Survey This author recommends executing online surveys; refer to Appendix A for more information about how to create and execute surveys. Appendix D provides an example of an end user survey. The content of this survey must be reduced to a handful of questions that you expect end user communities to be able to answer. While these questions are duplicates of specific questions in the SME survey, they must be selected and reworded for easy understanding of any end user consumer of BI-centric reporting or analysis. There are unique considerations necessary when crafting end user surveys, specifically regional diversity, language and word choices, and legal constraints. To start, this author highly recommends that nonmanagement surveys are approved by the legal or compliance department. Next, it is important for the assessment team to consider language translation and what the word choices might invoke when crafting final questions. Identify End User Participants Using Sampling This is unique to non-management surveys. Since end users potentially represent the largest consumers of analytic-centric output, it is often difficult or impractical to survey the entire population. For example, your program might have 20,000 registered users across multiple global regions and encompass several business and functional units. Consequently, the assessment team must leverage statistical methods to ensure a representative group of this large community is identified to participate. Comprehensive Analytic Maturity Assessment ! 16 To that end, this author recommends that the assessment team select participants based on the following steps: 1. Company team members must identify the total population of non-management users who are consumers of the analytic program output. For example, we need to identify those who receive one or more of the reports produced by the analytic program. Or, the program publishes a sales dashboard to 5,000 sales representatives, covering three geographic regions including North America, Asia, and Europe, and encompassing our manufacturing and distribution business units. 2. Once a total population of non-managerial consumers of analytic-centric output is quantified, a stratified random sample should be executed. This sample must include consideration for the following: What are the strata we want to consider in this assessment? This is a company decision. The company might have a heavy presence in a single geographic region or most of their revenue may be represented in a single business unit. All these aspects define your organization and must be considered when identifying the strata to be represented in the assessment. Once you’ve defined the strata, the total population of end users can be identified and serve as the basis of a random sample. A redacted list of the total potential survey respondents is consolidated into a file that contains the geographic region, Business Unit, and Functional Unit in which each respondent works. Using a tool such as IBM SPSS Statistics, a random sample is executed with the final survey participants selected. 3. Identified participants are invited to a pre-survey workshop. Stratified random sampling is not a trivial process, assuming you want to have some confidence in the inferences made on the survey responses. From this author’s perspective, if the assessment team does not have the skill to properly conduct this sampling, the team must reach outside the group to ensure a skilled resource can assist. Conducting Pre-Survey Workshop While the objective of a pre-survey workshop is similar to other surveys conducted, there are unique challenges in conducting a workshop for the non-management participants. Most of the complexity is a direct result of regional diversity, time zones, and sheer number of participants. It is important that the assessment team does not underestimate the level of preparation necessary for large end user audiences. While global differences have obvious implications in terms of time zones and languages, these issues might also be experienced within a single country. There are several countries with multiple national languages. This means that your team may need to communicate/survey employees in all recognized national languages. This might be necessary to adhere to legal requirements or even union regulations. Comprehensive Analytic Maturity Assessment ! 17 Executing the End User Survey By design, end user surveys are much smaller in scope and therefore, fewer questions are asked. It is recommended that these surveys be designed to be completed within 15 to 20 minutes, limiting the survey to about 15 questions. Since there are only a limited number of questions, you only need to keep the survey active for sufficient time for participants to respond. It is recommended that you pick a 48 hour window for end users to respond. And while providing a weekend for SMEs to consider responses, we should never expect end users to work over their days off. Techniques for Analyzing End User Survey Responses In order to compare and contrast between executives, SMEs, and end user responses, the questions must be similar. While the wording of questions may vary to accommodate the different participating communities, the construction of the question and how it will be scored and incorporated into the assessment must be similar. Refer to Appendix A for more information regarding the design and analysis of survey questions. Comprehensive Analytic Maturity Assessment ! 18 Project Plan The project plan is based on the breadth of assessment undertaken. The Assessment Information Channels section of this guide identifies several areas and information sources that can be included in an assessment, referred to as Information Channels. The more channels included, the more comprehensive the study and the higher the confidence you will have in its findings. Information Channels include: Executive Interviews Key Subject Matter Expert Surveys Non-management User Surveys Application Footprint Inventory Data Architecture Inventory Technology Inventory For example, an assessment that includes all the information channels above and is conducted with full support of the organization can be executed in 8 to 10 weeks. Appendix E details a comprehensive study conducted in 8 weeks. The plan in Appendix E can be summarized by week as shown in Table 5. Table 5. Sample Plan Weekly Summary Week Category Key Objectives 1 Planning Identify resources, outline artifacts, risks, and milestones Define project scope, formalize assessment schedule, and plan for instruments Secure work area and team access and identify all participants Develop all relevant instruments Schedule conference calls and meetings with participants Conduct inventories Schedule conference call for End User participants Complete inventories, conduct interviews, execute user and self-assessment surveys Schedule team meetings for findings documentation Finalize interviews Analyze information gathered, formalize draft reports, and submit for early readout Schedule internal team meeting Start final report development Submit final report and conduct presentation 2 Planning Logistics 3 Instruments Logistics 4 5 Assessment Logistics Assessment 6 Logistics Assessment Artifact 7 8 Logistics Artifact Artifact On the other hand, if the organization only wants executives interviewed and a quick review of data and technical architecture, the assessment might be able to be completed within 5 to 6 weeks. Comprehensive Analytic Maturity Assessment ! 19 A robust study can be executed relatively quickly, in as little as 8 weeks, assuming the commitment from top management is strong and consistent. The number of Information Channels to be included in the scope of the assessment determines the length of the assessment. Other aspects of the effort can also contribute to the project duration, including: Availability of assessment team members from the organization. We generally recommend that team members from the client-side should be prepared to dedicate 50% to 75% of their time for the duration of the assessment. Availability of participants identified. For example, executives have busy schedules. If the organization’s executives are committed to participating, they will open their schedules to accommodate the study. On the other hand, executives can create scheduling delays that extend the project duration. Assessments should not be excessively long or unnecessarily complex. A robust study can be executed relatively quickly, in as little as 8 weeks, assuming the commitment from top management is strong and consistent. Comprehensive Analytic Maturity Assessment ! 20 Appendix A – Conducting Surveys and Analyzing Responses When conducting surveys associated with the maturity assessment, the following steps are highly recommended by this author. Conducting Surveys There are 5 key steps for executing any survey, including: 1. Create the survey based on the target audience and integration to the overall assessment 2. Select survey participants 3. Conduct a short pre-survey workshop 4. Conduct survey 5. Analyze results and incorporate them into the overall findings This author recommends that surveys are conducted online. Doing so affords the following benefits: 1. Online surveys allow the research team to collapse the overall assessment time since online surveys can be conducted for different respondent audiences at the same time as other Information Channels are executed, that is, other surveys, executive interviews, data architecture inventory, etc. 2. It represents an individual’s perspective, un-influenced by their boss or peers. Taking an online survey is a personal event. There is no outside bias introduced while the respondent considers their answers. 3. SME surveys can be executed anonymously, as it is likely to entice more frank response from participants, further enhancing their contribution to the overall assessment. 4. Sufficient time should be provided to respondents. Since the SME surveys are the most comprehensive, there are often 60 to 100 questions to answer, covering a broad array of topics. Consequently, it is important that the respondents are provided sufficient time to consider their answers. This research recommends that SME participants be given access to the survey on Thursday and it remains active until close of business the following Monday or Tuesday. This gives them enough time to carefully weight their responses. Unless a company already has survey software at their disposal, it is highly recommended that the use of existing survey services, such as SurveyMonkey, be employed. Not only will your organization save significant time and money compared to developing your own survey tools, but you will be able to conduct the survey quickly, thus collapsing the overall assessment time. Techniques for Analyzing Survey Responses While there are several methods for creating and scoring surveys, this author highly recommends adopting a Likert scale. This is one of the most popular methods for crafting and analyzing surveys. It is associated with responses that most of us are familiar. There are three common Likert scales, including 3-point, 5-point, and 7-point. Table 6 outlines response examples for each scale. Comprehensive Analytic Maturity Assessment ! 21 Aside from the obvious differences between the scales, there are other factors that you need to consider. For example, a 3-point scale is the easiest for participants simply because there are only three choices. But what is easier for respondents does not give researchers much information about the perspective of respondents. A 7-point scale provides considerably more detail about the perspective of participants but is more difficult to use. The number of choices requires more thought. This author believes that a 5-point scale provides the right level of detail without being overly difficult for various survey communities to complete. Table 6. Likert Scales Strongly Agree Agree Somewhat Agree Uncertain Somewhat Disagree Disagree Strongly Disagree 3-Point 5-Point 7-Point The best practice for analyzing Likert surveys is to break it down into two distinct approaches for different aspects of the survey as follows: 1. Aggregate Level of Dimensions/Constructs. This approach uses analysis of means. In other words, the researcher simply calculates the mean value of all responses for a particular construct that is comprised of two or more specific questions. This is often the type of results reported from surveys; for example, the average response for all Technical Architecture questions was 4.1, which on a scale of 1 to 5 (assuming a 5-point scale where 1 is negative and 5 is positive), the reader could infer a relatively positive perspective from respondents. Analysis of means is not only aggregation of means across dimensions but also between respondents and participant communities. Proper analysis would attempt to identify statistically significant differences between respondents based on their mean scores. 2. At the Question Level. This approach uses Range and Mode analysis. This analysis provides guidance of the shape of the the data in terms of providing a measure of center (Median and Mode) and a measure of dispersion (Range). Range is used as an estimate for a statistical dispersion of data, specifically: High value suggests a wide variety of opinions within a group of respondents. Range scores of 4 and 3 are indicative of wide variability of opinion. Low value is associated with consistency between respondents. Range scores of 2 and 1 are considered relatively consistent responses. Zero means complete agreement among respondents. Mode identifies the answer that was mentioned more often than others. It is significant if one or two mode responses exist for a single question. These are two distinct methods for analyzing Likert-based surveys. A common mistake by many researchers is using means analysis for specific questions. If your consulting firm only uses means analysis, then you should challenge their approach if the survey is Likert-based. Comprehensive Analytic Maturity Assessment ! 22 Appendix B – Sample of Executive Structured Interview Figure 4. Executive Structured Interview Sample Executive Structured Questionnaire Each interview will be scheduled for a two-hour session. 30 mins MAX ---- Short Answer Questions: Value Map 1 2 3 4 5 6 7 From your perspective, what are the key business drivers for COMPANY? What are the performance indicators or metrics you use to measure your business? What are the performance indicators or metrics you use to measure your functional area? What are the performance indicators or metrics you would like to use? What is preventing you from using them? What is the impact to the business? What are the critical success factors required by BI/DW for COMPANY success? 30 mins MAX ---- Short Answer Questions: General BI/DW Requirements 1 2 3 4 5 6 What business questions are most critical to operating your unit effectively? Are there initiatives in other BP BUs/FUs that are more important to the success of BP than yours? 7 How often do you need data refreshed and information available (monthly, weekly, daily, intra-day, real-time)? What level of history, if any, is required? Is there data/information in other BP BU/FUs that you feel are necessary to make better decisions? For example, sales/marketing may need to have greater access to available inventory? 8 What tools (data mining, mobile apps, dashboards) do you use today that are invaluable in running your business? What data/information do you consider critical in running your business, today? Within 1 or 2 years? Are you currently getting the type of information and insight necessary to make better decisions? If not, what do you need? What type of analytics (predictive, exploratory, etc.)do you need to support your decision making processes? 30 mins MAX ---- Freeform: Allow Interviewee Time to Express Ideas/Concerns 30 mins MAX ---- Specific Questions and Single Response Responses: Strongly Agree, Agree, Uncertain, Disagree, Strongly Disagree Leadership Technology Personnel Value Competition 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 Within our industry, I consider our BI/DW to be a LEADER Our BI/DW is considered to be an EXPENSE instead of an INVESTMENT The strategy of our BI group is well aligned with our firm's strategy COMPANY's level of investment in BI data management and warehousing is much HIGHER than competitors COMPANY's level of investment in BI/DW applications and analytics is much HIGHER than competitors COMPANY's level of investment in BI/DW infrastructure is much HIGHER than competitors COMPANY's level of investment in BI/DW personnel management skills is much HIGHER than competitors COMPANY's level of investment in BI/DW personnel interpersonal skills is much HIGHER than competitors COMPANY's level of investment in BI/DW personnel business knowledge is much HIGHER than competitors BI/DW applications assist in reducing the cost of recruiting, hiring, and training organizational personnel BI/DW reduces the cost of general management activities such as purchasing, marketing, and sales BI/DW reduces the cost of providing services to maintain or enhance the value of our services BI/DW maintains and provides a common customer view that is available to everyone in the our firm BI/DW maintains and provides a common product view that is available to everyone in the our firm BI/DW assists in our ability to threaten suppliers with vertical integration (e.g. assimilate external functions) BI/DW assists in our organization's ability to evaluate and choose the most appropriate customers Our BI/DW stores integrated, historical data that are difficult for competitors to replicate Our BI/DW offers information services to suppliers/customers that are difficult for competitors to replicate Comprehensive Analytic Maturity Assessment ! 23 Appendix C – Sample Survey Questions Table 7. Sample Survey Questions Sample'Survey'Questions Leadership Value'&'Risk Infrastructure Skill Which'best'describes'the'SPONSOR(S)'of'your'BI'initiative? Which'best'describes'how'EXECUTIVES'perceive'the'PURPOSE'of'the'BI'environment? To'what'degree'is'your'sponsor(s)'COMMITTED'to'the'BI'program? To'what'degree'is'the'sponsor(s)'held'ACCOUNTABLE'for'the'outcome'of'the'BI'solution? How'many'levels'below'the'CEO'is'the'BI'top'position? Top'management'views'BI'as'an'INVESTMENT'and'not'an'EXPENSE Top'management'views'our'BI'program'as'a'leader'in'our'industry Management'encourages'the'use'of'BI User'satisfaction'with'BI'is'a'concern'of'management Users'are'actively'involved'in'BI'development BI'objectives'are'derived'from'overall'business'objectives BI'strategy'is'aligned'with'the'strategic'plan'of'the'organization BI'objectives'adapt'to'the'changing'objectives'of'the'organization BI'leaders'educate'top'management'on'the'importance'of'BI BI'team'is'constantly'assessing'the'strategic'importance'of'emerging'technologies BI'team'is'constantly'generating'ideas'to'reengineer'business'process BI'team'seeks'to'identify'BI'related'opportunities'to'support'the'strategic'direction'of'the'organization A'standardized'process'for'prioritizing'BI'projects'has'been'established How'easy'is'it'to'get'FUNDING'for'the'annual'BI'budget? What'group'ALLOCATES'funds'to'the'BI'environments? Which'best'describes'the'current'degree'of'CAPITAL'INVESTMENT'in'your'BI'system? Which'best'describes'the'current'MAINTENANCE'BUDGET'for'the'BI'system? BI'value'is'measured'as'cost'savings'or'meetings'budgets BI'leadership'has'the'ability'to'identify'key'problem'areas BI'team/organization'is'flexible'and'adapts'to'unanticipated'changes BI'leadership'has'the'ability'to'gain'cooperation'among'user'groups'for'BI BI'leadership'identifies'and'resolves'potential'sources'of'resistance'to'BI'plans BI'projects'always'contain'an'assessment'of'risk Our'organization's'level'of'investment'in'BI'technology'is'… Our'organization's'level'of'investment'in'the'BI'team's'business'function'is'… Our'organization's'level'of'investment'in'the'BI'team's'technical'skill'is'… BI'reduces'costs'for'many'business'processes BI'enhances'the'value'of'our'products'and/or'services BI'assists'in'identifying'the'most'appropriate'customers/clients'for'our'organization BI'assists'in'monetizing'information'for'customers'or'suppliers Executive x x " " x x x x x x x Surveys SME x x x x x x x x x x x x x x x x x x x What'is'the'PREDOMINANT'ARCHITECTURE'of'your'BI'environment? To'what'degree'can'users'directly'ACCESS'the'data'they'need'to'make'decisions'from'a'single'user'interface? To'what'degree'have'you'established'standards'for'TECHNOLOGY'and'TOOLS'in'your'BI'environment? To'what'degree'do'individuals'and'groups'ADHERE'to'the'technology'and'tools'standards'established? To'what'degree'has'they'BI'group'defined,'documented,'and'implemented'DEFINITIONS'and'RULES'for'key'terms'and'metrics? How'many'unique'DATA'SOURCES'does'your'BI'environment'draw'from? To'what'degree'do'end'users'TRUST'the'data'in'your'BI'environment? Which'best'describes'the'degree'of'synchronization'among'the'DATA'MODELS'that'your'group'maintains? Which'best'describes'your'BI'group's'approach'to'DEVELOPING'BI'solutions? To'what'degree'has'each'group'defined,'documented,'and'implemented'STANDARDS'for'developing,'testing,'and'deploying'BI' functionality? On'average,'how'many'BI'PROJECTS'that'last'three'or'more'months'does'your'group'run'concurrently? Metadata'tools'are'generally'available Metadata'tools'are'used'regularly'across'the'organization There'is'a'general'understanding'of'data'structures'across'the'organization There'is'a'well\defined'data'environment'including'stewardship'and'metadata Which'best'describes'how'users'access'BUSINESS'METADATA? x x x x x x x x x x x x x x x x Of'regular'BI'users,'most'have'strong'PROBLEM'SOLVING'ability Of'regular'BI'users,'most'have'strong'DATA'MANIPULATION'skill Of'regular'BI'users,'most'have'a'high'tolerance'for'CHANGE'and'AMBIGUITY Of'people'who'use'BI'on'a'regular'basis,'most'have'a'strong'understanding'of'BUSINESS'FUNCTIONS There'is'a'well\organized'availability'of'technical'training There'exists'a'well\organized'availability'of'business'training' Management'supports'ongoing'education Online'training'is'available'on\demand Formal'measurement'of'training'is'done'to'improve'courses How'many'of'the'courses'offered'are'from'vendors Members'of'the'BI'team'have'the'right'technical'skills BI'team'thoroughly'understands'the'data,'applications,'and'technologies'used'by'the'firm x x x x x x x x x x x x Comprehensive Analytic Maturity Assessment ! User x x x x x x x x x x x x x x x x x x x x x x x x 24 Appendix D – Sample End User Survey The end user survey should have questions that overlap with other surveys like the self-assessment and executive interviews. This affords cross-analysis in an effort to discern if the organization has a consistent message among communities or there is a significant disconnect. Table 8. Sample End User Survey Response Question Strongly Disagree Disagree Uncertain Agree Strongly Agree The data our company uses and reports are trustworthy The data our company uses is clean Our company is quick to react to technological change Our company is driven by Information Technology innovation Our employees embrace change to new ways of doing things Employees share information with other employees Employees have a wide variety of communication tools to choose from including telephone, Internet and e-mail Our employees fully leverage all the available Information Technology Employees make extensive use of Information Technology Employees are given access to the information they need to perform their duties Our employees have access to information to make timely decisions Our companies BI/BA personnel are knowledgeable about business functions A common view of our products/services is available to everyone in the organization Mobile users have ready access to the same data used at desktops When employees need specific information, they know who has it or where to get it BI/BA staff quickly respond to user support requests Comprehensive Analytic Maturity Assessment ! 25 Appendix E – Sample Project Plan Table 9. Sample Project Plan Week'' Assessment'Project'Plan Category Week'1 Planning Planning Team*participants Outline*artifacts,*milestones,*and*risks Week'2 Planning Conduct*scope*and*definition*meetings*with*client Assessment*process*schedule*definition Instrument*planning Logistics Complete*all*required*documentation*for*team*members Secure*work*area*for*team*members Secure*site*passes*and*system*access*for*team*members Formalize*the*entire*team*roles Identify*executive*participants*&*schedule*faceEtoEface*executive*interviews Identify*SME*selfE*assessment*survey*participants Identify*user*sentiment*survey*participants Identify*IT*and*business*candidates Identify*total*population*and*conduct*stratified*random*sample*if*necessary Schedule*technical*architecture*meetings Schedule*data/application*meetings Week'3 Instruments Logistics Week'4 Assessment Logistics Week'5 Assessment Logistics Assessment Week'7 Week'8 Develop*user*sentiment*survey,*selfEassessment,*structured*interviews*(executives) Develop*structured*interview*instrument Develop*technical,*application,*and*data*architecture*inventory*instruments Schedule*30*minute*conference*call*with*SME*participants Finalize*executive*interview*schedules Schedule*team*roundtable*meeting Scheduled*white*board*meeting*with*the*sponsors Have*user*sentiment*survey*approved*by*H/R*and/or*legal Conduct*technical*inventory Conduct*data/application*inventory Schedule*user*sentiment**conference Conduct*executive*interviews Conduct*user*sentiment*survey Conduct*selfEassessment Complete*technical*inventory Complete*data*and*application*inventories Schedule*team*meetings*for*product*development*&*presentation Schedule*internal*team*roundtable*team*meetings Conduct*remote*executive*interviews*if*necessary Schedule*any*secondary*interviews*or*send*emails*for*filling*research*gaps Artifact Formalize*survey*findings Formalize*technical*inventory*report Formalize*data*architecture*inventory*report Formalize*applications*inventory*report Analyze*executive*interview*notes Start*analysis*of*User*Sentiment*&*SelfEAssessment*survey*results Submit*formal*reports Logistics Schedule*internal*team*roundtable Artifact Start*final*report*development Artifact Final*presentation*and*report Week'6 ! Tasks Comprehensive Analytic Maturity Assessment ! ! 26 Appendix F - References Bharadwaj, A. S. (2000). A resource-based perspective on information technology capability and firm performance: An empirical investigation. MIS Quarterly, 24(1), 169-196. Bhatt, G. D., & Grover, V. (2005). Types of information technology capabilities and their role in competitive advantage: An empirical study. Journal of Management Information Systems, 22(2), 253-277. Dehning, B., & Stratopoulos, T. (2003). Determinants of a sustainable competitive advantage due to an ITenabled strategy. Journal of Strategic Information Systems, 12(1), 7-28. Gartner (2009). Key roles for successful BI/DW delivery; Business Intelligence solution architect. Gartner. Gonzales, M.L. (2012). Competitive Advantage Factors and Diffusion of Business Intelligence and Data Warehousing, The University of Texas at El Paso. Gonzales, M. L., Bagchi, K., Udo, G., & Kirs, P. (2011). Diffusion of business intelligence and data warehousing: An exploratory investigation of research and practice. 44th Hawaii International Conference on System Sciences. Gonzales, M. L., Mahmood, M. A., & Gemoets, L. (2009). Technology-enabled competitive advantage: Leadership, skill and infrastructure. Decision Science Institute. Gonzales, M. L., Mahmood, M. A., Gemoets, L., &Hall, L. (2009). Risk and IT factors that contribute to competitive advantage and corporate performance. Americas Conference on Information Systems, San Francisco, CA. Gonzales, M. L., & Wells, D. L. (2006). BI strategy: How to create and document. El Paso, TX: HandsOn-BI, LLC. IDC (2003). Leveraging the foundations of wisdom: The financial impact of business analytics. IDC. Inmon, W. H. (1992). Building the data warehouse, 1st Edition. Boston, MA: QED Technical Pub. Group. Johannessen, J., & Olsen, B. (2003). Knowledge management and sustainable competitive advantages: The impact of dynamic contextual training. International Journal of Information and Management, 23(4), 277-289. Oh & Pinsonneault. (2007). On the assessment of the strategic value of information technologies: Conceptual and analytical approaches. MIS Quarterly, 31(2), 239-265. Piccoli, G., & Ives, B. (2005). IT-dependent strategic initiatives and sustained competitive advantage: A review and synthesis of the literature. MIS Quarterly, 29(4), 749-776. Porter, M. E. (1979). How competitive forces shape strategy. Harvard Business Review, 57(2), 137-145. Porter, M. E. (1980). Competitive strategy. New York, NY: The Free Press. Comprehensive Analytic Maturity Assessment ! 27 Porter, M. E. (1998). Competitive advantage: Creating and sustaining superior performance. New York, NY: The Free Press. Ramiller, N. C., Swanson, E. B., & Wang, P. (2008) Research directions in information systems: Toward an institutional ecology. Journal of the Association for Information Systems, 9(1), 1-22. Ross, J. W., & Beath, C. M. (2002). Beyond the business case: New approaches to IT investment. MIT Sloan Management Review, 43(2), 21-24. Sambamurthy, V. (2000). Business strategy in hypercompetitive environments: rethinking the logic of IT differentiation. In: R. W. Zmud, Framing the domains of IT management (pp. 245-261). Cincinnati, OH: Pinnaflex Educational Resources. Sambamurthy, V., Bharadwaj, A., & Grover, V. (2003). Shaping agility through digital options: Reconceptualizing the role of information technology in contemporary firms. MIS Quarterly, 27(2), 237-263. Santhanam, R., & Hartono, E. (2003). Issues in linking information technology capability to firm performance. MIS Quarterly, 27(1), 125-153. Turban, E., Aronson, J. E., Liang, T., & Sharda, R. (2007). Decision support and business intelligence systems (8th Ed). NJ: Prentice Hall. Watson, H. J., Goodhue, D. L., & Wixom, B. H. (2002). The benefits of data warehousing: Why some organizations realize exceptional payoffs. Information & Management, 39(6), 491-502. Weill, P., & Broadbent, M. (2000). Managing IT infrastructure: A Strategic Choice. Cincinnati, Ohio: Pinnaflex Educational Resources. Weill, P., Subramani, M., & Broadbent, M. (2002). Building IT infrastructure for strategic agility. MIT Sloan Management Review, 44(1), 57-65. Wixom, B. H., & Watson, H. J. (2001). An empirical investigation of the factors affecting data warehousing success. MIS Quarterly, 25(1), 17-41. Stream Integration - Aligning Technology to Business Stream Integration, now the Prolifics Information Management and Analytics practice, is a market-leading information lifecycle consultancy specializing in Enterprise Analytics, Big Data, and Information Management. We leverage our deep industry knowledge, broad functional experience, and mastery of technology to help our customers build robust business solutions that deliver real return on investment. In over a decade of business, Stream Integration has successfully delivered business value on hundreds of customer engagements. We focus exclusively on turning data into valuable business information. As a result, our customers are more competitive, efficient, and profitable. Enterprise Analytics l Master Data Management l Information Governance l Data Integration l Big Data l Performance Management l Enterprise Enablement OTTAWA | TORONTO | NEW YORK | ORLANDO | AUSTIN | AMSTERDAM | LONDON © 2015 Stream Integration. All rights reserved. Comprehensive Analytic Maturity Assessment! info@streamintegration.com 28