Fraud Control - IT Interventions and Solutions ©2011, Cognizant Key considerations for the functional solution •Understand the difference between abuse and fraud: Fraud: knowingly, intentionally, willfully, ongoing for direct financial gain Abuse: excessive, unwarranted, potentially not needed •Provide practical insights to insurers, through portfolio analysis and comparison to industry benchmarks Core Principles •Focus on obtaining a demonstrable return on investment from project by prioritizing high financial loss practices, such as systematic collusion •Deliver tools that can be deployed at all levels, ie: broker / agent / insurer / TPA / regulator and across functions – distribution / underwriting / claims processing •A solution that provides a comprehensive data analysis and reporting environment facilitating MIS and fraud analytics reports, to dissect and highlight patterns trends, volume and scope of fraudulent claims observed •Strengthening future data capture initiatives and develop greater data analysis capabilities within the insurance company 2 | ©2011, Cognizant Solution Proposed Components of the proposed solution Domain Knowledge Functional Solution 3 | ©2011, Cognizant Technical Solution Solution Proposed Solution Proposed – Holistic View Additional requirements Functional Solution MIS & Fraud Detection Reports Aggregate Level Fraud Modeling Predictive Modeling Technical Solution Rules Social network analytics Real-time Fraud Detection at various stages Detection at Underwriting Detection at Preauthorization Integrated Data Operational Data Store (ODS) Data Integration Extract, Transform & Load (ETL) Detection at Claims Process Stage Data Cubes Data Quality – Cleansing, Profiling Data Marts Data Standardization & Certification Transactional Data Policy 4 Anomaly Detection | ©2011, Cognizant Member Claims Lookup Data Provider Registration Portal Standardized IDs for providers & employers ICD 10 Coding Procedure codes Functional Solution: Aggregate level Fraud Modeling & Analysis using data Aggregate level Fraud Modeling - Components Predictive Modeling Social Network Analysis Modeling of portfolio with various methods, most suitable to be selected based on results: Analysis of possible collusion between industry players, ie: Logistic regression Decision trees Neural networks Possible collusion between TPAs and providers Collusion between a TPA and employer Mis-selling / concealing by intermediater, agent, underwriting office Anomaly detection rules Outlier detection rules Evaluation from a clinical or business logic standpoint to detect anomaly: Identify outliers, compared to industry experience, for scrutiny: ICD - PCS mismatch Age-gender appropriateness Capacity / infrastructure appropriateness ICD to total charges & charge break-up Past payment for the same ICD to the same provider Flexibility: predictive models for fraud detection should be built using different statistical methods; the final models should be determined after analyzing the results. Focus on enhancing predictive values (also reducing false positives) and continuous improvement as new data fields becomes available. 5 | ©2011, Cognizant Proposed Technical Solution 6 | ©2011, Cognizant Key Considerations for the Technical Solution •Need for a Platform that can provide end-to-end capabilities, starting with Data Integration, Statistical Modeling, Fraud Detection, BI & Reporting. • To choose a tool that supports advanced analytic approaches and fraud risk scoring techniques like anomaly detection, social network analysis. Core Principles •To build a comprehensive Operational Data Store (ODS) to hold persistent source system data in a standard model for reporting & analytical requirements. •An unique approach to combine Modeling techniques to leverage the unique aspects of each of the techniques be it logistic regression, decision trees or neural networks. •A solution that provides a comprehensive data analysis and reporting environment with MIS and fraud analytics reports, to dissect and highlight patterns trends, volume and scope of fraudulent claims •A solution which caters to current requirements and is extensible to other lines of business. •Leverage industry specific relevant frameworks, methodologies and processes to ensure flawless and timely delivery with utmost quality. 7 | ©2011, Cognizant Technical Solution Overview The integrated data will consist of the Operational Data store (ODS), Data cubes built using SAS tools & Data marts. This data will provide the base for the models & reports to be built for the solution SAS FFI (SAS Enterprise DI) Oracle Enterprise Ed SAS FFI (Base SAS, Enterprise Miner, OLAP Cube Studio) Fraud Suspect Extracts / Investigation feedback 8 | ©2011, Cognizant Oracle + SAS Cubes SAS FFI (SAS Enterprise BI) Model Development & Modeling Techniques DISC Analytics Methodology closely weaves business outcome with the statistical techniques Define Simulate Investigate Modeling Techniques proposed Consult X (Contd.) Data Extraction from different sources Fine tune the model Claims Data Merging Predictive Modeling No Data Cleaning Is Model Adequate Outliers Detection Yes Exploratory Data Analysis Score the Validation Data Identify the Variables for the Model Data Split No No Is Satisfactory Is Adequate Yes X. 9 | ©2011, Cognizant Examine the predictive ability Claims Segmentation Yes Results and Insights Logistic Regression • Statistical technique used to identify the likelihood of occurrence of a binary/ categorical outcome using multivariate inputs • Logistic Regression can estimate the probability of making a fraud claim in next few months Decision Tree • Decision Tree divides the population into segments with the greatest variation in the objective variable at each segment . The algorithms usually work topdown • Decision Tree supports in identification of the segments which are more likely to have fraud concentration • The key variables/logic , that identify the fraud concentration in decision tree can also be used in Neural network for instant Fraud detection. Neural Network • Artificial Neural network is non-linear data analytical process used to identify complex relationships between inputs and output • By detecting complex nonlinear relationships in data, neural networks can help make accurate predictions about real-world problems. • Integrated learning capabilities in Neural network , where the significant logic coming out of Decision tree and logistic regression can be feed in . • This will enable to continuously monitor and refine detection rules and techniques to reduce false positives and identify and respond to emerging threats Exploratory Data Analysis 10 | ©2011, Cognizant Decision Tree Analysis 11 | ©2011, Cognizant Neural Networks 12 | ©2011, Cognizant Cognizant’s Fraud Management Workbench Fraud Management Workbench will enable SIU users orchestrate the complete process of investigating a suspect claim referred to SIU, analyze the claim by its merits and label the claim to its logical closure Sixth Sense Solution Fraud Management Workbench Functional Features Technical Enablers • Automated & manual claims fraud referral from claims system • Automated case assignment based on SIU user skills and availability • Automated creation of relevant tasks for each case based on claim type • Claim fraud scoring with 360 degree claims view • Outside investigators assignment and tracking • Compliance alerts and reports • Regulator referral utility • Cloud ready • Light weight case management/ workflow layer • Rules engine interface • Scoring engine to interpret predictive models and provide claim fraud propensity score • Multi format claim investigation evidence update (Images, Audio Files, GIS Data etc) • Third party reports interface • Discussion forums and chat functionality to discuss with SIU gurus 13 | ©2011, Cognizant Thank you ©2011, Cognizant | ©2011, ©2011,Cognizant Cognizant