Uploaded by efthdel

Summary EBM

advertisement
Week 1: Introduction to evidence-based management
Article: Evidence-based-management – the basic principles
What is evidence based practice?
Idea is: Good quality decisions should be based on a combination of critical thinking and the best available
evidence.
Definition of evidence-based practice=
It is about making decisions through the conscientious, explicit and judicious use of the best available evidence
from multiple sources by:
1. Asking: Translating a practical issue or problem into an answerable question.
2. Acquiring: systematically searching for and retrieving the evidence.
3. Appraising: critically judging the trustworthiness and relevance of the evidence.
4. Aggregating: weighing and pulling together the evidence
5. Applying: Incorporating the evidence into the decision-making process.
6. Assessing: evaluating the outcome of the decision taken
To increase the likelihood of favourable outcomes.
What counts as evidence?
Evidence means: information, facts or data supporting a claim or an assumption.
• May come from research suggesting generally applicable facts about the world, people r organizational
practices.
• Also, from organisations
• From professional experience
Why do we need evidence-based practice?
There are barriers to evidence-based practices such that few practitioners have neem trained in the skills required
to critically evaluate the trustworthiness and relevance of the info they use. Information may also be difficult to
access. And practitioners are often not aware of the current scientific evidence available.
What source of evidence should be considered?
Start by: What is the available evidence? (What is known)
Evidence from four sources should be taken into account:
• The scientific literature
• The organization (data, facts and figures)
o Hard (numbers), soft (people experience)
• Practitioners (professional experience)
o Accumulated over time through reflection on outcomes
o ‘tactic knowledge’
• Stakeholders (value and concerns of people affected by the decision)
o Provides a frame of reference to analyse evidence from other sources
Why do we have to critically appraise evidence?
Critical appraisal always involves asking the same basic questions:
• Where and how is evidence gathered?
• Is it the best available evidence?
• Is there enough evidence to reach a conclusion?
• Are there reasons why the evidence could be biased in a particular direction?
• How are these figures calculated? Are they reliable?
1
•
•
How was the data collected?
How were the outcomes measured?
Why focus on the best available evidence?
Fundamental principle of evidence-based practice is that the quality of decisions is likely to improve the more
we male use of trustworthy evidence.
• Using limited quality evidence is still better than using no evidence as long as we are aware of the
limitations.
Some common misconceptions of evidence-based practice
Misconception 1: Evidence-based practice ignores the practitioners professional experience
• Directly contradicts the definition of evidence-based practice
• Doesn’t many one source is more valid than another
• Evidence based practice is about using evidence from multiple sources rather than just relying on one
Misconception 2: Evidence-based practice is all about numbers and statistics
• It is not about doing statistics but statistical thinking is an important element.
Misconception 3: Managers need to make decisions quickly and don’t have time for evidence-based practice
• Vast amount of decisions are made over long time periods and require considerations of legal,
financial, strategic, logistical or other issues which all take time.
Misconception 4: Each organization is unique, so the usefulness of evidence from the scientific literature is
limited
• Organisations often repeatedly face similar issues
• Neither are perfectly unique or perfectly alike
Misconception 5: If you do not have high-quality evidence you cannot do anything
Misconception 6: Good quality evidence gives us the answer to the problem
• Evidence doesn’t speak for itself
• To make sense of evidence we need an understanding of the context and a critical mindset
• Evidence does not tell you what to decide but it does help you to make a better-informed decision
Lecture 1
Why do evidence-based practice?
• We always use evidence as a base for out decisions but that’s not the same as adopting an evidencebased approach
• All practitioners always use evidence in their decisions on evidence but
o Pay limited attention to quality and relevance of evidence
o Use limited sources and types of evidence
o Are easily pushed off track when trying to make better-informed decisions
• Decisions are about important problems/opportunities and most likely solutions should be based on
evidence
• 4 sources of evidence
Paradox of evidence-based management
• Seems nobody really disagrees with evidence-based practice in principle…some evidence is better than
nothing
• So why isn’t it happening much or at al
• It has barriers
Barriers: What gets in the way?
• Individual and group cognitive biases
• Fads, fashions
• Managers incentives away from evidence based practice
• Organisational politics/power
• Poor logic models/theories of change
2
Misconceptions of EBP: Not necessarily a lack of poor evidence-its about the best available evidence and a
process
• A lack of focus on specific and well-identified problem
• Contexts in which practitioners practice is not evaluated
• Good intentions
• Erroneous belief that we already are evidence based or evidence based enough
• Access to only some type of evidence
• The perceived need for speed
• Crude benchmarking
Example: management fads and fashions
• New and exciting ideas practices and techniques
• Adopted quickly and widely across many countries and types of organisations
• Can strongly shape what organisations do (and do not do)
• Tend to disappear after a few years to be replaced by … another new and excting idea
Whats the point of evidence-based practice?
• EBP is not about the evidence itself but rather something to help us…
o Do stuff that addresses important business/organisational/social problems and opportunities
(rather than trivial issues)
o Do stuff that is more likely to work rather than stuff that is unlikely to work or has little effect
• EBP is also ethically-based practice as
o Maximised chanced of doing good
o Minimised chances of doing harm
o Takes into account the values (including ethical) of stakeholder a key part of the evidence
picture
Some tips
• Always start with and spend more time getting evidence about and understanding the problem (or
opportunity)
• Beware of the tendency to look too quickly for solutions or solutioneering
• Ask the question why? A lot
• Be healthily sceptical of cool and cutting-edge fads and fashions
• Remember we are full of biases – don’t just believe what you see and hear but check it out using
evidence
3
Week 2: Organisational decision making
Article: Making evidence-based organizational decisions in an uncertain world (Rousseau, 2018)
Organisational decisions are similar to but different from other decisions
Organisational decision makers work with and through others. And all judgements and decisions are affected by
the roles people play.
• Organisational decision makers tend to pay attention to certain information because it first their role and
avoid other information if it seems irrelevant or exposes them to risk
• Face complexity and uncertainty and highly dynamic situations
Organisational practices can repair decision bias
Using appropriate organisational practices helps overcome both well-established individual biases and biases
peculiar to organisations.
• Its easier to recognise biases in other people than ourselves
Six organisational biases and their targeted repairs:
1. Solving the wrong problem (idea led not problem driven)
o start by asking lots of questions
o start with search and get the right decision frame
o deliberate search helps keep an open mind
o accept uncertainty – asses problem
2. Ignoring politics ( sponsor biases, pet projects)
o Politics = implicit or explicit use of peronal power and reosurces to influence others
§ Positive: integrative goals
§ Negative: pet projects, ignoring sensitive information, involving only some
stakeholders and leaving others out
o Addressing the politics of the decision
o Politics can be positive and negative
o Politics reflect the moral component of decision making
o Legitimate a de-biased decision focus
3. Considering just one option (pet project, gut feeling)
o Entertaining multiple options
o Systematic attention to each alternatives pro and con
4. Focusing on a single outcome (narrow view of success)
o Using several outcomes of decision success and effectiveness
5. Narrow interest dominate (stakeholder ignored)
o Broaden the kinds of stakeholders considered and involved
o Reflect the broad array of stakeholders
o Politics: including those outside the organisation
6. Relying on easily available information (Stories and hippos)
o Tradition (past experience)
o Authority (powerful people)
o Broadening sources of information to include scientific evidence, organisational data, expert
judgement and stakeholder concerns
o Critical mindset is key for good quality evidence
Implications
Organisational decisions have better odds of success if the above six de-biasing processes are used
• Synergic and work well together
• In combination they help decision makers to confront and manager incomplete information, ambiguity
and uncertainty.
• These six responses can also promote positive politics
Decision processes matter: Three evidence -based decision processes
4
Processes matter because decision makers cannot rely on the results to know if the made a good decision.
1.
Routine decisions
Characterised by stability and clear cause-effect understanding. Repeating conditions.
2.
Non-routine decisions
Complicated situations where there is no full information but there is information somewhere. Wellstructured decision processes by involving knowledgeable others in evidence gathering and appraisal,
developing alternatives.
3.
Truly novel decisions
Critical information required but doesn’t exists because historical evidence is irrelevant. Evidence must
be generated by action, learning and experimentation.
• Two types:
o Complex situations with new conditions
o Chaotic situations
Routine decisions involving known knowns
Causal connections are clear. Key aspect is to gather facts in how the decision is currently being made, evaluate
its outcomes and the redesign the process periodically to get good outcomes more consistently.
• Variety of sources can help
• Accessibility of data may be an issue
Non-routine decisions with known unknows
Here critical information needed must be first identified
• Search and discovery processes are essential
• Political implications need to be addressed
• Multiple alternatives
5
•
•
•
Multiple success criteria
Identify relevant stakeholders
Multiple sources of evidence needed
Also, can conduct periodic after action reviews (AARs)
Ask:
• What did we set out to do?
• What actually happened?
• Why did it happen?
• What are we going to do next time?
Novel decisions involving unknown unknowns
Involve technical as well as political uncertainty. Also, ambiguity.
Two kinds of unknown unknowns:
• Complex situations where the unknowns reflect new kind of conditions and events whose underlying
order can be ultimately be understood
• Chaotic situations where there may be more unknowable’s
o Require resilience
Article: Is decision-based evidence making necessarily bad? (Tingling & Brydon, 2010)
Evidence is not as frequent an input to decisions as suggested by the business press. Not all decisions use
evidence in the same way. Evidence can be used to make, inform or support a decision. Managers need to be
aware that evidence is shaped by subordinates to meet perceived expectations of company leaders.
Why does decision-based evidence making occur?
AN algorithmic approach is well suited to highly structured decision problems in which the ends and means are
well understood.
• But here the role of evidence is unclear
Make a decision
Primary risk of making decisions by relying exclusively on hard evidence is that the algorithms and models used
to transform evidence into a decision provide an incomplete or misleading representation of reality.
Inform a decision
Evidence is used to inform a decision whenever the decision process combines hard, objective fact with
qualitative inputs.
• Evidence based inputs either confirm or disconfirm the decision makers initial subjective beliefs and
preferences
Support a decision
Evidence is used to support a decision whenever the evidence is gathered or modified for the sole purpose of
lending legitimacy to a decision that has already been made.
• Depends on norms which are becoming more explicit and rigid
Is decision-based evidence making necessarily bad?
Can be effective when the audience is external and the manufactured evidence supports the organisations best
guesses about a complex and unpredictable decision environment.
What can be done to lessen the negative impact of decision-based evidence making?
Decision makers should have a clear understanding of the different roles evidence can and should play in a
decision process.
The following guidelines can helps:
• Understand the nature of the decision problem and assess the potential contribution of formal evidence
to the quality of the decision process (for some decisions intuition is better than old evidence)
• Weigh the risks , costs and benefits of evidence when advocating an evidence-based approach to
decision making
• Differentiate between internal and external audience when engaging in decision-based evidence making
• Ensure that the objective evidence painstakingly gathered by court analysts is reflected more often than
not in the decisions of the organisation
6
Lecture 2: Organisational deicion making
Discussion rosseau article
Aim of the article:
• Promote better organisational decision making
How
• By offering evidence-based management: What are the insights from the literature?
Rousseau (2018) three key ideas:
1. Organization decision are different rom individual decisions – may introduce new biases
2. Bias in organisational decision and individual judgment may be offset by six evidence-based practices
in organisations
3. These practices are helpful in managing three types of decision processes
Organisational decisions
Note: About half od organisational decisions fail to achieve their goals
Are different from individual decision in three ways:
• Social setting: participants can hold different information
• They may be de-biased in the organisation
• It is possible to install appropriate decision processes
Bias
•
•
Time bias
o Hyperbolic discounting: favouring the immediate over future outcomes
Ikea effect = how people tend to value an object more if they make it themselves
Overcoming your own bias?
• E.g deliberate thinking because of thinking In another language
How do organisational decisions differ from individual (or professional) decisions?
Is management a profession
• Yes, because…education?
• No, because…expertise on few decisions (e.g. expert judgment engineer on building bridges,
recognizing patterns, 10000 hrs practice rule)
Key idea 1
• Managers, teams. Workgroups:
o Work with and through others
o May be put under pressure by politics
o Are accountable for own and others decisions
o Make many types of decisions, often concurrently
o Affect many stakeholders
o Face high uncertainty (missing information, difficulty interpreting situations changes in
environment)
o Usually have few decision supports/protocols
• Uncertainty: develop skills to confront
o Foreseeable or unforeseeable
o Political - no clear course of action
o Environmental change: psychical, economic, technological, social forces
o OR combination of the three
Key idea 2 (repairs)
Easier to detect in others than in ourselves: groups may correct individuals (e.g. valuing own experience more
than data or scientific findings) Example: favouring candidate from same university.
Use of best evidence: what counts as best evidence is contingent on its appropriateness to the question being
asked. E.g. effectivness vs experience
Key idea 3: decision process matters: deciding how to decide
What is a decision process?
• A set of ideas that begins with identification of an issue or needs and results in action (conventional)
7
•
However: organisation – complex – decision process: on the go, taking action and learning, acting again
(dynamic)
Why do decision processes matter?
• Results of a decision often not available – takes time, so learning from feedback difficult
• Processes appear to predict good outcomes, so improvement is possible there
Three types of decisions
• Routine – known knowns
o Stable
o Clear
o Cause and effect understanding
o Skills:
§ Why routinize: get consistent results, find ways of improving, free up time for other
activities
§ How: mental checklistis; agile protocols- use human judgement to go of script if
needed. Understanding protocol is crucial
§ Danger: overlooking ways to improve
• Non routine – know unknowns
o Complicated decisions but information can be gathered to get to a good outcome
o Structured decision process can be conducted
o Skills:
§ 1. Comprehend problem
§ 2. Acknowledge political implication
§ 3. Define multiple alternatives
§ 4. Evaluate on multiple criteria.
§ 5. Involve stakeholders
§ 6. Evidence – 4 sources
o AARs = after action reviews
• Truly novel – unknown unknowns
o So vague that problem cannot be identified
o Improvisation
o Two types:
§ Complex situations with new conditions: sensemaking and learning by doing
• Critical evidence does not exist, historical evidence is irrelevant, crisis
• Evidence comes from action, learning, experimentation
§ Chaotic situations: resilience, muddling through process, supported by
experimentations, acting reflecting
• Simulating crisis situations: train people in slow training, combined with
training under realistic conditions
• Learning after surprise event
• Small trials and experiments: explore what works
Learning cycle
Master each process to improve outcomes. Evaluate and update. Understanding cause and effect may not always
be possible.
Conclusion Rousseau (2018)
• Organisational decision making is flawed due to many types of bias
• Decision processes in organisations can be trained
• Improvement of decision processes may help to make better decisions – different types for different
types of decisions
• EBP matches best with non-routine
8
Contrast with Tingling and Brydon (2010)
• Politics and strategic
• Side effects of rhetoric of evidence-based decision making
o Ceremonial decision process
o Investment in information systems an analytical tools wasted (self-censorship, presenting
confirming evidence)
o Example: bluffing about rationality for new platform to convince others in the sector
• Guidelines:
o 1. Understand nature od decision problem
o 2. Weigh costs and benefits of evidence – based approach
o 3. Differentiate between internal and external audiences when engaging in evidence making,
internal difficult to be fooled
o 4. Do not disregard evidence unless you have to (!!) when it serves a strategic purpose.
Barends and Rousseau (2018): why ebm is difficult:
• 1. Scientific evidence is hard to find
• 2. Scientific evidence is often not updated in courses
• 3. Scientific evidence is not made accessible
Part 2
Quality of scientific data:
• Systematic review/meta analysis: multiple studies provide an overview of evidence accumulated over
time
• Theoretical (narrative) review: often reviews the ideas rather than evidence as empirical studies
• Single studies
o Cause and effect
o Controls
o Before and after
Three conditions needed to establish causality
• Covariate: manipulating variable does together with change in other variable
• time: X precedes Y
• Control/isolation: other influences geld constant: only this variable and no other variable can be the
cause of the effect. (outside the lab?)
Research strategies and levels of evidence
• Internal validity
o Research may contain bias
§ = distortion in the outcome
§ Caused by systemic errors in the way the study has been designed
§ Scientific method helps to guide reasoning and make bias less likely
o The degree to which alternative explanations for the research results are possible
• External validity: do the findings generalise to people, situation outside the study?
o
o
However in applied fields, experiments are not always possible
The isolation of only a few variables in an experiment may make the study unrealistic
Experiment
• Theoretical predictions (hypothesis) are made
9
•
•
•
•
Independent variables are varied systemically (manipulated) to test whether this manipulation or
treatment has an effect tin comparisons to a control condition in which the treatment was not given
Random allocation of sample to experimental conditions
Controlled: holding other variables constant by use of randomization
Observing effect of independent on dependent variables
Experimental designs:
Such as between and within subject. Effects:
Quasi experiment
• Non random assignment but there is a control group
• Implies that no causal inference can be made from the data but at least a comparison with a group that
did not receive the treatment can be made
Survey
• Structured collection of date from a sizable population
• Questionnaires, structured observation, structured interviews
• Advantages:
o Reach many people that can be included in a short period od data collection,
o quantitative analysis
• disadvantages
o limited time for respondents or they do not participate
o superficiality : it cannot go into great depth because the observations are standardized and do
not go into specifics for specific respondents
o it is difficult to design a good survey: use validated measures designed by others.
Standardization of measures is good for comparisons with other studies: exactly the same
measurement is used
Case study/multiple case studies
• investigation of a particular contemporary topic, within its real-life context, using multiple sources of
evidence
• triangulation = the comparisons of multiple sources to find out whether the same can be concluded
• often includes qualitative date, rich descriptions
• multiple cases ma be compared for contrast, illustration or other reasoning
Note on qualitative research methods:
• low in internal validity
• reason to use it:
o exploratory studies
o need for an in depth, detailed study to gain understanding, richness of qual. Info necessary
o natural setting is crucial
o large sets of tests, data mining is possible
10
Week 3: data driven decision making
Article
What contributes to making HR analytics a management fad?
List of analytic pitfalls that will contribute to make it a fad
a. Lack of analytics about analytics
b. Mean/end inversion or data fetish. Some are enamoured with analytics thinking that more data is
always better. You don’t necessarily need more data but the right data.
c. Academic mindset in a business setting. You need to start with a deep understanding of business
challenges not with a theory you are testing. Need to understand the difference between academia and
practice.
d. Hr analytics run for an HR Centre of expertise (CoE). Impactful Hr analytics is more about strategic
business focus than random patterns in big data.
e. A journalistic approach to HR analytics. HR analytics can be misused to maintain the status quo and
drive certain agenda, i.e. you look for data supporting your story. But take from journalistic approach
that clear story telling is needed.
Our suggestions for moving HR analytics from Fad to an ongoing part of management decision making
•
Start with the business problem
•
Take HR analytics out of HR. When HR analytics matures it initially start cooperating more with other
departments teams and eventually becomes part of cross functional/end to end analytics looking at the
whole value chain.
• Remember the ‘human’ in human resources. People and the organisation are not completely rational.
• Train HR professionals to have an analytical mindset
Article 2: The double edged sword of big data in organizational and management research: a review of opportunities and risks (wenzel
& Quaquebeke, 2017)
Characterising big data
Three central characteristics (Laney)
1. Volume
2. Variety
3. velocity
Big data can be defined as observational records that may be exceptionally numerous, highly. Heterogeneous
and or generated at high rate and systematically captured, aggregated and analysed to useful ends.
Drivers of big data:
1. Instrumentation
2. Interaction
3. interconnection
Big data in organizational management research
11
Volume = number of observations under investigation.
• Opportunities
o …for universal inferences: universal or highly represented samples
o ….to enhance effect detection and model granularity: increasing the number of datapoints is
good for the statistical power of a test. Allows investigation with both breadth and depth
§ Greater analytical efficacy
o ….to discover
§ Process may be aided by integrative, computational approaches that automate the
construction and fitting of models from nonparametric
§ Association rules can be developed using data where conceivable attributes are
determined to either be present or absent
§ Make discoveries using massive textual data
§ Graphical portrayal of deviation etc
• Risks
o …of biased sampling: a big biased sample is less informative than a small representative one.
It occurs because not every single person in the world will be online
o …Of spurious relationships
§ Falsely rejecting null hypothesis
o Risk of analytical dilemmas
§ Too many dimensions for the data set
Variety = the heterogeneity of data modalities that are open for investigation; it is a function of the many
autonomous sources and means by which reality manifest in the digital realm
• Opportunities
o …to triangulate
§ Using many different methods and measures which enables to increase efficacy o
their findings
§ Use both quantitative and qualitative data
o …to capture in situ signals
§ Big data era affords more unobtrusive and faithful measures that can address
methodological limitations
o …for perspective and reconciliation
§ Metadata: more global information associated with an entity or case, maybe obtained
across multiple data sets that could help link data sets
• Risk
o ...of deceiving data quality
§ Observations may not always have meaning assigned to them and may not produce
dependable data
§ Data affected by inherent technological and institutional structures
§ Importance of establishing internal validity and reliability
o …of privacy breach
§ Expense of people privacy, their ability to control their own conception and its
expression
§ Unauthorized entities gaining access to data
§ Questions about informed consent
§ Ways for privacy:
• Clear legal, statuary and social rules that govern data owners’ control about
data collection etc
• There may be exclusive or shared bigdata ownership
• Transparency and control
• Informed consent is desirable but not always achievable
• Risks can be eliminated by for example volunteered data
• Shared data can remain partially private
• Raw data can be obfuscated
• Computational means can decrease privacy risk (secure systems etc)
o …risk of capability lack
Velocity = the speed at which data under investigation accumulate; a function of the rate by which a
phenomenon is quantified or sampled into a digital object and then transmitted and retrieved.
• Opportunities
12
o
•
o
Risks
o
…for time series and causal analysis
§ Big data affords data sources that can sample parameters without end and at
unprecedent rates, resulting in longer timer-series with reduces intervals between
signals.
§ Time series can support causal claims
…to make research more practical
…of computational constraints
§ More data being stored that can be processes
Lecture 3: Data driven decision making
What is big data?
Big data (Wentzel & Van Quaquebeke, 2018)
Meaning varies strongly depending on the context it is used in
Made possible by:
• Technology
• Algorithms
• Myths
Key proposition: Traces of data we leave behind can help understanding patterns of human life.
3 Vs
• Volume
• Variety
• Velocity
Needed: Instrumentation, interconnection, interaction
Volume
Opportunities that volume offer
• Massive data: both tall (many records N) and wide (many parameters per record p)
• May transcend sampling (population level data)
• May help to detect effects (statistical power): split (holdout sample) and validate
• Discovery (automate the construction and fitting of models) (association rule learning; if-then rules) in:
o Numerical data
o Text data
o Visual data
Risks of volume data
• May still be biased
o Only online (WEIRD)
o Or not applicable
o Bot data already systemically distorted
• Spurious relationships (in particular: decision on significance level)
• Analytical dilemmas (missing data)
è External validity: range restriction/omitted variables bias
è Data origin: who was excluded, less visible, untruthful or not real
Variety
Opportunities of variety
=data on space, time, physiology, kinetics, expression, ambience and data
• Triangulation (combined indicatory may be better predictors than single
• In situ ‘unobtrusive’ measures, rather than asking people
• Perspectives and reconciliation (metadata, network data)
Risks of variety
• Uncertain whether data reflect real phenomenon (validity/correspondence) and whether similar result
sunder stable conditions can be found (reliability)
13
•
•
Privacy breach (identity disclosure/distortion/abuse)
o Detailed profiling (discrimination
o (re)identification from (anonymized) public data
Lack of expertise (…how to train scholars)
Velocity
Opportunities of velocity
• Time series and causal analysis (temporal ordering of continuous phenomena/events)
• practical use of or data driven decision making and exchange between science and practice
Risks
•
storage/processing capacity not sufficient (ao still sampling)
AI
AI & snakeoil
• Perception: genuine, rapid technological
progress (ethical concern: face recognition,
deepfakes accurate)
• Automating judgement: far from perfect but
improving
• Predicting social outcomes: fundamentally
dubious (inaccurate)
Social outcomes more dubious because:
• Nobody can predict the future
• Regression analysis can cover much of the prediction
• Even human judgement is often ok (but more noise)
How to recognise a management fad
Illustration: HR analytics
• Dangers of analytics becoming a management fad
• How to recognise it
Management fad
= simple, one size fits all, falsely encouraging, novel but not radical, legitimized by gurus in tune with Zeitgeist
14
What can contribute to management fad in HR?
Rasmussen & Ulrich (2015)
• Lack of analytics on analytics (no critical assessment)
• Data fetish (more data is better)
• Moving in theory without considering questions
• Dustbowl empiricism
• A journalistic approach (evidence making)
Interview with Denise Rousseau
• To make sense of big data you need to be able to develop evidence-based logic models. Logic models
map connections among organisational data, like inputs and attributes of actors, to organisational
practises and outcomes that follow.
• Sometimes this is described as a theory of change. The key idea here is that data analyitcs is not
dustbowl empiricism. Instead theory is needed to choose among the myriad ways data could be
analysed, to find a sensible logic that helps detect to whom the outcomes of interest are tied
Rasmussen & Ulrich, 2015
Suggestion
• start with business problem (actionable)
• not only HR involves
• act on tendency to reject data that threatens existing beliefs, data have to be sold to have impact
• train people to have analytical mindset
Framework
• context
• stakeholders
• strategies
o what choices do we need to make? What can we discover and test? What data can we collect
and analyse? Which actions do recommend on the basis of data?
Two illustration of models in Marsek drilling (Rasmussen & Urich, 2015)
1. leadership and turnover: incorporating both turnover and operational performance
2. Return on investment in training (comparing ROI of two training programs)
Checklist organisational data (Barends & Rousseau, 2018)
• Is the collection of organisational data based on a logic model?
• Are the data relevant to the decision making?
• Are the data accurate?
• Is the data’s context taken into account?
• Is the data set sufficiently large (stats)?
•
•
•
•
•
1.
Relative or absolute change percentages?
Not only means but also variance?
Is the graph accurate?
Regression coefficient, based on?
Confidence intervals?
Leadership and turnover:
Big data in HRM: article and guestspeaker Shuain Yuan
HRM processes
• Personnel selection
• Performance evaluation
• Organisational culture
Different types of data
• Numbers: personality scores, performance scores, satisfaction, business data
• Videos : (AI) interview recordings
• Emails: contact networks of the organisation
• Others: time active on work systems
15
Big data in personnel selection: automatic trait evaluation
• Procedures:
Automatic trait evaluation; state of the art
• Building prediction models: predicting personalities form words, phrases and topics
• Regression models are limited because
o It cannot model nonlinear relationships
o It cannot model complex interaction between variables
• Machine learning models and (or) deep learning models are used as prediction models
• Using single words and phrases are limited because…
o It puts text out of context
• Next generation text mining tools (e.g. NERT) that are pre-trained on large corpus
How to evaluate the performance of the algorithms?
• The problem of over fitting
o Fit the current data too well, illusory success
o Solution: partition the entire data set into two parts-80% data points used as the training set
and 20% data points used as the validation set
o Used as the training set and 20% data points used as the validation set and 20% data points
used the validation set
o Use the training set to train the algorithm and examine the performance of the algorithms on
the validation set
• Criterion 1: correlations with self-reported and other reported scores
• Criterion 2: Languages associated with each personality
• Criterion 3: reliability
o Whether the estimated personality scores are stable over time
o Solutions: examine whether the estimated personality scores are relatively stable in test and
retest
Big data: privacy issue
• Posts on social media: private or public?
o It is difficult, if not possible to remove your digital traces once posted
o Posts and pictures contain information about other people (tags)
o It is not entirely clear what was given out when the consent form was signed
Big data in personnel selection: gamification
Games are used to asses (1) personality, (2) job-related situational judgement, (3) numeric skills, (4) memor< ns
interpretation of emotion
• Pros
o Games are more engaging
o More difficult to fake goof
o Possible to continuously adjust
o More real work-related experience
• Cons
o Costly to build
16
o
Reliability and validity are not always guaranteed
AI and machine learning
Initial candidate screening
• education background: schools, majors, GPA, publication
• career background: work experiences, skills (soft skills and hard skills)
17
Week 4: Implementing evidence-based decision making in organisations
Article: Implementing big data strategies – a managerial perspective (Tabesh, Mousavidin & Hasani, 2019)
Big data and big data dreams
Big data (here) refers t the large and complex data assets that require cost effective management and analysis for
extraction. 4 features characterise big data:
1. Volume = large scale of big data requiring innovative tools for collection, storage etc.
2. Velocity = rate of generation, updates
3. Variety = variation in types of data
4. Veracity = complex structures of big data assets that make them ambiguous imprecise and inconsistent
Big data analytics cycle
The goal of big data analytics is to enhance organisational decision making and decision execution processes.
• Managers collect data, generate several strategies and their outcomes before making final decisions
• Once implemented the realized outcomes will be evaluated to generate additional information that is
cycled back into subsequent decision-making phases.
The big data analytics cycle is comprised of 4 important phases:
1. Large diverse and usually unstructured data are collected from internal and external sources and
processed using analytical tools and algorithms in order to gain insights.
o These are then interpreted by decision makers and used in the process of decision making
2. Insights from phase 1 are transformed into decisions.
o Done by managers who contextualize the insights generated from their data analysis and attach
meaning to them.
3. Decisions are transformed into specific operational actions - decisions are executed
4. Transformation of decisions into actions generates additional outcomes (data points) which are cycled
back into the process for future efforts.
è Self-perpetuating cycle of big data analytics can significantly benefit organisational decision-making
processes and outcomes. Large volumes both internal operations data and data collected from external
sources are gathered and transformed into actionable insights.
è Each phase entails specific tasks, requires specific resources and needs critical attention from specific
organisational actors.
o Data scientists are mainly responsible for the technical tasks related to data collection &
analysis (1 & 4)
o & can help the interpretation process by communicating technical findings to mangers (3)
o Manager play a primary role in interpretation and execution phases of the cycle
o Managers are also responsible for the orchestration of efforts in the entire cycle
18
Barriers to effective implementation of big data strategies
Technological barriers
Range from the costly infrastructures required for big data acquisition, storage and analysis to the shortage of
qualified data scientists and analysts.
• The challenges impact the entire big data analytics cycle
• Implementation of big data initiatives calls for capital investment in building or buying new data
management systems to enable effective storing and analysis of big data
• Unstructured data requires extensive processing
• 66% of organisations are currently unable to successfully fill their available data scientists’ positions
with qualified candidates
The list of challenges extends beyond infrastructure and talent to also include technical acumen by senior
managers.
• Failures to management misunderstanding about the process
Concerns regarding consumer data ownership and privacy pose another technological challenge to many
organisations.
• Organisations need to device technological procedures to ensure that their activities in the entire big
data analytics cycle comply with the new regulations
Cultural barriers
Organisational culture = set of values, beliefs and attitudes shared by the members of an organisation
Challenges of developing a data driven culture (=the extent to which organisational members make decisions
based on the insights extracted from data).
• Executives rely heavily on prior experience or intuitive hunches instead of following evidence based
and data driven decision processes.
• If top managers do not value data driven decision making their behaviour affect the decision patters at
all levels of the organisation
Difficulties in creating a unified vision about the organisational big data strategy is another major roadblock in
front of effective implementation.
• Lack of understanding what big data analytics is and what benefits it can generate
19
Big data strategy implementation: managerial responsibilities
Three categories of managerial levers that can help mitigate implementation challenges:
1. Structural influences
Refer to existing budgets, plans and control mechanisms that support implementation of a specific
strategy. Managerial structural support for formulated strategies is important.
2.
Relational influences:
Refer to the interpersonal mechanisms that influence communication of objectives and coordination of
efforts at different levels of the organisation.
o Strategic initiatives should be well understood across the organisation
3.
Knowledge influences
Refer to the managerial strategy specific expertise that contributes to successful implementation
Structural influences: providing continued commitment and support
Full exploitation of big data analytics is impossible without senior and middle managements involvement in the
process. Time also is important and managers should not expect their investment in technology to generate
immediate returns as this process takes time to pay off.
Managers need to stay committed to data driven decision making and be persistent in providing structural
support for big data initiatives. Contribution to a data driven culture.
Providing ample financial supports for activities will address the technological needs.
Relational influences: effective communications and coordination of efforts
Establishing a common understanding of big data goals among managers and their technical teams is an
important step in addressing the implementation barriers and realizing big data dreams.
• Business goals of big data analytics should be communicated effectively to the technical staff
• Through all phases of the cycle the communication channels between managers and data scientists
should remain open.
Coordination of data-driven decision-making process at all levels of an organisation is another important
responsibility of managers.
• Led to new executive positions such as chief data/digital officer (CDO) to streamline the process and
facilitate communication
• Building multiskilled teams is important
Knowledge influences: Gaining fundamental managerial analytics acumen
Managerial analytics acumen is a prerequisite for effective creation of value out og big data investments.
Managers at different levels need to become familiar with general concepts and applications related ot theses
techniques.
• Understanding will enable managers to ask the right questions
Effectiveness of data tools depends on how well they fit the problem domain at hand and how much they
empower the organisation in addressing their most pressing needs.
Fundamentals of big data analytics tools and applications
Descriptive tools
Helps managers learn about the current state of their business based on data from the past.
• Tools address: what happened – by uncovering existing states or patterns at an aggregate level.
• Can help discover hidden and potentially useful information related to business processes
Predictive tools
Helps managers predict probably future states, patterns or outcomes based on analysis of existing data.
• Tools addresses: what is likely to happen? And what should we do next?
• Enable decision makers to predict the estimated value of a variable of inters using existing data
o Data mining, statistical modelling
• Tools based on regression analysis
20
Article: Big data, big decisions: the impact of big data on board level decision making (Merendion et al., 2018)
Theoretical approach
A knowledge based view (KBV) is adopted: data as the key resource in decision making. The paper explores the
key determinants of knowledge at three different levels:
• The director/ individual levels (managerial cognitive capabilities)
o Need to develop the mental models and skills or managerial cognitive capabilities to perceive,
analyse and process changes in the environment (cognitive complexity)
• The board level (behavioural factors)
o Routines are essential, consist mainly of knowledge that is tactic and hard to codify, refer to
behaviour that is learned, repeated and rooted in tactic knowledge.
• The stakeholders level (dynamic capabilities)
o Companies are expected to proactively respond to environmental changes by correctly
anticipating the stakeholders needs
o Dynamic capabilities encompass the ability to secure new and additional knowledge by
empowering decision makers to proactively address environmental changes.
Findings
First core category: cognitive capabilities
Most informants found integrating BD resources and developing the necessary capabilities to be complex.
• Organisations don’t have the capabilities to sotre manage and analyse BD
Three subthemes identified
1. Shortfall in cognitive capability
21
2.
3.
Required technical capabilities and those needed to integrate, build an reconfigure the necessary
internal and external competencies to use the data were often absent.
Cognitive biases
Evidence of anchoring in which old ways of thinking inhibited the decision-making processes and a
poor understanding of BD.
Cognitive overload
Occurs when individual are exposed to more information than they are able to process.
Second core category: board cohesion
Strong evidence of BD disrupting the cohesive dynamics of board-level decision making with implication for the
required dynamic and adaptive capabilities. Perceived clash between old and new ways of working.
Decision making disruption
BD disrupting the strategic decision making process leading to anxiety about the impact for the organisation and
worries from individuals about their personal preparedness.
Temporal issues
Informants acknowledge the speed of BD and welcomed to d thinks faster. However there was a perceived
mismatch between BD velocity and capacity to respond quickly.
Board composition issues
Boards consisting of technological native people and older people.
Third core category: responsibility & control
The organisational impact of sub-group formulation
People commented on the need to create new senior roles to make the necessary connection during this
transformational period. Has major impact upon how the board functions and responds to the assimilation of
knowledge acquired through the interpretation of BD
External stakeholders
New trend of outsourcing of resources to external stakeholders. Informants felt that their clients depended
heavily on these outsourcing relationships. Also boards may be losing control of their influence over the firms
strategic direction.
• Involvement of a broad range of stakeholders in shaping how BD is transforming learning and action,
particularly regarding how those decisions are made.
Discussion and conclusion
Main findings
• At level of individual directors: gap between cognitive capabilities that organisation prosses to cope
with BD; capabilities that are crucial in avoiding the cognitive biases and overloads to which BD can
contribute
• At board level, board cohesion is disrupted by BD which has consequences for the decision-making
process itself
• Taking a holistic view: boards seeking new ways of working that cut across traditional silos such as
through the introduction of sub boards and relying upon the capabilities of third parties to help handle
BD.
22
Download