Reliability Definition: The stability or consistency of a test. Assumption: True score = obtained score +/- error. ~ Domain Sampling Model ~ Item Domain Test Test-retest method [error is due to changes occurring due to the passage of time] Test 1 ------------- r Test 1 ------------- Issues: • Length of time between test administrations if crucial (generally, the longer the interval, the lower the reliability) • Memory • Stability of the construct being assessed • Speed tests, sensory discrimination, psychomotor tests (possible fatigue factor) Parallel/Alternate Forms [error due to test content and perhaps passage of time] Test 1 ------------- r Test 2 ------------- Issues: • Need same number & type of items on each test • Item difficulty must be the same on each test • Variability of scores must be the same on each test Two types: 1) Immediate (back-to-back administrations) 2) Delayed (a time interval between administrations) KR-20 and Coefficient Alpha [error due to item similarity] • KR-20 is used with scales that have right & wrong responses (e.g., achievement tests) • Alpha is used for scales that have a range of response options where there are no right or wrong responses (e.g., 7-point Likert-type scales KR-20 Rtt = k k–1 % of those getting the item correct Σ pi (1 – pi) σy 2 # of items variance of test scores variance of scores on each item Alpha ά = k 1 – Σ σ i2 k–1 # of items σ y2 variance of test scores Factors Affecting Reliability 1) Variability of scores (generally, the more variability, the higher the reliability) 2) Number of items (the more questions, the higher the reliability) 3) Item difficulty (moderately difficult items lead to higher reliability, e.g., p-value of .40 to .60) 4) Homogeneity/similarity of item content (e.g., item x total score correlation; the more homogeneity, the higher the reliability) 5) Scale format/number of response options (the more options, the higher the reliability) Content Validity: The extent to which test items represent a domain (e.g., Subject Matter Expert Opinions) Connect each item to a KSA or two Rate difficulty of each item (5-point scale) relative to the level of KSA needed in the job) Sample Item Rating Form Types of Validity (cont.) Criterion-Related Validity [correlation between test scores and job performance scores Predictive [Correlation between test scores of applicants and their performance scores when some time interval has passed after they are hired] • Range restriction issue on performance scores • Time, cost, & pragmatic concerns Concurrent [Correlation between test scores and performance scores of current employees] • Motivation level • Guessing, Faking • Job experience factor • Range restriction issue on performance scores Application Blanks • Content of items (use of job analysis) • Number of application blanks (one for each position or job category) • Legal issues • Image of organization (e.g., format, recruitment issue, perceived fairness of questions) • Accuracy of data Education, salary, job title/duties, years worked College students willing to include one lie on AB (95%); 45% had done so • 40% to 60% candidates overstated their qualifications on resumes (George & Marett, 2005) Yahoo CEO Scott Thompson caught padding his resume Yahoo director Patti Hart, who was in charge of the company's CEO search, will not seek re-election at the next shareholder meeting. An exact date for the meeting has not been set, but last year's event was held in June. Hart, who joined Yahoo's board in 2010, is the CEO of International Game Technology Statements from both Yahoo and IGT said that IGT's board asked her not to seek re-election at Yahoo. Thompson stepping down as Yahoo CEO (May 13th, Washington Post) Yahoo chief executive Scott Thompson, hired from eBay in January to turn around the struggling Web portal, is stepping down after it was discovered that his academic credentials were misrepresented, according to a person with knowledge of the matter In a series of published biographical statements stretching back for years – including his bio on Yahoo’s website ---- Thompson has said that he "holds a Bachelor's degree in accounting and computer science" from Stonehill College. But his degree is actually only in accounting … Yahoo sent out a statement saying references to Thompson earning a computer science degree were an "inadvertent error.” … It's an error Thompson made repeatedly. References to Thompson's nonexistent computer science degree are featured in his bios on sites for PayPal, the eBay subsidiary where he previously served as president. Ways To Increase Applicant Honesty ABs & Resumes • Tell applicants that the information they furnish will affect their employability • Inform applicants that their answers will be thoroughly checked • Have applicants sign a statement certifying the accuracy of the information they provide • Include warnings of penalties (not being hired or termination upon discovery) for deliberatly lying Frequency of Common Inappropriate Application Blank Questions Worded Appropriately Not Asked Item Not Appropriate Past salary 98.9% 0% 1.1% Minimum acceptable salary 72.7% 0% 27.2% Reference source 59.1% 0% 40.9% Age 54.5% 37.5% 8.0% relatives 50.0% 10.2% 39.8% Conviction records 43.2% 28.4% 28.4% Health 40.9% 2.3% 56.8% Military service 30.7% 30.7% 38.6% Marital status 27.3% 0% 72.7% 25.0% 43.2% 31.8% residence 23.9% 0% 76.1% Physical description, photo 19.3% 0% 80.7% Rent or own car or home 18.2% 0% 81.8% Handicap 17.0% 6.8% 76.2% Organizational membership 15.9% 21.6% 12.5% Work schedule 13.6% 63.6% 22.7% Information about Who to notify in case of emergency Length of time in Years of experience and previous salary are the strongest predictors of starting salary, and starting salary is the greatest predictor of current salary. --- Mickey Silberman, Jackson Lewis, LLP Industry Liason Conference (2011) Other possible concerns include: personal email, personal web page, and/or Facebook accounts Effect of Name on Resumes and Interview Rates Name type Resume Quality Low High “White” sounding name “Black” sounding name 50% less chance of being invited for an interview versus “Whites” with high qualifications (Bertrand & Mullainathan; 2004) EEOC Guidance on Arrest and Conviction Records (2012) Conviction records are generally easier to defend than arrest records from a legal perspective EEOC stresses the consideration of the following 3 factors regarding conviction records: 1) the nature and severity of the offense 2)the amount of time that has passed since the conviction (or completion of one's sentence) 3) the nature and type of job sought Social Media and Selection Frequency of Use --•18% indicated that they have used social networking websites to screen applicants, while 11% planned on using such sites in the future (survey of over 400 organizations by the Society for Human Resource Management in 2011) • 45% of employers used social networking sites to investigate job applicants (survey of over 2,600 hiring managers conducted by Harris Interactive for CareerBuilder.com) Consequences? • 35% of organizations in Harris survey said they did not hire candidates due to content available on social networking sites. Most common examples of negative information included Provocative attire Images of drug or alcohol use, Complaints about previous employers Password Protection Act (2012) If passed it will: • Prohibit an employer from forcing prospective or current employees to provide access to their own private account as a condition of employment • Prohibit employers from discriminating or retaliating against a prospective or current employee because that employee refuses to provide access to a password-protected account. An Example Rating Form for Use in Evaluating Training and Experience of Applicants for the Job of Personnel Research Analyst Evaluating Applicant Training and Experience (T&E) Recommendations • Establish minimum levels for acceptance • Use as a rough screening devices (rule out those without minimum levels) • Tie T&E requirements to job-related tasks or KSAs Overall, good predictor of job performance Best regarding “early” job performance (first 3-5 years) ~ Letters of Recommendation ~ Issues: Self-selection (little variation in responses, mostly positive) Knowledge of applicant Writing ability of recommender, unstructured content Time availability Role of negative information Scoring (subjective versus established criteria) ~ Honesty Testing ~ Two Types 1) Overt integrity tests 2) Covert/Personality (Honesty test questions imbedded within personality measures, e.g., among items assessing drug use/attitudes, tendency toward violence) Sample Honesty Test Questions Other examples: Should a person be fired if caught stealing $5.00?; , Have you ever thought about taking company merchandise without actually taking any?; ~ Honesty Testing ~ Some Considerations: • Should applicants be eliminated if they “fail” an integrity test but perform well on other tests measuring job knowledge and cognitive ability? (Important issue as some research has indicated that approximately 40-70 percent of test takers fail honesty tests) • Should high scores on other tests be allowed to compensate for poor honesty test scores? • Faking of answers (more likely on overt tests) *** Despite the above, honesty tests are strong predictors of job performance and counterproductive work behaviors. They also result in very little adverse impact. Drug Testing • Approximately 57% of companies use pre-employment drug testing (Society for Human Resource Management survey (2011) • What does a positive drug test score indicate? Some contend that drug use does not negatively affect performance in the majority of jobs (Macdonald, 1997) Other research indicates that drug use is related to poor job performance (e.g., increased injuries and involuntary turnover) (Bass et al., 1996; Normand, Salyards, & Mahoney, 1990). Drug Testing Some Issues: • Those being tested Applicants, employees or both • Testing procedure Random or for cause • Type of company Public or private • Type of test Accuracy, Cross-reactivity • Type of job Safety concerns or not Background Checks Driving record Criminal background Credit history Educational record SHRM Survey on Use of Credit Background Checks (2010) For all job candidates: 13% Only for certain job candidates: 47% No: 40% On which categories of job candidates does your organization conduct credit background checks? Some Credit Check Issues: •Correlation with work performance (weak, inconsistent) Table below from: Statement of Michael Aamodt, Ph.D., Principal Consultant, DCI Consulting Group, Inc. EEOC Meeting of October 20, 2010 Employer Use of Credit History as a Screening Tool K Work problems 10 N 7,464 r .149 Discipline 5 5,946 .131 Absenteeism 6 1,678 .211 Performance ratings 3 561 .069 K = number of studies, N = total sample size, r = sample-size weighted uncorrected average correlation •No relationship between credit ratings and performance scores or termination decisions (Bryan & Palmer, 2012) – over 170 employees in a financial organization •Recent study: Relationships with task performance (supervisor ratings), OCBs, conscientiousness, but not theft or aggressiveness •Adverse impact (minorities, esp. Blacks, Hispanics, have lower scores) ~ Reference Checks ~ (Exceptionally common technique; e.g., 95% usage by organizations) In-Person (e.g., interview) • Costly, time consuming • Used in jobs that involve the concern for risks (e.g., security, $) • Can elicit different types of information (differences between in-person and written reference information) Mail (or e-mail) • Low return rate using “snail” mail (e.g., 56 – 64%) • Standardized questions, format • Written record of responses • Ensure confidentiality of responses (signed statement by applicant) ~ Telephone Checks ~ (More frequently used than written references) • Allows follow-up or clarification of answers given • Less resistance to giving certain types of information can be collected • Quick process (many last 10-15 minutes) • Important data can be gleaned from various verbal cues (e.g., pauses, hesitations, voice inflections, voice level, intonations) • Relatively high return rate • Better responsiveness, more interactive nature of the method • More confidence in the identity of responder Usefulness of Reference Information • Relatively low validity; relationship to performance measures (e.g., .14, .16) • Relatively low interrater reliability (e.g., .40, but sometimes from different sources) Most useful if: • Data collected from immediate supervisor • Referee knows applicant well (chance to observe job behavior) and have similar demographic characteristics • Similarity between the prior job and the one being applied for Sample Biographical Information Blank Items [Biodata -- Provides life history data] • During high school, how many times did you make the honor roll? • How much freedom or independence did your parents allow you in grade school? • How important did your favorite high school teachers stress discipline in the classroom? • How many times did you change schools before you were sixteen years old? • Compared to other people in high school, how many friends did you have? • How old were you when you spent your first week (or more) away from your parents? • How bothered are you if you a job is left undone? • How often do you read craft and mechanics magazines? • How quickly do you normally work? • How well do you feel you can understand the feelings of others? • How well do you tolerate performing routine tasks? Classification of Biographical Items Classification of Biographical Items (cont.) 1. Verifiable: Did you graduate from college? 2. Historical: How many jobs have you held in the past five years? Unverifiable: How much did you enjoy high school? Futuristic: What job would you like to hold five years from now? 3. Actual Behavior: Have you ever repaired a broken radio? 4. Memory: How would you describe your life at home while growing up? 5. Factual: How many hours do you spend at work in a typical week? 6. Specific: While growing up, did you collect coins? 7. Response: Which of the following hobbies do you enjoy? 8. External Event: When you were a teenager, how much time did your father spend with you? Hypothetical Behavior: If you had your choice, what job would you like to hold now? Conjecture: If you were to go through college again, what would you choose as a major? Interpretive: If you could choose your supervisor, what characteristic would you want him or her to have? General: While growing up, what activities did you enjoy most? Response Tendency: When you have a problem at work, to whom do you turn for assistance? Internal Event: Which best describes the feelings you had when you last worked with a computer? Biodata Scoring A Commonly Used Approach (correlation between items and a criterion) Items (based on job analysis e.g., tasks, KSAs, constructs) High Performance Groups (e.g., median split, upper vs. lower thirds) Low Issues: • Validity of the criterion measure • Items (and their weights) are specific to the criterion • Need for item pretesting (e.g., remove those with low variability, low/no correlation to criterion, correlation with protected group status) Summary of Bio-Data Validity Studies Biodata Summary • Good predictor of job performance • Wide range of information collected • Possible situational specificity • Requires large sample sizes ~ Employment Interview ~ • Frequently used to make selection decisions (over 90%) • Social exchange (interpersonal) process • Search for information Common “Traditional” Interview Problems • Variety of Interviewer Biases * 1st Impressions * Expectancy Effect * Contrast Effect * Stereotype Matching • Different Questions Asked to Applicants (Lack of standardization; 29% - 94%) • Disagreement on the Desirability of Interview Responses • Little Formal Interviewer Training (44% receive such training: Wang & Yancey, 2012) • Subjective (or no) Scoring System (50%: Wang & Yancey, 2012) • Interview Conducted and Scored by One Person • Poor Reliability, Validity, and Job Relevancy (Open to Legal Challenge) ~ Summary of Situational Interview Process ~ • Perform a Job Analysis Using the Critical Incident Technique • Place Critical Incidents into Relevant Job Dimensions (e.g., Safety, Responsibility, Interpersonal Skills) • Reword Critical Incidents Into Question Form • Decide on the desirability of responses [Think of how good, average, and mediocre workers would have answered such a question] • Conduct interviews in groups of two or more. Each interviewer scores applicant independently. A single score is given after group discussion Situational Interview Process (cont.) Behavior Description Interview Ability Tests Sensory (e.g., hearing, vision) Motor (e.g., dexterity, strength, agility) ADA concerns • Reasonable accommodation • Essential job duties Cognitive (e.g., Intelligence) Cognitive Ability (e.g., ability to learn; acquire new knowledge and skill) • General Cognitive Ability • Special abilities (e.g., verbal, quantitative, spatial abilities, arithmetic reasoning, memory, mechanical comprehension, reasoning ability) Cognitive Ability Tests --- Pros & Cons Pros --• Significant predictors of managerial performance (Hunter & Hunter, 1984), especially scores on verbal and numerical ability (Grimsley & Jarrett, 1973, 1975). • Data from of meta-analytic studies indicate they are some of the best predictors of performance available across an array of jobs (Bobko, Roth, & Potosky 1999; Schmidt & Hunter, 1998). • Better predictor among highly complex jobs Cons --• Potential adverse impact (Blacks and Hispanics score significantly lower on these tests (Roth, Bevier, Bobko, Switzer, & Tyler, 2001) • They do not assess other types of intelligence (e.g., analytical, creative and practical intelligence (Sternberg, 1985, 2000). Differential Aptitude Test (DAT) Abstract Reasoning PROBLEM FIGURES ANSWER FIGURES A B C D E Differential Aptitude Test (DAT) Mechanical Reasoning A B Which weighs more? (If equal, mark C.) Differential Aptitude Test (DAT) Space Relations A B C D http://www.cctexas.com/?fuseaction=main.view&page=2478 Texas city hit with police sex discrimination suit CHRISTOPHER SHERMAN, Associated Press Updated 7:13 p.m., Tuesday, July 3, 2012 McALLEN, Texas (AP) — The Justice Department sued the city of Corpus Christi on Tuesday, alleging the Police Department discriminated in hiring women by using a physical ability test few female applicants have been able to pass. Federal prosecutors say only about one in five women who took the test between 2005 and 2009 passed it, compared with about two-thirds of the men. The last two years the pass rates for men and women increased due to a change in the cutoff scores, but the gap between men and women persisted. The complaint filed in federal court in Corpus Christi says the department hired 12 female entry-level officers and 113 males from 2005 to 2011. Consent Decree (Settlement) The consent decree requires that Corpus Christi no longer use the physical abilities test challenged by the United States for selecting entry-level police officers. It also requires the city to develop a new selection procedure that complies with Title VII. Additionally, the consent decree requires the city to pay $700,000 as back pay to female applicants who took and failed the challenged physical abilities test between 2005 and 2011 and are determined to be eligible for relief. Also under the consent decree, some women who took and failed the challenged physical abilities test between 2005 and 2011 may receive offers of priority employment with retroactive seniority and benefits. Applicants interested in priority employment must pass the new, lawful selection procedure developed by Corpus Christi under the decree and meet other qualifications required of all applicants considered for entry-level police officer positions. ~ Work Sample Tests ~ (performing a piece, or sample task, of the job) • Good content validity • Less adverse impact (e.g., than cognitive ability tests) Issues: • Can be time consuming, costly to develop (what questions/tasks to include, determine correct answer or response) • Need for standardization (e.g., instructions, administration, scoring, administrator training) ~ Assessment Center Process ~ Candidates Participate in Situational Exercises Sample Group Exercises Leaderless Group Discussion Business Game Sample Individual Exercises Interview Simulation Scheduling Exercise Observed, discussed, and scored on various dimensions (e.g., communication, decision making, planning/organizational skills) by trained raters • Overall scores computed and ranked for personnel decisions • Indiviudaul dimension scores used for developmental purposes In-Basket • Can be time consuming and costly (e.g., over $1,500/person) Personality Assessment (Self-report versus projective techniques) ~ “Big 5” Personality Factors ~ • Extraversion --- Outgoing, sociable • Neuroticism (Emotional Stability): Depressed, anxious, worrisome • Agreeableness: Flexible, forgiving • Conscientiousness: Careful, thorough, persevering • Openness to Experience: Curious, imaginative Overall, conscientiousness and extraversion are best predictors of managerial performance across jobs Personality measures add to prediction above and beyond other commonly used measure such as cognitive ability Tests intended to assess psychiatric disorders (e.g., MMPI) are considered as medical tests under the ADA and must be given post offer