EEOC Statements on PreEmployment Inquiries “Although Title VII does not make pre-employment inquiries concerning race, color, religion or national origin per se violations of the law, the Commission’s responsibility to equal employment opportunity compels it to regard such inquiries with extreme disfavor.” “ … in the investigation of charges alleging the commission of unlawful employment practices, the Commission will pay particular attention to the use by the party against whom charges have been made of preemployment inquiries concerning race, religion, color, or national origin, or other inquiries which tend directly or indirectly to disclose such information. The fact that such questions are asked may, unless otherwise explained, constitute evidence of discrimination, and will weigh significantly in the Commission’s decision as to whether or not Title VII has been violated” Application Blanks • Content of items (use of job analysis) • Number of application blanks (one for each position or job category) • Legal issues • Image of organization (e.g., format, recruitment issue, perceived fairness) • Accuracy of data Applicants overstated length of employment and past salary on application forms (Goldstein, 1971) One serious lie on 25% of application forms and resumes (LoPresto, Mitcham, & Ripley,1986) 40% to 60% candidates overstated their qualifications on resumes (George & Marett, 2005) Most frequent falsifications are job history, job duties, educational record, position title, and previous salary (Broussard and Brannen, 1986) Increasing Application Form Accuracy • Inform applicants in verbally and in writing, that the information they furnish will affect their employability • Inform applicants that the data they provide will be thoroughly checked • Require applicants to sign a statement certifying the accuracy of the information they provided on the form. • Include warnings of penalties (not being hired or termination upon discovery) for deliberate falsification • Include a statement that the application does not create a binding obligation of employment for any specific period of time Previous research studies examining employment applications by date of study Study Results Type of Application & Sample Wallace & Vodanovich (2002) Fortune 500 sample = 2.99 inappropriate items Customer Service sample = 5.35 inappropriate items 191 Fortune 500 Finance/Accounting applications 109 Customer service applications (e.g., retail, food service) Wallace, Tye, & Vodanovich (2000) Average of 4.2 inappropriate items with most problematic: salary, age, driver’s license 42 online state general employment applications Vodanovich & Lowe (1992) Average of 7.4 inappropriate items with most problematic: age, convictions, & salary Retail; 46 categories Jolly & Frierson (1989) 25% of 20 categories were problematic (e.g., salary) 283 random applications from American Society of Public Administration members; 20 categories Coady, (1986) Most problematic: improper use of EEO worksheets 50 state libraries; 25 categories Lowell & DeLoach, (1982) Most problematic: military service & age 50 US firms; 17 categories Burrington, (1982) Average of 7.7 inappropriate items 50 general state applications; 30 categories Miller, (1980) Average of 9.74 inappropriate items 151 of Fortune 500; 72 categories Note: Adapted from Wallace, Tye, and Vodanovich, 2000. Frequency of Common Inappropriate Application Blank Questions Item Not appropriate Worded Appropriate Not asked Past salary 98.9 0 1.1 Minimum salary 72.7 0 27.2 Age 54.5 37.5 8.0 Information about relatives 50.0 10.2 39.8 Conviction records 43.2 28.4 28.4 Health 40.9 2.3 56.8 Military service 30.7 30.7 38.6 Marital status 27.3 0 72.7 Emergency contact 25.0 43.2 31.8 From: Vodanovich & Lowe (1992); Public Personnel Management Years of experience and previous salary are the strongest predictors of starting salary, and starting salary is the greatest predictor of current salary. --- Mickey Silberman, Jackson Lewis, LLP Industry Liason Conference (2011) Percentage of most commonly identified inadvisable application blank items by sample [from Wallace & Vodanovich, 2004) Public Personnel Management] Customer Service Fortune 500 Inadvisable Legitimate Inadvisable Legitimate Desired Salary 66.1 15.0 20.8 5.2 Personal E-Mail Address * 49.5 0.0 88.0 2.5 Lowest Acceptable Salary 46.8 19.2 25.0 1.2 Graduation Date 33.9 3.1 54.2 3.5 Work Schedule 33.0 36.2 1.6 0.6 Conviction (w/o disclaimer) 26.6 56.8 3.6 38.2 References 26.6 42.4 3.1 23.6 Gender (w/o EEO disclaimer) 25.7 38.0 20.3 19.5 Race (w/o EEO Disclaimer) 24.8 25.1 15.1 18.9 Driver’s License 22.9 16.2 1.6 0.8 Relatives 21.1 5.1 2.6 0.0 EEO Worksheet 16.5 8.1 14.1 25.6 Handicap (w/o EEO disclaimer) 14.7 3.2 3.6 15.6 Age (w/o EEO disclaimer) 14.7 3.2 3.1 15.8 Language Fluency 11.9 1.5 3.1 0.7 Emergency Contact 7.3 0.0 .5 0.0 Marital Status 3.7 0.8 5.7 0.3 Personal Web Page Address 3.7 0.0 16.7 0.0 National Origin (w/o EEO disclaimer) 2.8 1.8 5.7 23.1 Category Effect of Name on Resumes and Interview Rates Name type Resume Quality Low High “White” sounding name “Black” sounding name 50% less chance of being invited for an interview versus “Whites” with high qualifications Research on court cases: Most common application blank challenges were based on questions about sex (28%), age (25%), and race (12%). Source: Kethley & Terpstra, 2005 2012 EEOC Guidance on Arrest and Conviction Records In 2012, the EEOC issued guidelines on the use of arrest and conviction records for making selection decisions, its first update since 1990. Refusing to hire those with arrest records is not considered to be justifiable. However, if an individual's behavior underlying an arrest makes the person unfit for a given job, then a decision to not hire may be legitimate. Conviction records are generally easier to defend from a legal perspective. But, the EEOC stresses the consideration of the following three factors: 1) the nature and severity of the offense, 2) the amount of time that has passed since the conviction (or completion of one's sentence), and 3) the nature and type of job sought. The EEOC suggests not asking about conviction records on application forms. If included, such questions ought to be restricted to convictions “for which exclusion would be job related for the position in question and consistent with business necessity.” See: http://www.eeoc.gov/laws/guidance/arrest_conviction.cfm “Ban The Box” Movement Best Practices: EEOC Use of Criminal Records (http://www.eeoc.gov/laws/guidance/arrest_conviction.cfm#VIII) The following are examples of best practices for employers who are considering criminal record information when making employment decisions. General · · Eliminate policies or practices that exclude people from employment based on any criminal record. Train managers, hiring officials, and decisionmakers about Title VII and its prohibition on employment discrimination Developing a Policy · Develop a narrowly tailored written policy and procedure for screening applicants and employees for criminal conduct. o Identify essential job requirements and the actual circumstances under which the jobs are performed. o Determine the specific offenses th at may demonstrate unfitness for performing such jobs. · Identify the criminal offenses based on all available evidence. o Determine the duration of exclusions for criminal conduct based on all available evidence. · Include an individualized assessment. o Record the justification for the policy and procedures o Note and keep a record of consultations and research considered in crafting the policy and procedures. · Train managers, hiring officials, and decisionmakers on how to implement the policy and procedures consistent with Title VII. Questions about Criminal Records · When asking questions about criminal records, limit inquiries to records for which exclusion would be job related for the position in question and consistent with business necessity. Confidentiality Keep information about applicants’ and employees’ criminal records. confidential. Only use it for the purpose for which it was intended. SHRM Credit Background Check Survey Results SHRM Survey Credit Checks 2004 In general, how frequently does your organization, or an agency hired by your organization check any of the following references for its job candidates? 2010 Does your organization, or an agency hired by your organization, conduct credit background checks for any job candidates by reviewing the candidates’ consumer reports? Credit Checks Credit Checks Always: 19% Sometimes: 24%/ 42% Rarely: 18% Never: 39% All job candidates: 13% Select job candidates: 47% No: 40% Survey margin of error: +/- 5% Note: n = 296. Excludes respondents who responded “Don’t know.” Source: SHRM Reference and Background Checking Survey (2004) Survey margin of error: +/- 5% Note: n = 343. Excludes respondents who responded “Not sure.” Source: SHRM Background Checking Survey (2010) SHRM Survey on Use of Credit Background Checks (2010) On which categories of job candidates does your organization conduct credit background checks? SHRM Survey (cont.) When does your organization, or any agency hired by your organization, initiate credit background checks on job candidates? SHRM Survey (cont.) Does your organization allow job candidates, in certain circumstances, the opportunity to explain the results (e.g., high debt, bankruptcy, etc.) of their consumer report that might have an adverse effect on an employment decision? SHRM Credit Check Survey Research Summary • The use of credit background checks in employment decisions has not changed in any discernable way over the past 6 years. • Most organizations do not conduct credit background checks on all job candidates. • Organizations conduct credit background checks for those positions where this information is most job-relevant. • Employers place lower relative importance on credit background checks than other job-related factors in making hiring decisions. • Employers do not use credit background checks to screen out mass numbers of candidates in the early phases of the application process. • Credit background check results are seldom used as a definitive hiring criterion. Two large studies by the Federal Reserve System in 2003 and Freddie Mac in 2000 concluded that Asians and Whites have higher credit scores than do Hispanics and African Americans Meta-Analysis Criterion Work problems K 10 N 7,464 r .149 Discipline 5 5,946 .131 Absenteeism 6 1,678 .211 Performance ratings 3 561 .069 K = number of studies, N = total sample size, r = sample-size weighted uncorrected average correlation Credit score: A number which provides a “snapshot” over a certain period of time (not shown to employers) Credit report: Generates information about an individual’s debt over a longer time frame than a credit score From: Statement of Michael Aamodt, Ph.D., Principal Consultant, DCI Consulting Group, Inc. EEOC Meeting of October 20, 2010 - Employer Use of Credit History as a Screening Tool Credit Scores No relationship between credit ratings and performance scores or termination decisions (Bryan & Palmer, 2012) – over 170 employees in a financial organization A recent study (Bernerth, Taylor, Walker, & Whitman, 2011) found credit scores to be predictive of certain work-related outcomes and Big 5 personality scores. The authors found that credit scores were significant and negatively related to supervisor ratings of: • Task performance and employee engagement in OCBs Credit score were also predictive of Big 5 personality scores of • Greater conscientiousness • Low agreeableness • But, credit scores were NOT found to predict supervisor ratings of: workplace deviance (e.g., theft, aggressiveness) However, the authors caution the use of credit scores absent data demonstrating their job relatedness for certain jobs and the potential for adverse impact. Recent Rulings on Use of Credit and Criminal Background Checks EEOC v. Kaplan Higher Learning (2013) EEOC position: Kaplan’s use of credit reports adversely impacted Black applicants. Had to use “race raters” to decide race from driver’s license photos of applicants But, circuit court judge (N.D. Ohio) ruled this approach to be unreliable and not scientifically rigorous enough (did NOT meet the Daubert standard): 1) 2) 3) 4) 5) technique or theory can be or has been tested whether it has been subject to peer review and publication the known or potential rate of error of the technique or theory the existence and maintenance of standards and controls whether the technique or theory has been generally accepted in the scientific community >>> Judge issued a SJD EEOC v. Freeman (2013) EEOC evidence: • 51 Black applicants passed over between March 23, 2007, and Aug. 11, 2011 because of credit histories • 83 Black and male workers passed over between Nov. 30, 2007 and July 12, 2012 based on criminal records. Data in report issued by EEOC deemed to be problematic: (e.g., did not include data on all available applicants for the two classes for the entire class period) Words used by the judge in this case: “flawed,” “skewed,” “rife with analytical errors,” “laughable,” and “an egregious example of scientific dishonesty” Judge also ruled that the EEOC did not identify a specific employment practice that caused the alleged adverse impact Also in 2013: 9 State Attorney Generals Letter Opposing EEOC Guidance on Criminal Background Data Characteristics of Training & Experience Evaluations (e.g., information from application blanks, resumes) A listing or description of tasks, KSAs, or other job- relevant content areas A means by which applicants can describe, indicate, or rate the extent of their training or experience with these job content areas A basis for evaluating or scoring applicants’ self- reported training, experience, or education Some Uses of Training and Experience Evaluations (e.g., gleamed from application blank information, resumes) As the sole basis for deciding if an individual is or is not minimally qualified As a means for rank-ordering individuals from high to low based on a T&E score As a basis for prescreening applicants prior to administering more expensive, time-consuming predictors (for example, an interview) In combination with other predictors used for making an employment decision Applications via the Internet Increasing frequency of requiring ABs and resume information via the Internet (e.g., preset fields, check boxes) Greater convenience and standardization -- but can lead to less applicants Effect on those without access to Web (adverse impact) Use of 3rd party companies for resume submission • Privacy concerns (tell users how data will be handled) Social Media and Selection Frequency of Use --•18% indicated that they have used social networking websites to screen applicants, while 11% planned on using such sites in the future (survey of over 400 organizations by the Society for Human Resource Management in 2011) • 45% of employers used social networking sites to investigate job applicants (survey of over 2,600 hiring managers by Harris Interactive for CareerBuilder.com) Consequences? • 35% of organizations in Harris survey said they did not hire candidates due to content available on social networking sites. Most common examples of negative information included Provocative attire Images of drug or alcohol use, Complaints about previous employers http://www.siop.org/tip/oct12/05davison.aspx (TIP article of social media & selection) Password Protection Act Introduced March of 2012 • Prohibits an employer from forcing prospective or current employees to provide access to their own private account as a condition of employment • Prohibits employers from discriminating or retaliating against a prospective or current employee because that employee refuses to provide access to a passwordprotected account. Example: Scoring resume data for sales and accounting jobs Brief Training and Experience Evaluation Used for Appraising Applications Submitted for the Job of Clerk An Example Training and Experience Evaluation Form for the Job of Personnel Research Analyst An Example Rating Form for Use in Evaluating Training and Experience of Applicants for the Job of Personnel Research Analyst Decision-Making Methods for T&E Data • Holistic Judgment An informal, unstructured approach that an individual takes when reviewing an application or T&E form An individual makes a cursory review of the information and arrives at a broad, general judgment of the applicant’s suitability Because of its unstandardized nature and unknown reliability and validity, it should be avoided as an approach to T&E evaluations. Decision-Making Methods for T&E Data (cont.) • Point Method A pre-established rating system for crediting applicants’ prior training, education, and experience considered relevant to the job Points are assigned based on the recentness, type, and amount of training, job experience, and education received Analysts using the point method make their ratings and then sum the credited points assigned Decision-Making Methods for T&E Data (cont.) • Grouping Method Divides applicants into groups that best represent each applicant’s level of qualifications The number of groups used will depend on the particular situation High Group: suitable applicants well qualified for the job Middle Group: applicants not fitting in either the high or low group Low Group: applicants with minimum qualifications but poorly suited because of limited experience or training Unqualified Group: applicants lacking minimum qualifications Grouping Method Example Decision-Making Methods for T&E Data (cont.) • Behavioral Consistency Method Applicant descriptions of achievements related to key job requirements or competencies are formally scored using scales derived from subject matter experts • Principles of the Method Behaviors evaluated have been identified by SMEs as showing differences between superior and minimally acceptable workers. Applicants’ past accomplishments can be reliably rated by SMEs. Past accomplishments are considered predictive of future behaviors Sample of Behavior Consistency Model Concerns the conduct of research activities including designing a research study, collecting and analyzing data to test specific research hypotheses or answer research questions, and writing up research results in the form of a formal report. For the behavior Conducting Empirical Research that is defined above, think about your past activities and accomplishments. Then write a narrative description of your activities and accomplishments in the space below. In your description, be sure to answer the following questions: 1. What specifically did you do? When did you do it? 2. Give examples of what you did that illustrate how you accomplished the above behavior. 3. What percentage of credit do you claim for your work in this area? Description: During my senior year (2005–2006), I wrote a senior research thesis as a partial requirement for graduation with honors in psychology. I designed a research study to investigate the effects of interviewer race on interviewee performance in a structured interview. I personally designed the research study and conducted it in a metropolitan police department. White and African-American applicants for the job of patrol police officer were randomly assigned to White and African-American interviewers. After conducting an analysis of the patrol police job, a structured interview schedule was developed. The various interviewee-interviewer racial combinations were then compared in terms of their performance in the structured interview. I consider the vast majority of the work (80 percent) to be my own. My major professor accounted for about 20 percent of the work. Her work consisted of helping to obtain site approval for the research, helping to design the study, and reviewing my work products. Name and Address of an Individual Who Can Verify the Work You Described Above: Name: Dr. Amy Prewett Address: Department of Psychology Pascal Univ. State College, ID Phone: 607-555-0821 An Example Rating Scale for Scoring the Behavioral Consistency Method of T&E Evaluation Decision-Making Methods for T&E Data (cont.) • Task-Based Method Critical job tasks identified from comprehensive job analysis serve as the basis for the task-based method. Applicants indicate on a list of tasks if they have performed the tasks and, if so, how often Applicants furnish specific information such that their self-ratings can be verified • KSA-Based Method Similar to the task-based method with the substitution of KSAs on the questionnaire for applicant self-ratings Psychometrics of T&E Evaluations • Reliability T&E evaluations reflect high inter-rater reliability estimates (.80s) Task-based method has the highest reliability Grouping method producing the lowest • Validity Validity of T&E ratings varies with the type of procedure used The behavioral consistency method demonstrated the highest validity The point- and task-based methods show useful validities for applicant groups having low levels of job experience Past work experience predicts job performance (.27) and task performance is rather than job or organizational experience GPA r with performance = .32 but years of education = .10; Job tenure (organizational tenure or hours worked (r = .18, .20 respectively) Research Findings on T&E Evaluations • T&E Evaluations Consistently predict important work outcomes Vary significantly in the strength of their predictive validity Some methods of evaluating experience and training exhibit substantial correlations, .45, with success (e.g., the “behavioral consistency” method) Other methods reflect low validities (e.g., the point method: .15, task method: .11) Are particularly valuable for the first three to five years on the job T&E Recommendations • Use T&E evaluations to set specific minimum job qualifications (KSAs), rather than using a selection standard • Replace holistic methods with competency-based approaches—behavioral consistency and grouping methods • T&E evaluations are subject to the Uniforms Guidelines • Use T&E evaluations only as rough screening procedures for positions where previous experience and training are necessary • Forms and procedures for collecting and scoring T&E evaluations should be standardized/structured as much as possible • Verify self-report data, particularly of data given by applicants who are going to be offered a job • Base final hiring decisions on other selection measures when distortion of self-evaluation information is likely to be a problem Reference Checks (Exceptionally common technique; e.g., 95% usage by organizations) Basic Purposes: • Verify information provided by the applicant (check for inconsistencies) • Uncover unreported or additional information • Predict job performance (pass or fail decisions) Typical Types of Information Collected • Employment dates • Rehire? • Job title • Job tasks • Salary (e.g. beginning and ending): Technically not illegal to ask but can potentially be problematic – leading to pay offers that are different (lower) for minorities and females (From SHRM survey, 2005 -- Reference and Background Checks) Sources of Reference Data • Supervisor (most common and most useful) • Personal reference • Agencies (e.g., credit ratings) • Public Records (criminal background, driving records, court records, workers compensation) • Educational background (verification) Reference Check Methods In-Person (e.g., interview) • Costly, time consuming • Used in jobs that involve the concern for risks (e.g., security, $) • Can elicit different types of information (differences between in-person and written reference information) Mail (or e-mail) -- See Table 9.4 on page 409 • Low return rate with use of “snail” mail (e.g., 56 – 64%) (many chose to set up phone reference checks via email) • Standardized questions, format • Written record of responses • Ensure confidentiality of responses (signed statement by applicant) Telephone Checks (see Table 9.9 on page 407) • Allows follow-up or clarification of answers given • Less resistance to giving certain types of information can be collected • Structured phone interview validity (.25) • Relatively quick process • Important data can be gleaned from various verbal cues (e.g., pauses, hesitations, voice inflections, voice level, intonations) • Relatively high return rate (especially if time was set earlier) • Better responsiveness, more interactive nature of the method, and more confidence in the identity of responder Reference Check Recommendations • Use of job-related questions (e.g., KSAs from a job analysis) • Use of multiple reference check forms (job specificity) • Follow provisions contained in the Uniform Guidelines (e.g., regarding fairness, validity) • Behaviorally-focused and objective set of questions • Get written permission for applicants • Training of interviewers (phone, interview) and recordkeeping • Ask for additional references if one’s submitted not available • Verify information that is collected! Usefulness of Reference Information • Relatively low validity; relationship to performance measures (e.g., .18, .25) • Relatively low interrater reliability (e.g., .40, but sometimes from different sources) Most useful if: • Data collected from immediate supervisor • Referee knows applicant well (chance to observe job behavior) and have similar demographic characteristics • Similarity between the prior job and the one being applied for Reference Checks --- Legal Issues Defamation • Content can be written or oral • Statement must be false • Injury must occur (e.g., not hired) • Company does not have privilege: Absolute privilege: Immunity from legal challenge Qualified privilege: Statement is knowingly false or malicious Reference Checks --- Legal Issues Negligent Hiring An injury to a third party is caused by an employee The employee is shown to be unfit for the job that he or she holds The employer knew or should have known that the employee was unfit if a background check or criminal check had been conducted The injury to the third party was a foreseeable outcome resulting from hiring the unfit employee The injury is a reasonable and probable outcome of what the employer did or did not do in hiring the individual More lawsuits regarding reference and background checks than ones involving medical exams, drug tests, or polygraphs – but, still small percent. Applicants sign waiver? Letters of Recommendation (Mainly used in highly skilled or professional jobs) Some generic indicators: • Meaning of certain adjectives (e.g., mental ability – performance; cooperativenss/personality – not related to performance) • Number of words used or length of letter (longer letter is better) Concerns: • Pre-selection of referees (often only positive information included) • Verbal and organizational skill of writer • Unstructured content • Omissions • Time availability • Subjective scoring (e.g., focus on irrelevant information, status of writer) Biographical Data (Bio-Data) Process Developing Biodata Items --Task/job-based approach --- To assess safety orientation; “How often have you been a participant in safety meetings?” Assumes prior experience in the job domain KSAO approach --[Example of assessing a job-related construct such as initiative --“On past jobs, how often did you volunteer to work on specific projects?” More generic but not explicitly related to job duties • Item Screening • Relevance of items to the intended construct or job/task relevance of the item • Potential overlap with other constructs • Social desirability and/or faking potential • Possible for bias against protected groups • Privacy concerns Eliminate an item from the Bio-data inventory if items: • Have little variability • Has a skewed response distribution • Is correlated with protected-group characteristics such as ethnicity • Has no correlation with other items thought to be measuring the same life history construct • Has no correlation with the criterion (no item validity) Classification of Biographical Items Historical Hypothetical How old were you when you got your first job? What job do you think you’ll have in 10 years? External Internal Did you ever get fired from a job? What is your attitude towards friends who smoke marijuana? Objective Subjective How many hours did you study for your math tests? Would you describe yourself as shy? First-hand Second-hand How punctual are you about coming to work? How would your teachers describe your punctuality? Discrete Summative At what age did you receive your driver’s license? How many hours do you study during an average week? Verifiable Non-Verifiable What was your GPA in college? How many fresh vegetables do you eat daily? Controllable Non-controllable How many times did you drop classes in college? How many siblings do you have? Best results if items are: Historical, objective, discrete, job relevant, and external Classification of Biographical Items (cont.) 1. Verifiable: Did you graduate from college? 2. Historical: How many jobs have you held in the past five years? Unverifiable: How much did you enjoy high school? Futuristic: What job would you like to hold five years from now? 3. Actual Behavior: Have you ever repaired a broken radio? 4. Memory: How would you describe your life at home while growing up? 5. Factual: How many hours do you spend at work in a typical week? 6. Specific: While growing up, did you collect coins? 7. Response: Which of the following hobbies do you enjoy? 8. External Event: When you were a teenager, how much time did your father spend with you? Hypothetical Behavior: If you had your choice, what job would you like to hold now? Conjecture: If you were to go through college again, what would you choose as a major? Interpretive: If you could choose your supervisor, what characteristic would you want him or her to have? General: While growing up, what activities did you enjoy most? Response Tendency: When you have a problem at work, to whom do you turn for assistance? Internal Event: Which best describes the feelings you had when you last worked with a computer? 1. Yes-No Response: Are you satisfied with your life? a. Yes b. No 2. Continuum, Single-Choice Response: About how many fiction books have you read in the past year? a. None b. 1 or 2 c. 3 or 4 d. 5 or 6 e. More than 6 3. Noncontinuum, Single-Choice Response: Which one of the following would you most prefer to do in your leisure time? a. Read a book b. Work crossword puzzles c. Attend a party d. Play golf, tennis, or softball e. Repair a broken appliance or make minor home repairs 4. Noncontinuum, Multiple-Choice Response: Check each of the following activities you had participated in by the time you were 18. a. Shot a rifle b. Driven a car c. Worked a full-time job d. Traveled alone more than 500 miles from home e. Repaired an electrical appliance 5. Continuum, Plus Escape Option: When you were a teenager, how often did your father help you with your schoolwork? a. Very often b. Often c. Sometimes d. Seldom e. Never f. Father was not at home 6. Noncontinuum, Plus Escape Option: In what branch of the military did you serve? a. Army b. Air Force c. Navy d. Marines e. Never served in the military 7. Common Stem, Multiple Continuum: In the last 5 years, how much have you enjoyed each of the following? (Use the rating scale of 1 to 4: (1) Very Much, (2) Some, (3) Very little, (4) Not at all a. Reading books b. Watching TV c. Working at your job d. Traveling e. Outdoor recreation Biodata Scoring Empirical Keying Approach (correlation between items and a criterion) High Items Performance Groups (e.g., median split, upper vs. lower thirds) Low Issues: • Validity of the criterion measure • Contamination/bias in criterion measure • Items (and their weights) are specific to the criterion Summary of Bio-Data Validity Studies Bio-Data (cont) • Reliability: .60 to .80 across several studies [higher for more verifiable items] • Validity: Many validity coefficients above .30 • Accuracy: Some distortions exist. Mainly on unverifiable items (e.g., interests, preferences) and more if desirability of answers is apparent (e.g., faking can occur) Bio-Data [Why does it work?] • Use of life history items e.g., personal background, life experiences, interests (past behavior is best predictor of future behavior) • Only relevant (empirically significant) items/constructs are selected • Correlation between BIB content and criterion • Wide range of information (lots of different questions and types) Some Bio-Data Issues • Situational specificity • Need large sample to construct properly • Assumption of a “correct” life history • Pure empirical approach (e.g., versus content approach) • Legal issues (e.g., adverse impact, validity, reliability)