SOCIAL MEDIA AND SELECTION: HOW DOES NEW TECHNOLOGY CHANGE AN

advertisement
SOCIAL MEDIA AND SELECTION: HOW DOES NEW TECHNOLOGY CHANGE AN
OLD GAME?
Julie Wade
JTWADE@uscupstate.edu
USC-Upstate
Phillip Roth
rothp@clemson.edu
Clemson University
Jason Bennett Thatcher
jthatch@clemson.edu
Clemson University
Monday, February 1, 16
 Working Paper – please do not circulate without author’s permission 
1
SOCIAL MEDIA AND SELECTION: HOW DOES NEW TECHNOLOGY CHANGE AN
OLD GAME?
ABSTRACT
A variety of sources indicate decision makers use social media, such as Facebook and
LinkedIn, to make decisions regarding potential employees. Unfortunately, there is scant
academic research on the implications of this practice. To shed light on the relationship between
social media and selection, we investigate whether applicants’ political attitudes and individuating
information (i.e., job-related information) found on social media impact decision makers’
evaluations of job candidates’ likeability, similarity, and “hireability”. To evaluate these
relationships, we conducted an experiment, which manipulated presentation of political attitudes
and individuating information on two social media platforms. Our results indicated perceived
similarity influenced liking and in turn, hireability, for all of our political conditions, regardless of
the social media platform information was viewed on. Further, we found such effects in spite of
individuating information. The study has many implications for practice, including indicating that
political information on social media may influence hiring decisions; suggesting a need for future
research on how to craft appropriate hiring policies.
Key words: SOCIAL MEDIA, SOCIAL MEDIA PLATFORM, HIRING DECISIONS
2
SOCIAL MEDIA AND SELECTION: HOW DOES NEW TECHNOLOGY CHANGE AN
OLD GAME?
INTRODUCTION
“A recent tweet by one would-be employee of Cisco Systems – a computer networking
company - may have cost him a job. A man who’s known on Twitter as “The Connor” posted this
after an interview: “Cisco just offered me a job! Now I have to weigh the utility of a fatty
paycheck against the daily commute to San Jose and hating the work.” A Cisco affiliate tweeted
back: “Who is the hiring decision maker? I’m sure they’d love to know that you’ll hate the work.
We here at Cisco are versed in the web.” (“Think Before You Post,” 2014). The Cisco story is one
of many stories found in the press of applicants losing potential jobs over pictures and comments
posted over Twitter, Facebook, and other social media platforms (Poppick, 2014; “The Facebook
Fired,” 2015; Ritter, 2014). In contrast, how decision makers view job applicants’ social media
presence, and how it impacts their decision making about individuals, is a phenomenon that has
received scant coverage in academic publications.
A number of studies indicate that decision makers view social media to learn more about
job seekers. Further, reports from decision makers suggest social media assessments do appear to
impact said seeker’s employment prospects. In fact, a 2013 survey conducted by the Society for
Human Resource Management (SHRM) indicates 77 percent of decision makers view social
media profiles to screen job applicants. Another survey conducted by the American Bar
Association, 89 percent of decision makers check social media, with over 79 percent of them
choosing Facebook as first choice for gathering information about job candidates (“Managing
Your Online Image Across Social Networks”, 2011). Further, some surveys indicate 1 in 10 job
seekers lose a job opportunity because of their social media profile (Sherman, 2013) and a
3
Eurocom study of 300 European firms suggests that as many as 1 in 5 job seekers lost a job offer
due to their social media activity (Eurocom Worldwide, 2012).
That decision makers turn to social media is not surprising, because many social media
platforms contain a substantial amount of information about job seekers, including personal
interests and demographic information (Ellison & boyd, 2006). For example, LinkedIn, a
professional networking site, enables users to present their education, professional experience,
skills, previous decision makers, professional certifications, organizational memberships and so
on. In contrast, while affording opportunities to disseminate professional information, Facebook, a
personal networking site, directs users’ attention to sharing personal stories, pictures, and
opinions. Irrespective of the platform’s orientation, creating a public profile, opens social media
users’ private information to a sometimes unintended or unknown audience of current and future
decision makers, schools, parents, and so on.
Decision makers secure access to social media information in a number of ways. Some of
them request that job applicants “friend” human resource decision makers or log into a company
computer during the interview (McFarland, 2012). Others search for job seekers’ publically
available user profiles to discreetly collect information on job seekers. Although often talked
about, the practice of requesting applicant social media passwords (likened, by some recent
college students, to “protecting the keys to ones’ house”) appears to be waning in popularity;
perhaps due to legislation that has been introduced or is pending in 28 states, (National
Conference of State Legislatures, 2014; Ho, 2014), there is still no federal legislation in place to
completely make this practice illegal (Dame, 2014). Nor do many laws address the issue of
“friending” a job applicant to the organization.
4
Collecting social media information introduces ethical and practical problems for decision
makers. When using social media to conduct background checks, decision makers may gather
protected information on job seekers, particularly demographic characteristics, such as age, race,
gender, or even political beliefs, that they are not allowed to request in an application or interview.
For example, decision makers can read status updates, photo captions, notes, etc. thought to
indicate user’s attitudes (for instance, an applicant’s stance on homosexuality) that have no
bearing on how competent an applicant may be at a potential job. Although not relevant to job
performance, many recent studies indicate decision makers do not hire job seekers who have
“troubling” Facebook profiles; for instance, decision makers avoid job seekers that post
pornographic pictures or use high levels of profanity (Erwin, 2014). Unsurprisingly, studies also
indicate decision makers react unfavorably to profiles that depict a job candidate as consuming
various levels of alcohol (Bohnert & Ross, 2010; Peluchette & Karl, 2008).
Though some practical research suggests decision makers screen out employees with “red
flag” behaviors, much less information is known about how they respond to information about job
applicants that indicates they are similar or differing from the decision maker, such as an
applicants’ political attitudes. For example, in light of recent political and social events, it might
be unsurprising for a decision maker to find a job seeker who supports legalizing marijuana to
update her status to reflect this, or another job seeker who strongly supports gun control laws to air
his attitudes openly on Facebook. Understanding the implications of viewing such opinions is
important, because social media encourages discussion and expressing opinions.
Within the context of hedonic and utilitarian social media platforms, we examine how
viewing job seekers’ expressions or statements about social and political issues may impact
decision makers’ assessments of perceived similarity to job applicants and associated evaluations
5
of the job applicants’ hireability. Broadly, we investigate do feelings of perceived similarity,
determined from viewing information about job applicants on social media, influence how
decision makers view and as a result, choose to hire (or not hire) the applicants? Specifically, this
study examines the perceived similarity issue from a political standpoint, asking: do attitudes
expressed by job seekers on social media sites impact decision makers’ screening of applicants,
particularly attitudes about gun control laws, legalizing marijuana and the Affordable Care Act
(often referred to as “Obamacare”)?
The results of this study provide important theoretical contributions to MIS. It contributes
to the field through shedding light on how information found on social media affects important
business decisions, in contrast to the broader social media literature that often focuses on issues
such as impression management/identity formation, networks and network structures, and privacy
issues (Ellison & boyd, 2006). Though some literature streams in our field do focus on how IT can
improve decision-making, the role of social media, which is increasingly pervasive in its use, as a
conduit for decision-making is largely ignored, particularly from a human resources standpoint.
This is troubling, given the large number of decision makers that claim to use social media to aid
in decision making (SHRM, 2013; American Bar Association, 2011). It also contributes through
highlighting the importance (or lack thereof) of the social media platform and the role the platform
may play in hiring decisions.
The remainder of this manuscript unfolds as follows. First, we review relevant theories
and studies surrounding social media, decision-making (particularly, demographic similarity
theories) and individuating information that informed our model and hypotheses’ development.
Then we present our experimental design. Next, we present our hypothesis tests and discuss the
results. We conclude by presenting implications for research and practice.
6
REVIEW OF SOCIAL MEDIA, DECISION-MAKING AND INDIVIDUATING
INFORMATION LITERATURES
In this section, we review the relevant literature in social media and theories of decision-making
and individuating information, that inform our research model and hypotheses, presented in
Figure 1.
“Social media” is an umbrella term used to describe social software “various, loosely
connected types of applications that allow individuals to communicate with one another, and to
track discussions across the Web as they happen” (Tepper, 2003, p.19) and social networking
websites (that focus on user relationships and networking), such as Facebook and LinkedIn
(Barnes, 2006). Social media may also be defined as ‘a group of Internet-based applications that
build on the ideological and technological foundations of Web 2.0, and that allow the creation and
exchange of User Generated Content’ (Kaplin & Haenlein, 2010, p. 61).”
User profiles are a defining characteristic of social media. Users often have more than one
profile and that may interconnect across multiple social media platforms (for instance, applicants
may share Instagram photos on LinkedIn or Facebook) (boyd & Ellison, 2008). A user’s profile
7
shapes how one is perceived on social media (e.g., type oneself into “being” on a platform)
(Sunden, 2009). User profiles may be text-based or multimedia-based and are generally set to
default privacy settings that are public or semi-public in nature (though privacy settings in most
platforms are customizable). The public nature of these user profiles grants decision makers
potentially easy access to personal information, user attitudes and the user’s network of
connections, including friends, family members and coworkers (boyd & Ellison, 2008), which
may impact how decision makers evaluate job applicants.
Empirical academic research has been mostly silent on the issue of how decision makers
use social media in employment decisions. There is a small literature that examines employment,
impression management/identity formation, networks and network structures, and privacy issues
(Ellison & boyd, 2006). For example, a study of 236 undergraduate students found that, even after
reading a vignette about the dangers of disclosing personal information, respondents still disclosed
all the information they were asked about (Nosko, 2010). Yet, the role of how decision makers
perceive social media user behaviors and activities, as well as how this impacts their hiring
decisions, is largely missing from academic social media literature to date.
In contrast to the academic literature, given the widespread use of social media in
recruiting, it is not surprising that a large practitioners literature has surfaced, that focuses on
“how-to” guides for decision makers (Bates, 2013; Charlton, 2012) or reviewing how decision
makers use social media, discussing who uses it, what platforms they use, what information they
take in about candidates, and so on (Schwabel, 2012). Despite the advice coming from many
sources, there is little in the way of rigorous empirical research to support one view or another.
Such research is important because, theories of bounded rationality and demographic
similarity theory suggest that social media may introduce unwanted information into the selection
8
process that could bias decision making. Through social media, the decision maker receives a
“snapshot” of the applicant (in conjunction with information about other job applicants also being
considered) and the snapshot may contain varying amounts of job related information (McCarthy
et al, 2010). Further, decision makers may even find complex and contradictory information about
job applicants on social media (Simon, 1982). For example, hiring decisions, based on
examination of social media sites likely expose decision makers to very large amounts of
information, spanning potentially years of an applicant’s life, that is likely not uniform. For
example, individuals may post different types of information with one individual posting many
blogs on current affairs while another posts hobby related information. In sum, the sheer volume
of information and likely inconsistent information on applicants available on social media, likely
taxes decision makers’ abilities to make good decisions on applicants.
To better handle this volume of information, decision makers may use heuristics, or
mental shortcuts, particularly for routine, simple decisions, such as screening out job applicants
via negative information found on social media profiles. A decision maker, who has limited time
or information processing ability, may focus on the most salient information in the social media
profile (which may not be job related, e.g., being a member of certain religious group), which will
“cue” his hiring evaluations. For example, a decision maker might immediately screen out an
employee who constantly posts political views that oppose his own; these cues are most salient for
the decision maker. While heuristics can be beneficial, allowing decision makers to perhaps save
deeper cognitive processing for more complex decisions (Hastie & Dawes, 2009), heuristics may
introduce many problems into the selection process such as “liking” some job applicants more
than others, a phenomenon that can be explained by Demographic Similarity Theory (and such
liking may not be based on job qualifications).
9
Perceived Similarity  Liking  Hireability Relationship
Demographic similarity theories describe one class of variables that help understand why
decision makers might tend to like certain job applicants more than others. The theoretical position
suggests that, when forming an impression, individuals make observations about how similar they
perceive they are to another person (often referred to as “demographic similarity”), preferring
those individuals with whom they have commonalities. This theoretical position has often been
used to explain the influence of variables such as gender and ethnicity (Harrison et al, 1998; Sacco
et al, 2003).
Two theories underpin the emphasis on similarity and demographic similarity. Social
Identity Theory maintains that decision makers form their personal identities through the groups
(and categories of individuals) they associate with, from work groups to extracurricular activities
(Tajfel, 1982; Tajfel & Turner, 1986). To maintain a positive sense of identity and decrease
cognitive dissonance, decision makers will ascribe positive characteristics to those employees who
belong to the same groups and negative characteristics to employees who do not (Abrams &
Hogg, 1988). Social Identity Theory has given rise to a rich literature in Organizational Behavior
and Social Psychology (e.g., see the review of Ashforth & Mael, 1989 as well as work by Opotow,
1990a,b).
Likewise, the Similarity Attraction Paradigm suggests that, when decision makers perceive
they are similar to job applicants, they will tend to favor them (e.g., Byrne, 1971). For example,
decision makers who are similar to applicants will likely understand and like those applicants
better than those job applicants who are perceived as dissimilar. Dissimilar applicants may be
regarded as illogical, uninformed or not liked (e.g., see Reskin et al, 1999 in the sociology
literature). We return to this theory in more depth below.
10
Social Identity Theory and the Similarity Attraction Paradigm have given rise to many
theories in Organizational Behavior. Relational Demography Theory emphasizes the variety of
demographic characteristics that might influence decisions including variable such as age in
addition to ethnicity and gender as well as emphasizing the cumulative nature of demographic
differences (Tsui, Egan, & O’Reilly 1992; Tsui, & Gutek, 1984; Tsui, & O’Reilly, 1989).
Likewise, the faultlines literature on OB focuses on diversity, and the influence of diversity as
embodied through demographic differences, on group performance (Lau & Murnigham, 2005; S.
Thatcher and Patel, 2011). Demographic differences are also theorized to influence group
attitudes and group learning such that groups with individual members with strong social identities
across multiple demographic differences have potential faultlines that may be triggered by internal
or external events and degrade group performance. Thus, a wide variety of theories emphasize the
similarity of individuals and decision makers and serve as a basis for our work.
However, it is important to note that similarity goes beyond demographic variables such as
gender and ethnicity (or age); decision makers may be swayed, for example, by “verbal and
nonverbal communication, physical appearance and dress” (Graves & Powell, 1995, p. 86), and
this may include sharing political attitudes. Given that other non-biological variables may
influence this sense of perceived similarity, it is somewhat surprising that research in most fields
(including OB and IS) have largely ignored the role of political attitudes in assessments of
similarity. Some Social Psychologists term these political attitudes as part of a larger domain of
personal attitudes, "an individual's tendency or predisposition to evaluate an object or the symbol
of that object in a certain way" (Katz & Stotland, 1960, p. 428). Such attitudes, when perceived to
be conflicting with the decision maker’s views, may negatively impact work-related outcomes;
11
likewise, when an individual perceives another individual to be likeminded or attitudinally similar,
this can lead to positive work-related outcomes (Byrne, 1971).
Similarity theories have the potential to extend understanding of the connection between
decision makers’ viewing social media posts and their assessments of job seekers. Through social
media, individuals disclose their attitudes about a variety of topics, through posting memes and
pictures, status updates, comments on walls, sharing articles and so on, including political attitudes
about gun control, marijuana usage and the Affordable Care Act (or “Obamacare”). Further,
similarity theories suggests that through viewing this social interaction (a one-sided interaction of
reading a social media profile), a decision maker may develop a liking to a job candidate who is
predisposed towards similar views, including “hot button” political issues, such as gun control
laws, similarly (Engle & Lord, 1997).
While there are a similarity and identity theories, we use the Similarity Attraction
Paradigm to develop our hypotheses. We do so, because the Similarity Attraction Paradigm notes
the key contribution of the variable of liking (Byrne, 1971). That is, the perceived similarity of
one person to another is a descriptive theoretical state involving an overall assessment of
similarity. Decision makers then are thought to add affect to this state along a positive to negative
continuum such that the decision maker “likes” (or dislikes) a particular application. So, for a
variety of reasons, such as possessing a positive identity or personal attraction, decision makers
“like” job applicants who they perceive are similar to them.
In a social media context, practitioner reports suggest that decision makers are likely privy
to information cues made publically accessible by applicants through social media profiles
(SHRM, 2013; American Bar Association, 2011). These cues give rise to perceived similarity of
the decision maker to the job applicant. Following the judgment of amount of similarity, decision
12
makers will have positive feelings of liking (or disliking applicants). With this in mind, we
hypothesize:
H1: Perceived similarity positively influences liking of job applicants.
Further, when decision makers perceive they are similar to and like a job applicant, the
Similarity Attraction Paradigm suggests this liking will positively influence subsequent judgments
of the job applicant (Goldberg, 2005). Liking has been “…related to many positive work-related
outcomes, such as more positive superior–subordinate and mentoring relationships,
communication, and job satisfaction” (Sacco et al, 2003, p. 853). Further, the Similarity
Attraction Paradigm suggests that liking fully mediates the relationship between perceived
similarity and various ratings by decision makers. Thus, theoretically there is no need for a direct
path from perceived similarity to ratings in this model.
In this study, we focus on decision maker’s evaluations of how “hireable” they rate a job
applicant to be, and we consider this from the angle of how well a decision maker believes an
applicant will accomplish responsibilities and job duties, often called “task performance” in the
OB and HR literatures. In addition, theory about the nature of the variable of job performance has
also expanded. Theoretical and empirical work in the last couple of decades has delineated a
second “dimension” of job performance (Motowidlo & Van Scooter, 1994). This dimension(s) of
job performance involves how well an individual goes “above and beyond” the job description
activities to help others or the organization as a whole. For example, a salesperson may informally
counsel another salesperson not in their area to help that person perform their job better (and help
the organization perform better as well), even though it is not in the job description and even
though they do not receive any formal credit for engaging in such activities. In our case, we ask a
decision maker how well a potential applicant will perform the tasks of the jobs as well as asking
13
how the potential employee will treat other employees and go “above and beyond” to be a good
organizational citizen or “Organizational Citizenship Behaviors” (OCB) (Organ, 1997; see also:
Dyne et al, 1994; Podsakoff et al, 2000;).
As noted above, there is a rich literature examining the impact of perceived similarity and
liking on job-related outcomes, especially in interviewing and selection and a degree of empirical
studies backing these relationships. However, the empirical results of many studies of
demographic similarity are decidedly mixed (McCarthy et al, 2010), and this is generally
attributed to the amount of structure employed, where more structured/standardized interviews
with clear procedures and lengthy training processes, did not find meaningful relationships
between demographic variables, perceived similarity, liking and selection (Sacco et al., 2003). The
social media context is important because using social media for making hiring decisions, notably,
is lacking in structure with over 54 percent of organizations claiming they do not have any sort of
social media policy in place to use for hiring decisions (SHRM report, 2014). Even having a
policy does not guarantee that the policy applies directly to hiring procedures or helps to structure
those activities. The lack of structure allows non-job related information to potentially have
greater influence on ratings (Huffcutt & Roth, 1998). For example, consider a social media
situation in which decision makers, who research multiple candidates, are inundated with a vast
amount of information about job applicants, straining their cognitive capabilities. In such cases,
there is little to restrain factors, including a more general affective state (liking), from influencing
ratings of hireability. In this instance, job irrelevant information on issues, such as political
affiliation, may have a strong influence on whether a decision maker likes an applicant and rates a
particular applicant well or poorly based on how well the decision maker perceives they will do a
job or how good an organizational citizen they will be when hired.
14
Given the Similarity Attraction Paradigm and empirical work in less structured
employment decision making situations, we hypothesize:
H2: Liking of job applicants positively influences hireability ratings for (a) task and (b)
organizational citizenship behaviors (OCB).
Individuating Information  Hireability Relationship
Individuating information, (e.g., job-related information about an applicant), such as
knowledge, skills or abilities, will also likely play a role in decision makers’ choices (McCarthy et
al, 2010). Theory explicates the importance of individuating information in many ways.
Expectation States Theory suggests individuating information may be thought of as a “direct status
cue” of likely levels of job performance (Berger, Rosenholtz, & Zelditch, 1980; Berger, Wagner,
& Zelditch, 1985; Berger & Webster, 2006; Correll & Ridgeway, 2003). Hence, the logical
relationship between the cue (e.g., writing ability) and job performance is logically short,
straightforward and compelling. In contrast, cues such as ethnicity or gender are “diffuse status
cues” that are only indirectly related to performance variables (i.e., the logical chain of
relationships is longer and less direct). For example, an applicant may be a minority, who is
stereotypically thought to write more poorly, and will subsequently perform more poorly.
Importantly, decision makers generally pay more attention to direct status cues than diffuse status
cues when both are present (e.g., Freese & Cohen, 1973).
Parallel processing theories also emphasize the importance of individuating information.
These theories suggest that decision makers process categorical information (e.g., race, gender) at
the same time as they process individuating information to arrive at an overall judgment (e.g.,
Kunda & Thagard, 1996, see also Kunda & Sinclair, 1999; Kunda, Davies, Adams, & Spencer,
2002). That is, we propose that political attitudes will invoke category based-processing that
might involve stereotypes and similarity while simultaneously looking at individuating
15
information. For example, a decision maker may see that an applicant is affiliated with the
Democratic Party and this could arouse category-based processing around attitude and value
similarity. At the same time, the decision maker is processing individuating information (e.g., the
applicant made the dean’s list) that might involve the level of qualifications an individual
possesses. Likewise, a decision maker might consider how well an employee met their sales
quotas when considering promoting them for the job of regional decision maker, as well as being
aware that the person is a Republican.
Finally, controlled processing models from Social Psychology suggest that stereotyping
and related mental processes might be controlled by many individuals (e.g., Hilton and Fein,
1989). That is, more controlled mental processing (as opposed to automatic processing) can be
involved in the actual use or potential use of stereotypes. One motivation for such controlled
processing is that most people do not wish to be considered prejudiced/biased and prefer to be
thought of as fair (e.g., Monteith, Sherman, & Devine, 1998). For example, roughly 80% of
people studied contrasted how they might respond versus how they should respond to suspect
groups, such as African Americans or homosexuals, and dissonance associated with such states are
associated with guilt and discomfort (e.g., Monteith & Mark, 2005). People can use controlled
processing to periodically check if they believe stereotypes are operating and influencing their
judgment. Further, such states can lead to retrospective reflection such that alternative behaviors
can be exhibited in the future (e.g., Monteith & Mark, 2005). Such effects are particularly salient
when individuals are motivated to make important and accurate assessments (Kunda et al., 2002)
such as hiring or promotion decisions. In our case, it is possible that individuals attempt to control
the influence of political attitudes on hiring recommendations.
16
Individuating information can have powerful affects based on research in the
organizational sciences (Landy, 2005, 2010). For example, individuating information can swamp
gender related information (Locksley, Borgida, Brekke, & Hepburn, 1980;) and meta-analyses of
experimental studies suggest individuating information is roughly eight times more powerful than
gender information in hiring scenarios (Olian, Schwab, & Haberfield, 1988).
Other types of
individuating information might include personality (Caldwell & Burger, 1998) and cognitive
ability (Schmidt et al, 2007)) or actual performance in performance appraisal ratings (Landy &
Farr, 1980).
The social media literature has only begun to address the issue of individuating
information. Social media may contain individuating information about job applicants. Through
social media, job applicants routinely post information such as education and work experience;
judgments may also be made about verbal/writing ability and so on. For example, Kluemper and
Rosen (2009), using a sample of undergraduate students, found that the students could reasonably
accurately discern Big Five personality traits from social media profiles (when 4-6 observers rated
such characteristics). Such information might relate to subsequent job performance.
Unfortunately, the social media literature has seldom addressed the issue of how category based
processing (e.g., demographic information) and individuating information based processing might
relate to each other. The literature appears to be silent on this issue at this time.
Based on Expectation States Theory, dual processing theories, and results from OB and
HR, we hypothesize:
H3: Individuating information influences hireability ratings for (a) task and (b) organizational
citizenship behaviors (OCB).
Before leaving the topic of individuating information, we make a final point. That is,
another reason to incorporate the variable of individuating information in our research model is to
17
test if political attitude information has an effect independent of the individuating information.
Given that individuating information tends to diminish the effect of demographic similarity on
employment decisions, we thought it highly important to allow our study the opportunity to show
that political attitude information could still have an effect with job related information in our
social media profiles. If this is the case, it could be evidence supporting the potentially potent role
that this information could have even in the presence of individuating information and reinforce
the importance of political attitude-based information in social media.
Platform’s Moderating Influence
The social media platform (the website, or varying interfaces and software combinations
through which user-generated content, interaction and connectivity are communicated; examples
of well-known platforms include Instagram, Tumblr, Facebook and LinkedIn) on which decision
makers view information may impact their decision-making. Given the large number of social
media platforms (some sources indicate hundreds exist at any given time and job applicants may
be members of multiple platforms), social media platforms may be classified in a variety of ways,
including by similar and differing feature sets. Some platforms, especially Facebook and
LinkedIn, have very similar feature sets (such as text and multimedia capabilities), though the
user’s intended reason for use is often notably different (boyd & Ellison, 2008). For instance,
LinkedIn is mostly utilitarian and is used for building professional networks and displaying one’s
resume. Facebook, in contrast, is more hedonic; users detail personal details about their daily
lives, professional or otherwise and members may consider the social network “fun” and
“entertaining” (Beer, 2008; van der Heijden, 2004; Babin, 1994) and many individuals use it to
communicate with friends and relieve boredom (Lampe et al, 2008).
18
The argument may be made that platforms, such as Facebook and LinkedIn, are intended to
be used for hedonic and utilitarian purposes and that different treatments of information presented
on different websites may impact feelings of liking, as well as strengthen or weaken influence of
individuating information. For example, since LinkedIn is intended to be used for professional
networking and reads similarly to an employment resume, decision makers may approach this
platform with a more objective, professional outlook, focusing more on individuating information
and paying less attention to personal feelings of “liking” a job applicant. Similarly, since
Facebook is intended to be the “fun, frivolous” platform, decision makers may be more likely to
attend toward “extraneous” information, like political attitudes. Moreover, they may even expect
to find “red flag” behaviors on Facebook, such as use of excessive profanity. They also may take
individuating information less seriously in such a lighthearted platform. We hypothesize:
H4: The social media platform will moderate the perceived similarity and liking relationship. The
use of a Facebook platform will be associated with a stronger relationship than the use of a
LinkedIn platform.
H5: The social media platform will moderate the relationship between individuating information
and hireability ratings for (a) task and (b) organizational citizenship behaviors (OCB). The
LinkedIn platform will increase the strength of this relationship, while the Facebook platform will
decrease it.
The constructs used in our research model, as well as their definitions and operationalizations, are
summarized in Table 1.
RESEARCH METHOD
Experimental Task and Design
Since our model posits that perceived similarities and individuating information influence
decision maker evaluations, we designed an experiment that employed a task that included both
types of information in a social media context. Further, the experiment presented individuating
information and individual attitudes on Facebook and LinkedIn to isolate effects of platforms on
candidate evaluations (see Appendix A: Experiment Development for explanation of
19
experimental development and development of social media profiles).
Construct/Variable Constitutional Definition
Perceived
Similarity
The extent to which decision makers perceive themselves as
similar (or alike) to a job applicant (Engle & Lord, 1997)
Operationalization
Tepper, Moss, and
Duffy (2011)
perceived similarity
measure
Liking
An attraction or positive feeling towards another individual
(Byrne, 1961)
Wayne and Ferris
(1990) liking measure
Hireability
How well a job applicant is expect to perform job
responsibilities (task performance) and be a good
organizational citizen (organizational citizenship behaviors)
(Organ, 1997; Podsakoff et al, 2000; Dyne et al, 1994)
Williams and
Anderson (1991)
hireability measure
Social Media
Platform
The website, or varying interfaces and software combinations
through which user-generated content, interaction and
Facebook and
connectivity are communicated
LinkedIn
Job-related information (for example, knowledge, skills and
Individuating
abilities, among other attributes) (McCarthy et al, 2010)
Information
Table 1. Construct Definitions and Operationalizations
Low and high levels of
individuating
information
Figure 2. Overview of the Methodological Development
20
Measures and Manipulations
Measures.
Established scales were used and slightly adapted to measure perceived similarity (Tepper
et al, 2011, α = .96), liking (Wayne & Ferris, 1990, α = .86) and hireability (task and
organizational citizenship behaviors are both used in the Williams & Anderson scale, 1991, α =
.75). Please see Appendix C: Perceptual Measures for the scales and noted changes made to
them, and Appendix D: Confirmatory Factor Analysis for scale reliabilities and validities.
Experimental Manipulations,
Social media profiles were created that emulated Facebook and LinkedIn environments
without being “intrusive” (without our subjects asking for a social media password or “friending”
the job applicant; instead, our subjects viewed publically accessible (fictionalized) social media
profile pages). First, we considered that many Facebook and LinkedIn profiles use “Privacy
Settings,” only allowing for certain information to be viewed globally, or publically (for example,
a Facebook user may elect to make some information available worldwide and other cues
available only to “all friends” or “select friends”). Next, even profiles that are entirely accessible
generally only display recent information, requiring a considerable amount of effort to pull up data
beyond the previous month. Finally, since the study’s objective was to evaluate decision-making,
all information provided on profiles was presented in the same manner (for example, the applicant
posted the same article on both platforms) or was held constant (for example, profile pictures were
the same in all conditions within one of our studies).
After that, the researchers identified three salient political issues as “conditions” for our
experiment as stimuli for cueing judgments of similarity in the research model. Each political
condition exemplified different political attitudes supporting and opposing important political
21
issues of marijuana legalization, gun control laws, and the Affordable Care Act, as well as high
and low levels of individuating information to test for the influence of work-related attributes of
our fictionalized job applicants. To measure the impact of platform, we created profiles in
Facebook and LinkedIn. In all, we created 12 profiles for Facebook and 12 for LinkedIn, summing
up to 24 profiles. The table below (Table 2) shows the six experimental selections and
manipulations.
Factor
Factor Type
Selection/Manipulation
Level
Political Attitudes
Between
Between
Between
Between
Within
Within
For
Against
High
Low
Facebook
LinkedIn
1
2
1
2
1
2
Individuating
Information
Platform
Table 2. Factor Table
These six experimental selections are associated with a 2 x 2 x 2 design. Individual
attitudes (“for” or “against” one of three political issues consisting of legalizing marijuana, gun
control laws and the Affordable Care Act) and individuating information (“high,” or containing
job-related information or “low,” or containing other information unrelated to one’s employment)
were used as between-subjects variables and platform (Facebook or LinkedIn) was used as the
within-subjects design. Within each platform, subjects were divided into groups that viewed
profiles that contained individuating information (“high”) or did not contain individuating
information (“low;” these profiles contained innocuous, non-job-related information). Profiles
reflected the three political conditions (legalizing marijuana, gun control laws and the Affordable
Care Act), though we assumed (and tested) that there was no variation between conditions (i.e., we
tested that our results were pervasive and not specific to just, for example, the legalizing marijuana
attitudes, so, the design is a 2 X 2 X 2 design, as opposed to a 2 X 2 X 2 X 3 design). Examples of
the conditions and their levels used in each experiment can be found in Appendix A.
22
The experiment was a post-test only randomized experiment with random selection that
assigned participants to different conditions (Campbell et al, 1963; Shadish et al, 2002).
Participants were randomly assigned to different treatments (i.e., viewed the social media
profiles). Then participants were asked to respond to the manipulation checks and survey
questions corresponding to the experimental conditions applied. Previous social media studies
used similar designs (Mazer et al, 2007; Kluemper & Rosen, 2009). Figure 3 presents an example
of a social media profile created in the legalizing marijuana condition, with the manipulations
highlighted.
RESULTS
Prior to running the full experiment, our measures were pre- and pilot tested (see
Appendix E). This section discusses our sample, manipulation checks, assumption checks,
confirmatory factor analysis and hypotheses testing.
Sample Characteristics
Our participants consisted primarily of graduate students from a Southeastern university in
the United States under the assumption that they would have previous experience in this area (and,
as noted below, roughly 2/3rds of them claimed to have previous professional hiring experience)
and decision makers from a local company (n > 300 employees) through contacts of the
dissertation author. Importantly, about 66% of our participants had direct experience with
conducting hiring interviews. Thus, our sample was somewhat older, more educated, and more
experienced with actual HR practices than many laboratory experiments (e.g., Kluemper & Rosen,
2009). Also, we controlled for these sample characteristics in our analysis. Sample characteristics
are described in Table 3. Out of 270 invited participants, a total of 191 individuals participated in
our survey (71% response rate).
23
Innocuous Information
Innocuous Information
Political Issue: Job applicant strongly
supports legalizing marijuana
Innocuous Information
Individuating Information: Job applicant is in
the “low” individuating information condition
Figure 3. Sample Social Media Profile for the Legalizing Marijuana Condition
24
Assumption Tests
We conducted several assumption tests. We estimated skewness and kurtosis; the values
fell within the required bounds (were <3 on skewness and though there is some debate as to what
makes kurtosis a “problem,” all scale points fell under the conservative “rule of thumb,” SI<10.0)
(Kline, 2011). Also, we tested for normality using Mardia’s Coefficient, a statistic that identifies
nonnormality of data and the cases that contribute most to it (Ullman, 2006) and found our
normalized estimate to be over the recommended statistic of 3.00, even after deleting an outlier,
indicating nonnormality of data (Ullman, 2006). Hence, we use the ML robust method in EQS for
our confirmatory factor analysis and the Satorra-Bentler chi-square statistic (Byrne, 2006).
Scale Reliability and Validity
To assess the dimensionality and reliability of the measures, we conducted a confirmatory
factor analysis (CFA). We found acceptable fit, with all our scales yielding reliabilities over .8.
Further, the average variance extracted (AVEs) of our scales were all higher than .63, which is
greater than the .5 or higher required to indicate convergent validity. Further, we note the square
root of the AVE for items were higher than the construct correlations, suggesting discriminant
validity (see Appendix E for more information on the measurement model).
Common Method Bias and Social Desirability Bias
We used procedural and statistical remedies to control for bias. To control for common
method bias we used multiple methods of measuring our perceptual measures, protecting of
respondent anonymity and reducing evaluation apprehension (Podsakoff et al., 2003). Also, when
informing subjects of their IRB rights in their informed consent letter, we assured respondents that
their answers were anonymous, and all personally identifying information was secure and
confidential. We also repeated this guarantee in the recruiting email. We also told our participants
25
that there were no right or wrong answers, and to respond to all questions as honestly as possible.
Because we manipulated political issues, we controlled for social desirability bias. As a result of
our precautions, we effectively reduced the likelihood that method effects biased our results.
Table 3. Sample Characteristics
Hypotheses Testing
We used structural equation modeling (SEM) to evaluate the measurement and structural
models (Anderson & Gerbing, 1988). To test the measurement and structural models EQS (version
26
6.1, build 97) was used (Bentler, 1995)1. We ran the model (in Figure 1) to account for responses
to social media profiles created in the legalizing marijuana, gun control and Affordable Care Act
(“Obamacare”) Act conditions. Overall, the model Satorra-Bentler chi-square statistic was
1882.35 with (p =.04) (Satorra & Bentler, 1988). The model fit and hypotheses tests (not including
interactions) are included in Tables 4 and 5.
Fit Index
Satorra-Bentler Chisquare (S-B Chi-square)*
Bentler's Comparative Fit
Index (CFI)*
Root Mean Square Error
of Approximation
(RMSEA),* RMSEA
Intervals*
Standardized Root Mean
Square Residual (SRMR)
* = robust fit indices
Definition
Our
Study
Good
Measures goodness-of-fit for small sample
sizes, large models and/or nonnormal samples
(Satorra & Bentler, 1988)
An incremental or comparative fit index that
assesses fit of a structural model in
comparison with the null model (Satorra &
Bentler, 1988)
NA
1882.35
>.95
0.92
A "badness" of fit index, where values closer
to 0 are better; includes model parsimony and
sample size (Klein, 2011)
≤.08,
.05,.1
0.05,
.05,.06
The square root of the difference of residuals
of the sample covariance matrix and
hypothesized model (Hooper et al, 2008)
≤.08
0.08
Table 4. Fit Indices and Model Fit
Legalizing Marijuana Condition
Hypothesis 1 stated that perceived similarity influences liking of job applicants. Our SEM
was conducted to test this hypothesis when subjects were shown social media profiles of job
applicants who posted status updates supporting or opposing legalizing marijuana. Our analysis
revealed support for hypothesis 1(b = .49,  = .72, SE = .06, t = 8.74, p < .001). Hence, in a social
media context where job applicants post about legalizing marijuana, respondents appear to like
1
For our parameter estimates, there was evidence of suppression and non-robust estimates were
therefore more reliable.
27
applicants who they perceive to be more similar to them.
Model Fit
5523, d.f. = 1431
1882.45, d.f. = 1304, p = .04
0.92
0.05
.05,.06
0.08
Tests of Hypotheses
Legalizing Marijuana
S.E.
t-value
b()
0.49(.72) 0.06 (.06) 8.74 (8.18)
H1
1.15(.67) 0.21 (.25) 5.57 (4.6)
H2a
0.99(.67) 0.18 (.21) 5.59 (4.7)
H2b
0.08(-.04) 0.14 (.14) -0.59 (-.59)
H3a
0.14(.07) 0.13 (.13) 1.1 (1.11)
H3b
Gun Control Laws
0.37(.77) 0.05 (.06) 7.68 (6.64)
H1
1.26 (.5) 0.32(.35) 3.97 (3.57)
H2a
1.15(.57) 0.27 (.28) 4.32 (4.19)
H2b
0.8(-.33) 0.15 (.15) -5.34 (-5.38)
H3a
0.43(-.22) 0.12 (.12) -3.52 (-3.54)
H3b
The Affordable Care Act ("Obamacare")
0.51(.78) 0.05 (.06) 9.47 (9.19)
H1
1.64(1.02) 0.23 (42) 7.02(3.94)
H2a
1.13(.85) 0.17 (.21) 6.62 (5.51)
H2b
0.83(.4) 0.12 (.12) 7.13 (7.26)
H3a
0.66(.37) 0.1 (.1)
6.44 (6.43)
H3b
Parentheses = standardized estimates
* = robust fit indices, ** = p < .001
Normal Chi-square
S-B Chi-square*
CFI*
RMSEA*
RMSEA Intervals*
SRMR
p-value
**
**
**
>.05
>.05
**
**
**
**
**
**
**
**
**
**
Table 5. Model Fit and Parameter Estimates
Hypothesis 2a stated that liking job applicants influences hireability ratings of task
behaviors. Our model showed a significant relationship with task (b = 1.15,  = .67, SE = .21, t =
5.57, p < .001) implying hypothesis 2a was supported. Thus, our subjects gave likeable job
applicants significantly higher hireability ratings of expected task performance. Hypothesis 2b
(involving organizational citizenship behaviors) was also supported (b = .99,  = .67, SE = .18, t =
28
5.59, p < .001), indicating this relationship also exists for organizational citizenship behaviors.
Hypothesis 3a and 3b, that individuating information, job-related information about our job
applicants, leads to more positive hireability ratings, was not supported in this model for either
task (b = .08,  = -.04, SE = .14, t = -.59, p > .05) or OCB (b = .14,  = .07, SE = .13, t = 1.10, p <
.05). Individuating information was dummy-coded to reflect profiles that were “low” in
individuating information (=0) and “high” in individuating information (=1).
To test hypotheses H4 and H5a and H5b, we created interaction terms and ran a general
linear model in SPSS using composite variables, following it up by examining the simple slopes of
the significant interactions. We also tested the interactions using latent variables in our structural
model in EQS and by running a multilinear model using composite variables. Hypothesis 4 posited
that social media platform, Facebook or LinkedIn, will moderate the perceived similarity and
liking relationship. We examined the standardized residuals for outliers (values higher than 3 are
considered outliers) and removed one case from this analysis that was considered an outlier2.
Using the interaction term, we found that this relationship was not significant at F(1,126) = < 1.0,
p = .72 We dummy-coded the platform measure (0 = Facebook, 1 = LinkedIn) and individuating
information (0 = low individuating information, 1 = high individuating information). Overall, the
platform, whether it was Facebook or LinkedIn, did not impact the relationship between similarity
and liking in a social media context in the legalizing marijuana condition.
Hypothesis 5 stated platform will moderate the individuating information  hireability
relationship. That is, we expected that the individuating information  hireability – task
relationship would be strengthened under the Facebook platform, as opposed to a platform
2
We conducted a preliminary analysis of standardized residuals and identified one case that fell
outside of recommended bounds of 3. This case was removed from our analysis of Hypothesis 4
for the legalizing marijuana condition. No additional cases were removed from our analysis on
moderating relationships.
29
designed around working professionals (LinkedIn). We found that this interaction was supported
for task performance F(1,171) = 17.65, p <.001. We tested this further using analysis of variance
(with our hireability variables serving as dependent variables, interactions as fixed factors and all
other variables, including control variables, set as covariates) the means for the Facebook platform
are significantly higher than for the LinkedIn platform (mean difference for Facebook = 2.43, p <
.05; mean difference for LinkedIn = 1.39, p = .06), though it appears that evaluations were highest
for Facebook and high individuating information. The means are shown in Table 6.
Test of Simple Effects
Hireability - Task Behaviors
Platform
Individuating
Facebook LinkedIn
Information
0.003
0.55
Low
2.43
1.93
High
2.43*
1.39
Mean difference
Table 6. Simple Effects for H5a – Legalizing Marijuana Condition.
We also found that this interaction was supported for hireability regarding evaluations of
organizational citizenship behaviors F(1,171) = 15.02, p < .001. We tested this further using
analysis of variance (post hoc tests) and found that, the Facebook platform had a stronger simple
effect (mean difference = 2.11, p = .03), indicating that hireability ratings increase when moving
from conditions of low individuating information to high individuating information and the
Facebook platforms appears to strengthen this effect. The means are shown in Table 7.
Test of Simple Effects
Hireability - OCB
Platform
Individuating
Facebook LinkedIn
Information
0.48
.44
Low
2.59
1.81
High
2.11*
1.86*
Mean difference
Table 7. Simple Effects for H5b – Legalizing Marijuana Condition.
30
Gun Control Condition
Hypothesis 1 stated that perceived similarity influences liking of job applicants. Our
measurement model was conducted to test this hypothesis when subjects were shown social media
profiles of job applicants who posted about the polarizing political issue of gun control. This
model showed a significant relationship (b = .37,  = .77, SE = .48, t = 7.68, p <.001) indicating
hypothesis 1 was supported.
Hypothesis 2 stated that liking job applicants influences hireability ratings of task
behaviors. Our model showed a significant relationship (b = 1.26,  = .5, SE = .32, t = 3.97, p <
.001) implying hypothesis 2 was supported. It appears that our subjects also gave likeable job
applicants significantly higher hireability ratings in expected task performance. Our subjects also
gave likeable job applicants significantly higher hireability ratings of expected organizational
citizenship behaviors. Hypothesis 2b (involving organizational citizenship behaviors) was also
supported (b = 1.15,  = .57, SE = .27, t = 4.32, p <.001).
Hypothesis 3a and 3b, that individuating information, job-related information about our job
applicants, leads to hireability ratings, was supported in this model for task performance
expectations (b = .80,  = -.33, SE = .15, t = -5.34, p <.001) and organizational citizenship
behaviors (b =.43,  = -.22, SE = .12, t = -3.52, p <.001). Individuating information was dummycoded to reflect profiles that were “low” in individuating information (=0) and “high” in
individuating information (=1). For profiles that were “high” in this job-related information,
hireability evaluations involving both evaluations of expected task performance and organizational
citizenship behaviors increased. That is, the presence of individuating information influences
hireability evaluations, where high levels of individuating information increased hireability
evaluations, while low levels decreased it.
31
To test hypotheses H4 and H5a and H5b, we created interaction terms and ran a linear
regression in SPSS, following it up by examining the simple slopes of the significant interactions.
Hypothesis 4 posited that social media platform, Facebook or LinkedIn, will moderate the
perceived similarity and liking relationship. Using the interaction term, we found that this
relationship was not significant F(1,118) = >1.0, p = .67. Overall, the platform, whether it was
Facebook or LinkedIn, did not moderate the relationship between similarity and liking in a social
media context. That is, the effect generalized across platforms.
Hypotheses 5a and 5b stated platform will moderate the individuating
informationhireability relationship. That is, we expected job-related information would not have
as much of an impact on a website designed around working professionals (LinkedIn), as it would
for Facebook. We found that this interaction was not supported for task performance F(1,174) =
<1.0, p = .38, though we found support for organizational citizenship behaviors F(1,177) = 4.26, p
= .04. We tested this further using analysis of variance and found that the Facebook platform
strengthened the individuating information  hireability – OCB relationship (mean difference =
1.5, p < .05). The means are shown in Table 8.
Test of Simple Effects
Hireability - OCB
Platform
Individuating
Facebook LinkedIn
Information
0.3
1.35
Low
1.8
1.45
High
1.5*
0.11
Mean difference
Table 8. Simple Effects for H5b – Gun Control Condition
Affordable Care Act Condition
Hypothesis 1 stated that perceived similarity influences liking of job applicants. Our model
was conducted to test this hypothesis when subjects were shown social media profiles of job
32
applicants who posted support or opposition of the Affordable Care Act (“Obamacare”). The
model showed a significant relationship (b = .51,  = .78, SE = .21, t = 9.47, p <.001) indicating
hypothesis 1 was supported. Hence, in a social media context where job applicants post about the
Affordable Care Act, respondents appear to like applicants who they perceive to be more similar
to them.
Hypothesis 2a stated that liking job applicants influences hireability ratings of task
behaviors. Our model showed a significant relationship (b = 1.64,  = 1.02, SE = .23, t = 7.02, p <
.001) implying hypothesis 2a was supported such that our subjects gave likeable job applicants
significantly higher hireability ratings in expected task performance. Our subjects also gave
likeable job applicants significantly higher hireability ratings of expected organizational
citizenship behaviors. Thus, Hypothesis 2b was also supported (b = 1.13,  = .85, SE = .17, t =
6.62, p <.001).
Hypothesis 3a and 3b, that individuating information, job-related information about our job
applicants, leads to hireability ratings, was supported in this model for task performance
expectations (b = .83,  = .4, SE = .12, t = 7.13, p <.001) and organizational citizenship behaviors
(b = .66,  = .37, SE= .10, t = 6.44 p < .001). Individuating information was dummy-coded to
reflect profiles that were “low” in individuating information (=0) and “high” in individuating
information (=1). For profiles that were “high” in this job-related information, hireability
evaluations involving both evaluations of expected task performance and organizational
citizenship behaviors increased. That is, the presence of individuating information positively
influences hireability evaluations.
To test hypotheses H4 and H5a and H5b, we created interaction terms and ran a linear
regression in SPSS. Hypothesis 4 claimed that social media platform, Facebook or LinkedIn, will
33
moderate the perceived similarity and liking relationship. Using the interaction term, we found
that this relationship was not significant F(1,128) = 1.09, p = .37. Overall, the platform, whether it
was Facebook or LinkedIn, did not impact the relationship between similarity and liking in a
social media context.
Hypothesis 5 stated platform will moderate the individuating informationhireability
relationship. That is, we expected job-related information would not have as much of an impact on
a website designed around working professionals (LinkedIn), as it would for Facebook. We found
that this interaction was not supported for task performance F(1,170) = 2.75, p = .1. Also, we did
not find a significant interaction for Hypothesis 5b at F(1,170) = 1.4, p = .24.
The hypotheses, whether they were supported or not, as well as the statistical evidence, are
presented in Table 9.
34
Table 9. Research Model Hypotheses
35
Order of results: legalizing marijuana, gun control, Affordable Care Act (“Obamacare”)
( ) = standardized coefficients, * = p<.05, ** = p < .001
Figure 4. Structural Model with Results
DISCUSSION AND IMPLICATIONS
We sought to examine how attitudes expressed on social media (i.e., political issue
endorsements) influenced recommendations for hiring in social media assessments. We noted an
increasing number of organizations are using such approaches for screening job applicants and
that a number of applicants are losing job offers due to information posted on social media
platforms (Sherman, 2013; Eurocom Worldwide, 2012). Further, it appears that most social media
assessments are unstructured in that they do not involve the use of specific procedures and policies
(SHRM, 2014).
Our results suggest that the position taken on various political issues by our simulated job
applicants influenced how they were rated in terms of hireability. We found support for the
relationship between perceived similarity of the applicant to the decision maker which then
influenced liking (thus, the more our respondents felt they had in common with the job applicants,
36
the more they liked the applicants). Liking, in turn, influenced ratings of expected levels of task
performance as well as expected levels of OCB performance. These effects are rather novel
because few, if any, individuals in Information Systems, Organizational Behavior, and Human
Resource Management have studied how political attitudes (manifest on social media platforms)
influence hiring decisions.
We also investigated if individuating information influenced hiring decisions. In two of
our three studies, we found support for this effect. This was consistent with the Organizational
Behavior literature (e.g., Olian et al., 1988). However, the fact that we found effects for political
issue endorsement in an experimental design in which there was individuating information is of
theoretical interest. That is, we found an effect for political affiliation in all three studies even
when individuating information was present and able to account for variance in ratings of
hireability. Thus, individuating information did not account for all of the variance in hiring
recommendations. Rather, political attitudes, through perceived similarity and liking still had
effects.
We also hypothesized that social media platform would moderate effects in our studies.
There was little support for this in that we found similarity effects for both Facebook and LinkedIn
simulated job applicants. Put another way, our effects generalized across the type of platform
used to assess social media information. This suggests that political issue information may have
effects in multiple types of social media platforms, even when individuating information is
available on various platforms.
All told, the results of our studies suggest political issue information may influence the
hiring process. This is in contrast to much of the Organizational Behavior and Human Resource
Management literatures that shows small and or inconsistent effects for ethnic or gender similarly
37
(e.g., McFarland, et al., 2006; Sacco et al., 2003). Such effects may have a number of theoretical
and practical implications across various fields.
Theoretical & Practical Implications
This study has a number of theoretical implications. First, we identified a gap in MIS
literature where little to no current literature examines social media effects on personnel decisions.
To contribute to the field, we studied social media, asking how decision makers use it to make
screening and hiring decisions. We also examined the role of social media platform as a moderator
for many relationships in our research model. The study contributes to Organizational Behavior, a
referent field, by examining similarity theories in a social media environment (examining political
issues), as well as viewing social media platform’s (lack of) influence on the relationship between
perceived similarity and liking. Finally, we also contributed to the growing literature stream
surrounding individuating information, showing that individuating information found in user
profiles influences hireability ratings, though this depends upon the platform to an extent, with the
fun, hedonic Facebook platform strengthening the relationship.
The study also has practical implications for decision makers as it shows that our
respondents made decisions based on non-job related information (i.e., their stances on political
issues). An implication is that mangers and organizations should take time to develop, articulate
and publicize clear, transparent policies involving the use of social media to screen applicants or
make hiring decisions. In particular, the policies, and associated training, should focus on finding
job relevant information. The study also suggests that individuating information provided on
social media directly impacts hireability evaluations as well. Finding and evaluating individuating
information (that is job related) about job applicants should be the focus of using social media (if
it is used). For example, decision makers might instruct employees to pay extra attention to
38
qualities of applicants directly related to job performance (e.g., education, writing skills, languages
spoken). To do this, the organization should develop uniform, standardized criteria for evaluating
information found about applicants online so all applicants are evaluated consistently. All
applicants should be forewarned that their social media profiles may be viewed (perhaps job
descriptions, especially descriptions provided online, should include this). An organization’s
social media hiring policy involve that all activity should be documented. Social media may be a
useful tool for learning about applicants but decision makers must use caution to avoid “biasing”
their evaluations with their own perceptions of similarity or liking of job applicants.
The findings of our study have important implications for job applicants as well. Since we
found that perceived similarity (influenced by political similarity), liking and individuating
information all have a role in making hireability evaluations, this suggests that individuals who are
on the job market should become familiar with and use privacy settings available in the different
social media platforms. Applicants should research the organization carefully and take them to
learn their social media policy(s), if there is one, and should, most importantly, know their rights
involving social media. Since our results do indicate a moderating influence of social media
platform when individuating information is involved, with Facebook increasing the strength? Of
relationships, applicants may also use social media platforms, particularly ones that are hedonic in
nature, to demonstrate individuating information for decision makers (for example, applicants
might use multiple social media platforms to create a consistent image and to demonstrate KSAs,
such as communications or technical skills). Finally, individuals currently on the job market may
wish to consider what kind of information that they post to social media. In our case, political
attitudes influenced hiring ratings even in the presence of individuating information (and most of
our participants had actual hiring experience). Pragmatically, applicants may decide to weigh the
39
positive aspects and negative aspects of certain types of posts.
LIMITATIONS AND DIRECTIONS FOR FUTURE RESEARCH
Though our model was pre- and pilot-tested, further testing and refinement should be
completed for many reasons. First, our study consisted of mostly MBA students and graduate
business students and the roughly two thirds had previous experience in conducting actual
employment interviews. We also controlled for this in our experimental model (see Appendix C),
but future studies should use organizational samples and may even seek to use other groups such
as HR managers or individuals in different cultures. Next, though this study is likely high in
internal validity, it may be lacking in ecological validity (real-world semblance, though we made
efforts to address this issue by keeping most of the information consistent and innocuous, having
an extensive profile development process, pre- and pilot-testing measures, etc., as noted in
Appendix A). Though using social media platforms in real-life may improve ecological validity,
our large number of manipulations and need to keep profiles consistent to avoid confounding data
also influenced our decisions. Also, we measured individuating information using two levels
(“low” and “high”), and this might not have covered all levels and kinds of individuating
information that can be used in this context. Though few studies in IS or social media specifically
measure individuating information in a social media context, it has been measured in many ways
in psychology (McCarthy et al, 2010). Our study also primarily focused on work-related
accomplishment, but it should be noted, individuating information may manifest in a number of
ways, such knowledge, skills, abilities and personality traits (Lee, 1997; Caldwell & Burger, 1998,
and discerning personality traits has been studied in social media, such as Kluemper and Rosen’s
2009 study, though not to infer individuating information) and behaviors (Locksley et al, 1980).
40
Substantial future research is needed in this area. In both the U.S. and abroad, many other
political issues are discussed on social media and additional research into these issues would
increase the generalizability of our work. Our study focused on how political attitudes expressed
on social media influence hireability rankings but future endeavors may investigate other
demographic or attitudinal attributes in this context. Richness of information provided is another
potential variable to research and can impact how it is received (Dennis et al, 2006); future studies
might evaluate how the “richness” of cues about an individual’s personality expressed on social
media through images, text, video, applications, and so on may influence decision makers’
perceptions of employment seekers. Further, future studies look at decision maker responses to job
applicants behaviors across multiple platforms.
CONCLUSION
Our study provides evidence that political attitudes can influence decision making. In
particular, decision makers’ perceptions of similarity to job applicants, through information
viewed on social media, was related to their overall positive feelings (liking) of applicants. In turn,
this liking or favorability of the applicant had a direct effect on the hireability of the applicant.
Thus, political atittudes, appears to “matter” in the hiring process. Interestingly, this result
occurred even with the presence of individuating information, which could have potentially
accounted for much or all of the variance in hiring recommendations. Moreover, we found that
these effects generalized across both Facebook and LinkedIn websites. Our results show that
perceptions of similarity, especially in the case of political attitudes, do “matter,” even when jobrelated-information is also present and regardless of the website, or platform, it was viewed on.
41
REFERENCES
"Employer Access to Social Media User Names and Passwords," 2014. National Conference of
State Legislatures (2014:5/14/2014), 4/10/2014, pp. 1, ncsl.or.g
"For the First Time, Americans Favor Legalizing Marijuana," 2013. Gallup (2014:6/10/2013),
10/22/2013, gallup.com.
"Gun Control: Key Data Points from Pew Research," 2013. Pew Research Center
(2014:6/8/2014), 5/5/2013, pewresearch.org.
"Managing Your Online Image Across Social Networks," 2011. The Reppler Effect
(2014:5/14/2013), 9/27/2011, blog.reppler.com.
"One in Five Technology Firms has Rejected a Job Applicant because of Social Media Profile Eurocom Worldwide Annual Survey," 2012, Eurocom Worldwide3/15/2012,
eurocompr.com.
"SHRM Research Spotlight: Social Media in the Workplace," 2013, 2014. Society for Human
Resource Management (2014:5/1/2014), 11/1/2011, org.
"Social Networking Fact Sheet," 2013. Pew Research Internet Project (2014:5/1/2014),
September, 2013, pewresearch.org.
"State of the States," 2014. Gallup (2014:6/12/2014), 6/12/2014, gallup.com.
"Think before You Post," 2014. Intelligence for Your Life (2014:5/14/2013), 5/14/2014, pp. 1,
teshreport.com
Ashforth, B. E., & Mael, F. 1989. Social identity theory and the organization. Academy of
Management Review, 14: 20-39.
Berger, J., Rosenholtz, S. J. & Zelditch, M. (1980). Status organizing processes. Annual Review of
Sociology, 6, 479-508.
Berger, J. Wagner, D. G., & Zelditch, M. (1985). Introduction: Expectation states theory. In J.
Berger and M. Zelditch (Eds.), Status, rewards, and influence (pp. 1-72). San Francisco:
Jossey-Bass.
Berger, J., & Webster, M. (2006). Expectations, status, and behavior. In P. Burke (Ed.),
Contemporary Social Psychological Theories (pp. 268-300). Stanford, CA: Stanford
University Press.
Correll, S. J., & Ridgeway, C. L. (2003). Expectation states theory. In J. Delamater (ed.),
Handbook of Social Psychology (pp. 29-51). New York: Kluwer Academic/Plenum
Publishers.
42
Davison, H. K., & Burke, M. J. 2000. Sex discrimination in simulated employment contexts: A
meta-analytic investigation. Journal of Vocational Behavior, 56: 225-248.
Freese, L., & Cohen, B. P. (1973). Eliminating status generalization. Sociometry, 36, 177-193.
Hilton, J. L., & Fein, S. 1989. The role of typical diagnosticity in stereotype-based judgments.
Journal of Personality and Social Psychology, 57: 201-211.
Kunda, Z., Davies, P. G., Adams, B. D., & Spencer, S. J. 2002. The dynamic course of stereotype
activation: Activation, dissipation, and resurrection. Journal of Personality and Social
Psychology, 82: 283-299.
Kunda, Z., & Sinclair, L. 1999. Motivated reasoning with stereotypes: Activation, applicant, and
inhibition. Psychological Inquiry, 10: 12-22.
Kunda, Z., & Thagard, P. 1996. Forming impressions from stereotypes, traits, and behaviors: A
parallel-constraint-satisfaction theory. Psychological Review, 103: 284-308.
Landy, F. J. 2005. Employment discrimination litigation: Behavioral, quantitative, and legal
perspectives. San Francisco: Jossey-Bass.
Landy, F. J. 2010. Performance ratings: Then and now. In J. L. Outtz (Ed.), Adverse impact:
Implications for organizational staffing and high stakes selection: 227-2480. New York:
Routledge.
Landy, F. J., & Farr, J. L. 1980. Performance rating. Psychological Bulletin, 87: 72-107.
Lau, D. C., & Murnigham, J. K. (2005). Interactions withing groups and subgroups: the effects of
demographic faultlines. Academy of Management Journal, 48, 645-659.
Locksley, A., Borgida, E., Brekke, N., & Hepburn, C. 1980. Sex stereotypes and social judgment.
Journal of Personality and Social Psychology, 5: 821-831.
Monteith, M. J., & Mark, A. Y. 2005. Changing one’s prejudiced way: Awareness, affect, and
self-regulation. European Review of Social Psychology, 16: 113-154.
Monteith, M. J., Sherman, J. W., & Devine, P. G. 1998. Suppression as a stereotype control
strategy. Personality and Social Psychology Review, 2: 63-82.
43
Motowidlo, S., & Van Scotter, J. 1994. Evidence that task performance should be distinguished
from contextual performance. Journal of Applied Psychology, 79: 475-480.
Opotow, S. 1990a. Moral exclusion and injustice: An introduction. Journal of Social Issues, 46:
1-20.
Opotow, S. 1990b. Determining moral exclusion. Journal of Social Issues, 46: 173-182.
Tajfel, H. 1982. Social identity and intergroup relations. Cambridge, UK: Cambridge University
Press.
Tajfel, H., & Turner, J. 1986. The social identity theory of intergroup behavior. In S. Worchel &
W. G. Austin (Eds.), The psychology of intergroup relations: 7-24. Chicago: Nelson-Hall.
Thatcher, S. M. B., & Patel, P. C. (2011). Demographic faultlines: A meta-analysis of the
literature. Journal of Applied Psychology, 96, 1119-1139.
Tsui, A. S., Egan, T. D., & O'Reilly, C. A. 1992. Being different: Relational demography and
organizational attachment. Administrative Science Quarterly, 37: 549-579.
Tsui, A. S., & Gutek, B. A. 1984. A role set analysis of gender differences in performance,
affective relationships, and career success of industrial middle decision makers. Academy
of Management Journal, 27: 619-635.
Tsui, A. S., & O’Reilly, C. A. 1989. Beyond simple demographic effects: The importance of
relational demography in superior-subordinate dyads. Academy of Management Journal,
32: 402-423.
44
APPENDIX A:
DEVELOPMENT OF SOCIAL MEDIA PROFILES
A rigorous process was used to create social media profiles for the experiment (detailed in
the figure, above). The first author reviewed over fifty Facebook and LinkedIn profiles of college
students, paying particular attention to pairs of profiles (that is, individuals who had both a
Facebook and a LinkedIn profile) and recording similarities and differences between the
platforms, including information cues and their forms presented on each profile, as well as how
information was presented.
To develop the social media profiles, the author first created real profiles in Facebook and
LinkedIn and obtained permission from close friends and family members to use their profile
45
pictures and then used Adobe Photoshop to emulate real data and include manipulations. We were
especially concerned with maintaining the “spirit” of each social media platform. Despite this, it
was important to present information in a way that it remained consistent across both platforms as
well, so as not to confound our experimental results.
Selection of Political Attitudes
These three political issues have been extensively covered in the press (a search in Lexis
Nexis from January 1, 2012 to October 17th, 2014, indicates 2,642 articles about legalizing
marijuana, 999 articles about gun control laws and 992 articles about the Affordable Care Act or
“Obamacare”). These were salient because of several events that occurred in the broader
environment.
Legalized Marijuana: Though many states have legalized the use of medicinal marijuana,
Colorado was the first state to legalize the use of recreational marijuana in 2012 (Ferner, 2012)
and has since been the subject of debate over whether the rest of the nation should follow suit.
Gun Control: After a tragic elementary school shooting in December of 2012, President
Barack Obama took “…23 executive actions to access to restrict guns. However, an Obamabacked bill for stronger background checks for gun purchases could not get enough support to pass
the… Senate” (Lucas, 2014). Following another high school shooting in Oregon of 2014, it was
expected there will be more discussions in the U.S. Senate regarding gun control laws (Mapes,
2014), even while some states, such as Georgia, are in the midst of passing bills allowing for guns
in churches, airports, bars, and schools (Buchsbaum, 2014).
The Affordable Care Act: The Affordable Care Act was signed into law in 2010, under the
core principal that every American should have some form of healthcare (Stolberg & Pear, 2010),
though support of this act usually splinters along party lines (supported by Democrats and opposed
46
by Republicans). It is currently a subject of intense debate, with House Republicans still promising
to work to repeal it (O’Keefe, 2014).
All three of these issues are salient and/or polarizing issues that elicit emotional responses
and debate among United States citizens. For example, for the first time in the nation’s history, a
2013 Gallup poll indicated 58 percent of Americans indicated marijuana use should be legalized
(compared to 39 percent who believe it should not be legal). This is a 10 percent upswing from the
previous year’s results (2013). Gun Control is also highly polarizing, with a 2013 Pew Center Poll
showing 50 percent of all Americans support control laws, as opposed to 48 percent, who oppose
them. Finally, a 2013 Gallup poll indicates that American opinions over the Affordable Care Act
are very split, with 50 percent of Americans favoring it and 48 percent not favoring it. The United
States is very divided about how to approach each issue. For these reasons (the issues are current
and are polarizing), the issues of legalizing marijuana for recreational use, gun control, and the
Affordable Care Act were selected (these reasons are summarized in the table, below).
47
Selection of Social Media Platforms
Our study used multiple manipulations. Profiles were created in two separate social media
platforms, Facebook and LinkedIn, because they are popular and well-established (since its
inception in 2004, Facebook has the most users, with 757 million members numbers recorded as
of December 31, 2013; Sedghi, 2014 and LinkedIn was established in 2003, with over 332 million
members logging in (from LinkedIn’s press center, as of November 2014 (Sedghi, 2014,).
Facebook is used in a number of social media studies (Nosko, 2006; Mazer et al, 2006; Kluemper
& Rosen, 2009).
Experimental Manipulations
Transferring information across platforms was a notable issue for both political attitudes
and individuating information. First, since LinkedIn is a professional networking website and
members can recreate their entire resume on their profile, members may supply more
individuating information on this particular platform. For example, when editing one’s profile,
LinkedIn members have the option to provide information about their work experience with dates
and descriptions, tag relevant skills and endorsements, discuss honors and awards, and so on.
Facebook users can discuss work experience but do not have a specific area in which they may
delineate relevant skills and endorsements or discuss their honors and awards. On LinkedIn,
colleagues or supervisors can recommend or “promote” skillsets of job candidates, while this
option is not available on Facebook (for example, a decision maker might post “Great job today”
on his/her employee’s wall instead). For this reason, it was important to create profiles with
individuating information that were appropriate for the platform itself and its main purposes for
use.
48
For the “high” condition of individuating information, the condition needed to contain jobrelated information about our fictional employment candidates. In each case and in each platform
condition, the fictional employment candidate posted a manipulation in the form of a status update
containing job-related information (for example, Trent Thompson posted “I was named Employee
of the Month for having top sales numbers in June!!” Posts for Shane and Mark were created in
the same media form – text-based status updates – and used similar wording).
For the “low” condition of individuating information, the condition needed to contain
information that was not related to the job applicant’s job, or KSAs (knowledge, skills and
abilities appropriate for employing a job applicant). Instead, it was important to post information
that was more innocuous in nature. For example, each fictional employment candidate posted
individuating information on his Facebook or LinkedIn condition in the form of a status update
containing information that was clearly not job-related (for example, Mark Matthews posted a
status update saying “Annnd it’s gone! Go Buffs!” in support of the local minor league baseball
team). Posts for Shane and Trent were created using similar versions of innocuous information in
the same text-based form (more discussion on developing innocuous information is discussed in
the following chapter, as the profiles were populated with innocuous information in order to
appear “authentic”).
Similarly, though Facebook and LinkedIn have similar feature sets, members use them
differently and thus, political attitudes conveyed in one platform do not necessarily seamlessly
transfer to the other platform. Since LinkedIn is primarily used as a professional networking and
job search platform with a utilitarian purpose, members are less likely to express political
attitudes, for example, through posting long, ranting status updates (a practice that is acceptable
and in the norm for a hedonic platform, such as Facebook). When manipulating political attitudes
49
within profiles, it was important to consider the means by which this information might manifest
itself in a way that remained appropriate for the platform. Across platforms, political attitudes may
be manifested in sharing articles in status updates; “liking” politically-charged posts, pictures and
articles; and joining political groups, among other ways (all of our manipulations are featured in
the table below).
Political Issue
Condition
Legalizing
Marijuana
Political
Attitude
Legalizing
Marijuana
Individuating
Information
Gun Control
Laws
Political
Attitude
High Level
Low Level
50
Gun Control
Laws
Individuating
Information
The Affordable
Care Act
(“Obamacare”)
Political
Attitude
The Affordable
Care Act
(“Obamacare”)
Individuating
Information
These manipulations were checked by asking the respondents in the following survey,
“Does this applicant like/support legalizing marijuana/the National Rifle Association/the
Affordable Care Act?” They were then asked, “Do you support legalizing marijuana/passing
stricter gun control laws/the Affordable Care Act?” (Yes/Maybe/No/Decline to Specify), followed
by a question asking how strongly they supported this position (where 1 = Strongly Don’t Support
and 7 = Strongly Support”).
51
52
APPENDIX B:
SOCIAL MEDIA PROFILES
Experiment #1: Legalizing Marijuana
Mark Matthew’s Facebook: Supports legalizing marijuana, high individuating information
53
Mark Matthew’s Facebook: Supports legalizing marijuana, low individuating information
54
Mark Matthew’s Facebook: Does not support legalizing marijuana, high individuating information
55
Mark Matthew’s Facebook: Does not support legalizing marijuana, low individuating information
56
Mark Matthew’s LinkedIn: Supports legalizing marijuana, high individuating information
57
Mark Matthew’s LinkedIn: Supports legalizing marijuana, low individuating information
58
Mark Matthew’s LinkedIn: Does not support legalizing marijuana, high individuating information
59
Mark Matthew’s LinkedIn: Does not support legalizing marijuana, low individuating information
60
Experiment #2: Gun Control Laws
Trent Thompson’s Facebook: Supports stricter gun control laws, high individuating information
61
Trent Thompson’s Facebook: Supports stricter gun control laws, low individuating information
62
Trent Thompson’s Facebook: Does not support stricter gun control laws, high individuating
information
63
Trent Thompson’s Facebook: Does not support stricter gun control laws, low individuating
information
64
Trent Thompson’s LinkedIn: Supports stricter gun control laws, high individuating information
65
Trent Thompson’s LinkedIn: Supports stricter gun control laws, low individuating information
66
Trent Thompson’s LinkedIn: Does not support stricter gun control laws, high individuating
information
67
Trent Thompson’s LinkedIn: Does not support stricter gun control laws, low individuating
information
68
Experiment #3: Affordable Care Act (“Obamacare”)
Shane Smith’s Facebook: Supports Affordable Care Act, high individuating information
69
Shane Smith’s Facebook: Supports Affordable Care Act, low individuating information
70
Shane Smith’s Facebook: Does not support Affordable Care Act, high individuating information
71
Shane Smith’s Facebook: Does not support Affordable Care Act, low individuating information
72
Shane Smith’s LinkedIn: Supports Affordable Care Act, high individuating information
73
Shane Smith’s LinkedIn: Supports Affordable Care Act, low individuating information
74
Shane Smith’s LinkedIn: Does not support Affordable Care Act, high individuating information
75
Shane Smith’s LinkedIn: Does not support Affordable Care Act, low individuating information
76
APPENDIX C:
PERCEPTUAL MEASURES AND CONTROL VARIABLES
Measure of Perceived Similarity (Tepper, Moss & Duffy, 2011)
Anchors: 1= strongly disagree; 2= disagree; 3= moderately disagree; 4 = neither agree nor
disagree; 5= moderately agree; 6= agree; 7= strongly agree
Measure of Interpersonal Attraction/Liking (Wayne & Ferris, 1990)
77
Based on what you have seen, please tell us how much you liked the job applicant on the
Facebook website using the following items.
How much do you like this job applicant?
Ratings and anchors:
1 = I don't like this job applicant at all
3 = I neither like nor dislike this job applicant
5 = I like this job applicant very much
(no anchors for 2 and 4)
Ratings and anchors: 1= strongly disagree; 2= disagree; 3= moderately disagree; 4 = neither agree
nor disagree; 5= moderately agree; 6= agree; 7= strongly agree
Personal feelings (check one) – reverse-coded
_I feel that I would probably like this person very much.
_I feel that I would probably like this person.
_I feel that I would probably like this person to a slight degree.
_I feel that I would probably neither particularly like nor particularly dislike this person.
_I feel that I would probably dislike this person to a slight degree
_I feel that I would probably dislike this person.
_I feel that I would probably dislike this person very much.
Working together (check one)
_I feel that I would very much dislike working with this person.
_I feel that I would dislike working with this person.
_I feel that I would dislike working with this person to a slight degree.
_I feel that I would neither particularly dislike nor particularly enjoy working with this person.
_I feel that I would enjoy working with this person to a slight degree.
_I feel that I would enjoy working with this person.
_I feel that I would very much enjoy working with this person.
78
Measure of Hireability
Williams & Anderson, 1991
(Cable & Judge, 1997)
Please give your overall evaluation of this candidate (ranging from very negative [1] to very
positive [5]).
Based upon what you have seen, tell us what kind of employee you think the job applicant will be
using the following items. The job applicant can be expected to:
79
Ratings and anchors: 1= strongly disagree; 2= disagree; 3= moderately disagree; 4 = neither agree
nor disagree; 5= moderately agree; 6= agree; 7= strongly agree
Control Variables
We controlled for relevant demographic and psychological variables. Control variables,
their definitions, and their measures are presented in Appendix A. Though research results as to
the impact of gender and ethnicity are mixed in psychology (as per the discussions in Chapters 2
and 3), they are popular variables in many studies using demographic similarity theory (McCarthy
et al, 2010). We measured these demographic factors, as well as sexual orientation.
We also measured individual cognitive absorption, a user’s deep involvement with social
media, was evaluated using the Cognitive Absorption scale developed by Agarwal and Karahanna
(2000). We were concerned that a more “entertaining” platform, such as Facebook, might engage
our subjects more, possibly influencing our results, so this scale was used to account for how
engaged subjects were with the two different platforms, Facebook and LinkedIn. The scale
consists of five components (control, focused immersion, curiosity, heightened enjoyment and
temporal disassociation), all with reliabilities ranging from .83 to .93. Items, such as “time appears
to go by very quickly when I am using the web,” are rated on a 7-point Likert scale (1 = “strongly
disagree” and 7 = “strongly agree”).
In the study’s instructions, we clearly informed respondents that there was no right or
wrong way to answer our survey items. We also suggested that viewing the social media profiles
could be seen as a “fun” activity as well in an effort to alleviate pressure respondents might feel to
answer in a socially desirable way. We also felt that, since our subjects were judging others
instead of reporting on their own behaviors, our subjects might be more honest (Collerton et al,
2006). However, since the political beliefs we manipulated are considered polarizing, some
80
applicants might still attempt to respond in a way they deem more socially appropriate. As such,
we controlled for social desirability, an individual’s proclivity to respond to items in a way they
might feel is socially admired, using a shortened version of the Marlow and Crowne social
desirability scale (Reynolds, 1982). The scale had a reliability of .82 in the literature. The scale
consists of statements, such as “I never hesitate to go out of my way to help someone in trouble”
and operates on a 7-pt Likert scale.
As a procedural control, we also controlled for information presented in the profiles that
was not directly relevant to the manipulations. All profiles, not including the experimental
manipulations, contained the same information in forms appropriate for the platform condition.
For example, for each political issue, the same profile picture was used for every condition of the
experiment. All fictional job applicants were business graduates from the University of Colorado
with similar work experience (note, across experiments, the wording was varied and the names of
the companies the applicants currently worked for were also varied).
Finally, we controlled for relevant professional and educational experience. our research
subject were also asked, “Have you ever interviewed anyone before?,” “Have you been trained in
how to evaluate social media?” and “have you served in a human resources management position
before?.”
81
Measure of Cognitive Absorption (Agarwal & Karahanna, 2000)
Please indicate how you feel about Facebook/LinkedIn below:
Cognitive Absorption Scale
1. Time flies when I am using Facebook/LinkedIn.
2. Most times when I get on Facebook/LinkedIn, I end up spending more time than I planned.
3. Sometimes I lose track of time when I use Facebook/LinkedIn.
4. When using Facebook/LinkedIn, I am able to block out most distractions.
5. While using Facebook/LinkedIn, I am absorbed in what I am doing.
6. While using Facebook/LinkedIn, I am immersed in the task I am performing.
7. I often spend more time on Facebook/LinkedIn than I intended.
8. I have fun using Facebook/LinkedIn.
9. Using Facebook/LinkedIn bores me.
10. I enjoy using Facebook/LinkedIn.
Italicized = reverse-coded items
Ratings and anchors: 1= strongly disagree; 2= disagree; 3= moderately disagree; 4 = neither agree
nor disagree; 5= moderately agree; 6= agree; 7= strongly agree
82
Measure of Social Desirability (Reynolds, 1982)
Please indicate your agreement with the following statements.
Social Desirability Scale
1. I never hesitate to go out of my way to help someone in trouble.
2. I have never intensely disliked anyone.
3. When I don't know something, I don't at all mind admitting it.
4. I am always courteous, even to people who are disagreeable.
5. I would never think of letting someone else be punished for my wrongdoings.
6. I sometimes feel resentful when I don't get my way.
7. There have been times when I felt like rebelling against people in authority even
though I knew they were right.
8. I can remember "playing sick" to get out of something.
9. There have been times when I was quite jealous of the good fortune of others.
10. I am sometimes irritated by people who ask favors of me.
Italicized = reverse-coded items
Ratings and anchors: 1= strongly disagree; 2= disagree; 3= moderately disagree; 4 = neither agree
nor disagree; 5= moderately agree; 6= agree; 7= strongly agree
83
APPENDIX D:
CONFIRMATORY FACTOR ANALYSIS
CFA Model Iterations and Fit Indices
Fit Index
Satorra-Bentler Chisquare (S-B Chi-square)
Bentler's Comparative Fit
Index (CFI)
Root Mean Square Error
of Approximation
(RMSEA), RMSEA
Intervals
Standardized Root Mean
Square Residual (SRMR)
CFA Model Fit Indices
Normal
ChiSquare*
7293 (d.f. =
Initial
1596)
Model
Definition
Measures goodness-of-fit for small sample
sizes, large models and/or nonnormal samples
(Satorra & Bentler, 1988)
An incremental or comparative fit index that
assesses fit of a structural model in
comparison with the null model (Hu &
Bentler, 1999
A "badness" of fit index, where values closer
to 0 are better; includes model parsimony and
sample size (Hu & Bentler, 1999)
The square root of the difference of residuals
of the sample covariance matrix and
hypothesized model (Hooper et al, 2008)
S-B Chisquare*
p-value
4214.7 (d.f.
= 1527)
6899.88
2771.77
Model (d.f. = 1431) (d.f. = 1340)
Split
Hireability
6899.88
2051.4 (d.f.
Model (d.f. = 1338) = 1306)
Split
Hireability,
Deleted Item
2023.2 (d.f.
Final Model 5271.362
(d.f. = 1304) = 1155)
CFI*
Good
Our
Study
NA
2023.2
>.90
0.96
≤.05,
0,.08
0.05,
.05,.06
≤.08
0.08
RMSEA* RMSEA
SRMR
Intervals*
<.001
0.7
0.1
.1,.1
0.338
<.001
0.83
0.08
.07,.08
0.18
0.02
0.9
0.06
.05,.06
0.08
0.01
0.96
0.05
.05,.06
0.08
* = robust fit indices, S-B = Satorra-Bentler
84
Factor Loadings and Scale Reliabilities
Scale and Item Description
Condition 1 - Legalizing Marijuana
Similarity
The job applicant and I…
Are similar in terms of our outlook, perspective, and values
Analyze problems in a similar way
Think alike in terms of coming up with a similar solution to a
problem
Are alike in a number of areas
See things in much the same way
Liking
How much do you like the job applicant?
I would likely get along well with this job applicant.
Supervising this job applicant would likely be a pleasure.
I think this job applicant would likely make a good friend.
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Hireability - Task
The job applicant can be expected to…
Adequately complete assigned duties
Perform tasks that are expected of him/her
Meet formal performance requirements of a job
Hireability - OCB
The job applicant can be expected to…
Help others who have heavy workloads
Go out of his/her way to help new employees
Take a personal interest in other employees
Condition 2 - Gun Control Laws
Similarity
The job applicant and I…
Are similar in terms of our outlook, perspective, and values
Analyze problems in a similar way
Think alike in terms of coming up with a similar solution to a
problem
Are alike in a number of areas
Standardized
Factor Loading
(unstandardized) S.E.
.86(.93)
.82(.91)
0.09
0.08
.87(.93)
.85(.92)
.87(.93)
0.05
0.06
0.09
.78(.89)
.84(.92)
.83(.91)
.70(.84)
0.04
0.06
0.06
0.06
.83(.91)
0.08
.70(.84)
0.11
.92(.96)
.91(.96)
.87(.93)
.89(.94)
.82(.90)
.85(.92)
Alpha
Rho
0.92
0.93
0.90
0.91
0.92
0.97
0.9
0.9
0.92
0.93
0.06
0.07
0.07
0.07
0.08
0.09
.86(.93)
.85(.92)
0.07
0.05
.83(.91)
.82(.90)
0.05
0.08
85
See things in much the same way
Liking
How much do you like the job applicant?
I would likely get along well with this job applicant.
Supervising this job applicant would likely be a pleasure.
I think this job applicant would likely make a good friend.
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Hireability - Task
The job applicant can be expected to…
Adequately complete assigned duties
Perform tasks that are expected of him/her
Meet formal performance requirements of a job
Hireability - OCB
The job applicant can be expected to…
Help others who have heavy workloads
Go out of his/her way to help new employees
Take a personal interest in other employees
Condition 3 - The Affordable Care Act ("Obamacare")
Similarity
The job applicant and I…
Are similar in terms of our outlook, perspective, and values
Analyze problems in a similar way
Think alike in terms of coming up with a similar solution to a
problem
Are alike in a number of areas
See things in much the same way
Liking
How much do you like the job applicant?
I would likely get along well with this job applicant.
Supervising this job applicant would likely be a pleasure.
I think this job applicant would likely make a good friend.
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Based on what you have seen, please tell us how much you liked
the job applicant on the [Facebook or LinkedIn] website
Hireability - Task
The job applicant can be expected to…
Adequately complete assigned duties
Perform tasks that are expected of him/her
.88(.94)
0.06
.63(.81)
.86(.93)
.83(.91)
.81(.90)
0.04
0.06
0.07
0.1
.86(.93)
0.14
.69(.83)
0.15
.96(.98)
.93(.97)
.82(.91)
.85(.92)
.78(.88)
.89(.94)
0.9
0.92
0.97
0.89
0.9
0.94
0.94
0.87
0.88
0.92
0.97
0.04
0.06
0.11
0.08
0.13
0.09
.83(.91)
.88(.94)
0.06
0.04
.88(.94)
.82(.90)
.93(.96)
0.05
0.06
0.04
.79(.89)
.82(.91)
.81(.90)
.79(.89)
0.03
0.08
0.06
0.07
.60(.78)
0.24
.63(.80)
0.19
.91(.96)
.92(.96)
0.9
0.05
0.05
86
Meet formal performance requirements of a job
Hireability - OCB
The job applicant can be expected to…
Help others who have heavy workloads
Go out of his/her way to help new employees
Take a personal interest in other employees
.85(.92)
0.07
0.89
.86(.93)
.85(.92)
.86(.93)
0.9
0.04
0.07
0.05
Correlation Matrix
Sim = Similarity Like = Liking
Task = Hireability –Task
OCB = Hireability - OCB
Conditions: 1 = Legalizing Marijuana 2 = Gun Control Laws 3 = Affordable Care Act (“Obamacare”)
Table 6. Latent Variable Correlations
Measurement Equivalence in Confirmatory Factor Analyses
Before evaluating our structural model, we tested to ensure our confirmatory factor
analyses of the three political conditions had the same pattern of fixed and free factor loadings
(that is, that the three conditions were equal, or showed “metric invariance”; Vandengerg &
Lance, 2000; Horn & McArdle, 1992). To do so, we evaluated the Satorra-Bentler chi-squares of
the study’s factor structure to a constrained model (with constrained factor loadings, a more
restrictive model). The Satorra-Bentler chi-square for our model (2023.2 with 1155 degrees of
freedom) was compared to the Satorra-Bentler chi-square (2,000 at 1184 d.f.), yielding a scaled
difference of 29.49. This difference was not significant (p>.2), providing evidence of metric
invariance (the factor structures across each condition have the same factor structure, or are
equivalent) (Bryant et al, 2012; Vandenberg & Lance, 2000).
87
APPENDIX E:
PRE- AND PILOT TESTING RESULTS
The initial survey instrument was extensively pretested. We did so to ensure the
experimental manipulations, particularly differing information cues (“liking” the National Rifle
Association vs. posting an article about legalizing marijuana), were a valid level of stimuli;
ascertain whether the profiles appeared authentic to respondents; and evaluate the experimental
procedure and questionnaire (Dennis & Valacich, 2001).
The pre- and pilot testing processes refined development of the social media profiles (the
process is highlighted in Figure 3). Our pre-testing sample consisted of business graduate students
who had experience selecting new employees (n = 10). Participants indicated the initial profiles
were “vanilla-looking” and appeared inauthentic, prompting the research team include innocuous
information to give the appearance of authenticity without drawing focus from the experimental
manipulations, leading to a second iteration of profiles used in the experiments. A sample social
media profile is presented in Figure 3. This profile was created for Experiment 1: Legalizing
Marijuana and contains a condition of positive political attitudes about legalizing marijuana and a
low instance of individuating information. Note, in this particular profile, sources of innocuous
information populating the applicant’s profile include his list of likes (the Denver Broncos,
Youtube, Starbucks, etc.) and places he’s visited (Prentup Field, Longhorn Steakhouse, etc.)
provide additional authenticity, the status updates all have a number of “likes” to them and a few
of the status updates may also be mapped correctly with the “places” the applicant has visited. The
remaining social media profiles are included in Appendix C of this document.
After developing our preliminary instrument and social media profiles, we conducted a
pilot study to simulate the full experiment (n = 35 undergraduate business majors). In particular,
88
we had two major objectives for conducting a pilot test: to uncover logistical problems with our
survey and to establish an estimate of the total time required to complete the survey (Dennis &
Valacich, 2001). Due to our relatively small sample size for the pilot test (n=35), we were unable
to run statistical analyses of our research model. However, we did determine that the experiment
ran smoothly and required around 20-25 minutes to administer. We determined that showing the
profile at the bottom of each question set was ideal for survey respondents. Further, to ensure
respondents were responding to experimental manipulations, as well as providing manipulation
checks, we asked an open-ended question, “What did you notice most about this social media
profile?” with each profile. In all, for each political issue, we determined the experimental
manipulations, regardless of the cue, were visible to our respondents (using descriptive statistics
and frequency charts, we determined 100% of applicants noticed our manipulations in experiment
one, 98% for experiment two and 95% for experiment three). We also examined the reliability of
our scales, finding acceptable reliability for all of them (ranging from .85 to .93). On the basis of
these results, we determined no significant changes needed to be made to our social media profiles
or the survey instruments and we should continue on as planned.
89
APPENDIX F:
STRUCTURAL INVARIANCE IN RESEARCH MODELS
For this study, comparison between groups was of importance; it was important to ensure
that our results were not specific to one particular political issue and were pervasive across all
three. We tested to determine whether three political conditions had the same pattern of
parameters (that is, that the three conditions were equal, or showed “structural invariance”;
Vandengerg & Lance, 2000; Horn & McArdle, 1992). Structural invariance can be tested by
comparing the Satorra-Bentler chi-squares and independent chi-squares of the study’s structural
model to a constrained model (with constrained relationship, a more restrictive model). This test
was done to determine whether the relationships in our model were moderated by the political
condition (or not).
We first compared Condition 1 (legalizing marijuana) to Condition 2 (gun control laws).
Two equality constraints significantly harmed model fit, indicating these relationships were not
equal across conditions. The equality constraints for likinghireability relationships for task (b =
1.15 and 1.26 in the legalizing marijuana and gun control laws conditions) and organizational
citizenship behaviors (b =1.15 and 1.13 in the legalizing marijuana and gun control laws
conditions) resulted in significant Chi-squares of 18.17 (p<.001) and 10.572 (p <.001),
respectively, where the results for our legalizing marijuana condition were significantly higher
than the gun control laws condition for these particular relationships. No other relationships across
models tested as significant, indicating the relationships did not differ significantly across
conditions.
For Condition 2 (gun control laws) and Condition 3 (the Affordable Care Act), three
constraints harmed model fit. Indicating these relationships were not equal across conditions. The
90
liking hireability relationship constraints for task (b = .1.26 and 1.64 for the gun control and
Affordable Care Act conditions, indicating a significantly higher relationship in the Affordable
Care Act condition) and organizational citizenship behaviors (b = 1.15 and 1.13 for the gun
control and Affordable Care Act conditions, showing a significantly higher relationship for the
gun control laws condition) resulted in significant Chi-square values of 5.89 (p = .015) and 9.747
(p = .002). The individuating information  platform  hireability – OCB constraints harmed
model fit as well (b = .41 and .22 for legalizing marijuana and gun control laws, Chi-square =
16.57, p = .001), showing a significantly higher relationship for the legalizing marijuana
condition. No other relationships across models tested as significant, indicating the relationships
did not differ significantly across conditions.
Finally, for Condition 1 (legalizing marijuana) and Condition 3 (The Affordable Care Act),
two constraints harmed model fit, including the individuating information  hireability (b = .08
and .83 for the legalizing marijuana and Affordable Care Act conditions) link for task behaviors
(Chi-square = 15.15, p = .02, with a significantly higher relationship in the Affordable Care Act
condition) and for the individuating information  platform  hireability – task behaviors (b =
.41 and .18 for the legalizing marijuana and Affordable Care Act conditions; Chi-square = 5.802, p
= .03). This relationship was significantly higher in the legalizing marijuana condition. No other
relationships across models tested as significant, indicating the relationships did not differ
significantly across conditions.
91
Download