The Happiness Effect
Page 37
The interview questions themselves cover a wide range of issues related to social media and were developed to try to address the questions I was hearing from students during lecture visits for Sex and the Soul. The online survey was developed after the interview process was completed and included a smaller series of only open-ended questions; those questions were selected based on their salience with students during the interview process.
With the help of both Dr. Smith and Dr. Nicolette Manglos-Weber, I went through the University of Notre Dame’s institutional review board (IRB) for both the interview and the online survey dimensions of this study. The Notre Dame IRB approached my application with the understanding that, in theory, the board would be providing approval on behalf of all participating institutions. I also subsequently went through the IRB approval process at all thirteen schools that participated in my study, but Notre Dame’s stamps of approval helped to expedite this process.
INTERVIEW METHODOLOGY AND PARTICIPATION: SAMPLING AND ADMINISTRATION
The student interviews are the primary and most important sources collected in this study. To obtain the strongest pool of interview candidates, I employed a random sampling methodology (more on this later).
The interviews I conducted were semistructured and lasted anywhere from thirty to ninety minutes. With every participant I asked about the same series of topics and questions. However, every interview was unique because the students raised different themes and issues; when a student made a comment that was particularly intriguing, or that he or she wished to go into in more depth, I reserved the right to ask additional questions related to that topic. In some instances, this semistructured approach led me to add questions to subsequent interviews, since several participants raised important issues that were not addressed in the basic questionnaire. I did my best to ensure that every interview touched on every main theme and question overall, even if that meant the interview ran very long. All students were informed before the interview started that if they wanted to skip a question, they had this right, but very few chose to do so. Each student was promised anonymity and the student signed an informed consent document prior to the interview.
One of my primary concerns was spreading the interviews out across a range of institution types. When selecting the schools, I considered factors such as religious/nonreligious affiliation and geographic location, and I tried to reach as diverse a student population as possible in terms of educational background, race, ethnicity, and socioeconomic status. I began my search by approaching personal acquaintances and colleagues at a number of schools, as well as taking advantage of new contacts and opportunities provided by lecture visits, if the inviting institution fit the school type for which I was searching. Finding institutional participants was rather easy because everywhere I inquired, colleges and universities were very interested in data on social media and their student populations, and thus the study intrigued them. My campus contacts ranged from faculty members to Campus Ministry to Student Affairs administrators; these contacts helped me pass through the IRB process at their institution, facilitated access to a random sampling of students for the interview pool, and helped identify a space where the interviews could take place.
In the end I visited thirteen schools for the interviews: three public (one each in the Northeast, the Southeast, and the Midwest); three private-secular (one each in the Southwest, the Northeast, and the Midwest); three Catholic (one each in the Northeast, the Mid-Atlantic, and the Midwest); two evangelical Christian (one in the Southeast, one in the West); and two mainline Protestant (both in the Midwest). All participating institutions were promised anonymity and will remain anonymous.
To identify the pool of potential interview candidates, once a participating school passed through the IRB process, my campus contact would generate a random sample of thirty to forty students, trying to balance for gender. At the largest institutions, my campus contact helped identify a way to sample from a large pool of students in a manageable way. At two such institutions (both of them private-secular), this meant the sample was pulled from the entirety of the school’s Honors College. At one of the largest public universities, this meant drawing from the pool of students at one of the university’s specialized colleges. At another two of the larger institutions (one evangelical Christian, one private-secular), the sample was drawn from the complete rosters of students living in campus housing. At five of the schools (one Catholic, two mainline Protestant, two public), the random sample came from the combined student lists from a series of classes. At the three remaining schools (two Catholic, one evangelical Christian), the sample was drawn from the entire undergraduate student population.
At each school, I interviewed between eleven and sixteen students, for a total of 184 interviews across the thirteen participating institutions. To identify these students, I took the randomly sampled list of names that included their emails and gender that was given to me by the campus contact, and I emailed an invitation to participate in the interview process (with all necessary information and informed consent form attached) to the first eight women and the first eight men on the list. Then I would do my best to get as many of those students as possible to agree to meet with me when I was on campus—ideally the first sixteen I initially contacted. I did follow up with emails and, if necessary, also had the campus contact follow up with the students I hadn’t yet heard from to encourage their participation. The response rate was high overall, though at two of the schools, it proved to be unusually difficult to recruit student participants out of such a small random sample.
In the end, to get 184 student participants for the interviews, I emailed a total of 235 invitations to randomly sampled students across the thirteen participating institutions, for a response rate 78 percent. Not a single student who participated in the interview process volunteered based on interest. Once the invitations went out, some students responded nearly immediately, and I scheduled their interview; I needed to email some other students three or even four times—sending reminders and pleas—to get them to agree to meet with me. At times, I called on my campus contact to follow up with the stragglers on my behalf, and this usually generated a few more responses as well.
Each potential participant was offered a cash incentive, which he or she would receive at the end of the interview in the form of a thirty-dollar prepaid credit card, generated at Notre Dame for this purpose. Out of the fifty-two students who received invitations but did not do interviews, forty-three simply never responded to my attempts to get in touch with them, six actively declined, two canceled at the last minute because they got sick, and one canceled at the last minute because of a lab that got rescheduled during the interview slot, on the day that I was leaving campus. All participating students had to be at least eighteen or older and enrolled as undergraduates.
The benefit of random sampling is obvious: the pool of interviewees at each school was pulled based on the simple fact of being currently enrolled undergraduate students at the participating institution and not because they self-identified as being particularly interested in or active on social media or desirous of discussing it. This method of sampling helps to alleviate this sort of bias among the interview pool. One potential weakness in the interview pool involves the five institutions where the random sample was pulled from a series of classes and professors—in other words, from a smaller population of students on campus, as low as eighty to a hundred—as opposed to the eight other institutions, where the sample pulled was drawn from populations that generally ranged into the high hundreds (at the very least) and even thousands, depending on the school’s size. Even at these five schools, however, there was not a single volunteer. All participants received their invitations “out of the blue,” so to speak, and then we went from there.
Another weakness I can identify that may have affected the response rate is that email is a tough way to contact students today. Many students check their university email accounts very infrequently, if at all. Som
e universities try to require students to check their campus emails, but students tend to prefer contact via either text or, perhaps a bit ironically, Facebook. Two of the institutions (one Catholic, one private-secular) assigned me an undergraduate to help me track down the students; these student assistants followed up with the stragglers on Facebook, which helped a good deal in signing them up for interviews. I, however, did not contact any potential interviewees by text or via their social media accounts. In the future, if I was to initiate a new study, I would consider finding alternative ways to contact potential interviewees other than via email.
Overall, the demographic breakdown of the interviewees was as follows:
Gender: 92 women (approx. 50%), 91 men (approx. 50%)
Racial breakdown: 121 white; 22 African American; 12 East Asian; 5 Southeast Asian/Indian; 1 Middle Eastern; 13 Hispanic; 10 biracial (4 half black/half white; 2 half Pacific Islander/half white; 2 half East Asian/half white; 1 half Hispanic/half white; 1 half East Asian/half Native Hawaiian)
Sexual orientation: 181 heterosexual, 1 gay, 1 lesbian, 1 bisexual
School type: public: 42/184 = 23%; private-secular: 44/184 = 24%; Catholic: 45/184 = 24%; evangelical Christian: 27/184 = 15%; mainline Protestant: 25/184 = 14%
The topics for the interview included the following:
Highlights of college
Friendships (general)
Romantic relationships (general)
Describe self (generally)
Describe self (socially)
Meaning/happiness (generally)
Religious background
Describe self (on social media)
Why do you participate/post?
Criteria for posts
Feelings about people’s reactions/nonreactions to posts
Expressing emotions on social media
Faith on social media? (if relevant)
Have you ever thought about quitting any accounts?
Bullying
Social media history—first account, what age, etc.
Parents and social media
Selfies
Comparing self to others/FOMO
Success and social media
Social media is necessary/unnecessary today?
Online image
Privacy
Gender
Competition/jealousy on social media
Self-esteem
Relationships/romance/dating and social media
Sexting
Smartphone and related issues
Social media and happiness
Discussions of social media at college with professors/staff?
Anything else?
ONLINE SURVEY METHODOLOGY AND PARTICIPATION: SELECTION AND ADMINISTRATION
The questions for the online survey were drawn directly from the pool of questions used for the interviews, and they were chosen after I finished all the interviews at the participating thirteen institutions. I waited to administer the online survey until after the interview process was finished because the interview process helped me to select only the topics that seemed most important to the students for inclusion in the online interview. I wanted the online interview to be as short and as efficient as possible—especially because it consisted of open-ended essay questions.
All questions were open-ended because—in my opinion—this format allows students to explain why they feel they way they do and why they do what they do, giving us far more information and insight into the landscape and significance (or insignificance) of a subject than questions that merely seek quantitative statistics.
All questions were also optional—save two. The first of these asked students either to check the social media sites to which they belonged or to check that they had none, which would send them to a different set of questions about why they don’t participate in social media. The second asked students if they had a smartphone; again, those who checked “no” would be sent to a different set of questions. The decision to make all the essays optional was made both to encourage the ease of participation and to see which topics really interested the students taking the survey enough that they chose to answer them.
All students were required to give informed consent for the survey before they could proceed to the survey questions. All students also had to be eighteen or older to participate and had to be enrolled as an undergraduate at their participating institution.
Aside from being asked to provide basic demographic data, students were asked to answer a gridlike series of questions about how various issues on social media make them feel, before moving on to a series of open-ended essays—all of which were optional. On the main page for essays, students were asked to choose five out of ten topics to write about, before moving on to a series of optional essay questions that asked about relationships and then another series about smartphones.
The following is the list of all optional essay topics students could answer for the online survey, in abbreviated form:
FIRST ESSAY PAGE
•Dos and don’ts of social media
•Social media behavior—then (when you first got on) versus now
•If you could go back, what do you wish you’d known about social media?
SECOND ESSAY PAGE (CHOOSE 5)
•Social media and self-expression
•Most/least favorite thing about social media
•Selfies
•Online image
•Social media as obligation
•Rules/criteria for posting
•Comparing yourself to others
•Gender and social media
•Anonymous sites
•Quitting social media
THIRD ESSAY PAGE
•Apps like Tinder
•Friendships
•Sexting
FOURTH ESSAY PAGE—ON SMARTPHONES
For students who have a smartphone:
•What do you like about them?
•Does having one mean you are “always available”?
•Do you ever intentionally take a break from it?
For students who do not have a smartphone:
•Why don’t you have one?
•How does not having one affect your life/college experience, if at all?
Please note: students who did not participate in any social media were directed to a special essay page designed for this, immediately after they filled out the demographic information. On it were four essay questions that dealt with the following topics:
•Why do you stay off social media?
•Have you always been this way, or did you quit?
•How does not having it affect your life/college experience (if at all)?
•What do you think of the phenomenon of social media?
Also, all students were given the option at the very end of the online survey to add anything else they’d like to say, using an essay format.
Unlike for the interviewees, who were sampled randomly, all students who took the online survey volunteered to do so. My reasons for this were twofold. The first reason was practicality. Convincing randomly selected, nonvolunteer students to participate in a study is incredibly time-consuming and very labor-intensive. The second reason has to do with the potential results: I was interested to see how the results would differ depending on whether the students (1) had been randomly sampled as with the interviews or (2) had volunteered because they were interested in the subject matter. I thought it would be valuable to have both types of participation to be able to compare them.
The obvious weakness of my sampling method for the online survey is exactly this: it’s an all-volunteer student pool. But, in my opinion, the fact that the interviews were all randomly sampled and therefore provide a comparison pool against the volunteers not only helps compensate for this weakness but actually helps to strengthen the results of the study overall.
Another thing to note about the online survey is that only nine out of the thirteen institutions where I conducted interviews participate
d—the three Catholic, the three private-secular, the two evangelical Christian, and one of the public institutions. At two of the public institutions, it proved too difficult to get approval to administer the online survey to students. Many larger universities have restrictions on how frequently they allow their students to be surveyed, and they often prioritize their own surveys over those that come from outsiders such as myself; they do not want their students experiencing “survey fatigue.” At one of the mainline Protestant schools, I simply got through to the people who needed to approve the online survey too late for the survey to go out (students were already well into final exams before summer); at the second mainline Protestant school, I never heard a response from my contact there about administering the survey at that institution.
As with the interviews, the students were invited to participate via email by my campus contact at the university. At three schools (two Catholic, one evangelical Christian), the survey invitation was sent out to the entire population on campus; at two schools (one evangelical Christian and one private-secular), the survey was sent out to the entire population of students who lived in undergraduate housing; at two schools (both private-secular), the survey was emailed to all students in the Honors College; at the public university, it was sent to all students at one particular undergraduate college; and at the remaining Catholic school, it was sent out to the population of students taking theological courses during that particular spring semester.
There was no cash incentive to complete the online survey. A total of 884 students across the nine participating schools volunteered to take the survey, which opened in March 2015 and closed in June 2015. Though the response rate was low, the students who did volunteer to take it wrote extensive answers to at least some, if not all, essay questions.