A Piedmont High School email inbox wouldn’t be quite the same without the weekly pile-up of surveys. Whether hidden behind a “Course Update” or heralding “Survey!” in the subject line, PHS students typically receive a minimum of two surveys per week through their email. While most students are aware of the surveys, opinions on them tend to vary.
“I don’t see the benefits,” senior Max Gaylord said. “No action is taken on them.”
Many other students shared similar opinions.
“They don’t benefit the school and no one listens to them. They’re pointless,” junior Jasper Schuetz said.
Even if students do answer surveys they sometimes can feel their opinions are still ignored.
“In the [survey] about the pride flag, they didn’t really use a lot of our opinions,” freshman Cory Minor said.
Multiple other students across grade levels agreed that the surveys are not being used to actually implement change.
On the other hand, many students simply disregard the surveys they receive.
“I’ve never opened those [surveys],” senior Harper Mand said.
Junior Olive Reining shared a similar sentiment.
“I never do them,” Reining said. “If it looks vaguely interesting I’ll fill it out.”
“No one answers them unless the teacher specifically mentions it,” junior Nate McKenzie said.
Freshman Zoya Schulze had a similar idea.
“If you want to get the students to answer the surveys, I think it should be an assignment in class,” Schulze said. “I think that the surveys are really great but I think a lot of people in the school don’t really open up the surveys and check to see what they’re about, so the surveys don’t get the student opinion.”
However, some students who take the surveys believe they are being used for good.
“Usually [the surveys] have information that I would like to know about or things I would like to make change to,” sophomore Raiden Holt said. “I think they do use the information in the surveys to make change.”
Math teacher Thomas Palsa agreed, adding that there is good intent behind the surveys being sent.
“When Mr. Yoshihara sends [surveys] out, they’re concrete surveys. He’s earnestly trying to gauge what people want,” Palsa said.
Principal David Yoshihara expressed similar thoughts.
“We want to see if there is student interest in different ideas,” Yoshihara said. “They are just to gauge general support.”
Additionally, Yoshihara said he makes sure to run his surveys by others in the office to prevent bias.
“When I create a survey, I’ll run it by some other folks, so I’ll run it by Mr. Marik or Mr. Dolowich,” Yoshihara said.
Math department chair and statistics teacher Dr. John Hayden emphasized the importance of these checks, explaining from a statistical perspective how surveys need to be looked over for bias.
“Something a lot of people forget about is having someone else review your survey before you send it out because an outsider can find sources of bias,” Hayden said. “Surveys can be very manipulative if you ask questions in particular ways that sort of lead your participant to answer certain ways, and depending on the construction of the survey, you can kind of predetermine what the outcome is gonna be.”
Hayden also highlighted some benefits of surveys.
“When they’re done right, they can be really effective in gathering information from a much larger audience than you might be able to get otherwise,” Hayden said. “Doing a survey allows people to give answers that can be more honest, depending upon how the survey is set up.”
Yoshihara agreed with this, mentioning how honesty is a priority in the surveys he sends.
“In general, if the surveys are anonymous, you’re not dealing with integrity issues,” Yoshihara said. “I think most people don’t have a reason to not be truthful on them.”
Additionally, Yoshihara explained how he addresses response rates on the surveys he sends.
“We usually get maybe a quarter, 25 percent. If we push harder we might get a higher amount, but a quarter most of the time,” Yoshihara said. “When I send out a survey, I like it to be short. I think you get a better response rate if people know it’s a couple questions rather than 20 or 30.”
However, Yoshihara also said he’s aware of how some students never check their emails/surveys at all.
“There could be people who either don’t read them, or don’t have time,” Yoshihara said.
ASB President senior Gen Hiller agreed, emphasizing that the students who do check their emails end up having a bigger say in the things decided by surveys.
“People that check their email and that are always on Schoology and super active have kind of a bigger voice, or a louder voice,” Hiller said. “And so I think at times, those surveys can be less of a good gauge of the public opinion, because you tend to only get people who are very, very passionate about it, and not some of the other people that maybe have an opinion but don’t know that they can voice their opinions.”
Hiller also covered student interest in survey results, explaining why ASB does not typically release them.
“We haven’t released the results of our surveys for some of the smaller things, not because we’re trying to hide things, but more just because we’re like, oh, ‘this is feedback specifically to us’,” Hiller said.
On the higher district level, there is a lot of interest in survey results as well, according to Superintendent Jennifer Hawn. Despite that, while survey results from the cell phone survey were discussed in a board workshop, direct survey results were never publicly released for any of the surveys.
With that lack of information, the question remains of what administrators actually do with the results.
“We haven’t really made any decisions with any of the surveys, including the belonging survey,” Hawn said. “So no decisions have been made based on the surveys … we’re not a survey company, we’re not creating surveys for any formal purpose.”
Hawn continued to emphasize the informal nature of the surveys, explaining the general intent behind them.
“The goal is just to get feedback,” Hawn said. “It’s one method of getting perspective.”