In the age of smart phones, hacking and fears of violated privacy, getting the pulse of a city of 1.5 million is increasingly difficult.
To solve the problem, Temple University's Institute for Survey Research is trying a new surveying method to gather data that paints an accurate picture of Philadelphia. The ISR is using bus stop ads, social media and neighborhood outreach to recruit a panel of several thousand Philadelphians who will be indefinitely used for survey requests from government and nonprofits in the area, staff there said. Those who sign up for BeHeardPhilly can respond to surveys though text message, phone call or email.
"It's extremely immediate," said Heidi Grunwald, ISR director. "It's right there on your phone."
BeHeardPhilly is the first city wide panel of its type in the country, Temple's staff said. The method will get its first application in the coming days, as the ISR gathers information about Philadelphians' traffic-related knowledge and behaviors for the city's streets department. Those who sign up get $5 as an incentive for completing the survey.
"Instead of reinventing the wheel every time you want to know something about your citizens," Grunwald said, "this is a much more easy way to do this."
Those interested can join in several ways:
People can also email for more information at email@example.com.
The opt-in approach, which essentially gives an open invitation to anyone to sign up and then uses those who respond, has critics. Because it isn't random like cold calling or knocking on every door in a neighborhood, its accuracy can't be validated with statistical methodology, the suite of tools that generates things like margin of error. But it also offers a way to overcome the low response rates, in some cases in the single digits, that plague statistical surveying.
"I think it's safe to say this is the future of survey research," Grunwald said. "We're trying to be on the cutting age."
As mentioned above, modern technology is giving survey methods a beating, experts said. People who don't have land lines and don't answer cell phone calls kill one of the most dependable ways to build a probability sample. Fears of hacking and information theft has left people wary about answering questions, said Robert Santos, vice president of the American Statistical Association and chief methodologist at the Urban Institute in Washington D.C. Those methods have also become very expensive. Assembling a random panel can cost up to $20,000, Grunwald said.
Assembling an opt-in panel, and turning to the same pool of people whenever a study is needed, costs a fraction of that per survey, she said. And, with this approach, dismal response rates rise to numbers around 30 and 40 percent.
Stats experts say the approach has worth. It's a good way to measure issues' importance to a community, or to determine attitudes, Santos said. Using a random panel is more precise, but some questions don't need to be accurate to the decimal point to provide insight into a population.
"Statistical methods are ways of gathering information and knowledge," Santos said. "A lot of times you don't need that."
Another expert, Roger Tourangeau, vice president and associate director at Westat, said opt-in surveys are simply more practical in some cases. He, along with colleauges Frederick Conrad and Mick Couper, authored a book on online surveys, the Science of Web Surveys.
"These online panels make things affordable," he said. "The other option isn't to do a probability sample, because they're too expensive. The other option is to do nothing."
Temple University is going to face challenges with this approach though, these experts said. Among the most obvious, the volunteers who participate run the risk of creating a selection bias. People who actively seek out participation in a survey, rather than those who are randomly gathered, may skew results. There are issues of confirming online participants are who they say they are and aren't engaging in fraud, Santos said. A panel that's used repeatedly has a tendency to suffer from people dropping off the panel or ignoring surveys, and there is also the chance of panelists becoming conditioned in ways that skew results.
Another concern, older, less educated and poorer people are less likely to participate in an online study, Tourangeau said. In part, that's a factor of some of those populations having less access to the internet.
And, while statistical precision isn't always needed, not using it raises the possibility of getting results that can't be tested for accuracy, Santos said.
"It means you're putting a lot more faith with a lot less science to defend it," he said.
Temple's stats gurus are aware of all the potential pitfalls in their approach, Grunwald said. They are actively working to find volunteers in communities underrepresented in opt-in studies. The also plan to do random checks of panel participants to confirm identities for those who register online.
The ISR is going to steadily recruit to ensure the panel continues to be representative of the city's demographics. Offering texting and phone options, not just online access, for participants is another way to get buy in from those groups less likely to respond.They also say they will use different approaches for different kinds of studies. Some don't need the accuracy offered by statistics, but there are tools that allow opt-in survey data to be weighted in a way that will allow it to resemble, and be analyzed like, that gathered from a probability sample.
That data manipulation isn't a complete fix, though, Tourangeau cautioned.
The ISR has about 750 people volunteering to be on the panel so far, and hopes to have 3,000 to 5,000 in the first year of recruiting. The ultimate goal would be a 10,000 person panel with participants representing the full range of the city's population. The participants' information would be kept in a database, allowing future surveys to easily hone in to get information about hard to study populations.
"Over time we learn more and more about our panel members," Grunwald said.
The methodology the ISR is pursuing is also a starting place, she said. The problems they were facing were low response rates and high costs. The opt-in panel is an experiment that will be modified and adjusted to get the best results.