Fax-based surveys give PC World magazine flexibility and quick turnaround at a low cost

PC World magazine uses fax-based surveys, with the help of the program Teleform, as a fast and economical research method.

Author: Quirk’s Editor Joseph Rydholm

The fax has become an indispensable part of doing business, primarily because it has two attributes that are much-prized in the money-making world: speed and low cost. Since these attributes are also sought after by marketing researchers, it’s no surprise that fax machines are being used more frequently to do surveys.

Quick turnaround and low cost are what drew the research staff at PC World magazine to fax technology almost two years ago. Since then, the publication’s readers have faxed back thousands of questionnaires to the magazine’s San Francisco offices.

PC World is a monthly with a circulation of 910,000. Its target audience is business managers who are responsible for buying computers and related products for their companies.

Along with focus groups and phone and mail surveys, which the magazine uses to determine editorial direction, the PC World research staff uses a software program called Teleform, made by Cardiff Software, Solana Beach, Calif., to design a variety of fax-based surveys that are inserted into the magazine each month.

Teleform does automated forms processing, reading the data on the forms (whether it’s hand-printed characters [OCR], typed or machine printed characters and shaded or checked circles [OMR]) and entering it into your data analysis package of choice.

The survey form includes a mark in each of its corners which helps correct – prior to processing – any elongation or skew caused by the fax. In addition, a machine-readable code gives the survey form a unique identity to simplify sorting when receiving completed questionnaires for more than one fax-based survey.

Editorial research

PC World has been using Teleform for two years now, says Thomas Gewecke, research manager, PC World magazine. Initially the program caught the staff’s attention as a way to do editorial research cheaply and quickly.

Regular reader surveys help the magazine answer important questions such as whether editorial should focus on hardware versus software, tech tips or consumer information, etc.

The magazine already had a long-standing monthly telephone tracking survey, in which subscribers were called to find out which articles they read and how valuable they were, and to measure the success of editorial experiments. While the data from the telephone survey is valuable, it’s costly to obtain, Gewecke says. “Even a fairly simple phone survey can become a $10,000 or $20,000 enterprise, which is a big part of our editorial research budget.”

In addition, lengthy processing times mean that the data isn’t always available soon enough for the magazine to act on it. By the time the results come back, it can be several weeks after the issue being researched was mailed to subscribers. This is troubling for any publication, which typically has a long lead-time between when stories are planned and written and the time they appear in the magazine. But it’s especially so, Gewecke says, for a magazine covering the computer industry, where technology changes so rapidly. “If problems are uncovered in the telephone research, a delay of three or six weeks in generating results means you miss two to three issues before you can correct a problem and change editorial direction.”

18-month experiment

After discovering Teleform, the research staff began an 18-month experiment that involved putting surveys in the magazine and having readers fax them back. The in-magazine form used the same questions as those in the telephone survey and offered a prize of computer equipment to encourage responses. The survey also asked readers to supply demographic information.

Response volume averaged between 5,000 to 7,000 faxes per month. The magazine began receiving 500 to 1,000 responses within just a few days after issues were mailed.

For about a year and a half the magazine fax surveys were done in tandem with the telephone survey, Gewecke says, to allow comparison of the results. “Over time we were able to compare the results of the random sample survey with those from the self-selected survey and come up with some characteristic ways which they tended to vary, which gave us ways to interpret the early returns of the fax survey.”


Gewecke fully acknowledges that the fax survey has the inherent problem of self-selection bias but he says the surveys have proven very valuable to the magazine, chiefly as a kind of “early read” on the readership levels of each month’s articles.

“The fax survey allowed us to generate large samples which let us get an early feel for which articles were being read. It gave us a very valuable comparative tool both to benchmark the random sample phone surveys and to get some of the same data much sooner for much less money.”

In tracking the two surveys Gewecke found that the fax survey accurately reflected the readership levels indicated by the telephone survey for the most- and least-popular articles. Those that fell in the middle range were more difficult to gauge.

“We did a lot of comparisons to see if the non-random sample generated the same article rankings, for example. We found that about 75 percent of the time they generated rankings that were similar.

Once the staff had validated the usefulness of the methodology, it became a valuable and time-saving tool to fine tune PC World’s larger editorial direction, Gewecke says. “Having flexibility and agility in terms of changing our editorial direction quickly is very important.”

New use

Following the success of the editorial research, the magazine last fall found a new use for the fax-based surveys. As part of what the magazine calls Service and Support Monitor, each issue contains a faxable form that asks readers to detail any experiences they’ve had with the products and service of a wide range of PC manufacturers. “One of our big research problems for a long time has been how to gather usable service and support experience data for the large number of PC brands we review without spending huge sums,” Gewecke says.

The survey form lists over 60 computer makers and asks respondents to answer questions about any computer they’ve purchased from those manufacturers, to determine how the computer has functioned, if there have been any problems, and if so, what kind of service and support the customer received.
“Service and support has become one of the most critical buying criterion for computer buyers and certainly for our readers, the managers who buy for their workplace and people who buy for home use. It’s a way to differentiate between brands in a market where prices have dropped and many models have similar features,” Gewecke says.

Assess performance

PC World has tried to obtain this information in the past, for example, by having staffers pose as customers and call service departments with problems to assess the company’s performance. Another approach is to survey a random sample of readers by mail. But these methods haven’t netted the depth of information the magazine wanted.}

“The problem is, after the largest vendors, there are some small companies who are very innovative or inexpensive or who are doing things that make them worth our readers’ consideration. If you survey any universe of computer buyers it’s very easy to get subsamples of people who own computers made by market share leaders like IBM or Apple. But it’s almost impossible and would cost hundreds of thousands of dollars to do a mail survey of PC owners and get usable subsamples of owners of all the smaller brands we review. And there’s really no easily available commercial list of computer owners with PC brand data attached.”

The fax-back survey is an attempt to gather that data. Readers are sending them back at rate of 10,000 to 20,000 month. “We’re going to be able to build a database — again, a non-random, self-selected one — but we’ll have 100,000 responses in six to nine months. The result will be a unique database of service and support experiences segmented by brand that doesn’t exist anywhere else.

“We hope to use it to provide a historical assessment of our readers’ experience with different vendors and also to provide an ongoing tracking vehicle. Because one of our goals is to rank vendors in a positive sense in terms of those who have delivered good service and support to their customers.”

Readers are invited to send in an entry each month to update their experiences. “Assuming the sample size is large enough, we expect to be able to correlate other indications we might have that a company is improving its service or experiencing difficulties by changes in the flow of surveys coming in.”

The magazine already invites readers to send letters on their service and repair experiences to its Consumer Watch column. But it’s difficult to know if one reader’s experience is an isolated case or an example of a larger problem. The fax survey data can help clarify that.

“One of our goals with this survey is to have a much bigger sample of people coming in all the time. So we can say, for example, that on average we get 100 or 200 responses a month about vendor X and 80% of them are always positive. And if suddenly only half of them are positive that would give us another data source to investigate and follow up on. We want to collect data in this area and alert readers where it seems prudent.”

Basic drawback

Gewecke stresses that he realizes this fax survey information has a basic drawback: it’s the antithesis of a random sample survey. “We don’t use any of this data in isolation. The editorial tracking study was always used in parallel with another random sample survey. And it turns out to have unique characteristics that made it quite valuable to have in addition to the random sample surveys. We never discontinued the other sampling methodologies and no one here would consider it valid to rely just on the fax survey for the editorial readership scores.

“In the case of the Service and Support Monitor, we’ll never claim that we’re getting a fully projectable measurement of readers’ experiences. When we present findings from it in the magazine we’ll note that there are some baseline thresholds of error. But because the survey isn’t biased for or against any particular vendor, we believe that the data is a very valid and certainly unique source of comparative information about the service and support experiences our readers have with different PC vendors. The absolute numbers we get back, such as the percentage of a PC brand’s users who have experienced a hardware problem, are interesting but not as valid as those produced by a random sample mail or telephone study. But the relative numbers, comparing one brand to another, are quite valid, particularly as we track changes from month to month.

“It doesn’t supplant more traditional classic research that we do but it allows us to do kinds of research that no one else is able to do. It’s given us a lot of flexibility in our approach to research in designing new kinds of surveys where before we wouldn’t have bothered or it wouldn’t have been cost effective. This produces some research tools and instruments that have proven very valuable and would not have been possible to obtain any other way.”