Conducting surveys and interviews may seem simple enough, but it is amazing how many advocacy evaluations are based on poor data. While not everyone has to be an expert, there are a few simple tips that can make your advocacy evaluation efforts more rigorous and lead to more informed and accurate results. We are committed to giving you an overview of critical evaluation skills without overwhelming you, so we offer these pointers on qualitative and quantitative techniques to get your feet wet in the topic.
Rest assured, you will probably realize that the most basic skills are things you already have. If you would like to improve your advocacy evaluation skills, we provide some sites and resources at the end of this section to learn more.
Qualitative evaluation techniques deal with things that cannot be put into numbers. These might include responses from people through interviews, focus groups and observation. Some general rules of thumb for qualitative evaluation:
- Avoid leading questions: Our interest in reaching the desired conclusion is completely understandable. Don't let it stand in the way of constructing questions that will yield the best information. When conducting an interview, survey or focus group, be sure to review your questions to ensure that they don't lead respondents to a particular conclusion. Don't ask someone if “the cold weather” impacted their decision not to participate in your rally. Just ask, “Why didn't you participate in the rally?” This example makes the problem obvious. But the next time you write evaluation questions, double check whether you are asking a truly open question or sending someone towards a particular answer. You may be surprised what you find.
- Survey questions: Surveys can quickly provide information. But again, be careful not to narrow the potential feedback you can receive. Every time you force someone to check off an answer that you have predetermined, you are limiting your ability to receive “out of the box” feedback. Be sure to always give your survey sample an “other” option, which gives them room to explain their thoughts.
- Focus groups: If you convene a focus group to learn more about your advocacy activities, be mindful that these groups are designed to represent different views from your constituency to determine the most effective course for your organization. Work to get as many diverse views around the table while still keeping the group manageable. The more people participating, the more ideas and feedback. A good moderator will carefully construct questions and conversation prompts to gather information you need while not leading the group to predetermined conclusions. The beauty of a focus group emerges when you allow the participants to bounce ideas and thoughts off one another and travel down unintended paths. Those paths may lead to the best feedback.
- Alternative forms of feedback—artistic responses: Not all feedback has to come via the written or spoken word. Think about ways that you can use the arts to evaluate your work. You can learn a lot about the impact of your work from the way your constituents express themselves. Take for instance the Global Campaign for Education's (GCE) 2005 Send My Friend to School campaign. Teachers implemented a lesson plan on access to education in other countries. Instead of sending a petition, young people were asked to create a “cut-out” friend representing one of the 100 million children around the world without access to education and to write a message to G-8 leaders on the back of the cut out. By looking at the “friends,” evaluators were able to see what young people think about the issue of universal access to education. This gave some formative feedback on the effectiveness of the lesson plan, which sought to portray people living in poverty respectfully.
- Technology is your friend: Digital technology can also help you receive feedback. From blogs to photos, digital information can help you evaluate your advocacy efforts in ways that were previously impossible. And you no longer have to convene an in-person focus group or stand on a street corner to get surveys filled out. Think about ways that online survey tools, Web cams, online conferences and the telephone can make your evaluation life easier.
- Encourage continuous feedback: People like to be consulted about their opinion. Promote a policy of open feedback from your constituents and facilitate channels through which they can talk to you. If you do receive feedback, acknowledge it, thank the constituent, and tell her or him how you have used it.
Quantitative techniques use numbers to help evaluate your initiatives. Here are some tips on how to reach some powerful conclusions:
- Make your qualitative responses count: Once you have responses to qualitative questions, think of ways that you can turn this information into a number. Asking questions with limited choices (while still incorporating a section for “other”) is a great way to easily be able to state that, for example, “25 percent of the interviewed constituents identify with the advocacy slogan” or “86 percent of the interviewed participants found the e-mail blast to be too frequent.”
- How much is enough: People often claim that their results are “statistically significant.” All that means is that the researchers have ensured that they have a random sample and a large enough sample to ensure that the survey responses are likely to reflect the views of the population they represent.
A random sample is an unbiased, completely random group of people from your big population. Sometimes statistics are misused because the sample is not random. For instance, if your information is from an optional Web surveyit is not random, it is a survey of people who are able to respond to Web surveys. But, depending on how detailed you need to get, an online survey sample may be good enough to get you some helpful information. In some cases, a Web survey is the only appropriate tool, especially for groups that primarily or often use the Web for campaigning.
To figure out how many people you need to survey to get highly accurate data, check out simple online sample size calculators, like this one.
Cut costs without cutting corners
So maybe your staff cannot spend its valuable time developing perfect survey questions or maybe it will take too long to interview enough people to get the right data. You may need to think about other sources of manpower and expertise. One thing to consider would be to hire undergraduate or graduate interns from fields you may have thought were distant from your advocacy initiative:
- A statistics major may want to apply his or her skills to a social cause and can design some great quantitative evaluation tools.
- An anthropology major could use her qualitative skills to help construct unbiased questions for your next survey or focus group.
- Communications majors may have learned how to conduct a random sample survey over the phone.
Getting more information
As we said, we cannot begin to cover the fields of qualitative or quantitative evaluation skills in a small space. Check out some of these resources to get more information and training to boost your ability to conduct and understand advocacy evaluation:
- The Innovation Network's Evaluation Resource Center is host to many online guides on surveys, interviews, focus groups, statistics and data analysis and reporting.
- The United State Department of Agriculture Graduate School offers inexpensive skill-building courses in statistics and data analysis in Washington, D.C., other parts of the country and online.
- Community colleges often offer continuing education credits in quantitative or qualitative analysis.
- Many nonprofit associations offer training workshops that cater to the 9-to-5 schedule. Check if your organization is a member of an association that offers professional development opportunities.