basic.gif (1712 bytes)basic.gif (1712 bytes)

NATIONAL INFORMATION CENTER
ON VOLUNTEERISM

Boulder, Colorado
ivanbar.gif (136 bytes)

@ -- permission for use-with-acknowledgment

ivanbar.gif (136 bytes)


Basic Feedback System:

A Self-Assessment Process for Volunteer Programs

By
Bobette W. Reigel
With assistance from Ruth D. Miller

National Information Center on Volunteerism, 1977

ivanbar.gif (136 bytes)

.Contents:

I.  Introduction to the Basic Feedback System
II. How to Use the Basic Feedback System
III. Uses of New Checklist for Voluntary Action Centers and Volunteer Bureaus
IV. Scorecard for Volunteer Coordinators and Directors
V. Top Management Checklist
VI. Volunteer Feedback Form
VII. Staff Reactions to Volunteer Programs
VIII."You Have A Volunteer – What Do you Think?" Feedback form for one-to-one clients
IX. Checklist for Board Members
X.  Checklist for Voluntary Action Centers and Volunteer Bureaus

 

CHAPTER I

INTRODUCTION TO THE BASIC FEEDBACK SYSTEM

Developed by Dr. Ivan B. Scheier and the Operations Analysis Unit of the National Information Center On Volunteerism, the Basic Feedback System (BFS) is a structured self-assessment process designed especially for volunteer programs. The purpose of this self-assessment process is to increase the effectiveness of the volunteer program at a minimum cost of time and money, with maximum input from a wide selection of people involved with the volunteer program.

The Basic Feedback System cannot substitute for an outside professional evaluation. It can, however, offer a process of gauging or roughly measuring the function, performance, commitment, and satisfaction levels of those involved with the volunteer program. When used on a regular, on-going basis, the volunteer director can identify small problems as they emerge, and take action to resolve them.

A special advantage of this system is the development of individual forms for six of the volunteer program constituencies: the director or coordinator, volunteers, paid staff working with volunteers, top administration, one-to-one clients, and board members. In addition, this publication introduces the final version of a new programmatic form designed for directors of Voluntary Action Centers and Volunteer Bureaus.

In addition to roughly taking the "temperature" of a volunteer program, a number of unforeseen benefits can occur with the use of this system. Administering the feedback forms to everyone in the program can act as a consciousness-raising device in itself, giving the program visibility and serious consideration it may have lacked previously. Often individuals will be more willing to articulate frustrations in writing and anonymously, than through direct personal contact. And the conclusions the volunteer coordinator deduces can help in formulating objectives for a professional outside evaluation. Tabulations from previously administered forms, even though they are self-reported, can serve as useful background information for an outside consultant; often the consultant can record impressions on the same forms, as a validity check on the self-assessment process.

The Basic Feedback System and the national comparative norms are designed only as guides, to be incorporated with other impressions, evidence, and data on hand. They should be viewed as flexible, as feedback for discussion and even training. Whenever possible, BFS should be used as an adjunct or stimulus to outside, professional evaluation.

A. A WORD ON NATIONAL NORMS

This publication presents the latest and largest base for BFS national norms. National norms for the older forms have been established over the past four years. However, NICOV does not, have a sufficient number of sectional and total scores to compile norms for some of the newer forms. No norms exist for the new VAC Checklist or the revised one-to-one client form, and only very general estimates are presented for the new Board Members Checklist. All norms appearing in this publication are at best approximations, and it is important to note that much in these forms is valuable, though not scoreable and entered with the norms.

The norms show the distribution of all scores NICOV has received from a wide range of volunteer programs: public schools, hospital programs, youth service organizations, Red Cross, criminal justice agencies, RSVP, JRWCA, and other human service programs. Percentiles (or the estimated percentage of the total population) are used to standardize the scores. If a raw score falls in the fiftieth percentile, then it is fair to say that the raw score is an average score with approximately half of all raw scores above the score and half below.

NICOV has a system for the development of new self-assessment forms and is interested in contracting to create forms for specialized key areas and key functions in volunteerism. Your completed forms and your input on a system which is still developing will be very welcome. 

CHAPTER II

HOW TO USE THE BASIC-FEEDBACK SYSTEM

There are many ways to apply BFS, but most volunteer program directors elect to administer the forms regularly, perhaps every three months, to each of the six constituencies. Often, a director will decide to adapt the forms to the specific conditions of the individual community or program. We strongly suggest that questions be added on to the end of the forms, rather than changing the forms themselves. In this way, the forms can be scored as per instructions, the national norms can be used, and responses to the special added questions can be analyzed separately.

Whenever possible, BFS forms should be administered in a face-to-face setting, either to individuals or to groups, rather than mailed. Ordinarily, it helps to offer anonymity and confidentiality. It is also possible to use the structured interview approach, where the interviewer records the responses. If you decide to administer the BFS forms, the following suggestions may be helpful.

1.Scorecard is designed for the volunteer coordinator or director. It can be self-administered at the very beginning of the program as a useful "standard setter" and every subsequent quarter for gauging development and potential problem areas.

2.If at all possible, administer the Top Management Checklist before the program begins. Designed for the high level administrator or policy-maker in the agency or major unit, this form asks questions concerning specific agency commitments to the volunteer program. The program should not proceed until the administrator has an understanding and acceptance of basic necessary commitments.

3.As we all know, an effective volunteer program must have continual input and cooperation from paid staff. The Staff Reactions form provides important attitude indicators, as well as demonstrating to staff that their feedback is essential. Be sure to follow-up with a report on findings and a discussion with staff and, later, staff/volunteers.

4.The other forms (Volunteer Feedback, Board Members, and one-to-one client) can be administered in groups or individually. Ask for frankness, with the assurance of confidentiality. The most effective administration of the one-to-one client form (You Have A Volunteer--What Do You Think?) has been for the interviewer to read each question aloud, allowing for some explanation or discussion, with the response recorded by the interviewer.

5.Instructions for scoring the forms, and existing national norms are presented with each form.

6.As mentioned above, in addition to assessing the program, response tabulations from BFS can serve as a springboard for discussion among the several constituencies; also these people deserve to know how the program is doing.

Perhaps the following fictitious example will illustrate one general approach to the use of the Basic Feedback System.

Project Proof Positive is a volunteer program working within the Division of Social Services in an urban area. The part-time volunteer coordinator is funded by a federal grant with some county support. This program involves volunteers working with families who have alcohol related problems; of course, volunteers are dependent upon the cooperation of the paid case, workers for much of their information and success with clients.

The program has been operating for over a year, long enough to seem somewhat organized, but Ms. Sharp, the Volunteer Coordinator, has noticed that the volunteer turn-over rate appears to remain rather high. Upon checking the attendance and resignation records she becomes convinced that the situation needs some attention, but her observations of the volunteers do not yield any tangible clues. She decides to try the Basic Feedback System to see if any patterns emerge within the different groups involved in the program.

She completes the Volunteer Coordinator Scorecard herself, and discovers upon comparison to national norms, that her program's score for the section on Orientation and Training of volunteers and staff is below the national average; the scores for sections on Motivation and Incentive, and Record-Keeping and Evaluation are also somewhat low. However, she's pleased to see that her hard work in Public Relations and Recruitment compare extremely well on a national scale. Knowing that it's crucial to get the perspective of everyone involved with the volunteers, she proceeds with the rest of the process, after adding some open questions which pertain specifically to local conditions.

After giving a brief orientation to the purpose of the system, Ms. Sharp asks the division's top administrator to take ten minutes to complete the "Top Management Checklist." In the next two weeks she is able to personally contact all the paid case workers to ask them to complete the "Staff Reactions to Volunteer Programs" form. At the next group meeting for volunteers, she explains the purpose of the "Volunteer Feedback" form, asks them to be very frank in completing the form, and explains that they need not sign the forms. Board members are asked to complete the new "Checklist for Board Members." And, with the help of some caseworkers, she administers "You Have A Volunteer--What Do You Think" to a selection of receptive clients, whose responses will be anonymous.

Upon tabulation of the six different forms, some patterns do emerge; the scoring patterns point to several possible causes for loss in volunteer motivation and the open-ended questions reveal "between the lines" some significant attitudes. For instance, top management and paid staff have professed from the beginning to be committed to the volunteer program; however, their commitment apparently does not uniformly extend to specific, organized management, time, and resource investment. Volunteer satisfaction level is low, with a decided lack of real direction and cooperation from staff, which seems to be reflected in uncertain and inconsistent feedback from clients.

Of course, none of this came as a complete surprise to Ms. Sharp, but the written feedback gave her something tangible to work with, and the general perspective of the forms helped her to step back from the daily workings of the program with a balanced long-term outlook. She was able to take heart from the positive feedback: for instance, Ms. Sharp has the good fortune to have a "working" board. She brought her main findings to the board and together they developed from all the feedback a plan of action that included among other things heavy staff input and participation on inservice training, a public statement and resource commitment from top management, regular volunteer/staff meetings, and a volunteer recognition plan. Ms. Sharp has decided to administer the forms regularly every three months to a random selection of the six main constituencies of the program. 

CHAPTER III..

USES OF NEW CHECKLIST FOR VOLUNTARY ACTION CENTERS AND VOLUNTEER BUREAUS

When using a special program form such as the VAC/VB checklist, insights can be gained by administering the form to a wide variety of people involved with the VAC. In addition to the Director, responses from staff, volunteers, the board, the funder, clientele agencies, and an outside observer can prove valuable. Not only does the VAC Director gain a perspective on how others view the various functions of the VAC, but consistency of perceptions between the groups can be revealing, too.

What does it mean when the responses of all those people agree? A consensus could mean confirmation. Or it could mean a misperception common among the groups, in which case, the role of the outside evaluator becomes even more crucial. In other words, the VAC form can serve as an instrument for inside-outside verification. What if there is wide disagreement in certain sections between the involved groups? At this point it is interesting to lay out a matrix or profile showing who disagrees on which programmatic areas, and take a closer look. Again, the administration of the form to such diversely involved individuals conserve as a basis for dialogue on program development.

This form and several of the other BFS forms can be applied as a rather unusual training technique. Participants can complete the form and break into small discussion groups according to low scored sections. For example, VAC participants who find they rate their programs low in the area of "Agency Relations and Assistance" can group together for problem solving or brainstorming discussions.

Since this is a new form, NICOV does not have a sufficient number of sectional and total scores from which to develop national norms. If you are a VAC or VB Director intending to utilize this form, please send copies of your completed forms to NICOV to be entered anonymously into the new norms. This system can only be effective if a wide range of programs participate. Of course, NICOV administers the form at every opportunity, too.

 

ivanbar.gif (136 bytes)

Return to Table of Contents


Ivan Scheier
Stillpoint
607 Marr
Truth or Consequences, New Mexico, 87901
Tel (505) 894-1340
Email: ivan@zianet.com

For comments and editing suggestions please contact Mary Lou McNatt mlmcnatt@indra.com