Program and service evaluation

Evaluation is a thorough analysis of all the information collected and can assist an organisation in assessing how effectively the program or service is meeting its goals. It often occurs at the end of a program, or once a year, depending on what is required or appropriate.

Evaluations are important for many reasons, including to:

  • verify if a program is meeting its aims and objectives
  • help staff understand, confirm or increase the impact of a program
  • support an organisation in making strategic decisions about a program
  • help an organisation plan how to spend funds and resources effectively
  • motivate staff and give them direction
  • promote best practice

Woman presenting at white board to another woman

Evaluations do not have to be a standalone project. In fact, the best evaluations occur when staff recognise the evaluation process as an integral part of program delivery itself.

Some common evaluation approaches are:

  • Design evaluation concentrates on defining a program. A design evaluation is used to document a program already in operation, or plan a new program
  • Process evaluation is concerned with what actually happens in practice, focusing on the activity of a program
  • Impact evaluation measures the impact of the program in terms of client or community benefits. It focuses on the extent to which the program has been able to achieve its objectives and deliver the desired outcomes
  • Action research uses a series of cycles to research and reflect on practices within a program. The approach allows practices to be continually developed and refined over time while the program is being implemented

Your goal should be to select the evaluation method (or methods) which will be the most cost-effective and practical for your organisation, and the audience who will read the results. Whichever approach you use, you will need to plan an evaluation process involving four steps: design, implement, reflect and review.

The evaluation team

Evaluations are most useful when there is an evaluation team working together to produce a final report. One person may have overall responsibility for writing the evaluation, but the information gathered for the report should come from a range of places and people.

Factors to consider in the selection of an evaluator or project team include:

  • Which staff or organisation members have the time and commitment to lead the project?
  • Does the organisation have people skilled in designing and carrying out this project?
  • Is an independent perspective required? The chances of ending up with bias in your final report may be reduced if you can access outside skills and knowledge
  • Does the organisation have the resources or funding to pay an independent consultant?

For evaluations which provide feedback to the organisation on a single aspect of operations or a specific program, you may find that there are staff or volunteers keen to lead the project and develop their evaluation skills. There are many resources available to help beginners. If the evaluation is more comprehensive, or intended to underpin a major restructuring of the organisation or program, an independent person may need to lead the project and provide the final recommendations to the board.

If your organisation does not have the finances to pay a consultant to lead the whole process, consider using a consultant for aspects of the evaluation which are particularly sensitive or liable to influence. These could include developing the design or the performance indicators, conducting a focus group, or analysing the results of a questionnaire.

If you decide to engage an independent consultant, you should put together a brief about the project and what kind of skills and experience you are looking for.

The evaluation process


Put together an evaluation team of interested or key staff and managers and begin by developing a program logic. A program logic (known as a Logic Model) clearly states why a program is being delivered, what issue it is addressing or changing, and what outcomes are expected. The logic model can then be used by your team to agree on an evaluation framework and methodology, depending on what you want to learn or discover.

When you have clearly documented your preferred evaluation approach, you will need to develop some performance indicators which are relevant, meaningful and which you have control over. These performance indicators will help you decide on your methodology. The methodology is the manner in which you are going to collect objective and reliable information related to the performance indicators.

Your evaluation team should decide how extensive the evaluation is going to be. The size of the evaluation depends on your aims, skills, resources, budget and time. If you work in an organisation that does not have the time or budget to engage a consultant, you will need to up-skill your program staff so they can conduct the evaluation themselves. If you work in an organisation needing to collect a large amount of data and utilise interviews or surveys to collect information, you can either choose to conduct the evaluation yourself, if you have the time and skills, or engage an independent consultant to conduct the evaluation for you.

Before you move on to the implementation phase, ensure you have gained formal support and endorsement of the proposed evaluation from your organisation’s board. Without formal support for your efforts, it is likely that the evaluation will be incomplete or selective in its investigations and therefore in its results. The outcomes will not be widely accepted and may lead to inaccurate recommendations.


Implement your chosen data collection methods and ensure an ongoing commitment to the evaluation. If even one staff member or volunteer does not really understand why they are collecting the data, your results may be inaccurate.

Collect the relevant data over an appropriate period of time. One month may be adequate for short-term projects, but one year may be more appropriate if you are evaluating a larger and ongoing program. Some evaluations may need data collection to occur at the beginning as well as the end of a project, or at regular intervals, and even after the project has finished (to assess long term outcomes).

In most cases, it is best to conduct a ‘pilot’ at an appropriate time after you have started collecting the data. A pilot means testing your data collection and results to see if any problems need ironing out before you go any further. After the pilot, continue to monitor the ongoing progress of your data collection throughout the project. If you are collecting a large amount of data, or collecting data over a long period, ensure that it is still relevant and staff remain enthusiastic and clear about the process.


Making sense of the information you have collected is the most important task! This is where you really discover if your planning and data collection methods were successful. There are many ways of collecting both quantitative and qualitative data, which means there are many ways of analysing and interpreting this data as well.

You need to assess the information collected and sort it into meaningful evidence. The manner in which it is presented will determine what conclusions are made. For example, if you present the findings that 80 per cent of people completing a vocational course found work experience, it could mean a positive outcome and that your program was a success. However, if this 80 per cent had problems at the work experience and left the placement before completing it, it might demonstrate that your program needs to do more life skill and preparatory work before sending the participants of the program for work experience.

Hopefully, you will find that the information collected confirms you have met your performance indicators, the number of outcomes produced, and what impact the organisation or program is having. The data may also reveal the efficiency and cost effectiveness of the program, including the total cost of delivering the identified outcomes.

Be careful that you do not fall into the trap of only collecting and reporting on positive outcomes and indicators. Evaluations that are objective and willing to state what did not go so well, or identify areas of weakness, are much more helpful and realistic. You will learn a great deal about a program by understanding its challenges and areas that need to be changed or modified.


You should present the findings of your evaluation by organising the information you have collected, analysing it, and then describing the results accurately and objectively. How you do this will depend on the purpose of the evaluation and the needs of your primary audience.

If you decide to include recommendations in your final report, then there should be a clear relationship between the findings and your recommendations. Recommendations should draw directly on what you have learned in your evaluation and what you know about the program.

If possible, involve as many people from your organisation as possible to review the findings of the evaluation. Evaluations should help you to make informed decisions that benefit the organisation, enabling you to make changes to a service or organisation based on the findings. However, be aware that if your data is too ambiguous or unreliable, you could be making decisions based on a lack of evidence or subjective opinions and judgements. This could harm your organisation and cause conflict, rather than improving your services and programs.

After the review, return to the planning process, to evaluate whether your changes have improved your program.

Key points

  • Do not avoid evaluations because they take up too much time. The effort required far outweighs the benefit you will gain from completing an evaluation
  • Start collecting information now. Do not overlook how important collecting information over the life of a program is to an evaluation process
  • Try to include qualitative data in your evaluation methodology. Quotes and personal stories are a powerful way to support findings from questionnaires
  • Avoid reporting just the successful elements of the program. Be objective and report about findings that can help you improve the program
  • Once your evaluation is complete, disseminate it widely. Be generous with your communication and help your funders and community partners learn from the process

Performance indicators

Performance indicators are measures of achievement. You need to be careful when writing performance indicators, because if they are not measurable, or do not determine what you want to know, your evaluation could be a waste of time. Good performance indicators are clearly and consistently defined and are relevant to the organisation collecting the data and the funding body.

If you were undertaking a literacy program for unemployed people, examples of performance indicators might be:

  • Ten unemployed people attend a six-week literacy course
  • Participants demonstrate increased literacy skills by taking before and after course tests
  • During the six-week course participants read three books which increase their reading skills
  • After the course, 85 per cent of participants report they feel more confident when using a computer

Collecting data

Valid and useful information can be qualitative or quantitative, but it needs to be relevant and reliable. Qualitative data is information in the form of feedback, memos, reports and observations. Quantitative data includes statistics, figures and financial records. In the planning stage of the evaluation process, you will decide what information is most relevant to your evaluation.

These are examples of methods to obtain data:

  • Community profiles which describe your local target group and their needs
  • Questionnaires, surveys or discussions/focus groups which collect client and other stakeholder feedback and observations
  • Direct observation
  • Exit interviews with clients
  • Written feedback from other organisations
  • Information and statistics about who accesses your service;
  • Photographs and video recordings;
  • Cost analysis and budgets;
  • Analysis of the time spent by staff and/or volunteers on each part of their job;
  • Records and documentation, such as files, case notes, logs, diaries, correspondence; and
  • Case studies

Service agreements

If you receive funding from a funding body or government department, it is likely that you will have either a Memorandum of Understanding or a Service Agreement. Service Agreements outline the responsibilities of both parties and often detail what reporting and evaluation requirements the organisation must adhere to. The agreement creates a legal relationship between the funding body and the funded organisation. It outlines the approved services, including outputs, outcomes, and performance measures.

Most government departments require funded organisations to undertake research and data collection, development of policy and practice, service planning, monitoring and evaluation. Before you sign your specific service agreement, read the document carefully to ensure that you can meet all of the practice and reporting requirements.

It is wise to negotiate specific funding which covers the resources and expenses related to carrying out monitoring and evaluation, especially if you want to engage a consultant to train staff or assist the process. Remember, evaluations help you to be transparent and accountable, so having the funding available to conduct an evaluation process should be a prerequisite.

Organisational policy

Policies are written statements, developed in light of the organisation’s missions and values, which communicate and document your organisation’s plans, instructions, intents, and processes. Policies should guide management, staff and volunteers, clarify your organisation’s values and influence your organisation’s culture.

For more information on the Human Services Quality Framework, and how to develop policies and procedures for audit, see the HSQF section of Community Door.

For samples and templates of standard policies and procedures – go to the administration section. Ideally, policies should be expressed as formal written documents, so that everyone in the organisation is clear about the organisation’s expectations and limitations.

Good governance relies on clear policies which are related to the goals of the organisation, and which are flexible and responsive to external factors and changes. Clearly written policies help the workforce have clear guidelines and a framework for action that helps them do their job, however new they are to the organisation.

It is important that you have a way of determining the appropriateness and success of your policies. Therefore, policies should be closely linked to planning, evaluation and review processes. Your organisation will then be managed through a continuous cycle of setting goals and policies; planning and implementing activities; evaluating the success of those activities; developing modifications or completely new activities; implementing and evaluating changes.

Even if you do not have to comply with a government department’s requirements, it is still best practice to have the right policies to help you manage your organisation. Policies can protect the organisation from legal problems, ensure fair treatment for employees, and establish consistent work standards, rules, and regulations.

Developing policies

Writing policies that are meaningful, effective, and enforceable by management and staff is difficult and takes time. It is wise to follow an agreed process that ensures you are spending your valuable time and resources on good policy, which the workforce understands and is committed to.

Steps to formulating policy:

  • Appoint a special committee or policy development working group.
  • Identify the broad policy areas and then prioritise when each policy needs to be written.
  • Agree which policy you are writing and brainstorm the issues involved.
  • Conduct research about the issue, e.g. legal issues, practice issues, resource issues.
  • Prepare a draft policy.
  • Circulate the draft to key staff and stakeholders for comment.
  • Amend or revise the draft.
  • Recirculate a final draft.
  • Present a report to the board about the policy, implications for the organisation and what feedback has been received in the consultation phase.
  • Ask the board to ratify the policy when they are satisfied with the final draft.
  • Insert the policy in the Policy and Procedures Manual.
  • Implement a training and communication strategy to ensure that staff and volunteers have the knowledge and skills to implement the policy.
  • You may need to write a related procedure, forms or checklists that describe exactly how to carry out the policy.

Good policy

Good policies:

  • are written in clear, concise, simple language
  • represent a consistent practice and decision making framework
  • are compatible with the organisation’s values
  • are easily accessible and understood
  • address what the rule is, rather than how to implement the rule
  • are regularly reviewed and updated with changes communicated to those who are affected by them.

When writing a policy, always use simple words and concepts. Speak directly to the people who will be reading, enforcing, and living by the policy. Keep the policy short – use bullet points and subheadings to make it easy to read.

Policies often begin with a policy statement. This statement should state why the policy has been developed.

After writing a policy, decide who will be affected by the policy and select the best way to write down or communicate the policy to them. The layout, style, design, and presentation of the policy is just as important as the text. You must make your policy easy to read. Long paragraphs, crowded pages, poor use of white space, and poor print quality all discourage or confuse the reader.


Policies should use language that is:

  • succinct
  • unambiguous
  • simple – plain English, active voice, avoid acronyms
  • free from jargon, cliches, unfamiliar words and phrases,
  • free from unnecessary technical expressions, with technical expressions where used
  • set out using short sentences
  • factual
  • future-proof – avoid information that may become out-dated quickly.

During the policy writing process, remember to consider the constraints which impact on the organisation. There may be constraints in the organisation’s constitution, in funding guidelines, or with the availability of training or resources, that affect what the policy contains. Sound policy making recognises these constraints and uses them to produce policy that is reflective of the real world. Nobody respects a policy that is idealistic or impossible to adhere to.

  • Five questions to ask when forming a policy
  • Does this policy reflect our values, ethics and priorities?
  • Have we considered the legal requirements?
  • Does the policy reflect reality?
  • What does the policy promise, and can we deliver it?
  • Will everyone understand this policy?

The board may decide to pass the final draft on to a legal expert before ratifying the policy and disseminating it. They may also decide that staff need some training in understanding and implementing the policy. For instance, if you write a policy that requires all staff to have a first aid certificate, you will need to make sure that staff know what this is, where and when they should obtain it, and who is expected to pay for it.

The board should approve written policies and procedures governing the work and actions of the organisation’s workforce.

The environment is always changing, so it is important to ensure that policies are reviewed regularly so they continue to meet best practice. Best practice is the current recommendation about the best way to manage and deliver services.

Most organisations put all their policies and procedures together and call it a policy and procedure manual. This manual is the result of many hours of thinking, analysing, researching, writing and re-writing, so you may find that your policy and procedure manual is a work in progress and is continually being updated or revised. For this reason, manuals need to have a quality control system to inform staff which version is the most current and when a specific policy within the manual has been changed or made obsolete.

Related Updates

See All
Webinar: Member Disputes & Conflict on the Committee
Connecting Up’s ‘IT Strategy on a Plate’ course
Mercy Foundation’s Social Justice Small Grants Program