Assuming you have tailored your evidence collection, you will likely have excellent information ready to analyse. Our focus here is on providing you with a choice of options for your reports. In the following table, we have expanded upon the example table we have built to date and focus on analysis methods and options here also. This table highlights how adaptable our evidence can be and its ability to serve multiple purposes. As in the previous module, we have identified the new content focussed on within the current section by showing it as a green shaded column.

FUNDERS: TO ACHIEVE OUR VISION, HOW SHOULD WE APPEAR TO OUR FUNDERS?

Objectives

Measures and Tools Used

Analysis Methods/Options (NEW)

OBJECTIVE 1:
We are a highly efficient charity.

New survey developed and sent to donors to gather their view of our efficiency via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded.
Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents. Feedback collated into a report to the CEO and internal team annually.

OBJECTIVE 2:
We have multiple revenue sources including investment returns.

Evidence team gains access to CEO report to Board including variance analysis. Significant variances (positive or negative) are recorded and analysed.

Variances from CEO report will be analysed likely via entry of variances into an ‘evaluation evidence’ database. Main focus in this objective is accuracy of the budgetary planning process and investment returns. Once data is analysed accuracy will be clear. If not accurate, may need to revisit budget and adjust/improve.

OBJECTIVE 3:
We offer a welcoming, safe and supportive environment to our peer members.

Survey developed and sent to existing members and those no longer attending, including questions on how welcomed, safe, supported they feel in the group via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded.
Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents and against the two groups – current attendees and other. Feedback collated into a report to the CEO and internal team (including relevant facilitators) to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 4:
We focus on building Individual Capacity by providing high quality, relevant information at peer sessions.

Survey developed and sent to existing members and those no longer attending, including various questions on information provided via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents including information quality and relevance using rating scale question(s). Target = 85% of members agree they receive high quality relevant information. Feedback collated into a report to the CEO and internal team (including relevant facilitators) to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 5:
We regularly invest in peer program development and group leader training.

Survey developed and sent to peer group facilitators including questions on training received and requested/needed via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents including training received and unmet needs. Feedback collated into a report to the CEO and internal team to enable training planning during budgeting process, improvements if needed and ILC reporting/submissions to illustrate ILC Outcome evidence.

MEMBERS: TO ACHIEVE OUR VISION, HOW SHOULD WE APPEAR TO MEMBERS?

Objectives

Measures and Tools Used

Analysis Methods/Options (NEW)

OBJECTIVE 1:
We focus on building Individual Capacity by providing high quality, relevant information at peer sessions.

Survey developed and sent to existing members and those no longer attending, including various questions on information provided via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents including information quality and relevance using rating scale question(s). Target = 85% of members agree they receive high quality relevant information. Feedback collated into a report to the CEO and internal team (including relevant facilitators) to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 2:
We provide high quality, relevant programs that are easily accessible.

Attendance sheets developed for use in each session. Have system in place for centrally recorded data into spreadsheet (centrally located (protected) file needed).

Data to be entered into an Attendance database with NEW member numbers noted, total attendance per event also along with event details (group, location, time etc). Analyse across groups and topics to ensure each group brings in new members regularly (and continue to attend). Use internally/externally.

OBJECTIVE 3:
We educate, inform and upskill via: peer group sessions, special events, website and newsletters.

Survey developed and sent to existing members and those no longer attending, including questions on value of peer program components using multi choice, Y/N and rating questions.

Data to be entered into a database with component orders and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents including program components most valued. Feedback collated into a report to the CEO and internal team to enable adjustments, budgetary decisions and ILC reporting/ submissions illustrating ILC Outcome evidence.

OBJECTIVE 4:
We offer informal advocacy and advice resulting in referrals that are accurate and timely.

Survey developed and sent to all members including on whether they have received informal advocacy/referrals and opinions of it using multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and opinion ratings coded including ‘do members feel they received what they needed or not, and were there outcomes from the advocacy? Statistics on coded responses are collated including %age of members getting and/or gaining from this. Feedback collated into a report to the CEO and internal team to enable adjustments, budgetary decisions and ILC reporting/ submissions illustrating ILC Outcome evidence.

OBJECTIVE 5:
We offer members a welcoming, safe and supportive environment.

Survey developed and sent to existing members and those no longer attending, including questions on how welcomed, safe, supported they feel in the group via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded.
Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents and against the two groups – current attendees and other. Feedback collated into a report to the CEO and internal team (including relevant facilitators) to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 6:
New member join our groups and those that depart provide positive feedback on their peer experience.

Survey developed and sent to group facilitators including questions on the group membership and changes in membership via multi choice, Y/N and rating questions.

Surveys of peer group facilitators data to be entered into a database with Y/N and ratings coded. Statistics on coded responses were collated. Feedback collated into a report to the CEO and internal team to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

BUILD: TO ACHIEVE OUR VISION, WHAT MUST WE BUILD INTERNALLY?

Objectives

Measures and Tools Used

Analysis Methods/Options (NEW)

OBJECTIVE 1:
IT infrastructure meets our needs as an innovative, growing charity.

Survey developed and sent to team members including questions on the IT system available and their use of it via multi choice, Y/N and rating questions.

Surveys of team members (including peer facilitators) including questions on IT system, IT resources they are using, if it assists them in their role. If not, what do you they need? Do they need training?. Data to be entered into a database with Y/N and ratings coded. Feedback collated into a report to the CEO and internal team to enable IT improvements.

OBJECTIVE 2:
We effectively manage new members professionally and consistently.

Survey developed and sent to new members including various questions on their joining process, new member package receipt and if needs are being met by peer group via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages. Feedback collated into a report to the CEO and internal team to enable adjustments and improvements and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 3:
Our office and session facilities are secure, safe and clean spaces.

Survey developed and sent to team members including facilitators, with questions on their office and other peer program spaces via multi choice, Y/N and rating questions.

Surveys of team members (including peer facilitators) including data collected on opinions about program facilities. Data to be entered into a database with Y/N and ratings coded. Feedback collated into a report to the CEO and internal team to enable IT improvements.

OBJECTIVE 4:
Our peer program has clear policies and procedures that support, and protect, both our members and our team.

Survey developed and sent to members including questions on 1-2 policy applications they should be impacted by and, if it is not working questions about what may be missing or not being followed. Use multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages to gain evidence on the consistent application of policies and procedures. Feedback collated into a report to the Board, internal team to enable improvements and possibly ILC reporting to illustrate adherence to policies such as accessibility, equity and access to complaints and feedback mechanisms.

OBECTIVE 5:
We continually improve and develop our programs, expertise and evidence, training resources and other materials or program resources.

Focus group attended by a range of peer group members that have attended various groups for some time is held. Discussion is recorded and transcribed. Training focus group facilitator directs discussion around evolving program, changes they have experienced, if these are good and ideas for beneficial change.

Data is entered into a program (QDA Miner Lite) enabling all the transcribed discussions to be put through a process of thematic analysis. The output from the program provides the key themes in the content, enabling conclusions of improvements and positive change to be confirmed or not. Feedback and the key themes, along with key quotes and comments, are then collated into a report to the Board, internal team and possibly ILC reporting to illustrate a commitment to continual improvement and evidence of including peer members in feedback and peer program development over time.

LEARN: TO ACHIEVE OUR VISION, HOW and IN THE FUTURE, WHAT MUST WE LEARN?

Objectives

Measures and Tools Used

Analysis Methods/Options (NEW)

OBJECTIVE 1:
National/overseas conference attendances and presentations are sought, secured and funded.

Survey developed and sent to team members with questions on submissions, attendances and presentations via multi choice, Y/N and open-ended questions.

Surveys of team members including data collected on conference and other attendances, submissions and presentations on the peer program and related content. Data to be entered into a database with Y/N coded and open-ended comments included. Feedback collated into a report to the CEO and ILC reporting as this illustrates ongoing program development and a research/evaluation focus.

OBJECTIVE 2:
We have a trained, motivated and empowered team that are flexible across multiple roles.

Survey developed and sent to team members, volunteers and facilitators with questions on their expertise, satisfaction, flexibility and motivation via multi choice, Y/N, ratings and open-ended questions.

Surveys of team members (all, including volunteers and group leaders) including data collected on their expertise, satisfaction in their role(s), flexibility (ability to operate in other roles) and motivation. Data to be entered into a database with Y/N, multi choice and ratings all coded and open-ended comments included. Feedback collated into a report to the CEO and ILC reporting as this illustrates team member attributes essential for ongoing program success. Could also be used for performance review purposes.

OBJECTIVE 3:
Our organisation develops leading edge information topics.

Survey developed and sent to existing members and those no longer attending, including various questions on information provided via multi choice, Y/N and rating questions.

Data to be entered into a database with Y/N and ratings coded. Basic statistics on coded responses are collated including %age of each answer and ratings averages across respondents including information quality and relevance using rating scale question(s). Feedback collated into a report to the CEO and Internal Team (including relevant facilitators) to enable input to topic selections and ILC reporting/submissions to illustrate ILC Outcome evidence.

OBJECTIVE 4:
We regularly explore organisational collaborations and grow links over time.

Survey developed and sent to team members, volunteers and facilitators with questions on collaborations or other links they develop via multi choice, Y/N, ratings and open-ended questions.

Surveys of team members (all, including volunteers and group leaders) including data collected on the ways in which they link in with, or collaborate with, other organisations. Data to be entered into a database with Y/N, multi choice and ratings all coded and open-ended comments included. Feedback collated into a report to the CEO and ILC reporting to illustrate collaboration and evidence of this approach being used.

We had developed, for each of the four BSC perspectives, tables that listed measures for each objective. Some of these indicators were from secondary sources and others, primary sources. A portion were collected opinions from key people via surveys or interviews. Others included figures, namely, group attendance or number of new members. For at least some of your objectives, you will be asking for feedback from a stakeholder such as a peer group member, one of your staff, potentially a donor. In these cases, we need to develop, or utilise a pre-existing, tool like a survey to collect this tailored evidence.

However, collecting the evidence is by no means the end of the process. We then need to follow our data analysis basics for presenting the evidence collected in the most suitable and powerful way possible. This module has focussed on the analysis of data and the various ways we can best manage the different types of data gathered within our peer organisation. The table of examples illustrates the kind of brief notes and planning required for our data analysis during our data collection planning. Consideration of how we will undertake this process should commence very early in evaluation planning, rather than when we have already completed data collection.

Capsule: Data can be analysed in various ways depending upon its purpose, its audience and its data type. Your own resources will also determine the available options. Utilising your data for various purposes ensures it brings your peer program maximum possible benefits.

SELF STUDY:

You previously responded to questions 5.8, 5.9, 5.10 and 5.11 providing information on each objective and measure (identified in questions 4.6 & 4.7 (Funder), 4.9, 4.10 & 4.11 (Member), 4.12 & 4.14 (Build) and 4.15 & 4.16 (Learning) and used a formatted table to list Indicators, Evidence Collection Strategy and Tools & Frequency. Now, please complete the following table with your planned data analysis details based on the content covered in this section of the Training Package.

SELF STUDY Q6.8

Complete the Funder objectives, measured and tools used (as per 5.8) and then add in data analysis methods and options being considered.

SELF STUDY Q6.9

Complete the Member objectives, measured and tools used (as per 5.8) and then add in data analysis methods and options being considered.

SELF STUDY Q6.10

Complete the Build objectives, measured and tools used (as per 5.8) and then add in data analysis methods and options being considered.

SELF STUDY Q6.11

Complete the Learning objectives, measured and tools used (as per 5.8) and then add in data analysis methods and options being considered.