Inventivo Design advocates and encourages evaluation capacity building (ECB) with all clients. For evaluation to be successful and sustainable, organizations need to learn and be involved with all aspects of evaluation, including strategy, planning, instrument design and delivery, data analysis, and reporting.

 

Inventivo Design will guide you through the process and develop your capacity to take on increasing amounts of evaluation work in your organization (to the extent you're comfortable) so it can become an embedded practice that is trustworthy and reliable.

 

Below are examples of evaluation capacity building topics that can be provided to your organization, and can be customized to meet your specific needs. Clients of full program evaluations receive many of these topics as part of their service agreement.

What Managers Should Know About Evaluation

 • Identifying Scope of Work and potential process flexibility

 • Contracting options

 • What it actually takes to execute evaluation activities - your involvement,
    the involvement of others, organizational implications

 • Expectations - evaluation doesn't necessarily 'fix' anything unless you engage in
    participatory, empowerment, or OL-based evaluation (it still requires action)

 • Cost areas - what you can and can't cut to get a decent outcome

 • Process logistics and choice points

 • Measurement and data issues (existing, new, storage, and maintenance)

 • The importance of focusing the evaluation before any data collection is
    considered

How much data do I need?

• Sampling Frames, Population, Response Rates

 • Statistical Power

 • Validity and Reliability

 • Differences between qualitative and quantitative methods that influence
    sample size, generalizability, and representativeness

Survey Basics (minimum 2 parts, 90 min. each)

• Assumptions going into a survey-based evaluation project (evaluation plan in
    place, this was your only or best option, etc.)

 • Human Subjects Protections

 • Validity and Reliability

 • Writing survey questions (focus, clarity, simplicity, rating vs. text)

 • Ordering effects

 • Sampling issues

 • Why you need to think carefully about demographic data

 • Scale anchors and what they mean for reporting data

 • Importance of proofing and testing/feedback pre-release

 • Managing the process

 • Reporting the results - what you can and can't do and say

 • Communication about the survey and its results

Logic Models

 • What they are

 • What they can look like

 • How to create a logic model

 • How to use one

 • Revising them regularly and long-term implications

Choosing Metrics

 • Types of metrics

 • New data versus existing data, and associated implications

 • Data collection, maintenance, and sustainability

 • Accurate reporting and use of metrics

 • Limitations and cautions

Evaluation Guidance

 • On-demand consultation about your evaluation activities from pre-collection
    through reporting

P.O. Box 281068    Lakewood, CO 80228-1068    720 413 0600