While NC State’s Industry Expansion Solutions (IES) Evaluation Group provides comprehensive evaluation services across various industries,  we seek to train our clients on how to build evaluation capacity within their own programs as well. Here are a few ‘quick tips’ we’ve learned and would like to share with you.

 

Tip #1: Learn about logic models

A logic model is a visual depiction of how a program is intended to operate by focusing on the elements that go into the program to support it (inputs) and looking ahead to what outcomes and impacts you want the program to produce. By building a logic model for you program, you will develop a clearer picture of how your resources are aligned to the appropriate activities to accomplish your deliverables and long term goals. Here are a few basic prompts to ask yourself for each of the components of the logic model:

  • Inputs: What resources will be used to support the project/program?
  • Activities: What are the main things the program is aiming to do?
  • Outputs: What products will be created by the program that are directly observable?
  • Short-term Outcomes: What will occur as a direct result of the activities and outputs?
  • Mid-term Outcomes: What results  (behavioral and process focused) are to follow?
  • Long-term Outcomes: What are the broader results to follow the short and mid-term outcomes?

 

Stay tuned for Blog #2 in this series which will provide a more in-depth look at logic models.

 

Tip #2 and Tip #3: Plan to collect data, and collect what data you planned

Data collection can be a tedious and sometimes difficult task to undertake, especially when different partners, information systems and policies are involved. A huge component of evaluation is quantitative or numerical data that helps to communicate your program outputs and progression towards goals. It’s important to take the time early on in your program implementation to meticulously plan what data you will be collecting, what it will be used for and a process for regularly reviewing your performance metrics.  Making a clear data collection plan will also ensure that all of the stakeholders involved are identifying and recording data in the same way. As part of your data collection and reporting guidelines, particularly if you have multiple partners reporting data to you, you need to make certain that the person in the data collection role has a backup plan. This ensures data is continuously collected and maintained despite transitions within or outside the organization.

 

Tip #4: Communicate effectively and frequently with stakeholders

While this step many seem simple, it’s the one that often times causes the most angst for the project. Consider creating a project charter which provides each person’s role in the project and contact information.  Also, consider the small things such as ‘ the best method for contacting the person’, such as email, phone call or face-to-face communication. Which method will provide the fastest information pending project urgency?  Is there an alternate contact person that can provide assistance? In the case of external communications to partners or advisory members, consider an executive briefing to keep them informed before, during and after your regular meetings with them.

 

Tip #5: Develop a project plan and stick to it (or at least document why you could not)

A common best practice in project management is establishing a clear project timeline that enables everyone to understand their contributions through the project cycle. It may be good to create a shared calendar of events that each partner, participant and stakeholder can access to see what activities are planned throughout the year. With all of this said, we realize that there are many factors that can impact the timeliness of completing project activities. That is ok, and funders should be aware of this possibility. In the event you are running behind on completing your activities, make notes and allot time for reflecting on the reasons why. For example, provide clear explanations for delays, such as a team member getting the flu or severe weather. Some things are simply out of your control, so be sure to document and modify plans accordingly.

If you are interested in the evaluation services that our team can provide, drop me a line and we can set up a free consultation. We would love to learn more about your organization and how we can help you.

 

Dominick Stephenson

Dominick Stephenson is the Assistant Director, Research Development and Evaluation for North Carolina State University Industry Expansion Solutions. Dominick is responsible for managing evaluation functions throughout the project including client engagement, evaluation design, data collection, instrument design and report writing. His current work focuses on workforce development, community college and STEM programs and crosses over local, state and federal agencies. Dominick is a graduate of East Carolina University with an M.A.Ed. in Adult Education and a B.S.B.A. in Management Information Systems.