Blog

  • 06/08/2020 10:46 AM | Greater Boston Evaluation Network (Administrator)

    The Greater Boston Evaluation Network (GBEN) unequivocally condemns police brutality and white supremacy, which has both taken the lives of Black Americans and continues to subject People of Color to police violence at this moment. We encourage jurisdictions to adopt evidence-based solutions to reduce police violence against Black Americans as quickly as possible.

    In addition, we are appalled at the health inequities laid bare by COVID-19 in Massachusetts and beyond, which have similarly stolen the breath from communities of color and immigrant communities. Black Americans have been dying from COVID-19 at rates 2.4 times as high as that of White Americans. Out of the ten communities in Massachusetts with the highest COVID-19 infection rates, nine far exceed the statewide proportion of people of color.

    The police violence and these health inequities are a direct manifestation of a legacy of racism that Black Americans have experienced since slavery began in 1619 in the United States. Though much has been accomplished in dismantling discrimination in our country, much more needs to be done. We commit to the fight for equity, and in particular to leveraging the truth-speaking power of evaluation.


    What we are doing and how you can help

    While evaluation would not have saved George Floyd’s life or that of countless others, we are committed to looking within ourselves and our organization to do better and do what we can to dismantle discrimination, racism, and white supremacy. Here is where we are starting: 

    • Earlier this year, GBEN formed a standing Diversity, Equity, and Inclusion (DEI) Committee. This committee is working towards a plan of action to improve the state of DEI within GBEN and our members. Please consider volunteering with us (email info@greaterbostoneval.org if interested).

    • More than ever, as part of our planning, the DEI Committee needs to hear from GBEN members about your thoughts, knowledge, and experience around equitable evaluation. We have launched a short survey and invite all GBEN members to participate. Please fill out the survey, so that we can design the best plan for our organization and members.

    • In partnership with and through the generous support of the Barr Foundation, GBEN will shortly be issuing a Request for Proposals for a consultant to help us with our DEI planning. Please look out for this RFP and help circulate it to your networks.

    If there are other ways for us to help, please do not hesitate to be in touch.


    Resources
    :

    List of anti-racism resources

    Equitable evaluation resources: 

    Can Social Justice Live in a House of Structural Racism? A Question for the Field of Evaluation

    Being responsive: The first assessment of Culturally Responsive Evaluation in Wisconsin: Findings from the 2017 survey

    More on the GBEN site (requires member login)


    Sincerely,

    The GBEN Executive Committee

    Danelle Marable, President

    Matan BenYishay, Vice President

    Eileen Marra, Treasurer

    Elizabeth Brown, Clerk

    Kelly Washburn, Programming Co-chair

    Min Ma, Programming Co-chair and DEI Co-chair

    Calpurnyia Roberts, DEI Co-chair

    Bryan Hall, Communications Chair

    Annie Summers, Membership Chair



  • 02/27/2020 3:29 PM | Greater Boston Evaluation Network (Administrator)

    On February 12, 2020, Chuck Carter, Senior Evidence Director at Project Evident, hosted a roundtable workshop and discussion on building evidence with 22 GBEN members.  The title of the workshop was entitled “The New Normal: Practitioners at the Center of Evidence Building.” 

    Mr. Carter started the discussion by stating that Project Evident firmly believes that practitioners should be at the center of building evidence.  When led by practitioners, evidence building will result in better outcomes for communities.  The next generation of evidence must be:

    • Be Practitioner-Led: Practitioners must be empowered to move from the caboose to the engine to drive their own continuous evidence-building agendas.
    • Embrace Research and Development: R&D must become standard practice in the social sector to enable timely and relevant learning and innovation.
    • Elevate Community Voice: The needs, beliefs, values, and voices of the community must inform and drive the work of practitioners and the field.

    Mr. Carter also stated is it critical that we intentionally build evidence with a diversity, equity and inclusion lens to continually understand what works, for whom, and under what circumstances.  Project Evident has developed an Evidence Matrix, a structured process designed to effectively promote and intentionally integrate a diversity, equity and inclusion framework into the evidence building activities of an organization or program.   The DEI Evidence Matrix helps organizations evaluate their current evidence building practices using a DEI lens, prioritize learning questions and identify next steps that drive towards equity focused outcomes.

    Mr. Carter also provided 3 examples of what it might look like for practitioners to be at the center of evidence building.

    1. A non-profit engaging youth and young adults with lots of new data, a new program model, a developing data and evidence infrastructure, but also where staff are not focused on or prioritizing the collection or use of data.
    2. An urban housing authority with an established data team, clear evaluation questions, and the capacity for conducting experimental studies.
    3. A pediatric primary care program with a model that has shown positive impact, internal research and evaluation capacity, and has questions about effectiveness and scaling in new areas. 

    The group discussion centered on how one can create a learning culture in the organization and engage practitioners.

    The workshop concluded with some questions for practitioners to consider:

    • Are you using program data for decisions on how to improve the model delivery and participant outcomes?
    • Do you have a learning agenda that evidence can inform?
    • Do you have a theory of change? Is it reviewed on a regular basis?
    • Is senior leadership bought in?
    • Do you understand how data is collected, stored, shared, reviewed?
    • Are underrepresented and vulnerable groups a part of the data collection practices?
    • Are you collecting data that will highlight underrepresented and vulnerable groups?
    • Are there regular team performance meetings, learning goals, coaching for staff, and ongoing staff recognition?

    Mr. Carter’s presentation slides can be found in the member roundtable resources section.


  • 11/25/2019 11:22 AM | Greater Boston Evaluation Network (Administrator)

    On October 29, 2019, Hila Bernstein, from the Cambridge Public Health Department, facilitated a conversation about Institutional Review Boards (IRBs), how to determine whether your work needs IRB review, and best practices for evaluators when working with IRB staff to navigate the process.  Nearly 20 GBEN members participated in the discussion.

    During the roundtable, Hila provided a description of IRBs, how an IRB reviews evaluation studies, consideration for evaluators in working with IRBs, recommendations, then facilitated a Q&A and discussion.   Below are a few key summary points from the presentation.


    What is an IRB? 

    An IRB is a committee or group of individuals who review certain risks associated with research studies and, if warranted, provide approval for the study to go forward.   IRBs review all components of a research study including protocols, methods, materials, and tools used in the study.  A key IRB responsibility is ensure a study meets the following requirements:

    1. Risks to subjects are minimized.
    2. Risks to subjects are reasonable in relation to anticipated benefits.
    3. Selection of subjects is equitable.
    4. Informed consent will be sought from each prospective subject or legally authorized representative.
    5. Informed consent will be appropriately documented.
    6. When appropriate, the research plan  makes adequate provision for monitoring the data collected to ensure safety of subjects.
    7. When appropriate, there are adequate provisions to protect the privacy of subjects and to maintain the confidentiality of data.


    Identifying an IRB to Review Your Work

    Depending on the funding for your study and the type of organization or company you work for, the requirements and options for identifying an IRB to review your work can be different.  Research that is federally funded must be reviewed by an IRB.  Internal evaluators may already have an established IRB in their organization; if not, that organization could use another institution’s IRB by obtaining a federal wide assurance (FWA) number.   In certain situations, an evaluator may need IRB review at their own institution (for example, large institutions and research centers association with hospitals and universities).  Independent evaluators or those without an institutional IRB may need to contract with an external IRB entity.


    Is it Research?

    An important question to consider before even reaching out to an IRB is considering if your work is even considered to be “research” at all.  This may seem like a simple or unnecessary question to consider, but many institutions have complex, specific definitions for research, and those definitions may be different across institutions and/or disciplines.  Many institutional definitions of research focus on the use or involvement of human subjects in the research.  If a project is determined by an IRB to not be research, it may not require IRB review at all. 


    Oakes, J. (2002). Risks and Wrongs in Social Science Research An Evaluator's Guide to the IRB. Evaluation review. 26. 443-79. 10.1177019384102236520.


    Common Modes of IRB Review

    There are four common modes of IRB review:

    • Exempt:    Exempt from IRB review.  Experienced reviewer must make the determination that a project is exempt (this usually requires submitting materials for review)
    • Limited Review:  Limited IRB review is required for research that meets specific expedited or exempt categories.   This is new as of January 2019.
    • Expedited:  Protocol is reviewed by one IRB member, done on a rolling basis. Studies must fit into one of the 7 expedited categories. 
    • Full Board:  Greater than minimal risk or doesn’t fit into an expedited category. These are the projects that go to a fully convened IRB.


    Considerations for Evaluators

    Hila offered a few helpful considerations for evaluators. 

    • Evaluation that’s part of a research study is reviewed together with that research study (cannot separate components of a research study) while evaluation of practice would be reviewed even if practice is not subject to review.
    • If a project is not research or not human subjects research, it is still good practice to carefully consider consent content and process for collecting consent. 
    • Evaluators should look through the required elements of consent and think about what information you would like to have available if you were a participant.  Recruitment processes, data protection, privacy and confidentiality are also helpful to think about. 
    • Take advantage of all of the resources that your IRB offers (may include guidance documents, meetings, educational sessions) and don’t be afraid to ask questions or request examples.
    • Start early!  Obtaining a determination or going through the review process will take time, there may be questions (about the proposed activities or evaluation more generally), and the details of the project may evolve.


    Hila’s presentation slides can be found here (members only).


  • 09/26/2019 12:51 PM | Greater Boston Evaluation Network (Administrator)

    On September 10, 2019, Bryan Hall, Senior Director, Evaluation at BellXcel, discussed his experience hiring evaluation professionals at his organization, building a three-person evaluation department, and the process and challenges he encountered along the way.   Over 20 GBEN members participated in-person or virtually. 

    Mr. Hall discussed the pre-hiring process and what hiring managers should consider before making a job posting public, the hiring process and what to consider when creating a job posting, the applicant and resume review process, and the key characteristics of strong evaluation professionals.    Below are a few key summary points from Bryan’s presentation.


    The Hiring Process Begins Long Before a Job Posting is Public

    The pre-hiring process is critical, and can make or break a hiring initiative.  Before developing and making public a job posting, it’s important to consider a host of factors. 

    • Hiring managers should identify the key processes and stakeholders that will be part of the entire hiring initiative.  For example, what role will your organization’s Human Resources department plan in the hiring process?  Who will participate in the interview process and what scheduling accommodations will be needed?
    • It’s important to know your full budget for hiring, beyond (but including) the salary range for the position.  Do you have budget to fly a candidate in for an interview?  Will you have budget to train staff once hired?
    • Don’t underestimate the amount of time needed to complete the hiring process.  Some hiring processes can take upwards of a year to complete.  The interview process alone can sometimes take two-to-three months.  Bryan noted that a recent hiring process for an Evaluation Manager position took 6-8 months before leading to a hire. 
    • Think seriously about the workload of the position you are hiring for.  What will their day-to-day, month-to-month work life look like?  It’s important to consider whether you even need a full-time employee at all or if part-time, seasonal, temporary, or consultant staff would be a better fit for your needs.


    The job posting is important for candidates and the hiring manager

    Once you are ready to formally start the hiring process, it’s important to develop a job posting that you will make public to interested candidates.   Similar to the pre-hiring process, it’s important that the job posting strongly reflect your organization’s needs.  A poorly designed job posting can delay your hiring process or attract candidates that may not be the best fit for your needs.  A few considerations for the job posting:

    • The job posting is not the same as the job description you hand a new employee once they start.  Avoid making a job posting an exhaustive list of responsibilities, but instead try to capture the high level job responsibilities and requirements you are looking for.
    • A job posting should include key information that allows an interested candidate to decide if they are a good fit for the job.  Key items to include are:  brief position description, key responsibilities and expectations, key hiring attributes and requirements, a brief description of your organization and work, brief summary of benefits, the process to apply, and salary range (ideally, but not always possible).
    • Seriously consider what attributes a strong candidate must have in order to be considered for the position, and what are simply nice to have.   Most employees learn on the job and receive significant training once they start.  Consider which attributes are flexible and which are non-negotiable for a candidate.
    • Consider the power of transferable and general skill domains, versus hiring for a specific skill set.  For example, if your organization uses Salesforce as a data system, you don’t necessarily need to hire a candidate with Salesforce experience.  Instead, consider – and advertise for in the job posting - a candidate with strong “technology proficiency” in other systems who can be trained on how to use Salesforce.  
    • Don’t let perfect be the enemy of great.  There is no such thing as a perfect candidate.  Of the “must haves” and “nice to have” attributes in your job posting, consider 3-5 that are most important to you, and which others you can be flexible on. 


    There are many online resources for job postings

    Your job posting should ideally be hosted on your company’s website and/or LinkedIn account.  In addition, the evaluation world has a few key job websites for job postings including the American Evaluation Association (AEA) website and evaluationjobs.org.  In addition, consider general job websites like LinkedIn, indeed.com, and idealist.org (for non-profits).  Job sites like Monster and Career Builder may not be that useful for evaluation positions.  Lastly, consider discipline-specific professional associations, as most offer the ability to post job openings.  For example, if you work in the field of public health, consider listing the job through the American Public Health Association (APHA) website. 


    Interview Questions Should Fill in the Gaps of the Resume

    A resume and cover letter (if provided) should tell you 80-90% about a candidate.  The purpose of the interview is to fill in the rest.  Therefore, focus your interview on attributes about the candidate that may not be expressed via the resume such as passion for the work, soft skills, and problem-solving skills.  Example questions that Bryan has used in past interviews include:

    • Why are you interested in this position?  Why did you apply?
    • Tell me about a recent job experience and relevance to this job?
    • A key job responsibility is ____.  Tell me about a time you did ____ ?  Are you comfortable/do you enjoy doing ______ ?
    • Tell me about your work personality?  How do you work with others and/or independently?  What are your needs as an employee?
    • Tell me about a time you faced a conflict/challenge/problem – how did you approach and resolve it?


    Mr. Hall’s full presentation slides can be found here (members only). 


  • 09/26/2019 12:41 PM | Greater Boston Evaluation Network (Administrator)

    The next 2-year terms for GBEN Treasurer and Clerk start January 1, 2020. This is an opportunity to show your commitment to the value of GBEN and help to shape its future! You may nominate yourself or a committed GBEN colleague. The deadline for nominations is Monday, October 7, 2019.


    Position Descriptions

    GBEN is governed by an Executive Committee, which serves as the board of directors, and consists of a President, a Vice-President, a Treasurer, a Clerk, and chairs from each of the committees. The President, Vice-President, Treasurer, and Clerk are elected by the membership.Any GBEN members may serve on the GBEN Board.

    The Executive Committee meets monthly to over see all GBEN activities and operations, including overseeing all subcommittees. The Committee is also responsible for setting dues and approving a budget for each year. The Executive Committee oversees elections, fills vacancies, holds special elections, and removes Committee members as outlined in GBEN’s by-laws.


    Clerk Position Description:

    • Record the proceedings of GBEN;

    • Keep the records of Bylaws and subsequent amendments; 

    • Handle all the general correspondence of GBEN, as directed by the President and Vice-President;

    • Support creation of agendas for GBEN meetings;

    • Work with the Treasurer to submit annual IRS filing for 501c(3) status and attend to any other administrative and annual reporting work associated with 501c(3) status.


    Treasurer Position Description:

    • Collect dues and any other funds to be received by GBEN;

    • Document all financial transactions related to GBEN;

    • Report monthly financial updates to the President and Vice-President and the Executive Committee;

    • Report at general membership meetings and prepare an annual/fiscal year report;

    • Transact the general business of GBEN in the interim between meetings; 

    • Disburse funds and pay bills in accordance with the provision of the Bylaws or policies of the Executive Committee;

    • Work with the Clerk to submit annual IRS filing for 501c(3) status and attend to any other administrative and annual reporting work associated with 501c(3) status.

    • The outgoing officers shall deliver to their successors all books and materials of their respective offices by January 15th.


    Qualifications and Time Commitment:

    • Membership with GBEN and AEA

    • Some leadership or management experience

    • Minimum of 3 years experience with evaluation-related work

    • Capacity to commit 10-15 hours per month

    • Some Board experience helpful

    • Strong organizational skills helpful


    Submission Process:

    Each nomination submission should include:

    • Name, Title, Affiliation, Email, Phone

    • Resume or CV

    • A brief statement answering the following questions:

      • Why are you interested in becoming Clerk or Treasurer of GBEN?

      • What are your qualifications for Clerk or Treasurer?

      • What is your vision for GBEN?

    Submit COMPLETED applications to GBEN via email (greaterbostoneval@gmail.com) by Monday October 7, 2019 or earlier, if possible.


    Questions
    ?

    If you have questions about nominations process, please contact Danelle Marable, DMARABLE@mghihp.edu.

  • 07/31/2019 10:14 AM | Greater Boston Evaluation Network (Administrator)

    Big data.  Data science.  Predictive analytics.  Social network analysis.  The field of evaluation is expanding to new frontiers, becoming a transdisciplinary practice.   

    Based on your work experience and interests in the field of evaluation, what is the next area(s) that you want to learn more about and integrate into your evaluation practice?


  • 06/26/2019 3:24 PM | Greater Boston Evaluation Network (Administrator)

    On Tuesday, June 18th, GBEN hosted its second roundtable on the topic of social network analysis.   Over a dozen GBEN members and guests participated.  The roundtable discussion was led by Kelly Washburn, MPH, from Massachusetts General Hospital’s Center for Community Health Improvement.  Kelly is also one of GBEN’s Programming Committee co-chairs.

    Social network analysis (SNA) is the mapping and measuring of relationships and flows between people, groups, organizations, computers or other information/knowledge processing entities.” (Valdis Krebs, 2002). SNA can show the performance of the network as a whole and its ability to achieve its key goals, characteristics of the network that are not immediately obvious, such as the existence of a smaller sub-network operating within the network, the relationships between prominent people of interest whose position may provide the greatest influence over the rest of the network, and how directly and quickly information flows between people in different parts of the network.

    Kelly walked through a small social network analysis she conducted to walk participants through the different steps needed, challenges, and lessons learned.  The project discussed was a provider task force improving connections among services providers, streamlining services, and enhancing care coordination efforts.  The SNA provided a baseline on how the task force members work with each other by asking four questions:

    1. Do you know this person?
    2. Have you gone to this person for information in the last year?
    3. Have you worked with this person to obtain new resources in the last year?
    4. Have you referred a client to their organization in the last year?

    The analysis was done in Gephi, a free software for conducting social network analyses.  Data cleaning was the most tedious part of the project and was done manually; however, there are ways to bypass the manual data cleaning. After the data is set-up in the appropriate Node and Edges file, they are uploaded into Gephi. Once in Gephi, the steps detailed in their manuals were followed to take it from the initial map to the finalized map.  Following Kelly’s discussion of her project, others in attendance spoke of their own experiences of using social network analysis in their work.

    Key Challenges and Lessons Learned:

    The roundtable participants discussed a few challenges and lessons learned when conducting a SNA, including:

    • New analytical methods and techniques, like SNA, can require a lot of patience and time to learn and master.  Be sure to invest the necessary time when learning how to conduct a SNA for the first time.
    • A high response rate means A LOT of follow-up to ensure the data is representative of the population you are analyzing.  Be sure to invest the necessary time and resources to doing follow-up for your project.
    • Make sure the questions being asked are the right questions as it’s difficult to change directions once the project and analysis has started.
    • Continually ask yourself and/or your team(s):  Do I need to collect new data or is there already collected data I can use for the SNA?  
    • SNA can be frustrating to administer and master at times.  Patience during the process is key to ensuring a successful outcome. 
    • The visual map was key for the task force in understanding the analysis. 


    Resources:


  • 05/28/2019 1:12 PM | Greater Boston Evaluation Network (Administrator)

    Feminism, at its core, is about the transformation of power—but how do you know that’s happening at the organizational level? How can you understand the core drivers of that transformation? How can your own process of evaluating that transformation democratize the evaluators’ power?

    Taylor Witkowski and Amy Gray are evaluation and learning specialists at Oxfam America, designing and testing feminist monitoring, evaluation and learning processes for a gender justice-focused organizational change initiative.


    WHAT IS FEMINIST EVALUATION?

    Everything is political – even evaluations.

    Traditional evaluations, even when using participatory methods, prioritize certain voices and experiences based on gender, race, class, etc., which distorts perceptions of realities. Evaluators themselves carry significant power and privilege, including through their role in design and implementation, often deciding which questions to ask, which methodologies to use, and who to consult. 

    Feminist evaluation recognizes knowledge is dependent upon cultural and social dynamics, and that some knowledge is privileged over others – reflecting the systemic and structural nature of inequality. However, there are multiple ways of knowing that must be recognized, made visible and given voice. 

    In feminist evaluation, knowledge has power and should therefore be for those who create, hold and share it – therefore, the evaluator should ensure that evaluation processes and findings attempt to bring about change, and that power (knowledge) is held by the people, project or program being evaluated.

    In other words, evaluation is a political activity and the evaluator is an activist.


    APPLYING FEMINIST EVALUATION TO ORGANIZATIONAL TRANSFORMATION AT OXFAM AMERICA

    Oxfam America is seeking to understand what it means to be a gender just organization—from the inside out.

    In order to do this, Oxfam America recognizes that a holistic, multi-level approach is required. We believe that transformational change begins at the individual level and ripples outwardly into the organizational culture and external-facing work (Figure 1).


    (Figure 1)

    (Figure 1)

    This is why we are investing in a feminist approach to monitoring and evaluation—because even though feminist values—adaptivity, intersectionality, power-sharing, reflexivity, transparency—seem like good practice, without mainstreaming and operationalization they would not be fully understood or tied to accountability mechanisms at the individual, organizational or external levels.

    Therefore, as evaluators, we are holding ourselves accountable to critically exploring and implementing these values in our piece of this process. The foundational elements of this emergent approach include:

    • Power-Sharing: The design, implementation and refinement of the monitoring and evaluation framework and tools are democratized through participatory consultations with a range of stakeholders—a steering committee, the senior leadership team, board members, gender team and evaluation team.
    • Co-Creation: Monitoring includes self-reported data from project contributors as well as process documentation from both consultants and evaluation staff, and data is continually fed into peer validation loops for ongoing reflection and refinement.
    • Transparency: Information regarding the monitoring and evaluation framework, approach and activities are communicated and made accessible to staff on a rolling basis as they evolve.
    • Peer Accountability: Monitoring mechanisms that capture failures and the cultivation of peer-to-peer safe spaces to discuss them create new opportunities for horizontal learning and growth. This includes a social network analysis (SNA) of perceived power dynamics within teams (contributed by team members via an anonymous survey), followed by a group discussion in which they reflected on the visual depiction of their working dynamics through the lens of hierarchy and intersectionality.

    EVALUATORS AS ACTIVISTS

    As the monitoring, evaluation and learning (MEL) staff working on this initiative, we recognize that we have an opportunity to directly contribute to change. We therefore see ourselves as activists, ensuring MEL processes and tools share knowledge and power as well as generate evidence that reflects diverse realities and perspectives, and can be used for accountability and learning at multiple levels. As a result of this feminist approach to MEL, participating Oxfam staff can see and influence organizational transformation.                                                                       

    How have you used feminist evaluation in your work? Do you have any tips, resources or lessons learned you’d like to share? Do you think this would make a good roundtable discussion?


    CONTACT

    RESOURCES




  • 04/29/2019 3:12 PM | Greater Boston Evaluation Network (Administrator)

    As evaluators, we sometimes collect more data than we can use.



    What are 1 or 2 methods or tricks you use to make your data collection process more meaningful and/or more aligned to your evaluation questions?

  • 03/27/2019 10:52 AM | Greater Boston Evaluation Network (Administrator)

    On Tuesday, February 5th, GBEN and Northeastern University’s Public Evaluation Lab (NU-PEL) co-hosted a panel on Impact Evaluation.  This was the first GBEN event of 2019 and the first event co-sponsored with NU-PEL.  The event saw the greatest turnout in the history of GBEN with 66 attendees!

    The panel featured five local internal and external evaluation leaders who have recently undergone randomized-control trial (RCT) or quasi-experimental impact evaluations. 

    The five panelists were:

    More and more, non-profits must demonstrate impact in order to ensure their ability to grow and innovate.  The purpose of the panel discussion was to explore what drives non-profits to engage in an impact evaluation, how to choose methodology, and lessons learned about communicating results. 

    Here are some of the key takeaways from the engaging panel discussion:


    Methodological Rigor vs Reality

    Several of the panelists discussed the push-and-pull between ideal methodological rigor and what is actually possible and/or ethical for programs.  In particular, Ms. Britt and Mr. Nichols-Barrier spoke to being able to do or not do randomization based on program over-subscription.  On the flip side, Ms. Goldblatt Grace and Professor Farrell from My Life My Choice shared a powerful anecdote about overcoming skepticism to their project’s rigorous methodology to allow a research assistant to be present at the mentor-mentee match sessions.


    Organizations Conduct Impact Evaluations for Lots of Reasons
     

    The motivating factors behind the decision to evaluate impact are diverse. Organizational values, political context, and funders can all play a role in the decision to conduct an evaluation as well as decisions around study methodologies. 


    Communicating Results
     

    Several of the panelists shared helpful tips about communicating results, specifically going beyond sticking them in a report that few people read. Ms. Britt shared a strong example of Year Up planning a year-long plan for communicating parts of their results throughout the whole organization, including a big celebratory kick-off event. 


    GBEN would like to thank the five panelists for being a part of this incredible event as well as NU-PEL for co-hosting.  Be on the look-out for future co-hosted events with NU-PEL!


Greater Boston Evaluation Network is a 501(c)3 non-profit organization. 

Powered by Wild Apricot Membership Software