| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • You already know Dokkio is an AI-powered assistant to organize & manage your digital files & messages. Very soon, Dokkio will support Outlook as well as One Drive. Check it out today!

View
 

Sibiu quality criteria and indicators: progress and sustainability

Page history last edited by Randolph Preisinger-Kleine 12 years, 3 months ago

 

 

CORE QUALITY CRITERIA:  PROGRESS AND SUSTAINABILITY

Main aspects important for quality assurance in the case of progress and sustainability. The partnership / network meets its mission and progresses towards their Vision through planning and achieving a balanced set of results that meet both the short and long term needs of their stakeholders and, where relevant, exceed them.

QUALITY INDICATORS
(CORE AND ADDITIONAL/DESCRIPTORS)

What might indicate quality in the partnership?

 

EVIDENCE TO SUPPORT INDICATORS

What evidence is available to support the inclusion of the indicators?

 

The capacity to continuously understand results, reasons; and the capacity to use this understanding to influence ongoing planning

 

Definition:

Monitoring is about checking - checking whether inputs match outputs,whether income balances expenditure, whether actual activity matches planned activity. It is also about recording the gaps between them. Monitoring is not the same as evaluation because it is descriptive rather than interpretive and is not intrinsically directed toward learning but the two are often confused. However, evaluation is almost impossible if there is no monitoring system in place.

 

Definition:

There are probably as many definitions of evaluation are there are books written about it. However, the shared understanding is that

 

  • evaluation is purposeful, it is a means to an end not an end in itself,
  • evaluation of things, which have happened, helps people make decisions about the future,
  • it is based on asking specific questions about a project and finding the answers,
  • it is an investigative process, evaluation is systematic and scientific,
  • it involves collecting evidence,making comparisons, measuring things against criteria, 
  • evaluation means that someone, ultimately, has to make judgements about the value or worth of something so its outputs must be interpretive not simply descriptive.

 

Find below the handbook "Evaluate Europe", which has been developed by the Capitalization & Evaluation Research Network (CERN) in order to support project managers at the development of evaluation schemes for a variety of purposes.

 

Evaluate_Europe_Volume1.pdf  

There is a formal process agreed around evaluation, progress soft and hard indicators are determined and agreed; evaluation and review addresses product (outcomes) as well as process (lessons),

 

Monitoring mechanisms and evaluation procedures are in place and scheduled, and resources are allocated both, finance and human responsibility,

 

Clarity of monitoring, evaluation and review methods and instruments

 

All partner organisations have to fill-in a questionnaire on evaluation which is designed for all partners and possibly additionally used for primary stakeholders to collect the opinions and results and to benachmark them in a way

 

Research and planning skills are available, in order to effectively conduct monitoring, evaluation and review

Documentation of formal process,  signed by all stakeholders of the network

 

Documentation of monitoring instruments, implementation schemes, and responsibilities

 

Number of inquiries made and requests for support to effectively use monitoring, evaluation and review methods and instruments

Monitoring and evaluation informs strategic direction and both policy and practice

 

Definition:

Capitalisation means building on the achievements of an activity, project (or programme) and using the results in future activities.

Key stakeholders ensure personnel time committed to review

 

There is a clear understanding of the capitalization process, allowing policymakers and practitioners to effectively capitalize on evaluation results

 

Results of evaluation and review are fed into planning of LR actions and approaches

Hours / days spent by stakeholders for review of monitoring and evaluation results within a certain period

 

Easy to use documentation of capitalization process, agreed by all stakeholders

 

Number of inquiries, concerned with capitalization issues

 

Changes applied during planning stage of actions and approaches 

Evaluation and review (quality) are prioritised and internalized – seen as a core activity and not just an add-on

Evaluation culture is established and there is a common understanding of evaluation as a stimulus for development

 

It is ensured that evaluation and review are part of day-to-day activities of the network

 

Self-evaluation is promoted within the network

Evaluation and review are acknowledged a prior task in strategic documents

 

Evaluation and review are acknowledged a prior task in operational plannings

 

Number of inquiries, concerned with difficulties in understanding the task

 

Number of self-evaluations implemented

Means of measuring must be appropriate in context of the learning region

There is a macro as well as micro picture (product as well as process)

 

It is ensured that outcomes also are determined in longer term, since effective LR strategies may take generations for positive results – not immediate

 

Possible outcomes include effects on broader community (not just the individual) and positive benefits are identified in different contexts

 

Achievements in the product / process / professional practice-field / policy level

 

Experience of measuring good practice and contribution to public good

Measuring sticks are defined for both levels, the Learning Region network and its sub-networks, and are laid out in strategic planning documents

 

Long-term outcomes are defined, and indicators are mapped onto evaluation schemes

 

Outcomes are defined for the broader community, and indicators are mapped onto evaluation schemes

 

Outcomes and indicators are defined on different levels and mapped onto evaluation schemes

 

Good practice of measuring is documented and acknowledged by the network actors, that is stakeholders are feeling comfortable with measure sticks and perceive transformation (soft indicators)

 

Review of broader regional development strategies to determine influence of LR strategy 

Results and findings of monitoring, evaluation and review must be able to be widely understood

Evaluation and review results are made widely available

 

Results are stakeholder-specific

Number of network actors and target groups reached

 

Results of document analysis, such as on style of publications, different modes of communication targeted to the main user groups

 

Positive feedback from stakeholders, and inquiries indicating a gap in understanding

Flexibility of partners to share information (not defensive)

Learning Region actors see the value-addedness of sharing insights and giving information

Number of documents shared

 

Quantity and quality of participation in forums, inscriptions to mailing lists, and use of means for knowledge sharing, such as databases

 

Results from stakeholders audits into the effectivity of knowledge sharing

 

Smaller networks created within the hub

 

Mutual agreement governing confidentiality and ownership of information exchanged

Flexibility and openness of partners to accept results (failures as well as successes) and act on results

Evolution of a culture of change and emergence of change agents, shakers and shapers

Results from stakeholders audits into organisational change, patterns of innovative action, change of routines and habits, and new roles / functions indicating the appearance of change agents

Outcomes and impact are regularly checked, demonstrated and communicated to all members of the network and the community at large

There is mechanisms in place, which allow for periodic analysis, synthesis, demonstration and dissemination of outcomes and impact of LR initiatives based on monitoring and evaluation findings

Frequency and number of periodic reports

Unintended as well as planned outcomes are documented and shared

Different types of outcomes are documented in database 

Number of database records according to types of outcomes, intended and unintended as well

Benefits identified are broad-based
(not just education-linked)

 

Links between all these dimensions (as to the value of learning) are understood and demonstrated

 

Review broader regional development strategies to determine influence of LR strategy

   

 

 

 

Comments (1)

jutta.thinesse@web.de said

at 3:15 pm on Jun 18, 2011

To monitoring my suggestion would be to develop all together an evaluation sheet. All partners have then to fill-in a questionnaire on evaluation which is designed for all partners and possibly additionally used by primary stakeholders to collect the opinions and results and to benchmark them . This helps to reshape the project and leads to a sustainability.

You don't have permission to comment on this page.