Changes

Jump to: navigation, search

Indicators

4,627 bytes removed, 20:29, 20 February 2020
no edit summary
{{TOC right}}
This page summarises an initial, internal brainstorming on indicators. {{Under construction|THIS IS WORK IN PROGRESS!}}{{note|Further ideas/TODOs:* For each indicator, mention appropriate display mechanisms! --> For this, define categories of display mechanisms* For each indicatorIndicators can be recorded, mention if it is quantitative measured or qualitative, analysed in order to provide a simple and maybe even how it could be calculated/where reliable means to identify achievement measure the data could be taken from.changes connected with an intervention* --> Define a matrix for the main indicator characteristics* The overall goal is or to come up with a reference set of indicators per mine action area (Land release, MRE, Victim assistance, etchelp assess performance.)}} == Important meeting notes ==__NOEDITSECTION__* Indicators cannot be isolated! They always have to can be put in context with outcomes/objectives/etc.* Approach: cause-effect quantitative ("what leads to what"numeric), * qualitative or * Shortpseudo-term objective: quantitative in MINT/Geoportal, get started with output indicators and validate them with Russell case a number is converted to a scale (are they good ones? could they be presented in a better way? etce.)* Mid-term objective: GICHD publication on (outcome) indicators, their development, etcg. by mid-2015 1 =poor, 5 = Examples of dimensions of change relevant to mine action ==__NOEDITSECTION__From a document distributed during the Copenhagen workshop in 2013good).Examples of dimensions of change relevant to mine action (outcome level):* Changes relating to land and land use* Changes relating to safety / risk from mine and ERW* Changes relating to national capacity to address mine and ERW problems* Changes relating to gender* Change in the support to mine and ERW victims
== Principles for the development of indicators ==__NOEDITSECTION__
=== From a presentation from DDG: ===__NOEDITSECTION__
* '''Valid''' - Does the indicator directly represent the change it is intended to measure? Is the change within the scope of the project?
* '''Objective''' - Is the definition precise, simple and unambiguous about what is to be measured?
* '''Owned''' - Do the local communities and programme management agree that this indicator makes sense?
=== Other approaches? =SMART & SMARTER==__NOEDITSECTION__SMART - SMARTER:is a mnemonic/acronym, giving criteria to guide in the setting of objectives/indicators.
* Specific
* Measurable
* (Reevaluate)
See [http://en.wikipedia.org/wiki/SMART_criteriaWikipedia article]
== Categories/Levels of indicators ==__NOEDITSECTION__
This is just to have different sets Indicators are of indicators, for different types / levels/purposes/areas...
* Output-level indicators
* Outcome-level indicators
* Performance indicators
* Impact-level indicators?* Activity-level indicators?
== Collection Examples of indicators ==__NOEDITSECTION__This is an initial collection of indicators encountered so far in the mine action context. '''It is not yet an assessment regarding their applicability/usefulness/relevance!''' === Indicators mentioned in the Copenhagen initiative output document: ===__NOEDITSECTION__
{| class="wikitable"
|}
=== Indicators mentioned in the UN M&E framework: =References ==__NOEDITSECTION__In the * UN M&E framework for mine action, indicators are targeted to measure the progress towards the UN-specific mine action objectives. They address two levels: vision-level and strategic objectives. The latest document describing those indicators is this one: [[Media:20140318(v2) - Survey Instrument for Pilot.docx|UN Survey Instrument]] === Inspirational indicators from WHO document ===__NOEDITSECTION__Cf. http://www.who.int/healthinfo/systems/WHO_MBHSS_2010_full_web.pdf?ua=1 -- write a summary here after going through the document. === Indicators mentioned in a discussion about operational efficiency ===__NOEDITSECTION__From a discussion between Helen, Rana and Elisabeth:* % of areas worked on that had mines* % of areas worked on that had UXOs* Average size of cleared area* Average size of surveyed area === Comment from Russell about approaches and learning from other people's experience ===__NOEDITSECTION__ The key issue (I would go as far as to say the over-riding issue) about getting indicators into common use is the widespread negative perception by field operators. Mention indicators and the reaction is something like "that's all nonsense, no-one in the field has time to go around collecting page after page of data that is never going to be used anyway". This is almost a direct quote from a reaction I had from a colleague who is relatively positive about GIS and other technology, and seems to sum up the most common reaction. As a result, it seems likely that the biggest problem to overcome is perception and acceptance, and technical issues are the second issue. In terms of an approach this means things like:* go for quick wins with indicators that may not be overall so useful but are very easy to data collect and produce obvious results so that people start to change the perception* try to find out the biggest issue that needs addressed in a given situation (i.e. which indicator is most requested) and decide if this is a feasible problem or not. If not feasible then ruthlessly set it aside and look at next most urgent. What we choose _not_ to do is going to be important. (ref to Steve Jobs: "there is no shortage of good ideas, but you have to say no to all of them if you are going to work on the best ideas"). Don't be distracted by unrealistic expectations.* in looking at indicators from other parallel development areas, we should focus not on the "technical" fit of the indicator to mine action as our first criterion, but at how well the indicator is accepted in the field. Build up a library of these and then look for common factors and characteristics in why and how they are accepted (and also maybe the timescale and process from first use to acceptance). If our main problem is perception then that must be addressed analytically as far as possible.* accept that indicators are partial and indicative and generally only indicative of anything when it is too late, but that this is _far_ better than the alternative of working blind, hearsay and widespread accepted myths and lies about effectiveness and impact.* remember that "doing the right job" is more important than "doing the job right" and look at how other sectors address the "right job" issue. == References (external links) ==__NOEDITSECTION__* Feinstein International Center: ''Participatory Impact Assessment: A Design Guide''; available at [http://fic.tufts.edu/assets/PIA-guide_revised-2014-3.pdfParticipatory Impact Assessment: A Design Guide]* World Health Organization: ''Monitoring the Building Blocks of Health Systems: A Handbook of Indicators and their Measurement Strategies''; available at [http://www.who.int/healthinfo/systems/WHO_MBHSS_2010_full_web.pdf?ua=1A Handbook of Indicators and their Measurement Strategies]* Mikkel Nedergaard (DDG): ''Outcome Monitoring in Humanitarian Mine Action'', The Journal of ERW and Mine Action, 2014; available at [http://www.jmu.edu/cisr/journal/18.1/focus/nedergaard.shtml Outcome Monitoring in Humanitarian Mine Action]* Miscellaneous links/resources related to '''Outcome Mapping''' (please update as you find useful material):** [http://www.researchtoaction.org/2012/01/outcome-mapping-a-basic-introduction/Outcome Mapping: A Basic Introduction]** [http://www.idrc.ca/EN/Resources/Publications/Pages/IDRCBookDetails.aspx?PublicationID=121Building Learning and Reflection into Development Programs]** [http://www.outcomemapping.ca/Outcome Mapping Learning Community]
{{NavBox HubBusiness Intelligence}}[[Category:NoPublicNAA]]
6,632
edits

Navigation menu