Welcome to Capturing Development’s (blog) site

Through my company Capturing Development, I (Marlen Arkesteijn) support NGOs and research institutes active in the field of nature conservation, agriculture and poverty alleviation in planning, monitoring and evaluation (PM&E) mechanisms that promote learning.

The core of good monitoring and evaluation practices lies -in my view- in curiosity combined with the need for change: Wanting to know how your interventions contribute to lasting sustainable change and exploring how interventions can be made more effective.

Currently we are using 50 per cent more resources than our Earth can provide (see e.g. Living Planet report 2012). To ensure that both human beings and other living creatures on this planet can live in harmony  and have sufficient space, clean water, air and food, transformative change is needed. One of the keys lies in changing our consumption and production patterns, at all levels. Individuals (you and me), organisations, companies, governments all need to change their practices and -maybe even more important- the rules that (re-)produce these practices and sustain the current system.

For transformational change of practices and rules there are no (simple) recipies. Many times it is a matter of questioning assumptions, of trial and error, exploring what works, what does not, and why. Capturing Development supports NGOs in this exploration. I facilitate collaborative inquiries and research, trying to make differing roles, perspectives and assumptions explicit, conduct evaluations and reviews (preferably collectively with staff to embedd results and lessons), support the development and structuration of M&E practices within NGOs, encourage double loop learning and system learning (moving towards questioning our ways of thinking, our problem definitions) etc.

Tools and methods (both qualitative and quantitative)  can be helpful (but not leading) in these ventures: I have good experiences with Outcomes Mapping, Most Significant Change, Reflexive Monitoring in Action, (action research), Survey Monkey, surveys etc.

Besides, whenever relevant and usefull, I use visuals (videos and photographs) in my assignments to facilitate internal communication and reflection (see Visual M&E).

Video made by staff of Masindi Farmers’ Association on Gender Impact, after I trained them in camera, editing, interviewing and gender framework.

Capturing Development is owned by Marlen Arkesteijn (see ‘Who is Capturing Development?’). Although self-employed, I frequently work together with other professionals.

Capturing Development is partner of Evaluation & Co. Partnership for Innovative Evaluation. This group is specialised in evaluation for learning in complex situations.

With Barbara van Mierlo, working at the Wageningen University/ Communication and Innovation Science, I work on Reflexive Monitoring in Action, an M&E methodology especially developed for learning for social innovation and system change.

With Maple Tulips (Dominique Darmon) I developed e-Valuation, a novel visual evaluation method.

Posted in Power of images, System thinking | Tagged , , , | Leave a comment

Happy 2017!

kaart-2017-work-web-light-2

Posted in Uncategorized | Leave a comment

Happy 2016!

kaart2016zakweblight3

Posted in Uncategorized | Leave a comment

Documenting Gender Impact in Uganda

Training in visual documentation techniques.

A few months ago I trained staff of three district farmers associations (Hodfa, Madfa Mbabadifa) , three partners of Trias Uganda, in visual documentation techniques.

Over the years the three district farmers associations implemented a range of gender projects to ensure that women, men and youth equally benefit from their general programmes on food-and income security, and on value chains. And indeed, the associations seemed to be quite successful in their endeavour to mainstream gender -as reported by different experts visiting the districts- but none of the farmers associations really kept systematic track of these results. With years of experience, the need was felt to review and discuss the status and results of gender mainstreaming, and to determine the way forward.

At the same time the farmers associations felt the need to go off the beaten track and to show their results instead of writing up a study. Through an earlier evaluation two of the farmers associations had seen the power of video for facilitating discussions, and so they decided they wanted to document their gender results through videos, themselves. And so it happened.

This meant that staff needed to be trained in video documentation; especially in documenting results of gender mainstreaming. I developed a comprehensive training, including camera and editing techniques and skills, and -more important- social research techniques, interviewing and story telling, triangulation and validation, and training in understanding and recognising gender impact.

Especially this last issue ‘recognising gender impact’ took an important place in the training: What does gender impact mean in the light of farmers associations and their programmes?

To move beyond capturing of mere singular testimonies, I introduced the gender@work framework that includes changes in gender awareness, access and control over resources, services and decision-making of women, men and youth, and changes at institutional and community level (laws, regulations, policies and norms, values and practices) that enable gender changes to happen.

The results of three weeks training were absolutely amazing, and certainly not without challenges! In the end all three farmers’ associations had rather strong videos on gender impact, featuring specific women and men. The changes in awareness and the changes of norms, values and practices at community level

are very well visible. As Benon, one of the interviewee tells: ‘Only 15 years ago, women were not supposed to join our meetings, and if they did, they were supposed to keep their mouth shut. And look at us now, we even have women leaders that lead our groups.’ Showing access to services like extension and credit, and to resources like land and seed were visible in the videos as well. However, having control over resources and services was still kind of absent: For example ownership of land and decision making about money is still mainly in the hands of men.

Recently, the videos were published by Trias Uganda. And to be honest, I am really really proud of my students and how they were able to capture gender impact after a training of three weeks only.

Posted in Power of images | Tagged , , | Leave a comment

Visual outcome mapping and harvesting

Today I would like to showcase some of the work I do with my partners of Evaluation & Co on Visual outcome mapping and harvesting. Please find below our UpdateOne:

EvCoUpdate

  Why using Outcome mapping?
Untitled1The organisations we work with on Visual Outcome Mapping have programmes with specific characteristics: They involve a broad range of actors, their objectives focus on changing behaviours, policies and practices of another range of actors, while at the same time it is not clear what exactly could work and how, and the programmes are facing rapidly changing environments.  Good old Logframe planning and simple monitoring methods are not particularly helpful in these cases. For these problems dynamic, flexible design, monitoring and evaluation methods are needed that pay heed to the uncertainty of ‘not knowing’ and to the different perspectives on solutions and problems. Outcome mapping is such a method.  Read more

Does one image say more than a thousands words?
outsidersIn many organisations written words are the lifeline of policies, practices, learning, reporting and communications. This is how we have been educated. But there are other, and sometimes more effective or relevant, ways of sharing lessons, ideas and results for example through visuals. A large group of people learn better through seeing things. Visuals like pictures and videos provide information that words and texts alone cannot, like attitudes, images of situations, spheres etc.. Read more

Monitoring is the new ‘evaluation’
puppetsonastringCurrently, we are supporting an advocacy organisation in The Netherlands in their efforts to monitor and learn from one of their programmes. Their problem was that they were so hooked up in their daily work that even a simple joint reflection on what has been achieved was postponed time and again. So they have beautiful plans but no time to look back, reflect and take stock. Sounds familiar, isn’t it?

Fortunately the team leader realised that they would loose many opportunities for learning and asked us to facilitate the monitoring process. This was also something relatively new for us: We were educated with the notion that monitoring is for internal learning and reflection and is best done by the organisations themselves, while outsiders (evaluators like us) conduct mid- and end-term evaluations. Read more

Wemos – Visual Outcome Mapping for the HRH Alliance

OLYMPUS DIGITAL CAMERAIn 2012 we supported Wemos – a Dutch Health NGO – in developing a 3-year planning and monitoring trajectory (2013-2015) for one of their networks ‘the Human Resources for Health Alliance’.  This alliance attempts to promote  – within the Netherlands – the WHO Code of Practice on ethical international recruitment of health personnel and to facilitate the strengthening of health systems.

Currently the Netherlands has a health workers surplus. However, in the next decade, the demand is expected to increase sharply as the population ages. This will have consequences for the number of health workers needed, their skills and competences. To avoid shortages of health workers in the future, policy measures have to be developed now. Read more

140107FooterUpdate.jpg

Posted in Methods, Power of images | Tagged | Leave a comment

Complexity season

complexityAlthough discussions on ‘complexity’ never really waned,  lately the topic seems to gain momentum and attention as never before. Just to name a few: Within the Outcome Mapping Learning Community an interesting discussion is going on about the use of Outcome Mapping for complex situation; ODI has released a background note on ‘Planning and strategy development in the face of complexity’, and last but not least, Ben Ramalingam will soon launch his new book ‘Aid on the Edge of Chaos: rethinking international cooperation in a complex world’. This launch is taking place on 6 november 2013 and can be attended online. Do not miss it, it is for sure going to be interesting.

One of the questions that was posed in the Outcome Mapping Learning Community was what the added value was of using Outcome Mapping in complex systems, or in more general terms ‘what is the use of planning when facing complexity’. The ODI background note, written by Hummelbrunner and Jones, answers this question (and other questions) quite nicely.

Hummelbrunner and Jones reason that a situation is complex when there is a) no clear advance knowledge on how to tackle the issue at hand, we do not know what interventions could be useful and/or we do not know how contexts and trends will influence the issue (uncertainty); and b) no agreement among actors about what to do (disagreement) (see also Kurtz and Snowden, 2003; Patton, 2011 etc). In addition they add a third characteristic: the distribution of knowledge and capacities [I would rather add the dimension of ‘systemic stability’ instead, but for the sake of space and time I  will come back to that discussion in another blog.]

They state that planning does not become obsolete in the face of complexity, but requires different approaches and formats. First of all they reason (as Rogers, 2008 and Wilson-Grau, 2013) that not all aspects of a situation are complex. Secondly, for the aspects of a situation that are complex, they reckon that plans should be adaptive to incorporate new developments, challenges and opportunities since a large part of the information needed is generated along the way.

Setting learning objectives may be as important as performance objectives, and interventions should in their view be designed to actively test hypotheses. This means short feedback cycles are needed (see also Patton, 2011) to regularly adapt plans. They suggest moving from static to dynamic planning, from prescriptive to flexible planning modes and from comprehensive to diversified planning.

In their note they present various planning approaches that are appropriate for complex aspects of situations, ranging from scenario techniques to assumption-based planning, adaptive strategy development and outcome mapping.

In my view this is an excellent background note, and very timely.  So let us embrace uncertainty and realise  that ‘without any planning on where to go, the chances are smaller we eventually get anywhere’. The focus however could move from ‘front-loaded’ planning to learning and adapting on the road.

Posted in Complexity | Tagged , , | Leave a comment

Viewing evaluation-videos underneath the mangotree

IMG_4781In the year 2012 I hardly had time to write blogs. One of the reasons was that I was engaged in a video-evaluation of the Community Agro-Enterprise Development Programme (CAEDP) in Uganda, with Trias Uganda and their partners Hodfa and Madfa (two district farmers associations) and Hofokam (micro-finance institution). Actually, the word video-evaluation is not entirely correct: I used video as a tool to harvest stories and for reflection, next to all kind of other tools like a one-page survey, focus group discussions, reflection sessions, document review, in-depth interviews etc. But since using video is still rather new in the world of evaluation, I use the word video-evaluation to emphasize the use of video.

IMG_4805During the CAEDP evaluation I used video to harvest stories -in this case Most Significant Change stories- told by farmers (men and women) on changes in food and income security, and told by CAEDP staff on capacity strengthening. In every community we visited  people selected two most significant change stories that would be put on video. Ofcourse we held -before leaving the community- a reflection session during which the selected MSC stories were shown and further reflected upon. Since none of the communities had electricity, we used a generator and that worked very well, even viewing underneath the mango tree went very well.

IMG_4855Viewing always provoked a lot of discussion and laughther! It was never a problem to get people’s reaction and reflection.

With the partners we (re-)viewed the stories told by the farmers as well. And again here the videos really facilitated the discussions and the reflections on progress.

Once again I have seen  how videos tremenduously support evaluations, the data gathering, the analyses and the reflections. I say ‘support’ evaluations since while the videos clearly showed how the programme contributed to income and food security, and provided qualitative information on impact and the process, the groups discussions and the one-page household survey showed what the quantitative impact was of the programme. The qualitative cannot show its glory without the quantitative, and viceversa.

Below you find a trailer of the farmer MSC stories.

Posted in Power of images | Tagged , , | Leave a comment

Systems thinking for evaluation once more!

Yesterday I attended the live webinar on ‘Systems Thinking for Equity-focused Evaluation’ organized by UNICEF, UNWOMEN, the Rockefeller Foundation and a bunch of other organisations and institutes that together formed MY M&E (a really awesome and informative website with all you ever wanted to know about

monitoring and evaluation).  

Yesterday’s seminar is just one of the very many webinars they organize on evaluation (there are series of webinars on equity focussed evaluations, on emerging practices in development evaluation, on developing capacities for country-led M&E systems, on country-led M&E systems, etc). Every other week or so you can attend -for free- lectures delivered by top-notch evaluators and methodologists like Michael Quinn Patton, Patricia Rogers, Bob Williams, Martin Reynolds, etc., and theoretically debate with them!  Yes, we live in a world of wonders!

Yesterday both Bob Willliams and Martin Reynolds, both reknowned system thinkers/ evaluators, gave short introductions on ‘Systems thinking for Equity-focused Evaluations’ for a global interactive classroom of nearly 100 participants.  Bob Williams briefly explained the key principles of ‘thinking systematically’: Inter-relationships, perspectives and boundaries. These are the three principles  many methods from the ‘system field’ have in common (Williams claimed there were about 1200-1300 methods in the system field!). Martin Reynolds dived into the cross-road between equity focused evaluation and one of the system methods: Critical Systems Heuristics (CSH).

Although Martin Reynolds presentation looked rather impressive, the complexity of his story combined with some technical disturbances, made his lecture hard to follow and understand. There was one topic though that strongly made my ears wide open! He was talking about steps in  CSH, starting with a ‘Making an Ideal Mapping of ‘Ought’ (sounds like a fairytale), followed by a Descriptive mapping comparing ‘Is’ with ‘Ought’, and other steps. This ‘Ideal Mapping of ‘Ought” is placed at the beginning of the whole exercise to provoke ‘blue sky thinking’ and letting people realize that reality is constructed, and can be re-constructed if we really want.

Why is this remark raising my interest? Well, if you have followed my earlier blogs, my queste is very much ‘How can evaluation contribute to re-construction? Or with other words, how could evaluation contribute to ‘system change”.  Bob Williams commented on Reynolds saying  that it made him think very much of organisational development and ‘vision’ building, and that is certainly true as well.

And all that brings me again to my eternal question: ‘How does system thinking contribute to evaluation practice?’ Are it the new clothes of the emperor, or can it really contribute something solid? Again, I come to the conclusion it is not so much about the tools and instruments from the systems field itself, but about the way of thinking. Think big, act small, and see our world as one big construction site, taking nothing for granted, and challenge the existing rules of the game. Let evaluation (either with or without system thinking) help us in contributing to the transformation of this world! 

Next week, 22 november Particia Rogers will provide a lecture and on 6 December 2011 it is Michael Quinn Patton’s turn! You are strongly advised to join!

Posted in System thinking | Tagged , , , , , | 1 Comment

Maria Joao Pires & Evaluation Practice

What has Maria Joao Pires -the reknowed pianist- to with evaluation practice? Well, in first instance for most people likely nothing, but in my reality, or better,  in my brains Maria makes a great connection with evaluation.

It is already quite some years ago that I came across a documentary on Maria Joao Pires. In this documentary you see her students struggle with some of the most complex piano pieces, intertwined with shots of the gorgeous surroundings of her farm in Portugal. Although I am not a connaisseur, I guess the students played – technically- superb and showed great virtuosity!

Despite their virtuosity, Maria was -most of the time- not impressed. I do not recall exactly what she said, but it was very much in the line of ‘Yes, technically you played the piece very well, but tell me, why should you play this piece? What did you add to this piece? How did you interpret it? I want you to put your soul into this piece! Otherwise the piece could be played by anybody else. What makes your piece different from the (same) piece student X is playing?’ (after writing I found some clips on Youtube, aughh, memory is a feable thing; anyway for the point of this blog is does not make much difference ;-)).

I am not saying that evaluators are piano players, but Maria has a point here, also for evaluators. As evaluators we need to have expertise (knowledge, technical, procedural and intellectual) as a ground rule. Without this expertise we are nowhere and not worth to be hired anyway. The question here is, is that enough? If we are virtuose in our expertise, does that suffice to be a ‘good’ evaluator?

During a diner gathering with other evaluators (organised by Evaluation 5.0), we discussed -part of- this topic.  A first additional qualification that good evaluators (in our view)  should have, we concluded, is proper behaviour. The outcomes of an evaluation are influenced by many different factors, but one we have a certain control over is our own behaviour. When we are directive, the evaluated very likely will be defensive or timid. When we are open, and are truly listening, the evaluated may be open too and share his or her mind.

But still, does this qualify us as good evaluators? Not necessarily. So do we need to put our soul into our work, just like the pianoplayers should according to Maria? I am not quite sure about that. But what we do need to do, is to be aware of our vision and motivation. What is it we are actually doing? Are we mainly earning money? Or do we want to contribute to a more just and sustainable world through our practice? Shouldn’t we first clarify our vision, and use our expertise and behaviour to contribute to that vision?

Not that I have my vision ready, but my, I could start trying and ask myself ‘why should I do this evaluation and not somebody else?’.

Posted in Methods, Roles | Leave a comment

Reading Michael Quinn Patton

Since I am writing an article on development cooperation and its M&E approaches, -and naturally to keep myself updated- I read Michael Quinn Patton’s latest book (2011) ‘Developmental Evaluation. Applying Complexity Concepts to Enhance Innovation and Use’, published by the Guilford Press, New York.

To avoid any confusion: Developmental evaluation has nothing to do in particular with development cooperation. ‘Developmental’ is referring to the approach that Patton’s follows. He writes (based on a quote of Pagels) “Evaluation has explored merit and worth, processes and outcomes, formative and summative evaluation; we have a good sense of the lay of the land. The great unexplored frontier is evaluation under conditions of complexity. Developmental evaluation explores that frontier.” (pg 1)

So Developmental evaluation is an evaluation approach dealing with complexity. However, various practitioners and evaluation professionals have start using Developmental evaluation within development cooperation.

The book is a very good read, with illustrative examples and hilaric and anecdotal situations. Patton describes for example how he has come up with Developmental evaluation.  He was working for a programme, using formative and summative evaluations as his repertoire, while the team did not want to come to a fixed model (summative evaluation) that could be tested during a summative evaluation. “We want to keep developing and changing”, they stated….. “Formative evaluation! Summative evaluation! Is that all you evaluators have to offer?”, one of the team members exclaimed. ‘Frustration, even hostility, was palpable in his tone.’……. “Well,” I said, seeking inspiration in my coffee cup, “I suppose we could do, umm, we could, umm, well, we might do, you know… we could try developmental evaluation!” (pg 3)

Developmental evaluation supports innovation development to guide adaptation to emergent and dynamic realities in complex environments, so it is quite different from regular evaluation approaches that focus more on control, and finding order in the chaos. Patton mentions five complex situations developmental evaluation is particularly appropriate for:

  1. Ongoing development in adapting a program, policy, or innovation to new conditions in complex dynamic systems;
  2. Adapting effective principles to a local context as ideas and innovations are taken from elsewhere and developed in a new setting;
  3. Developing a rapid response in the face of a sudden major change, exploring real time solutions;
  4. Preformative development of potentially broad impact scalable innovation;
  5. Major system change and cross-scale development evaluation.

A very important key feature of developmental evaluation is that it aims to contribute to social change and ‘nurture developmental, emergent, innovative, and transformative processes’. It is not so much about testing and refining a model (formative) or about a judgement (summative). It has a strong action-research component. With this, he is embarking on a rather new purpose of evaluation. Ofcourse, other types of evaluation aim to contribute to social change as well, but usually in an indirect way, exploring what works and what doesn’t. Developmental evaluation goes a step further, and aims to be part of the action, facilitating interventions that may work (or not).

Another, very much related key feature is the ‘closeness’ of the evaluator to a programme. From a person that is only visiting mid-term or at the end of a programme, a developmental evaluator is ‘continously’ present. Asking questions, probing, exploring with the programme, providing feedback in ‘real time’ in rather short feedback loops.

These two features are in my opinion, exactly what may be needed when dealing with complex situations. The situations are complex, unpredictable, multi-causal, non–linear, emergent and may need constant attention. Programme or project leaders (in my experience) are many times too involved in their management activities to also be able to remain reflective and ask critical questions themselves.  A developmental evaluator could provide help.

Overall, it is an inspiring and thought provoking book, and offers good guidance, without falling in the pittfall of blueprints or steps! Ofcourse it also raises questions. Especially when he is talking about system change, the fifth complex situation. Here he refers to the work of Bob Williams who uses a quite broad understanding of system as long as boundaries, perspectives and interrelationships are involved. In the end this means that almost all situations are ‘systems’ and that is what I see happening in debates.

I think (and correct me if I am wrong) the ‘system’ concept needs unraveling and  ‘demystification’. What is really necessary  is to challenge the institutional settings and its related norms, values, cultures etc that reproduce current unsustainable practices.   What could help this unraveling  is to borrow from concepts and theory used in innovation science.

In my article on development cooperation and M&E appproaches, this will be one of the topics I will further explore and discuss. It is going to be an inspiring and hot summer! I hope to write more about this topic in my next blog.

Posted in Methods, Roles, System thinking | Tagged , , , | Leave a comment