Product matrix

From cg-parade ilriwikis

Participatory Agricultural Research: Approaches, Design and Evaluation

Oxford, 9-13 December 2013

Product Matrix


  1. Roadmap for monitoring, evaluation and learning of participatory agricultural research (Marina)
  2. The contribution of Participatory Agricultural Research (PAR) to achieving the SLOs (Marc)
  3. The PAR community of practice - what will it look like
  4. PAR tool typology (Tracy)
  5. PAR state of the art review / conceptual framework (Tracy)
  6. Strategy for capacity development linked to participatory agricultural research (Saa)
  7. Strategy paper for advocacy on participatory agricultural research (Nicole)
  8. Roadmap for the delivery of an initial framework on participatory agricultural research (Katherine and Beth)
  9. PAR value proposition (from the workshop)
  10. PAR profiles - of tools and approaches (Terry)
  11. PAR personal journeys - how PAR changes us (Terry)

Blogposts from the meeting

Blog piece on PARADE, what I got out of it (Jo) Presentations at the expert meeting



The writeshop 'roadmap" (full size inage)

media type="custom" key="25318754"




The final output from the clustering activity on Thursday:

Framework Community of practice Taskforce Capacity Development Advocacy, lobbying, comms, outreach M&E State of the Art
A Framework for integrated demand driven systems research by Q1 2015 Product: community of practice in each region actively monitored Product: training module on PAR, state of the art methods, approaches, ethics Develop statement on how PAR contributes to achieving SLOs Checklist for monitoring and evaluating indicators Study on how participatory ag research has been successful and how it will this help us meet SLOs
Product: strategy paper and action plan Community of practice annual meeting with yammer, wiki by Q1 2014 Product/action
Training and capacity development on Par (R4D) processes, management approaches
Side event at Fund Council on participatory agricultural research in Nov 2014 Product/action
Designing common ME frameworks and good practices
Use as accountability mechanism explicit to PAR
Product/action
State of the art review
5 year plan
User perception, needs assessment of PAR Terms of reference for PAR taskforce Funding for higher degree with continued support on return Action 5 year planning
Consensus conference on PAR
Process
Develop appropriate evaluation frameworks and methods
An outline for a systematic review paper on successes and failures of PAR
Strategy action plan Action: building alliances within CG and with partners and stakeholders to champion PAR More research money directly to NARs for them to decide how to spend Share/advocate lobby for PAR results M&E tools for PAR programs within our CRP Product/action
Collective demonstration products
Fergus
Floriane
Nils
Marina
Cross CRP working group: start with system CRPS on PAR by Q1 2014 CG support university faculty to embed in ministries for x years Explore opportunity for special issue on PAR Toolbox for ME A definition of typology on PAR
Identification of enabling conditions for transdisciplinary approaches A manifesto describing how the process will work Murat
Clovis
Valentine
Marina
NIls
Progress/action
Document PAR
Evidence base
Product/action
Incentives for PAR with organizations
Nicole
Mark
Toobox for facilitating PAR
Promote better interaction among and between NARs Nicole
Saa
Amet
Regis
Marc
Tracy
Geraldine
Adrian
Tim
Martha




Nils Ferrand and colleagues also proposed some follow up activities ([[1]])

#TOC-End-Users-needs-assessmentEnd-Users' needs assessment

Aim Getting the final target stakeholders of PAR, mainly farmers, citizens and local CBOs, to be exposed to the PAR design and use issues, and be able to express their own requirements and criteria for designing and implementing PAR.
How 0. organize means
1. Agree on a common international methodology combing "push" and "pull" with very adequate methods, like maybe drama, role playing games, movie showing, etc
2. select target groups worldwide, using the partners network
3. Implement globally the assessment
4. Gather and analyse results
5. Feedback results to all participants in communities and publish results
Who * PAR experts for the method
* Any PAR implementers
* Communities
Cost 4 PM for the managing group + 2PM per CS for local management + 3 1/2 day per participant in communities
Risks Difficult balance push/pull - Method design impossible - Limited extension - No added value of results
When 2014-1015

#TOC-High-level-Consensus-conference-on-PARHigh-level Consensus conference on PAR

Aim A decision makers consensus conference on PAR --> would bring high level decision makers (CG board, donors representatives, academic top leaders) to assess the pros and cons of PRA through an instructed and structured debate in front of a pool of experts
How 0. agree and organize means
1. Get top level decision makers to agree on the principles
2. Select the panel (high level decision makers)
3. Agree on the process and content, agenda
4. Select experts and "witnesses" from case studies
5. Organize logistics
6. Make and animate and monitor the consensus COnference
7. Write conclusion and disseminate through public events
Who * consensus conference experts for the method and facilitation
* PAR experts for the expert panel
* High level decision makers / policy makers interested in PAR vs non PAR
* Other "witnesses"
Cost 3 PM for organization + 6 PDs for all participants + 2 PM for post-valuation
Risks decision makers refuse to engage - poor debate - no conclusive outcome
When 2015-1016

#TOC-Systematic-experimental-assessment-of-PAR-processes-and-valueSystematic experimental assessment of PAR processes and value

Aim A set of very structured controlled experiments to test and compare the process, outcomes and constraints for different PAR approaches, tools, methods, in different contexts, using the principles of randomized trials + policy experiments & field experiments
How 0. agree and organize means / select partners and participants
1. select target PAR methods / tools / issues
2. decide methodology for experiments with external experts, including the monitoring protocol and definition of value assessment
3. design and pre-test methods for the experiments
4. select and organize local test contexts
5. implement the local tests including results collection
6. post-process and synthesize results per experiments
7. aggregate and compare
8. publish
9. revise, repeat, extend
Who * expert scientists in experiments, social or biophysical
* PAR experts - methods designers and implementers
* local partners in Case studies
Cost high per case, per experiment
Risks diversity and multiplicity of treatments - ethics - complexity of methodological design - relation between controlled vs. realism - multiple value assessment
When 2015-1018

Systematic comparative protocols of existing cases

Aim Reviewing and post-evaluating existing case studies using a common method
How 0. Organize means and partners
1. decide list of target methods / cases
2. decide methods for evaluation
3. select target case and organize partnership
4. go there and implement ex-post evaluation
5. collect and process results
6. analyze
7. publish
Who * PAR experts
* M&E experts
* PAR cases implementers / holders
* local case studies partners
Cost 4 PM for structuring + 2 PM per case for evaluation + 6 PM for synthesis
Risks lost memory - reconstruction - difficult to agree on common M& E - unwilling partners for ex-post evaluation - complexity and diversity of processes - disentangling factors
When 2014-2016

General meta-PAR workflow

Aim A generic methodological workflow for thinking, choosing and implementing methods with and for stakeholders from the different perspectives --> help both scientists and policy makers to consider why and how PAR should or shouldn't be used
How 0/ organize means and partners
1. Revise past experiences (cf other PARADE actions)
2. revise litterature on the same
3. build an abstract general model of PAR & R4D processes with actors, info flows, conditions
4. build an operational decision support tool (computer based or not) based on this model
5. test this tool and evaluate it
6. disseminate through demonstration exercices with real users
Who * experts in PAR
* process modellers
* decision support systems developers
* guinea pigs users
Cost 12 PM for model and framework + 6 PM for tool development + 6 PM for test
Risks complexity of the model - acceptance of the decision trajectories - user friendliness / compliance of the decision tool - adoption
When 2014-2016

#page-comments