Jan Edwards – some thoughts on decision support systems and the

advertisement
Some thoughts on decision support systems
and the provision of advice to farmers
Jan Edwards
District Agronomist, Cowra
GRDC’s premise
 The advice currently given to farmers by advisers is:
– inconsistent
– generic (rather than paddock specific)
– not the best available
– not rigorous enough
– more art than science
 GRDC want to lift the standard of advice
 Will a decision support tool help:
– diagnose why crops aren’t reaching their water limited yield
– assist GRDC make research investment decisions.
Why would advice differ?
 Because advisers:
– use their training
– use past experience
– use trial data
– extrapolate from experience and data
– use rules of thumb
– use DSS
– use things learnt from training workshops
– ask other agronomists
– seek help from experts
– make educated guesses
 This makes it individual
Is it the only reason outcomes differ?



Giving advice is an exchange
– Farmers are not the same
• age, debt level, family, education
• stage of life and family composition are important
– Farms are complex partnerships involving many people
Decision making is a very human thing
– Profit is not necessarily the main driving force
– It is also social (not just technical)
• A lot of adoption occurs when an idea or practice becomes part of ‘good
farm management’
Farmers construct their own knowledge
– Scientific advice
• is evaluated against other information, knowledge and beliefs
• does not automatically have credibility and legitimacy
• is used when consistent with their own understanding
• is often adapted to fit their own world view
Diagnostic agronomy
 What is the limitation to consistent advice?
– Not enough information
– Too much information
– Confusing information
– Not enough time to integrate / summarise information
– Not enough strategic, whole farm thinking
– Complex situations
 Diagnostic agronomy requires very effective monitoring
– Currently this is done on farm, mostly by farmers, and is very
subjective
 There are only so many things it could be
– More likely it is a “combination of factors”
– Most of which you don’t need a model to figure out
 What do you do after the problem is diagnosed?
Benchmarking performance
 We have experience
– Collation of data over many seasons and paddocks
• Crop production groups
• CropCheck database
• TopCrop
 When analysed:
– there was rarely a single identifiable reason for good
crop performance
– ‘rules’ for success were rarely to do with rates and dates
• so not easy to transfer
 Didn’t deal with the ‘so what next?’ question
Can decision support systems help?
 Computer based
– Simulation models
• GrassGro
• APSIM
– Predictive models
• WheatMan
• SowMan
– Websites
• CropMate
– Excel based
• Salvaging crops
calculator
• VarietyChooser
 Paper based
– Rules of thumb
• Nitrogen budgeting
workshop
– References
• Wheat growth and
development book
• Sowing guides
• AgFacts
 GRDC review has identified
about 60 DSS
Decision making and DSS
 The promise of DSS
– they can organise and process the dazzling volume of crop
management information
 Farming systems are complex
 DSS tend to be deep but narrow
– Models are aimed at biophysical accuracy
• at the expense of realism in management?
– Also tend to focus on
• Tactical decisions (which crop, rates and dates)
– numerous, have lesser consequences and lower priorities
– less important than strategies?
– Rather than:
• Operational decisions (sowing, spraying, harvesting)
• Strategic (pasture to crop mix, land purchases)
Developing DSS
 They take time and lots of money to develop
 They need a champion, someone who believes in the concept
– The more they believe
• the more likely it is to be built
• the less likely it is that anyone else will understand its purpose or
be as passionate about using it
 The more they mimic biological systems:
– the more data they need
– the more complex they are
– the harder they are to drive
 The more complex the easier to create nonsense
 They need updating or they get old very quickly
– they have to have current terminology and/or variety names or
people don’t relate to them
Using DSS
 Tools like APSIM are fabulous
– they are the only way to handle very complex questions
– but without continued basic research they have limitations
• row spacing, lupins in NSW, frost, disease
 ‘Black box’ models put people off
– they instinctively want to change a value even if it is not relevant to
do so
 Excel based spreadsheets are more flexible and adaptable
 Simple systems
– to teach you a method or a principle which you then formulate into
your own rule of thumb
– You don’t have to use it every time you make a decision
DSS have a history of failed adoption





Farmers make good decisions without
DSS
Lack of demand for DSS
They are built by the people who will use
it
– Reflect the decision making style of
the user
– Don’t necessarily match users needs
– Users generally not involved in
development
The analysis is inferior to experienced
human judgement
– Often no link to what can be done
about the situation?
DSS focus on operational or tactical
decisions











They can take a lot of time to learn
– Unless they are used often you
forget how
Tedious data entry
Complicated set up process
Lack of software support post-release
Technical interpretation
Need large amounts of input data
– some of which the user can’t get
easily
Lack of computer use for management
Time constraints
Complex to use
Lack of local relevance
Poor marketing
• Hayman (2002)
• Robinson and Freebairn (2000)
Precision

Models give the illusion of precision
– The added precision may make no difference to the decision.
– There may be less to be gained from being precisely right with detailed simulation
• Especially when you factor in the risk of being precisely wrong


It may be better to be vaguely right than precisely wrong
to solve the whole problem roughly than part of the problem extremely well
• Malcolm (1994)
Decision makers are often confronted with situations that are:
– Obvious
• large response to N fertiliser; or very suboptimal sowing time
– Marginal
• gain in N fertiliser equal to the cost; or small differences if sown a week earlier or
later


Better decisions don’t necessarily follow from:
– Better information
– More information
What makes a DSS successful









Simple to use
Relevant
Able to be localised
Effective
Low cost
User-friendly
End-users involved in the development
Very clear purpose
Aimed at learning
– Modelling and simulation contribute to learning and knowledge
• but there is a weak link between modelling / simulation and
change
• Robinson and Freebairn (2000)
Other options?
 DSS
– More time spent aggregating
data
– More literature reviews
– Publish summarised ‘guides
to …’
– Less concentration of trial
results, more concentration
on updating BMP
– More appropriate data
management
– Better organised website
– Rules of thumb
– Decision trees
– Paper-based tools
 Consistent advice
– Accreditation for
agronomists, advisers and
consultants
– Membership of AACA or
similar
– Regular re-fresher training
– Training for farmers
– Less focus on individual trials
at the Updates
Download