Cognition Network meeting, February 2011 Notes FAL OBS: Please note, that these are my personal unedited notes, and not an attempt to objectively summarize what was said! Henriette: - Nordic Conference in April, 2012. o - Who should give lectures? Relevant subjects for courses or seminars? o Concerning different groups. “Ground staff”, consultants, and psychologists. - Projects? o E.g. guidelines. - How to make the knowledge visible? - How to give way for a dialogue about the topic? - Publication? - Anything that need to be put on the web-page? The future of the network? Stream at NVC conference in Norway 2012. Why do we want to increase the knowledge on assessment of cognition? - To “box” the deafblind person? To find “proximal zone of development”? To see the learning potentials? To diagnose? To document (advances in) practice? Etc. - Accommodate with the “need” from the field. o Saskia: How is the case person understood by the staff? o Need for diagnosis? o Will the db person profit from db-program? This must be formulated. How do we increase our knowledge? Case analysis/assessment 1 - How do we get cases? - How do we assess cognition on the basis of the case material? - What kinds of case material do we need? o Videotaped interactions/activities. o Context material. o What questions do we need to ask to the staff/family? Discussing with other experts - Inviting theoretical researchers How do we understand cognition and assessment? Models - One coherent model o Interactive understanding Scales/observation cues and categories - How do we observe? Assessment - How to use the models and the “scales”? - What is the assessment situation/conditions (What are the conditions for the psychologist, having only few videos and meetings points). - How do we see improvement or differences in cognitive functioning? How do we go about to assess cognition? Guidelines How do we share knowledge? Lecturing - Seminars - Courses Publication - Guidelines - Good examples. Inviting Scholars and practitioners; Lecturing Case descriptions: How do we describe/assess? 2 Shared Knowledge: What is cognition? Jude (Tactile) Working Memory assessment Wood, Bruner, Ross, 1976: Scaffolding Wood, D., Bruner, J., & Ross, G. (1976). The role of tutoring in problem solving. Journal of child psychology and psychiatry, 17, 89-100. Errorless learning: reduce the degrees of freedom. Annika Case presentation List of abilities – Yes/No assessment based on tacit knowledge. [Maybe degree would be more suitable than yes/no] [Still a big issue to discuss how we map the cognitive concepts/abilities onto the actual (inter-)acts] Tuesday 8/2 Henrik Case - FAL: How to make a dynamic assessment 1. Starting point: Video recording. 2. First video analysis of the video to look for focus points, and to formulate initial focus point and hypotheses and – importantly – anti-hypotheses (e.g. arousal or reflection; attention problems or lack of interest). 3 a. Overview of different aspects of the situation (cognitive, emotional, arousal, participation, communicative topics etc.). b. Micro analysis of relevant parts of the interaction/communication. c. Activity analysis. 3. Getting an overview of the historical specificities regarding medical condition, sensory/perceptual functioning, and communication in general, as well as the history of communication with the individual communication partners. 4. Start programme for optimising the communication/situation/activity structure. a. Partner competencies i. How to communicate ii. How to do things together iii. What to do together 5. Setting up a specific situation that will display the cognitive (socio-emotionalcognitive) abilities in focus. a. Analysing the activity b. Preparing the partner c. Video recording 6. Preliminary cognitive assessment on the basis of video analysis. a. Same point as in 2. b. Focussed analysis of the interaction with regard to the focus points. 7. And so forth in (dynamic) circles. Dynamic Assessment Petri Partanen: Dynamic assessment www.dynamicassessment.com\id2.html *The underlying assumption of dynamic assessment is that all learners are capable of some degree of learning (change; modifiability). This contrasts with the underlying assumption of standardized psychometric testing that the learning ability of most individuals is inherently stable. Research with dynamic assessment has demonstrated that determination of the current levels of independent functioning of learners is far from a perfect predictor of their ability to respond to intervention. [Good!] 4 *The assessment is most often administered in a pretest-intervention-posttest format. *Dynamic assessment procedures vary on a number of dimensions, but primarily with regard to degree of standardization of interventions, as well as regarding content. There are four basic models that fit most of the procedures: 1. An open-ended, clinical approach that follows the learner, using generic problem solving tasks such as matrices (e.g., Feuerstein et al.). The approach to intervention focuses on principles and strategies of problem solution and aims to promote independent problem solving. 2. Use of generic, problem-solving tasks, but offering a standardized intervention. All learners are provided with the same intervention involving principles and strategies for problem solution (e.g., Budoff, Guthke, et al.). These approaches tend to focus on classification of learners, attempting to reduce the negative results of cultural bias. 3. A graduated prompting procedure where learners are offered increasingly more explicit hints in response to incorrect responses. All learners progress through the same menu of prompts or hints, varying with regard to the number of prompts required for task solution (e.g., Campione, Brown, et al.). 4. Curriculum-based approaches that use actual content from the learner's educational program, with interventions based on "best practices" of teaching. These can vary regarding degree of standardization of interventions (Lidz, Jepsen, et al.). These approaches focus on IEP development for learners with special needs. 1992 monograph 'Dynamic Assessment and Mediated Learning:Assessment and Intervention for Developing Cognitive and Knowledge Structures - An Alternative in the Era of Reform' is now available for downloading as a pdf file from www.mindladder.org. It is potentially problematic that the focus is on “problem solving” and on “prompting and correct/incorrect responses”. How to organise the network 1. Development network (as now) 2. Collaborative network 5 a. All professionals responsible for cognitive assessment processes b. Discussing cases, and developing assessment practice 3. Advisory (expert) panel a. Institution with formal advisory role Karin Case (the “spinner” from last time) Activity: Cutting holes bigger holes in boxes for putting smaller boxes inside. Diskussion [Long term memory “assessed” on the basis of anecdotic “evidence”; e.g. “He remember that he was here before – he recognises the environment/the activity”. It could be investigated by comparative video recordings/analyses of the primary, and the recall situations.] Teaching: Backwards Chaining 1. Imagine a future goal. 2. Break down the activity in decreasingly complex activities, and begin at the level that is manageable. Testing: 1. Start with describing the degree of complexity of activities that is enacted now. 2. Gradually expand the complexity until the manageable limit is found. Dorrit Case Video A: Music lesson “Drumming” Video B: End of hearing test. - Interactive play (turn taking) with sounds (clap, “prrt”, coughs, vocalisations, sound patterns) Discussion “Disrupted” turn takings (i.e. long overlaps) – Why? 6 Competing hypotheses: - Hypothesis A: It is on purpose. They enjoy making simultaneous sounds. - Hypothesis B: It is not on purpose. One or both of them are not respecting the turn taking patterns – why? o Hypothesis B1: Sound processing problems. He cannot distinguish that she is also making sounds o Hypothesis B2: Conversation structure problems. He does not respect the turn shift. o Hypothesis B3: Conversation structure problems. She does not respect the turn shift. If B2 or B3, why then, is it not respected? Wednesday Cognitive models + Scaling of parameters = Cognitive scales + Behavioural cues (fixed set of such cues or dynamically/contextually/semiotically interpreted) = Assessment “scales” Jude: When you do cognitive assessment, you are doing communication analysis, and while you do communication analysis, you do cognitive assessment. FAL: It is always a tricky thing to translate from “ordinary language” to “cognition language” or - and then back again! When you “translate” descriptions from everyday language to theory driven terminology (whether from neuro-cognition, cognitive semiotics, communication theory, or any other kind of psychological, sociological or ethnographical theory on human doing and thinking), the level of complexity in the description rises (that is why we have theory – to be able to describe complex matters). This complexity is very difficult to preserve, when you translate “back” to everyday language. Jude: Assessment categories: 7 - Verbal understanding - Perceptual organisation - Working memory - Processing speed Learning abilities is what we measure. Brainstorm: What is good about the network? - Our way of discussion - A safe environment What is the next step? - Products! - Clarifying/developing concepts/knowledge - Pass our knowledge on to others - Writing stuff together o Saskia: Preliminary title of our “guidelines”: Dynamic assessment of cognition in interaction. - Analysing/describing video together - Trying out in practice - Discuss cases Next meeting: 22-24 August 8