FUNWorkshopLocomotio.. - Faculty for Undergraduate Neuroscience

advertisement
VideoHacking: Automated tracking and quantification of locomotor behavior with
open-source software and off-the-shelf equipment
Ian Woods, Assistant Professor, Ithaca College • iwoods@ithaca.edu • 1 August 2014
Behavior is the ultimate outcome of nervous system function. The analysis and
quantification of animal behaviors, therefore, is a powerful way to assay the activity
of the nervous system. Our goal today is to quantify locomotor behavior using
free software and ubiquitously available hardware.
Numerous hardware systems and software applications are available to quantify
animal behaviors. These are typically difficult to implement widely in an
undergraduate laboratory setting because of their high cost, need for extensive
operator training, and limited customizability. In this lab, we will use the video
acquisition devices in smartphones, webcams, and laptops to detect the timing,
magnitude, and structure of locomotor behaviors in larval and adult zebrafish. Video
data from these devices will be analyzed using Python, a powerful open-source
scripting language that can be implemented in any computer system. Specifically,
we will take advantage of Python’s ability to interact with OpenCV, a freelyavailable suite of tools for computer vision and image analysis, and Matplotlib, a set
of tools designed to create stunning visualizations of data. Together, Python,
OpenCV, and Matplotlib offer a free alternative to commercial (and often expensive)
programs such as MATLAB.
These techniques will be explored via tracking locomotor behaviors of zebrafish
that have been exposed to drugs that interfere with or modulate nervous system
function.
Procedure:
1) Divide into 4-6 groups, and, choose whether you’d like to use adult or larval fish
(note that we only have equipment available for one group to study adult fish). It
would be optimal to have at least one iPhone in each group. If you’re feeling brave
(and have a laptop), one person can try downloading the necessary software. This is
OPTIONAL at this point, as we can do the analyses on my computer. If you do want
to try the software installation, see Appendix D. Macs are a bit easier than PCs, but
be prepared for a bit of frustration on either platform.
2) Choose which compound you’d like to explore from the table on the next page.
3) Choose how many conditions you’d like to explore (two is probably best at this
point), and how many fish you’d like to expose to each condition. We probably have
enough larvae for each group to use about 24 fish. The group that studies adult fish
can use 12 individuals, and is likely limited to studying the effects of ethanol.
4) Choose a format for arraying your larvae: we have 24-well, 48-well, and 96-well
dishes available.
5) Using water and the pipettors provided, measure the volume of each of the wells
in your dish. Determine the total volume you will need to fill all wells for each
condition in your experiment. For example, if you have 24 fish in two groups, you
would need (24/2 * well volume) total for each condition. Add about 10% extra
volume to make sure that you have enough.
6) Measure out this volume of fish system water (+0.1% DMSO) into a container,
and add the drug from the stock solutions according to the table below. Gloves
would be a good idea here.
Table of available compounds:
Drug
Ethanol
Strychnine
Caffeine
Glycine
Serotonin
Clonidine
Carbamyl-betacholine
Concentration
100%
20 mM
10 mg/mL (500 mg in 50 mL)
50 mM (185 mg in 50 mL)
4 mM (42.5 mg in 50 mL)
2 mM (27 mg in 50 mL)
2 mM (19.6 mg in 50 mL)
Dilution for reasonable dose
~1250X (for B.A.C. of 0.08%)
~100X
~200X
~100X
~200X
~100X
~100X
7) Fill the appropriate wells with your solutions.
8) Using a transfer pipette (or a net for adults), transfer one fish into each well.
Minimize the amount of water transferred with the fish. Use a different pipette for
each treatment.
9) Wait > 10 minutes for the fish to acclimate to their new conditions, and to allow
for drug penetration.
10) While you are waiting, figure out how you are going to film your fish. Some ideas
are pictured in the appendix. The major items you will want to think about are (1)
Alignment: the edges of the plate must be parallel to the edges of the video; use the
landmarks on the phone screen to help you align your fish; and (2) Glare and
shadows: sources of motion artifacts include light shining off of moving water, and
shadows moving across the plate; try to shield your fish from direct ambient light.
11) After your fish have calmed down a bit, begin acquiring video. The maximum
length of video that can be emailed from the iPhone is about 1 minute. You can take
multiple videos; we’ll stitch them together. Email your video(s) to me at
iwoods@ithaca.edu. One minute is probably sufficient for illustration purposes, but
it is OK if you would like to record more. If you have a long video (>1 min) that you
would like to use, we can transfer it directly to my computer.
Appendix A - Overview of analysis procedure: Locomotion is detected by
quantifying pixel differences in successive video frames:
Figure 1. Two successive frames from a video are shown. After subtracting the
pixels in image ‘Frame n’ from those in ‘Frame n+1’, the difference between the
frames can be visualized in the ‘Difference Image,’ and the magnitude and timing of
each movement can thus be recorded.
Figure 2. Regions of Interest. Regions of
interest (ROI) can be denoted on an
image from the video. The analysis
program treats each ROI independently.
Here we have twelve adult fish, each in a
separate container. The timing and
magnitude of each fish is recorded
simultaneously but independently of the
other fish.
Figure 3. Output format. The output of the
analysis consists of an array where each row
records the pixel difference between successive
video frame. Here, the individual in ROI #2
moved for about half a second, while no
movement was detected in the other ROIs.
Appendix B – ideas for filming fish
Figure 4. The phone is held over the
dish of larvae or arrayed adults by a
bridge connected by two supports.
The image is centered on the screen
of the phone, and aligned carefully.
Appendix C – Overview of computational pipeline
1) roiSelect.py: manually draw rectangular region(s)-of-interest (ROI) on an image
from the video feed. Several rectangles can be drawn, or a single one can be drawn
over a regularly-gridded group of regions, which can then be partitioned using
gridROI.py below. Usage: python roiSelect.py [opt: videoFile]. If no video file is
specified, an image is acquired from the computer’s webcam.
2) gridROI.py: partition a single rectangle drawn in roiSelect.py to a regularlyspaced grid of individual regions. Usage: python gridROI.py #rows #columns
3) deltaPix.py: analysis of video feed. Pixel differences in successive video frames
are recorded from each ROI. Usage: python deltaPix.py [opt: videoFile]. If no
video file is specified, live video feed is initiated on the computer’s webcam and is
analyzed in real time. Data is saved in .npy (numpy) format.
4) analyzeMotion.py: plotting of movement data. Various choices are available for
visualizing the data. Each choice can be ‘commented’ on or off in the python code.
Usage: python analyzeMotion.py
Appendix D – Installation of Software (Python, Numpy, Matplotlib, OpenCV)
1) Mac: python, numpy, and matplotlib are already there if you have OS 10.9. To
install OpenCV, follow the instructions here:
http://www.jeffreythompson.org/blog/2013/08/22/update-installing-opencv-on-mac-mountain-lion/
2) PC: To install it all, follow the instructions “Installing OpenCV from Prebuilt
Libraries here:
http://docs.opencv.org/trunk/doc/py_tutorials/py_setup/py_setup_in_windows/py_setup_in_windows
.html
Matlotlib will complain about needing additional software. You can find all of these
here: http://www.lfd.uci.edu/~gohlke/pythonlibs/
Download