Golden Lab-logo.png

Events

SfN 2019 Satellite Symposium: Understanding nature's complexity with the UltraScope II
Oct
22
6:30 PM18:30

SfN 2019 Satellite Symposium: Understanding nature's complexity with the UltraScope II

Tuesday, October 22, 2019

6:30 –9:00 p.m.

Location: McCormick Place , N230b

Agenda

Chair: Thomas Pingel

Director Sales & Marketing,

LaVision BioTec/Miltenyi Biotec, Germany

Mapping the mesoscale structural plasticity

of the brain using iDISCO+ and ClearMap

Nicolas Renier

Group Leader, Laboratory of Structural Plasticity, ICM

Brain and Spine Institute Principal Investigator, Inserm,

Paris, France

Aggression reward and relapse:

from behavior to whole brain

Sam A. Golden

Assistant Professor, Department of Biological Structure,

University of Washington, Seattle, WA

View Event →

Sfn 2019 Dynamic Poster 325.12 (Simon Nilsson): Automated analysis of prosocial and aggressive behaviours using computer vision and machine learning
Oct
21
8:00 AM08:00

Sfn 2019 Dynamic Poster 325.12 (Simon Nilsson): Automated analysis of prosocial and aggressive behaviours using computer vision and machine learning

Authors

*S. R. O. NILSSON, J. J. CHOONG, S. A. GOLDEN;
Biol. Structure, Univ. of Washington, Seattle, WA

Disclosures

S.R.O. Nilsson: None. J.J. Choong: None. S.A. Golden: None.

Abstract

Background. Disrupted social behaviour is a fundamental shared symptom of many neuropsychiatric disorders, including drug addiction, depression and PTSD. However, freely behaving mice are seldom considered in the experimental design of preclinical models. This is predominately due to technical limitations preventing high-throughput, consistent, and unbiased scoring of freely-moving complex social interactions.
Method. We developed predictive classifiers of social and aggressive behaviors during mouse dyadic encounters. Single C57BL/6J mice were placed into the home-cage of a CD-1 mouse and interactions were recorded in variable lighting conditions and different resolutions/frame-rates. We used DeepLabCut (Mathis et al., 2018, Nat Neurosci) to generate a model that tracks eight body-parts on each of the two mice. We detected and reduced tracking inaccuracies and calculated a battery of diverse features (>100) based on body-part movements, distances, angles, sizes, and their deviations across rolling windows. We used the features in sklearn-based machine learning algorithms against multiple socially-relevant targets (e.g., aggressive events, anogenital sniffing, tail rattling, pursuit, lateral threat display) and we visualized the tracking and the predictions with OpenCV.
Results. Model predictions were in excellent or good agreement with manual human frame-by-frame scoring. For example, random forest implementations based on re-sampled data predicted aggressive and tail rattling events with more than 95% accuracy. The model generalized well to new recording conditions.
Conclusion. The data support that complex social behaviors can be readily quantified in an un-biased, fast, and automated way in unmarked individual mice using DeepLabCut for feature detection and our python modules for machine learning.

View Event →