Tutorials¶
Installation¶
Installation of the Element requires an integrated development environment and database. Instructions to setup each of the components can be found on the User Instructions page. These instructions use the example workflow for Element DeepLabCut, which can be modified for a user's specific experimental requirements. This example workflow uses four Elements (Lab, Animal, Session, and DeepLabCut) to construct a complete pipeline, and is able to ingest experimental metadata and run model training and inference.
The DeepLabCut (DLC) website has a rich library of resources for downloading the software and understanding its various features. This includes getting started with their software (see links below).
Steps to run the Element¶
The Element assumes you:
- Have downloaded and installed DLC.
- Have a DLC project folder on your machine. You can declare a project either from the DLC GUI or via a terminal.
- Have labeled data in your DLC project folder. Again, this can be done via the GUI or a terminal.
With these steps in place, you can then use the materials below to start training
and pose estimation inferences. Training starts by configuring parameters in the
train
schema, and launching training in the ModelTraining
table. When you're happy
with the state of a model, you can insert it into the Model
table, and pair it with
videos to trigger pose estimation inferences via the PoseEstimationTask
table
in the model
schema. See Element Architecture
for a full list of table functions.
Videos¶
The Element DeepLabCut tutorial gives an overview of the workflow files and notebooks as well as core concepts related to DeepLabCut.
Notebooks¶
Each of the notebooks in the workflow (download here steps through ways to interact with the Element itself. For convenience, these notebooks are also rendered as part of this site. To try out Elements notebooks in an online Jupyter environment with access to example data, visit CodeBook. (DeepLabCut notebooks coming soon!)
- Data Download highlights how to use DataJoint tools to download a sample model for trying out the Element.
- Configure helps configure your local DataJoint installation to point to the correct database.
- Workflow Structure demonstrates the table architecture of the Element and key DataJoint basics for interacting with these tables.
- Process steps through adding data to these tables and launching key DeepLabCut features, like model training.
- Automate highlights the same steps as above, but utilizing all built-in automation tools.
- Visualization demonstrates how to fetch data from the Element to generate figures and label data.
- Drop schemas provides the steps for dropping all the tables to start fresh.
07-NWB-Export
(coming soon!) will describe how to export into NWB files. For now, see below- Alternate Dataset does all of the above, but with a dataset from DeepLabCut.
Data Export to Neurodata Without Borders (NWB)¶
The export/nwb.py
module calls DLC2NWB to
save output generated by Element DeepLabCut as NWB files.
The main function, dlc_session_to_nwb
, contains a flag to control calling a parallel
function in
Element Session.
Before using, please install DLC2NWB
1 |
|
Then, call the export function using keys from the PoseEstimation
table.
1 2 3 4 5 6 7 |
|
Here, CONDITION
should uniquely identify a session and SESSION_KWARGS
can be any of
the items described in the docstring of element_session.export.nwb.session_to_nwb
as a dictionary.
As DLC2NWB does not currently offer a separate function for generating PoseEstimation
objects (see ndx-pose), the current solution is to
allow DLC2NWB to write to disk, and optionally rewrite this file using metadata provided
by the export function in Element Session.