Welcome
Contents
Welcome¶
Welcome! This is an interactive walkthrough of our publication “Neuroscout: a unified platform for generalizable and reproducible fMRI research”. Here you can visualize and re-run the code we used to create analyses and figures.
In the paper, we validate the Neuroscout platform by replicating established effects from cognitive neuroscience using automatically extracted features in over 30 naturalistic datasets. We then use meta-analysis to synthesize single dataset findings, resulting in more robust and generalize estimates. In addition, we also showcase more exploratory applications in two domains (face processing & natural language perception) that demonstrate how Neuroscout can be used to run more generalizable naturalistic fMRI research.
These analyses require specifying and estimating models at the level of individual datasets/tasks, and the outputs of these analyses are used as inputs to meta-analyses. This is reflected by the structure of the GitHub repository and of this book.
Re-running the analyses¶
The analyses follow the structure of figures in the manuscript. Most analyses require first running single dataset results (using Neuroscout) and then performing a meta-analysis (using NiMARE).
You can use this resource to simply visualize the analyses, or to re-run them and recreate the figure.
Note that, if you want to re-run meta-analyses, you do not need to re-run the dataset-level models. All statistical maps are uploaded to NeuroVault and can be downloaded using our meta-analysis code. If you wish to recreate and re-estimate dataset-level models, you will have to do so locally.
Cloud computingNotebookscan be re-run on the cloud using mybinder by clicking on therocketicon at the top of the notebook page. This is potentially the easiest option as you don’t have to install/download anything. You can also easily access all analyses at:
Software containersIf you want to re-run the
analysesand recreate thefigureslocally, you can use oursoftware containersto recreate a suitable environment. More precisely, you can obtain the correspondingDocker imagevia:docker pull neuroscout/neuroscout-paper:preprint
and then start it:
docker run -it --rm -p 8888:8888 neuroscout-paper
Subsequently, start a
jupyter notebook servervia:jupyter-notebook --port=8888 --no-browser --ip=0.0.0.0
which should provide you with a link that looks roughly like this:
http://127.0.0.1:8888/?token=d47d101bcb9d1233471aa4fb21240ff74d520887d4c0e0b6If you click on this link or copy-paste it in your browser, you should see a
jupyter notebook serverthat allows you to navigate these resources.local python environmentFinally, if you want to re-run
analysesand re-createfigureslocally withoutsotware containers, you can do so via using apython environment. For this to work, you initially need to download the repository with thenotebooksand other necessary files from GitHub.It is recommend to create a new
python environmentthrough e.g.condato avoid installation and dependencies issues. For example:conda create -n neuroscout_analyses python==3.8
which you then can
activateand after navigating to the downloaded repository, install the requiredlibrariesvia:conda activate neuroscout_analyses cd /path/to/neuroscout-paper pip install -r requirements.txt
(NB: you need to run the above code line-by-line and exchange the
/path/topart above to thepathyou downloaded theneuroscout-paperrepository to.)Subsequently, start a
jupyter notebook servervia:jupyter-notebook
which should provide you with a link that looks roughly like this:
http://127.0.0.1:8888/?token=d47d101bcb9d1233471aa4fb21240ff74d520887d4c0e0b6If you click on this link or copy-paste it in your browser, you should see a
jupyter notebook serverthat allows you to re-run theanalysesand re-create thefiguresthrough the dedicatedpython environmentcreated above.
Feedback & Questions¶
If you have any feedback, don’t hesitate to get in touch! We also support public reviews and comments through an hypothes.is plugin with, which you can interact by clicking on the arrow at the top right side of the page.