Skip to content

Commit

Permalink
Merge pull request #1 from nilearn/master
Browse files Browse the repository at this point in the history
getting up-to-date
  • Loading branch information
pbellec authored Oct 14, 2018
2 parents b4f71e5 + 149dde2 commit 93016ff
Show file tree
Hide file tree
Showing 166 changed files with 5,344 additions and 1,786 deletions.
63 changes: 63 additions & 0 deletions .circleci/config.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
version: 2

jobs:
build:
docker:
- image: circleci/python:3.6
environment:
DISTRIB: "conda"
PYTHON_VERSION: "3.6"
NUMPY_VERSION: "*"
SCIPY_VERSION: "*"
SCIKIT_LEARN_VERSION: "*"
MATPLOTLIB_VERSION: "*"

steps:
- checkout
# Get rid of existing virtualenvs on circle ci as they conflict with conda.
# Trick found here:
# https://discuss.circleci.com/t/disable-autodetection-of-project-or-application-of-python-venv/235/10
- run: cd && rm -rf ~/.pyenv && rm -rf ~/virtualenvs
# We need to remove conflicting texlive packages.
- run: sudo -E apt-get -yq remove texlive-binaries --purge
# Installing required packages for `make -C doc check command` to work.
- run: sudo -E apt-get -yq update
- run: sudo -E apt-get -yq --no-install-suggests --no-install-recommends --force-yes install dvipng texlive-latex-base texlive-latex-extra
- restore_cache:
key: v1-packages+datasets-{{ .Branch }}
- run: wget https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda.sh
- run: chmod +x ~/miniconda.sh && ~/miniconda.sh -b
- run:
name: Setup conda path in env variables
command: |
echo 'export PATH="$HOME/miniconda3/bin:$PATH"' >> $BASH_ENV
- run:
name: Create conda env
command: |
conda create -n testenv python=3.6 numpy scipy scikit-learn matplotlib pandas \
lxml mkl sphinx pillow pandas -yq
conda install -n testenv nibabel -c conda-forge -yq
- run:
name: Running CircleCI test (make html)
command: |
source activate testenv
pip install -e .
set -o pipefail && cd doc && make html-strict 2>&1 | tee ~/log.txt
no_output_timeout: 5h
- save_cache:
key: v1-packages+datasets-{{ .Branch }}
paths:
- $HOME/nilearn_data
- $HOME/miniconda3

- store_artifacts:
path: doc/_build/html
- store_artifacts:
path: coverage
- store_artifacts:
path: $HOME/log.txt
destination: log.txt




2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,5 @@ tags
*.tgz

.idea/

doc/themes/nilearn/static/jquery.js
26 changes: 16 additions & 10 deletions .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -20,27 +20,33 @@ matrix:
include:
# Oldest supported versions (with neurodebian)
- env: DISTRIB="conda" PYTHON_VERSION="2.7"
NUMPY_VERSION="1.8.2" SCIPY_VERSION="0.14"
SCIKIT_LEARN_VERSION="0.15.1" MATPLOTLIB_VERSION="1.3.1"
PANDAS_VERSION="0.13.0" NIBABEL_VERSION="2.0.2" COVERAGE="true"
NUMPY_VERSION="1.11.2" SCIPY_VERSION="0.17"
SCIKIT_LEARN_VERSION="0.18" MATPLOTLIB_VERSION="1.5.1"
PANDAS_VERSION="0.18.0" NIBABEL_VERSION="2.0.2" COVERAGE="true"
# Oldest supported versions without matplotlib
- env: DISTRIB="conda" PYTHON_VERSION="2.7"
NUMPY_VERSION="1.8.2" SCIPY_VERSION="0.14"
SCIKIT_LEARN_VERSION="0.15"
NUMPY_VERSION="1.11.2" SCIPY_VERSION="0.17"
SCIKIT_LEARN_VERSION="0.18"
# Fake Ubuntu Xenial (Travis doesn't support Xenial yet)
- env: DISTRIB="conda" PYTHON_VERSION="2.7"
NUMPY_VERSION="1.11" SCIPY_VERSION="0.17"
SCIKIT_LEARN_VERSION="0.17"
NUMPY_VERSION="1.13" SCIPY_VERSION="0.19"
SCIKIT_LEARN_VERSION="0.18.1"
NIBABEL_VERSION="2.0.2"
# Python 3.4 with intermediary versions
- env: DISTRIB="conda" PYTHON_VERSION="3.4"
NUMPY_VERSION="1.8" SCIPY_VERSION="0.14"
SCIKIT_LEARN_VERSION="0.15" MATPLOTLIB_VERSION="1.4"
NUMPY_VERSION="1.11.2" SCIPY_VERSION="0.17"
SCIKIT_LEARN_VERSION="0.18" MATPLOTLIB_VERSION="1.5.1"
# Most recent versions
- env: DISTRIB="conda" PYTHON_VERSION="3.5"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
# FLAKE8 linting on diff wrt common ancestor with upstream/master
LXML_VERSION="*"
- env: DISTRIB="conda" PYTHON_VERSION="3.7"
NUMPY_VERSION="*" SCIPY_VERSION="*" PANDAS_VERSION="*"
SCIKIT_LEARN_VERSION="*" MATPLOTLIB_VERSION="*" COVERAGE="true"
LXML_VERSION="*"

# FLAKE8 linting on diff wrt common ancestor with upstream/master
# Note: the python value is only there to trigger allow_failures
- python: 2.7
env: DISTRIB="conda" PYTHON_VERSION="2.7" FLAKE8_VERSION="*" SKIP_TESTS="true"
Expand Down
8 changes: 8 additions & 0 deletions AUTHORS.rst
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,8 @@ particular:
* Andres Hoyos Idrobo
* Salma Bougacha
* Mehdi Rahim
* Sylvain Lanuzel
* `Kshitij Chawla <https://github.com/kchawla-pi>`_

Many of also contributed outside of Parietal, notably:

Expand All @@ -43,6 +45,8 @@ Mehdi Rahim, Philippe Gervais where payed by the `NiConnect
project, funded by the French `Investissement d'Avenir
<http://www.gouvernement.fr/investissements-d-avenir-cgi>`_.

NiLearn is also supported by `DigiCosme <https://digicosme.lri.fr>`_ |digicomse logo|

.. _citing:

Citing nilearn
Expand All @@ -69,3 +73,7 @@ guarantee the future of the toolkit, if you use it, please cite it.
See the scikit-learn documentation on `how to cite
<http://scikit-learn.org/stable/about.html#citing-scikit-learn>`_.


.. |digicomse logo| image:: logos/digi-saclay-logo-small.png
:height: 25
:alt: DigiComse Logo
8 changes: 4 additions & 4 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -40,13 +40,13 @@ The required dependencies to use the software are:

* Python >= 2.7,
* setuptools
* Numpy >= 1.6.1
* SciPy >= 0.14
* Scikit-learn >= 0.15
* Numpy >= 1.11
* SciPy >= 0.17
* Scikit-learn >= 0.18
* Nibabel >= 2.0.2

If you are using nilearn plotting functionalities or running the
examples, matplotlib >= 1.1.1 is required.
examples, matplotlib >= 1.5.1 is required.

If you want to run the tests, you need nose >= 1.2.1 and coverage >= 3.6.

Expand Down
47 changes: 0 additions & 47 deletions circle.yml

This file was deleted.

10 changes: 0 additions & 10 deletions continuous_integration/circle_ci_test_doc.sh

This file was deleted.

7 changes: 4 additions & 3 deletions continuous_integration/install.sh
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,8 @@ print_conda_requirements() {
# - for scikit-learn, SCIKIT_LEARN_VERSION is used
TO_INSTALL_ALWAYS="pip nose"
REQUIREMENTS="$TO_INSTALL_ALWAYS"
TO_INSTALL_MAYBE="python numpy scipy matplotlib scikit-learn pandas flake8"
TO_INSTALL_MAYBE="python numpy scipy matplotlib scikit-learn pandas \
flake8 lxml"
for PACKAGE in $TO_INSTALL_MAYBE; do
# Capitalize package name and add _VERSION
PACKAGE_VERSION_VARNAME="${PACKAGE^^}_VERSION"
Expand All @@ -61,10 +62,10 @@ create_new_conda_env() {

# Use the miniconda installer for faster download / install of conda
# itself
wget http://repo.continuum.io/miniconda/Miniconda-latest-Linux-x86_64.sh \
wget http://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh \
-O ~/miniconda.sh
chmod +x ~/miniconda.sh && ~/miniconda.sh -b
export PATH=$HOME/miniconda2/bin:$PATH
export PATH=$HOME/miniconda3/bin:$PATH
echo $PATH
conda update --quiet --yes conda

Expand Down
29 changes: 26 additions & 3 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,30 @@

import sys
import os
import shutil
import sphinx
from distutils.version import LooseVersion

# jquery is included in plotting package data because it is needed for
# interactive plots. It is also needed by the documentation, so we copy
# it to the themes/nilearn/static folder.
shutil.copy(
os.path.join(os.path.dirname(os.path.dirname(__file__)),
'nilearn', 'plotting', 'data', 'js', 'jquery.min.js'),
os.path.join(os.path.dirname(__file__), 'themes', 'nilearn', 'static',
'jquery.js'))


# -- Parallel computing ------------------------------------------------------
try:
from sklearn.utils import parallel_backend, cpu_count
parallel_backend(max(cpu_count, 4))
except:
pass

# ----------------------------------------------------------------------------


# If extensions (or modules to document with autodoc) are in another
# directory, add these directories to sys.path here. If the directory
# is relative to the documentation root, use os.path.abspath to make it
Expand Down Expand Up @@ -294,10 +315,10 @@
'reference_url' : {
'nilearn': None,
'matplotlib': 'http://matplotlib.org',
'numpy': 'http://docs.scipy.org/doc/numpy-1.6.0',
'scipy': 'http://docs.scipy.org/doc/scipy-0.11.0/reference',
'numpy': 'http://docs.scipy.org/doc/numpy-1.11.0',
'scipy': 'http://docs.scipy.org/doc/scipy-0.17.0/reference',
'nibabel': 'http://nipy.org/nibabel',
'sklearn': 'http://scikit-learn.org/0.17/',
'sklearn': 'http://scikit-learn.org/0.18/',
'pandas': 'http://pandas.pydata.org'}
}

Expand All @@ -319,6 +340,8 @@ def touch_example_backreferences(app, what, name, obj, options, lines):

# Add the 'copybutton' javascript, to hide/show the prompt in code
# examples


def setup(app):
app.add_javascript('copybutton.js')
app.connect('autodoc-process-docstring', touch_example_backreferences)
32 changes: 20 additions & 12 deletions doc/connectivity/functional_connectomes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -205,29 +205,37 @@ In the case of the MSDL atlas
with MNI coordinates for each region (see for instance example:
:ref:`sphx_glr_auto_examples_03_connectivity_plot_probabilistic_atlas_extraction.py`).

.. image:: ../auto_examples/03_connectivity/images/sphx_glr_plot_probabilistic_atlas_extraction_002.png
:target: ../auto_examples/03_connectivity/plot_probabilistic_atlas_extraction.html

..
For doctesting
>>> from nilearn import datasets
>>> atlas_filename = datasets.fetch_atlas_msdl().maps # doctest: +SKIP

For another atlas this information can be computed for each region with
the :func:`nilearn.plotting.find_xyz_cut_coords` function
(see example:
:ref:`sphx_glr_auto_examples_03_connectivity_plot_multi_subject_connectome.py`)::
As you can see, the correlation matrix gives a very "full" graph: every
node is connected to every other one. This is because it also captures
indirect connections. In the next section we will see how to focus on
direct connections only.

>>> from nilearn import image, plotting
>>> atlas_region_coords = [plotting.find_xyz_cut_coords(img) for img in image.iter_img(atlas_filename)] # doctest: +SKIP
A functional connectome: extracting coordinates of regions
==========================================================
For atlases without readily available label coordinates, center coordinates
can be computed for each region on hard parcellation or probabilistic atlases.

* For hard parcellation atlases (eg. :func:`nilearn.datasets.fetch_atlas_destrieux_2009`),
use the :func:`nilearn.plotting.find_parcellation_cut_coords`
function. See example:
:ref:`sphx_glr_auto_examples_03_connectivity_plot_atlas_comparison.py`

* For probabilistic atlases (eg. :func:`nilearn.datasets.fetch_atlas_msdl`), use the
:func:`nilearn.plotting.find_probabilistic_atlas_cut_coords` function.
See example: :ref:`sphx_glr_auto_examples_03_connectivity_plot_multi_subject_connectome.py`::

.. image:: ../auto_examples/03_connectivity/images/sphx_glr_plot_probabilistic_atlas_extraction_002.png
:target: ../auto_examples/03_connectivity/plot_probabilistic_atlas_extraction.html
>>> from nilearn import plotting
>>> atlas_region_coords = plotting.find_probabilistic_atlas_cut_coords(atlas_filename) # doctest: +SKIP

As you can see, the correlation matrix gives a very "full" graph: every
node is connected to every other one. This is because it also captures
indirect connections. In the next section we will see how to focus on
only direct connections.

|
Expand Down
10 changes: 5 additions & 5 deletions doc/decoding/decoding_intro.rst
Original file line number Diff line number Diff line change
Expand Up @@ -224,10 +224,10 @@ in a `K-Fold strategy
>>> cv = 2
There is a specific function,
:func:`sklearn.cross_validation.cross_val_score` that computes for you
:func:`sklearn.model_selection.cross_val_score` that computes for you
the score for the different folds of cross-validation::

>>> from sklearn.cross_validation import cross_val_score # doctest: +SKIP
>>> from sklearn.model_selection import cross_val_score # doctest: +SKIP
>>> cv_scores = cross_val_score(svc, fmri_masked, target, cv=5) # doctest: +SKIP

`cv=5` stipulates a 5-fold cross-validation. Note that this function is located
Expand Down Expand Up @@ -267,7 +267,7 @@ caveats, and guidelines*, Neuroimage 2017).
Here, in the Haxby example, we are going to leave a session out, in order
to have a test set independent from the train set. For this, we are going
to use the session label, present in the behavioral data file, and
:class:`sklearn.cross_validation.LeaveOneLabelOut`.
:class:`sklearn.model_selection.LeaveOneGroupOut`.

.. note::

Expand Down Expand Up @@ -320,9 +320,9 @@ at chance, is to use a *"dummy"* classifier,

**Permutation testing**: A more controlled way, but slower, is to do
permutation testing on the labels, with
:func:`sklearn.cross_validation.permutation_test_score`::
:func:`sklearn.model_selection.permutation_test_score`::

>>> from sklearn.cross_validation import permutation_test_score
>>> from sklearn.model_selection import permutation_test_score
>>> null_cv_scores = permutation_test_score(svc, fmri_masked, target, cv=cv) # doctest: +SKIP

|
Expand Down
2 changes: 1 addition & 1 deletion doc/decoding/estimator_choice.rst
Original file line number Diff line number Diff line change
Expand Up @@ -115,7 +115,7 @@ not perform as well on new data.
:scale: 60

With scikit-learn nested cross-validation is done via
:class:`sklearn.grid_search.GridSearchCV`. It is unfortunately time
:class:`sklearn.model_selection.GridSearchCV`. It is unfortunately time
consuming, but the ``n_jobs`` argument can spread the load on multiple
CPUs.

Expand Down
2 changes: 1 addition & 1 deletion doc/decoding/searchlight.rst
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ Kriegskorte et al. use a 5.6mm radius because it yielded the best detection
performance in their simulation.

.. literalinclude:: ../../examples/02_decoding/plot_haxby_searchlight.py
:start-after: import nilearn.decoding
:start-after: cv = KFold(n_splits=4)
:end-before: # F-scores computation

Visualization
Expand Down
Loading

0 comments on commit 93016ff

Please sign in to comment.