Skip to content

Commit

Permalink
[pre-commit.ci] pre-commit autoupdate (#303)
Browse files Browse the repository at this point in the history
* [pre-commit.ci] pre-commit autoupdate

updates:
- [github.com/astral-sh/ruff-pre-commit: v0.5.7 → v0.6.3](astral-sh/ruff-pre-commit@v0.5.7...v0.6.3)

* [pre-commit.ci] auto fixes from pre-commit.com hooks

for more information, see https://pre-commit.ci

* ignore linting old notebooks

* update README

---------

Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Co-authored-by: Richard Preen <[email protected]>
  • Loading branch information
pre-commit-ci[bot] and rpreen authored Sep 3, 2024
1 parent 368ee38 commit 10b4ac0
Show file tree
Hide file tree
Showing 11 changed files with 54 additions and 54 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ repos:
# Ruff, the Python auto-correcting linter/formatter written in Rust
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.5.7
rev: v0.6.3
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]
Expand Down
13 changes: 8 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,12 @@
# SACRO-ML

[![License](https://img.shields.io/badge/license-MIT-blue.svg?style=flat)](https://opensource.org/licenses/MIT)
[![Latest Version](https://img.shields.io/github/v/release/AI-SDC/SACRO-ML?style=flat)](https://github.com/AI-SDC/SACRO-ML/releases)
[![DOI](https://zenodo.org/badge/518801511.svg)](https://zenodo.org/badge/latestdoi/518801511)
[![codecov](https://codecov.io/gh/AI-SDC/SACRO-ML/branch/main/graph/badge.svg?token=AXX2XCXUNU)](https://codecov.io/gh/AI-SDC/SACRO-ML)
[![PyPI package](https://img.shields.io/pypi/v/sacroml.svg)](https://pypi.org/project/sacroml)
[![Python versions](https://img.shields.io/pypi/pyversions/sacroml.svg)](https://pypi.org/project/sacroml)

# SACRO-ML

A collection of tools and resources for managing the [statistical disclosure control](https://en.wikipedia.org/wiki/Statistical_disclosure_control) of trained [machine learning](https://en.wikipedia.org/wiki/Machine_learning) models. For a brief introduction, see [Smith et al. (2022)](https://doi.org/10.48550/arXiv.2212.01233).

The `sacroml` package provides:
Expand All @@ -14,8 +15,6 @@ The `sacroml` package provides:

## Installation

[![PyPI package](https://img.shields.io/pypi/v/sacroml.svg)](https://pypi.org/project/sacroml)

Install `sacroml` and manually copy the [`examples`](examples/).

To install only the base package, which includes the attacks used for assessing privacy:
Expand All @@ -35,10 +34,14 @@ Note: macOS users may need to install libomp due to a dependency on XGBoost:
$ brew install libomp
```

## Running
## Usage

See the [`examples`](examples/).

## Documentation

See [API documentation](https://ai-sdc.github.io/SACRO-ML/).

## Acknowledgement

This work was funded by UK Research and Innovation under Grant Numbers MC_PC_21033 and MC_PC_23006 as part of Phase 1 of the [DARE UK](https://dareuk.org.uk) (Data and Analytics Research Environments UK) programme, delivered in partnership with Health Data Research UK (HDR UK) and Administrative Data Research UK (ADR UK). The specific projects were Semi-Automatic checking of Research Outputs (SACRO; MC_PC_23006) and Guidelines and Resources for AI Model Access from TrusTEd Research environments (GRAIMATTER; MC_PC_21033).­This project has also been supported by MRC and EPSRC [grant number MR/S010351/1]: PICTURES.
Expand Down
13 changes: 4 additions & 9 deletions examples/notebooks/example-notebook-SVC.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,16 +17,11 @@
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import sys\n",
"import pylab as plt\n",
"import numpy as np\n",
"import logging\n",
"from sklearn.svm import SVC\n",
"from sklearn.linear_model import LogisticRegression\n",
"import os\n",
"\n",
"import numpy as np\n",
"from sklearn import datasets\n",
"from os.path import expanduser\n",
"\n",
"# next few commented out lines are for developers only\n",
"# ROOT_DIR = os.path.dirname(os.path.dirname(os.path.abspath(\"\")))\n",
Expand Down Expand Up @@ -191,7 +186,7 @@
],
"source": [
"target_json = os.path.normpath(\"testSaveSVC/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down Expand Up @@ -282,7 +277,7 @@
],
"source": [
"target_json = os.path.normpath(\"testSaveSVC/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down
11 changes: 5 additions & 6 deletions examples/notebooks/example-notebook-decisiontree.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@
"metadata": {},
"outputs": [],
"source": [
"import sys\n",
"import os\n",
"\n",
"# from os.path import expanduser\n",
Expand Down Expand Up @@ -91,8 +90,8 @@
"metadata": {},
"outputs": [],
"source": [
"from sklearn.tree import plot_tree\n",
"import matplotlib.pyplot as plt"
"import matplotlib.pyplot as plt\n",
"from sklearn.tree import plot_tree"
]
},
{
Expand Down Expand Up @@ -229,8 +228,8 @@
"metadata": {},
"outputs": [],
"source": [
"from aisdc.safemodel.safemodel import SafeModel\n",
"from aisdc.safemodel.classifiers import SafeDecisionTreeClassifier"
"from aisdc.safemodel.classifiers import SafeDecisionTreeClassifier\n",
"from aisdc.safemodel.safemodel import SafeModel"
]
},
{
Expand Down Expand Up @@ -894,7 +893,7 @@
],
"source": [
"target_json = os.path.normpath(\"hacked_unsafe/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down
27 changes: 11 additions & 16 deletions examples/notebooks/example-notebook-keras.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,6 @@
"metadata": {},
"outputs": [],
"source": [
"import sys\n",
"import os\n",
"\n",
"\n",
Expand All @@ -38,30 +37,26 @@
"metadata": {},
"outputs": [],
"source": [
"import numpy as np\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"\n",
"%matplotlib inline\n",
"\n",
"# Scikit-learn utils\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.datasets import make_classification, make_moons\n",
"from sklearn.metrics import ConfusionMatrixDisplay\n",
"from sklearn.metrics import confusion_matrix, classification_report, roc_curve, auc\n",
"\n",
"# Tensorflow imports\n",
"import tensorflow as tf\n",
"from tensorflow.keras.models import Model\n",
"from tensorflow.keras.layers import Input, Dense, Dropout\n",
"import tensorflow_privacy as tf_privacy\n",
"from tensorflow_privacy.privacy.analysis import compute_dp_sgd_privacy\n",
"\n",
"# Classifiers for attack models\n",
"from sklearn.linear_model import LogisticRegression\n",
"from sklearn.neural_network import MLPClassifier\n",
"\n",
"# Safe Keras\n",
"from aisdc.safemodel.classifiers import SafeKerasModel\n",
"from sklearn.datasets import make_classification\n",
"\n",
"# Classifiers for attack models\n",
"from sklearn.metrics import (\n",
" classification_report,\n",
" confusion_matrix,\n",
")\n",
"from sklearn.model_selection import train_test_split\n",
"from tensorflow.keras.layers import Dense, Input\n",
"\n",
"# set tensorflow messages to warning level\n",
"tf.get_logger().setLevel(\"WARNING\")"
Expand Down Expand Up @@ -614,7 +609,7 @@
],
"source": [
"target_json = os.path.normpath(\"safe1/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down
11 changes: 4 additions & 7 deletions examples/notebooks/example-notebook-randomforest.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,6 @@
"metadata": {},
"outputs": [],
"source": [
"import sys\n",
"import os\n",
"\n",
"##Commented out lines below applies to developers only\n",
Expand Down Expand Up @@ -90,8 +89,8 @@
"metadata": {},
"outputs": [],
"source": [
"from sklearn.tree import plot_tree\n",
"import matplotlib.pyplot as plt"
"import matplotlib.pyplot as plt\n",
"from sklearn.tree import plot_tree"
]
},
{
Expand All @@ -110,7 +109,6 @@
"metadata": {},
"outputs": [],
"source": [
"from aisdc.safemodel.safemodel import SafeModel\n",
"from aisdc.safemodel.classifiers import SafeRandomForestClassifier"
]
},
Expand Down Expand Up @@ -247,7 +245,7 @@
],
"source": [
"target_json = os.path.normpath(\"testSaveRF/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down Expand Up @@ -278,7 +276,6 @@
}
],
"source": [
"from aisdc.safemodel.safemodel import SafeModel\n",
"from aisdc.safemodel.classifiers import SafeRandomForestClassifier\n",
"\n",
"safeRFModel = SafeRandomForestClassifier(n_estimators=100) # (criterion=\"entropy\")\n",
Expand Down Expand Up @@ -352,7 +349,7 @@
],
"source": [
"target_json = os.path.normpath(\"testSaveRF/target.json\")\n",
"with open(target_json, \"r\") as f:\n",
"with open(target_json) as f:\n",
" print(f.read())"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -41,13 +41,14 @@
"source": [
"import random\n",
"from itertools import product\n",
"\n",
"import numpy as np\n",
"\n",
"np.random.seed(1234)\n",
"random.seed(12345)\n",
"\n",
"from scipy.stats import poisson\n",
"import pandas as pd\n",
"from scipy.stats import poisson\n",
"from sklearn.svm import SVC"
]
},
Expand Down
8 changes: 4 additions & 4 deletions examples/risk_examples/python/instance_based_mimic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -29,9 +29,9 @@
}
],
"source": [
"import os\n",
"import logging\n",
"import numpy as np\n",
"import os\n",
"\n",
"import pylab as plt\n",
"\n",
"%matplotlib inline\n",
Expand Down Expand Up @@ -123,9 +123,9 @@
}
],
"source": [
"from sklearn.svm import SVC\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.metrics import roc_curve\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.svm import SVC\n",
"\n",
"font = {\"size\": 14}\n",
"plt.rc(\"font\", **font)\n",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -40,16 +40,16 @@
"outputs": [],
"source": [
"import random\n",
"from itertools import product\n",
"\n",
"import numpy as np\n",
"\n",
"np.random.seed(1234)\n",
"random.seed(12345)\n",
"\n",
"from scipy.stats import poisson\n",
"import pandas as pd\n",
"from sklearn.svm import SVC\n",
"from sklearn.model_selection import train_test_split"
"from scipy.stats import poisson\n",
"from sklearn.model_selection import train_test_split\n",
"from sklearn.svm import SVC"
]
},
{
Expand Down
10 changes: 10 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -74,6 +74,16 @@ lint.select = [
"YTT", # flake8-2020
]

exclude = [
"**example-notebook-SVC.ipynb",
"**example-notebook-decisiontree.ipynb",
"**example-notebook-keras.ipynb",
"**example-notebook-randomforest.ipynb",
"**attribute_inference_cancer.ipynb",
"**instance_based_mimic.ipynb",
"**membership_inference_cancer.ipynb",
]

lint.ignore = [
"ANN101", # missing-type-self
"EM101", # raw-string-in-exception
Expand Down
2 changes: 1 addition & 1 deletion tests/conftest.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ def _cleanup():
os.remove(file)


@pytest.fixture()
@pytest.fixture
def get_target(request) -> Target: # pylint: disable=too-many-locals
"""Return a target object with test data and fitted model.
Expand Down

0 comments on commit 10b4ac0

Please sign in to comment.