From a3d6a9830dc69c4becb18743f6336115524a20b9 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Mon, 30 Dec 2024 15:15:39 -0500 Subject: [PATCH 01/19] data curation guide with narrative, 1st draft --- .../ch3-data-generation-and-curation.md | 761 +++++++----------- .../ch3-data-generation-and-curation.md.bkup | 500 ++++++++++++ 2 files changed, 791 insertions(+), 470 deletions(-) create mode 100644 pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index eb1761d..e3018b3 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -4,484 +4,305 @@ - *[Daniel Wheeler](https://www.nist.gov/people/daniel-wheeler), NIST*, [@wd15] - *[Damien Pinto](https://ca.linkedin.com/in/damien-pinto-4748387b), McGill*, [@DamienPinto] +# Data Generation and Curation ## Overview - - Look at lit on data and see how this is implemented - - Generation and dissemination - -## Ideas - - - Data formats - - FAIR - - Metadata (hierarchical data standards (look for current versions)) - - What standards exist - - One or two examples of phase field data stored currently - - Use an existing - - Create our own example - - Practical choices for storing data (figshare, zenodo, dryad, MDF) - - Deciding what data to keep - - What data to store when publishing - - What is supplementary material versus store versus leave on hard drive - - Lit review, good citations - - minting DOIs for the data - - might include simulatioin execution and logging - - how frequently to store data - - how to store - -## How do we want to structure the sections? - - 1. Intro (Daniel) - - What is data? - - What is metadata? - - Why do we need curate data? - - What is the motivation for this document? - - What should the reader get out of this document - - Why is this useful - - Create a distinction between software, data, and post-processed results - - 2. Data Generation (Trevor) - - HPC - - file systems - - data formats - - formats not to use (e.g., don't use serialization that depends on the version of the code that reads and writes because code changes) - - don't use pickles - - restarts - - data frequency - - post-processing -> refer to other document for this - - precision - - importance of folder structure - - 3. Data Curation (Trevor) - - Why is curating data important? - - Why do we need to store our data - - What formats can we use - - Is my data too large, data sizes - - What is useful for third party / subsequent users - - How might the data be used for AI or something else - - Could a reviewer make use of your curated data - - Storing post-processed data and raw data and which to store or keep - - Minting DOIs for your software when publishing a paper - - FAIR - - 4. Metadata standards (Daniel) - - Why do we need to keep some metadata beyond the data - - Zoo of things like data dictionaries, ontologies - - however, these are not well developed for our use case - - For example, you curate on Zenodo - - what extra data should you include - - how to describe the data files - - how to maintain some minimalist info about the simulation that generated the run - - When, why and how I ran this simulation - - What software - - Give example of a yaml file with 10 flat fields - - The future should be better in this regard. People actively working to improve this issue. - - 5. Examples - - Practical examples (Trevor) - - Using Zenodo for a PFHub record to store data and metadata - - Relatively rich metadata scheme - - - Simulation from scratch (Damien) - - data generation - - folder structure - - HPC issues with data - - capture process / descriptive parameters for the data that - are useful for subsequent ML practitioners that use the data - - ML / store data - - Narrative of what gets stored to disk - - Decisions of what to keep and how frequently to save data - - Auxiliary metadata decisions - - - 6. Summary (Daniel) - 7. Biblio (Daniel) - +Phase field models are characterized by a form of PDE related to an +Eulerian free boundary problem and defined by a diffuse +interface. Phase field models for practical applications require +sufficient high fidelity to resolve both the macro length scale +related to the application and the micro length scales associated with +the free boundary. This requires extensive computationally resources +and generates large volumes of raw field data. This data consists of +field variables defined frequently across a domain or an interpolation +function with many Gauss points. Typically, data is stored at +sufficient temporal frequency to reconstruct the evolution of the +field variables. + +In recent years there have been efforts to embed phase field models +into ICME-based materials design workflows. However, to leverage +relevant phase field resources for these workflows a systematic +approach is required for archiving and accessing data. Furthermore, it +is often difficult for downstream researchers to find and access raw +or minimally processed data from phase field studies, before the +post-processing steps and final publication. In this document, we will +provide motivation, guidance and a template for packaging and +publishing FAIR data from phase field studies as well as managing +unpublished raw data. Following the protocols outlined in this guide +will provide downstream researchers with an enhanced capability to use +phase field as part of larger ICME workflows and, in particular, data +intensive usages such as AI surrogate models. This guide is not a +complete guide about scientific data, but more of a though provoker so +phase field practitioners are aware of the fundamental issues before +embarking on a phase field study. + + +## Definitions + +It is beneficial for the reader to be clear regarding the main +concepts of FAIR data management when applied to phase field +studies. Broadly speaking, **FAIR data management** encompasses the +curation of simulation workflows (including the software, data inputs +and data outputs) for subsequent researchers or even machine +agents. **FAIR data** concepts have been well explained elsewhere, see +[X, Y, Z]. A **scientific workflow** is generally conceptualized as a +graph of connected actions with various inputs and outputs. Some of +the nodes in a workflow may not be entirely automated and require +human agent inputs, which can increase the complexity of workflow +curation. Workflow nodes include the pre and post-processing steps for +phase field simulation workflows. In this guide, the **raw and +post-processed** data is considered to be different from the +**metadata**, which describes the simulation and associated +workflow. The **raw data** is generated by the simulation as it is +running and often consists of field data. The **post-processed data** +consists of derived quantities and images generated using the **raw +data**. The **software** used the for the simulation generally refers +to the phase field code used to run the simulation directly and is +part of the larger **computational environment**. The **code** might +also refer to **software**, but the distinction is that the **code** +may have been modified by the researcher and might include **input +files** to the **software application**. Although software and code +can be considered as data, the curation process involves different +tools and practices. See the [Software Development] section of the +best practices guide for a more detailed discussion of software and +code curation. + +```mermaid --- - -## Old version - -- Save the data from your published work as much as possible, with meta data -- Save the inputs used to produce the results from all your published work - -### FAIR Data - -We discussed the [FAIR Principles] at [CHiMaD Phase-Field XIII][fair-phase-field]: - -#### Findable - -- [ ] (Meta)data are assigned a globally unique and persistent identifier -- [ ] Data are described with rich metadata (defined by R1 below) -- [ ] Metadata clearly and explicitly include the identifier of the data they describe -- [ ] (Meta)data are registered or indexed in a searchable resource - -#### Accessible - -- [ ] (Meta)data are retrievable by their identifier using a standardized - communications protocol - - [ ] The protocol is open, free, and universally implementable - - [ ] The protocol allows for an authentication and authorisation procedure, - where necessary -- [ ] Metadata are accessible, even when the data are no longer available - -#### Interoperable - -- [ ] (Meta)data use a formal, accessible, shared, and broadly applicable language - for knowledge representation. -- [ ] (Meta)data use vocabularies that follow FAIR principles -- [ ] (Meta)data include qualified references to other (meta)data - -#### Reusable - -- [ ] (Meta)data are richly described with a plurality of accurate and relevant attributes - - [ ] (Meta)data are released with a clear and accessible data usage license - - [ ] (Meta)data are associated with detailed provenance - - [ ] (Meta)data meet domain-relevant community standards - -### Zenodo - -Historically, [PFHub] has accepted datasets linked from any host on the Web. -At this time, we recommend using [Zenodo] to host your benchmark data. Why? *It's not "just" a shared folder.* - -* Guided prompts to describe what you're uploading -* DOI is automatically assigned to your dataset -* Basic metadata exported in multiple formats -* Browser-based viewers for CSV, Markdown, PDF, images, videos - -#### Metadata Examples - -Zenodo gives you the option to import a repository directly from GitHub. The original [FAIR Phase-field talk](https://doi.org/10.5281/zenodo.6540105) was "uploaded" this way, producing the following record. While basic authorship information was captured, this tells an interested person or machine nothing meaningful about the dataset. - -```json -{ - "@context": "https://schema.org/", - "@id": "https://doi.org/10.5281/zenodo.6540105", - "@type": "SoftwareSourceCode", - "name": "tkphd/fair-phase-field-data: CHiMaD Phase-field XIII", - "description": "FAIR Principles for Phase-Field Practitioners", - "version": "v0.1.0", - "license": "", - "identifier": "https://doi.org/10.5281/zenodo.6540105", - "url": "https://zenodo.org/record/6540105", - "datePublished": "2022-05-11", - "creator": [{ - "@type": "Person", - "givenName": "Trevor", - "familyName": "Keller", - "affiliation": "NIST"}], - "codeRepository": "https://github.com/tkphd/fair-phase-field-data/tree/v0.1.0" -} -``` - -The *strongly* preferred method is to upload files directly. The following record represents an upload for [Benchmark 1b using HiPerC](https://doi.org/10.5281/zenodo.1124941). I would consider this metadata ***rich!*** - -```json -{ - "@context": "https://schema.org/", - "@id": "https://doi.org/10.5281/zenodo.1124941", - "@type": "Dataset", - "name": "hiperc-gpu-cuda-spinodal" - "description": "Solution to the CHiMaD Phase Field benchmark problem on spinodal decomposition using CUDA, with a 9-point discrete Laplacian stencil", - "identifier": "https://doi.org/10.5281/zenodo.1124941", - "license": "https://creativecommons.org/licenses/by/4.0/legalcode", - "url": "https://zenodo.org/record/1124941", - "datePublished": "2017-12-21", - "creator": [{ - "@type": "Person", - "@id": "https://orcid.org/0000-0002-2920-8302", - "givenName": "Trevor", - "familyName": "Keller", - "affiliation": "NIST"}], - "keywords": ["phase-field", "pfhub", "chimad"], - "sameAs": ["https://doi.org/10.6084/m9.figshare.5715103.v2"], - "distribution": [ - { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv", - "encodingFormat": "csv" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0000000.png", - "encodingFormat": "png" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0100000.png", - "encodingFormat": "png" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0200000.png", - "encodingFormat": "png" - }] -} -``` - -#### Metadata Files - -After uploading the HiPerC simulation data, I also registered it with PFHub using `meta.yaml`. This file tells the website-generating machinery what to do with the dataset, and provides additional information about the resources required to perform the simulation. - -```yaml +title: A Phase Field Workflow --- -benchmark: - id: 1b - version: '1' -data: -- name: run_time - values: - - sim_time: '200000' - wall_time: '7464' -- name: memory_usage - values: - - unit: KB - value: '308224' -- description: free energy data - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv -- description: microstructure at t=0 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-000000.png -- description: microstructure at t=100,000 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-100000.png -- description: microstructure at t=200,000 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-200000.png -metadata: - author: - email: trevor.keller@nist.gov - first: Trevor - github_id: tkphd - last: Keller - hardware: - acc_architecture: gpu - clock_rate: '1.48' - cores: '1792' - cpu_architecture: x86_64 - nodes: 1 - parallel_model: threaded - implementation: - repo: - url: https://github.com/usnistgov/hiperc - version: b25b14acda7c5aef565cdbcfc88f2df3412dcc46 - simulation_name: hiperc_cuda - summary: HiPerC spinodal decomposition result using CUDA on a Tesla P100 - timestamp: 18 December, 2017 +flowchart TD + id1(Input Files\nParameters) + id1.1(Code) + id2(Computational\nEnvironment) + id2.5[[Pre-processing,\ne.g. CALPHAD or Meshing]] + id2.7([Pro-processed Data]) + id3[[Phase Field Simulation]] + id3.5([Scratch Data]) + id4([Raw Field Data]) + id5[[Post-processing\ne.g. Data Visualization]] + id6([Post-processed Data\ne.g. Derivied Quantities, Images]) + id1-->id2.5 + id1-->id5 + id2.5-->id2.7-->id3 + id2.5-->id3 + id1-->id3 + id1.1-->id3 + id2-->id3 + id3-->id4-->id5-->id6 + id3-->id3.5-->id3 + id2-->id2.5 + id2-->id5 ``` -This file is not part of my dataset: it resides in the [PFHub repository on GitHub]. Furthermore, since the structure of this file specifically suits PFHub, it is of no use at all to other software, websites, or researchers. - -### Structured Data Schemas - -In the Zenodo metadata above, note the `@context` fields: [Schema.org] is a [structured data schema] *and controlled vocabulary* for describing things on the Internet. How is this useful? - -Consider the [CodeMeta] project. It creates metadata files for software projects using [Schema.org] building blocks. There's even a handy [CodeMeta Generator]! If you maintain a phase-field software framework, you can (and should!) use it to document your code in a standards-compliant, machine-readable format. This improves interoperability and reusability! - -```json -{ - "@context": "https://doi.org/10.5063/schema/codemeta-2.0", - "@type": "SoftwareSourceCode", - "license": "https://spdx.org/licenses/CC-PDDC", - "codeRepository": "git+https://github.com/usnistgov/hiperc", - "dateCreated": "2017-08-07", - "dateModified": "2019-03-04", - "downloadUrl": "https://github.com/usnistgov/hiperc/releases/tag/v1.0", - "issueTracker": "https://github.com/usnistgov/hiperc/issues", - "name": "HiPerC", - "version": "1.0.0", - "description": "High-Performance Computing in C and CUDA", - "applicationCategory": "phase-field", - "developmentStatus": "inactive", - "programmingLanguage": ["C", "CUDA", "OpenCL", "OpenMP", "TBB"], - "author": [ - { - "@type": "Person", - "@id": "https://orcid.org/my-orcid?orcid=0000-0002-2920-8302", - "givenName": "Trevor", - "familyName": "Keller", - "email": "trevor.keller@nist.gov", - "affiliation": { - "@type": "Organization", - "name": "NIST" - } - } - ] -} -``` +## Data Generation + +Let's first draw the distinction between data generation and data +curation. Data generation involves writing raw data to disk during the +simulation execution and generating post-processed data from that raw +data. Data curation involves packaging the generated data objects from +a phase field workflow or study along with sufficient provenance +metadata into a FAIR research object for consumption by subsequent +scientific studies. + +When performing a phase field simulation, one must be cognizant of +several factors pertaining to data generation. Generally speaking, the +considerations can be defined as follow, + +- choosing data to generate (and then curate), +- file formats, +- file system hierarchy, +- restarts and recovering from crashes +- data generation and workflow tools, and +- HPC environments and writing to disk in parallel. + +These considerations are often conflicting, require trial and error to +determine the best approach and are highly specific to the +requirements of the workflow and post-processing. However, there are +some general guidelines that will be outlined below. + +### Choosing data to generate + +Selecting the appropriate data to write to disk during the simulation +largely depends on the requirements such as post-processing or +debugging. However, it is good practice to consider future uses of the +data for future work such as subsequent researchers trying to reuse +the workflow or even reviewers. Lack of forethought in saving data +could hinder the data curation of the eventual curation of the data +research object. This step should be considered independently from +restarts. The data required to reconstruct derived quantities or the +evolution of field data will not be the same as the data required to +restart a simulation. + +Another aspect of saving data to disk is the frequency off the disk +writes. This choice can often impact the performance of a simulation +as the simulation might have to wait on the disk before continuing the +computation. A method to avoid this is to use a separate thread that +runs concurrently to write the data (in the same process), see +[Stackflow +Question](https://stackoverflow.com/questions/1014113/efficient-way-to-save-data-to-disk-while-running-a-computationally-intensive-tas). In +fact many practitioners overlook optimizing this part of aspect of +phase field codes +[ref](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0202410). Generally, +when writing data it is best to do single large write to disk as +opposed to multiple small writes. In practice this could involve +caching multiple field variables across multiple print steps as a +single data blob to an HDF5 file. However, there is a trade off +between simulation performance and memory usage as well as latency and +communication overhead when considering parallel simulations. + +https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0202410 +https://dl.acm.org/doi/abs/10.1145/1713072.1713079 + +### File formats + +In general, when running phase field simulations, the user is limited +to the file format that the software supports. For example, if the +research is using PRISMS-PF the default data format is VTK and there +is no reason to seek an alternative. If an alternative file format is +required then the researcher could code a C++ function to write data +in an alternative format to VTK such as NetCDF. + +As a general rule it is best to choose file formats that work with the +tools already in use and / or that your colleagues are using. There +are other considerations to be aware of though. Human readable formats +such as CSV and JSON are often useful for small medium data sets (such +as derived quantities) as some metadata can be embedded alongside the +raw data resulting in a FAIRer data product than standard binary +formats. Some binary file formats also support metadata and might be +more useful for final data curation of a phase field study even if not +used during the research process. One main benefit of using binary +data (beyond saving disk space) is the ability to preserve full +precision for floating point numbers. The longevity of file formats +should be considered as well. A particularly egregious case of +ignoring longevity would be using the Pickle file format in Python, +which is both language dependent and code dependent. It is an example +of data serialization, which is used mainly for in-process data +storage for asynchronous tasks, but not good for long term data +storage. + +There are many binary formats used for storing field data based on an +Eulerian mesh or grid. Common formats for field data are NetCDF, VTK, +XDMF and EXODUS. Within the phase field community, VTK seems to be the +mostly widely used. VTK is actually a visualization library, but +supports a number of different native file formats based on both XML +and HDF5 (both non-binary and binary). The VTK library works well with +FE simulations supporting many different element types as well as +parallel data storage for domain decomposition. See the [XML file +formats +documentation](https://docs.vtk.org/en/latest/design_documents/VTKFileFormats.html#xml-file-formats) +for VTK for an overview of zoo of different file extensions and their +meaning. In contrast to VTK, NetCDF is more geared towards gridded +data having arisen from atmospheric research, which uses more FD and +FV than FE. For a comparison of performance and metrics for different +file types see the [Python MeshIO tools's +README.md](https://github.com/nschloe/meshio?tab=readme-ov-file#performance-comparison) + +The Python MeshIO tool is a good place to start for IO when writing +custom phase field codes in Python (or Julia using `pyimport`). MeshIO +is also a good place to start for exploring, debugging or picking +apart file data in an interactive Python environment, which can be +harder to do with dedicated viewing tools like Paraview. The +scientific Python ecosystem is very rich with tools for data +manipulation and storage such as Pandas, which supports storage in +many different formats, and xarray for higher dimensional data. xarray +supports NetCDF file storage, which includes coordinate systems and +metadata in HDF5. Both Pandas and xarray can be used in a parallel or +a distributed manner in conjucntion with Dask. Dask along with xarray +supports writing to the Zarr data format. Zarr allows data to be +stored on disk during analysis to avoid loading the entire data object +into memory. + +https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/ +https://docs.vtk.org/en/latest/index.html +https://docs.xarray.dev/en/stable/user-guide/io.html= + +### Recovering from crashes and restarts + +A study from 2020 of HPC systems calculated the success rate (I.e. no +error code on completion) of multi-node jobs with non-shared memory at +between 60 and 70 % +[Kumar](https://engineering.purdue.edu/dcsl/publications/papers/2020/fresco_dsn20_cameraready.pdf). This +success rate diminishes rapidly as the run time of jobs +increases. Needless to say that check-pointing is absolutely required +for any jobs of more than a few hours. Nearly everyday, an HPC +platform will experience some sort of failure +[Benoit1](https://inria.hal.science/hal-03264047/file/rr9413.pdf) +[Aupy](https://www.sciencedirect.com/science/article/abs/pii/S0743731513002219). That +doesn't mean that every job will fail everyday, but it would be +optimistic to think that jobs will go beyond a week without some +issues. Given that fact one can estimate how long it might take to run +a job without check-pointing. A very rough estimate for expected +completion time assuming instantaneous restarts and no queuing time is +given by, + +$$ E(T) = \frac{1}{2} \left(1 + e^{\frac{T}{\mu}} \right) T $$ + +where $T$ is the nominal job completion time with no failures and +$\mu$ is the mean time to failure. The formula predicts an expected +time of 3.8 days for a job that nominally runs for 3 days with a $\mu$ +of one week. The formula is of course a gross simplification and +includes many invalid assumptions, but regardless of the assumed +failure distribution the exponential time increase without +check-pointing is inescapable. Assuming that we're agreed on the need +for checkpoint, the next step is to decide on the optimal time +interval between checkpoints. This is given by the well known +Young/Daly formula, $W=\sqrt{2 \mu C}$, where $C$ is the time taken +for a checkpoint +[Benoit2](https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf) +[Bautista-Gomez](https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf). The +Young/Daly formula accounts for the trade off between the start up +time cost for a job to get back to its original point of failure and +the cost associated with writing the checkpoint to disk. For example, +with a weekly failure rate and $C=6$ minutes, $W=5.8$ hours. In +practice these estimates for $\mu$ and $C$ might be a little +pessimistic, but be aware of the trade off. [Benoit1] . Note that some +HPC systems have upper bounds on run times (e.g. TACC has a 7 days +time limit so $\mu<7$ days regardless of other system failures). + +Given the above theory, what is the some practical advice for +check-pointing jobs? + +- Estimate both $\mu$ and $C$. It might be worth discussing the $\mu$ + value with the HPC cluster administrator to get some valid + numbers. Of course $C$ can be estimated by running test jobs. It's + good to know if you should be writing checkpoints every day or every + hour or every minute. +- Ensure that restarts are deterministic (i.e. results don't change + between a job that restarts and one that doesn't). One way to do + this is to hash output files assuming that the simulation itself is + deterministic +- Consider using a checkpointing library if you're using a custom + phase field code or even a workflow tool such as Snakemake which has + the inbuilt ability to handle checkpointing. A tool like Snakemake + is good for large parameter studies where it is difficult to keep + track of which jobs wrote which files. The `pickle` library is + acceptable for checkpointint Python programs in this shortlived + circumstance. +- Use the inbuilt check-pointing available in the phase field code + that you're using. +- Whatever system is being used check that the check-pointing actually + works and is deterministic. + +- https://hivehpc.haifa.ac.il/index.php/slurm?start=5 +- https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf +- https://inria.hal.science/hal-03264047/file/rr9413.pdf +- https://www.sciencedirect.com/science/article/abs/pii/S0743731513002219 +- https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf +- https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf +- https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf +- https://icl.utk.edu/files/publications/2020/icl-utk-1385-2020.pdf +- https://ftp.cs.toronto.edu/csrg-technical-reports/621/ut-csrg-621.pdf +- https://arxiv.org/pdf/2012.00825 +- https://icl.utk.edu/~herault/papers/007%20-%20Checkpointing%20Strategies%20for%20Shared%20High-Performance%20Computing%20Platforms%20-%20IJNC%20(2019).pdf +- https://dl.acm.org/doi/10.1145/2184512.2184574 +- https://engineering.purdue.edu/dcsl/publications/papers/2020/fresco_dsn20_cameraready.pdf +- [Job failures](https://pdf.sciencedirectassets.com/271503/1-s2.0-S0898122111X00251/1-s2.0-S0898122111005980/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEKf%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJIMEYCIQCnYz7Yg2JHorkw2CwX7PI5fbyLRr02ykVPbgtxZhNy8QIhAIY%2BTq58bdBe3iRdnRXNP%2FjQ0%2B4LgrXUQh7aakHn9TSTKrsFCID%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEQBRoMMDU5MDAzNTQ2ODY1IgwlVGRSpcXtIR5IRTcqjwUcZrP%2B7%2Byn4dYmelmvfF1CCdKNMP%2BftdY1KdvKA%2BnlBpDHwh%2FulDxZPkotPpiaFnrHfT85QaPGB0Q5Ck16mfWIG5KAjrrPFXY2azR3%2FxLIM6I8Ka4aHzcvUDa5L2rn8PpHqVF61wtBWRZYI8N0YM5CZi8r2%2B6NLe9OJvgLR1%2B55%2BfwK5GucDahcWDrP2FvbBFWQyidEBNl7thbpO4NIKoUTGJkb8H%2BBezk09N%2F4CPCjHel5mA1CHA8cQLH9lcCPiLurzKTBP8ozNi%2FtrlAZUKRKk%2BYHMy5HyFl%2Bobumh1eesuGe19b%2FpYOZBGzrQ4mn9eblczLd2SQi3k%2FoAws7yW9HHqzmMkJnla2B3tgfpP9WxaCnb5ZMNLIjlwEfu67ZiydyWQ2VnygTjG8CsiBYUeFuCbTplAeviP4WN%2BtiK%2FAsXbQZW93Q7cH7K%2B7lhfPiuPaauh0tlgQJNVdp2QzT8qvsxbQtzdEOQ5ethcoNbXU8YmXYgYdUtGwQbySD18i3aQo6zdUD0h9YtNNl%2BXskT8nv9xVPzsfMdkvrWA23PdIDYagX6n4Dd45DkeYIa10oXQfQKy7JiYIyfy7L1zt6tE6Rr0H9aMogJZIfjZG3tFcRea97xN%2BputMKCCqpyz%2FBJgfCptLvjhoKsWCfIqiE23xTHTTl%2FTkTK20ZQ8yO8lHuKHwtjbJJrX%2BQgnfiZQp2Sm6sKWchwej1Nn8IgtbHswKexMqoyaQHUeokSJ6MwQmtHAoUjuwuhaVcOqHn5gNEBuAWo3pAS%2Fwn2TTG5g00gF7RpQRoZvflp4b9poAN20kS%2F0lmLzutQ3wHOq1Ak59OzQhZASygvTGiqCxRp%2BzqctUt6%2B3%2FKRFwhR7q%2BwRluPBMMyix7sGOrAB3n1WiLFCV6WV5KLzctnyXliLNDqpIxUPVeXy2v%2FvcR8zUDFo49QquQ6nJudq2u9aUJ4nKzEkzpLTdAOCnQ%2FIR68LdMPQF%2BqJr0BD78PqcfoaacB%2BH4vV0FhCOVs0rVLdevPVGoAhB%2BIFbrsv%2BvHvQUZxU3wX%2FuOp2ChL1cUohEeoQzo2PyF2FZXkNcUenB2EWhcZGpgAIL1EYIyu1iCYmbfjVlISQL7xMJHj%2BSq1H3c%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20241230T003505Z&X-Amz-SignedHeaders=host&X-Amz-Expires=299&X-Amz-Credential=ASIAQ3PHCVTYXOJ65CE4%2F20241230%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=4d3b88d3c0b7d80e5ff0df11ef0d4246875d4aabdc0c8d49be54c9c029beae8e&hash=ebead6fd98beaced790a9b84e76f9f6326f41ebbf4cedadb1fde7cfc734bfc31&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0898122111005980&tid=spdf-4974a446-6433-4619-afca-a5c210488d71&sid=a6179c5a4b870940603b745364079c847353gxrqa&type=client&tsoh=d3d3LnNjaWVuY2VkaXJlY3QuY29t&ua=161d5b095903025d02&rr=8f9df2a7da4fc990&cc=us) +- https://www.cs.cmu.edu/~bianca/dsn06.pdf + + -That's nice! But what about our datasets? Shouldn't the [PFHub] metadata "describing" a dataset live alongside that data? - -#### Towards a Phase-Field Schema - -We are working to build a phase-field schema (or schemas) using [Schema.org] and the [schemaorg] Python library. The work-alike port of `meta.yaml` looks like the following. - -> *N.B.:* We're going to deploy a generator similar to CodeMeta's so you won't have to write this! - -```json -{ - "@context": "https://www.schema.org", - "@type": "DataCatalog", - "author": [ - { - "@type": "Person", - "affiliation": { - "@type": "GovernmentOrganization", - "name": "Materials Science and Engineering Division", - "parentOrganization": { - "@type": "GovernmentOrganization", - "name": "Material Measurement Laboratory", - "parentOrganization": { - "@type": "GovernmentOrganization", - "address": { - "@type": "PostalAddress", - "addressCountry": "US", - "addressLocality": "Gaithersburg", - "addressRegion": "Maryland", - "postalCode": "20899", - "streetAddress": "100 Bureau Drive" - }, - "identifier": "NIST", - "name": "National Institute of Standards and Technology", - "parentOrganization": "U.S. Department of Commerce", - "url": "https://www.nist.gov" - } - } - }, - "email": "trevor.keller@nist.gov", - "familyName": "Keller", - "givenName": "Trevor", - "identifier": "tkphd", - "sameAs": "https://orcid.org/0000-0002-2920-8302" - }, { - "@type": "Person", - "affiliation": { - "@type": "GovernmentOrganization", - "name": "Materials Science and Engineering Division", - "parentOrganization": { - "@type": "GovernmentOrganization", - "name": "Material Measurement Laboratory", - "parentOrganization": { - "@type": "GovernmentOrganization", - "address": { - "@type": "PostalAddress", - "addressCountry": "US", - "addressLocality": "Gaithersburg", - "addressRegion": "Maryland", - "postalCode": "20899", - "streetAddress": "100 Bureau Drive" - }, - "identifier": "NIST", - "name": "National Institute of Standards and Technology", - "parentOrganization": "U.S. Department of Commerce", - "url": "https://www.nist.gov" - } - } - }, - "email": "daniel.wheeler@nist.gov", - "familyName": "Wheeler", - "givenName": "Daniel", - "identifier": "wd15", - "sameAs": "https://orcid.org/0000-0002-2653-7418" - } - ], - "dataset": [ - { - "@type": "Dataset", - "distribution": [ - { - "@type": "PropertyValue", - "name": "parallel_nodes", - "value": 1 - }, { - "@type": "PropertyValue", - "name": "cpu_architecture", - "value": "amd64" - }, { - "@type": "PropertyValue", - "name": "parallel_cores", - "value": 12 - }, { - "@type": "PropertyValue", - "name": "parallel_gpus", - "value": 1 - }, { - "@type": "PropertyValue", - "name": "gpu_architecture", - "value": "nvidia" - }, { - "@type": "PropertyValue", - "name": "gpu_cores", - "value": 6144 - }, { - "@type": "PropertyValue", - "name": "wall_time", - "unitCode": "SEC", - "unitText": "s", - "value": 384 - }, { - "@type": "PropertyValue", - "name": "memory_usage", - "unitCode": "E63", - "unitText": "mebibyte", - "value": 1835 - } - ], - "name": "irl" - }, { - "@type": "Dataset", - "distribution": [ - { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_1.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_1.csv", - "name": "solid fraction" - }, { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_2.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_2.csv", - "name": "solid fraction" - }, { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_3.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_3.csv", - "name": "solid fraction" - } - ], - "name": "output" - } - ], - "dateCreated": "2022-10-25T19:25:02+00:00", - "description": "A fake dataset for Benchmark 8a unprepared using FiPy by @tkphd & @wd15", - "isBasedOn": { - "@type": "SoftwareSourceCode", - "codeRepository": "https://github.com/tkphd/fake-pfhub-bm8a", - "description": "Fake benchmark 8a upload with FiPy", - "runtimePlatform": "fipy", - "targetProduct": "amd64", - "version": "9df6603e" - }, - "isPartOf": { - "@type": "Series", - "identifier": "8a", - "name": "Homogeneous Nucleation", - "url": "https://pages.nist.gov/pfhub/benchmarks/benchmark8.ipynb" - }, - "keywords": [ - "phase-field", - "benchmarks", - "pfhub", - "fipy", - "homogeneous-nucleation" - ], - "license": "https://www.nist.gov/open/license#software" -} -``` diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup new file mode 100644 index 0000000..eb1761d --- /dev/null +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup @@ -0,0 +1,500 @@ +# Data Generation and Curation + +- *[Trevor Keller](https://www.nist.gov/people/trevor-keller), NIST*, [@tkphd] +- *[Daniel Wheeler](https://www.nist.gov/people/daniel-wheeler), NIST*, [@wd15] +- *[Damien Pinto](https://ca.linkedin.com/in/damien-pinto-4748387b), McGill*, [@DamienPinto] + + +## Overview + + - Look at lit on data and see how this is implemented + - Generation and dissemination + +## Ideas + + - Data formats + - FAIR + - Metadata (hierarchical data standards (look for current versions)) + - What standards exist + - One or two examples of phase field data stored currently + - Use an existing + - Create our own example + - Practical choices for storing data (figshare, zenodo, dryad, MDF) + - Deciding what data to keep + - What data to store when publishing + - What is supplementary material versus store versus leave on hard drive + - Lit review, good citations + - minting DOIs for the data + - might include simulatioin execution and logging + - how frequently to store data + - how to store + +## How do we want to structure the sections? + + 1. Intro (Daniel) + - What is data? + - What is metadata? + - Why do we need curate data? + - What is the motivation for this document? + - What should the reader get out of this document + - Why is this useful + - Create a distinction between software, data, and post-processed results + + 2. Data Generation (Trevor) + - HPC + - file systems + - data formats + - formats not to use (e.g., don't use serialization that depends on the version of the code that reads and writes because code changes) + - don't use pickles + - restarts + - data frequency + - post-processing -> refer to other document for this + - precision + - importance of folder structure + + 3. Data Curation (Trevor) + - Why is curating data important? + - Why do we need to store our data + - What formats can we use + - Is my data too large, data sizes + - What is useful for third party / subsequent users + - How might the data be used for AI or something else + - Could a reviewer make use of your curated data + - Storing post-processed data and raw data and which to store or keep + - Minting DOIs for your software when publishing a paper + - FAIR + + 4. Metadata standards (Daniel) + - Why do we need to keep some metadata beyond the data + - Zoo of things like data dictionaries, ontologies + - however, these are not well developed for our use case + - For example, you curate on Zenodo + - what extra data should you include + - how to describe the data files + - how to maintain some minimalist info about the simulation that generated the run + - When, why and how I ran this simulation + - What software + - Give example of a yaml file with 10 flat fields + - The future should be better in this regard. People actively working to improve this issue. + + 5. Examples + - Practical examples (Trevor) + - Using Zenodo for a PFHub record to store data and metadata + - Relatively rich metadata scheme + + - Simulation from scratch (Damien) + - data generation + - folder structure + - HPC issues with data + - capture process / descriptive parameters for the data that + are useful for subsequent ML practitioners that use the data + - ML / store data + - Narrative of what gets stored to disk + - Decisions of what to keep and how frequently to save data + - Auxiliary metadata decisions + + + 6. Summary (Daniel) + 7. Biblio (Daniel) + +--- + +## Old version + +- Save the data from your published work as much as possible, with meta data +- Save the inputs used to produce the results from all your published work + +### FAIR Data + +We discussed the [FAIR Principles] at [CHiMaD Phase-Field XIII][fair-phase-field]: + +#### Findable + +- [ ] (Meta)data are assigned a globally unique and persistent identifier +- [ ] Data are described with rich metadata (defined by R1 below) +- [ ] Metadata clearly and explicitly include the identifier of the data they describe +- [ ] (Meta)data are registered or indexed in a searchable resource + +#### Accessible + +- [ ] (Meta)data are retrievable by their identifier using a standardized + communications protocol + - [ ] The protocol is open, free, and universally implementable + - [ ] The protocol allows for an authentication and authorisation procedure, + where necessary +- [ ] Metadata are accessible, even when the data are no longer available + +#### Interoperable + +- [ ] (Meta)data use a formal, accessible, shared, and broadly applicable language + for knowledge representation. +- [ ] (Meta)data use vocabularies that follow FAIR principles +- [ ] (Meta)data include qualified references to other (meta)data + +#### Reusable + +- [ ] (Meta)data are richly described with a plurality of accurate and relevant attributes + - [ ] (Meta)data are released with a clear and accessible data usage license + - [ ] (Meta)data are associated with detailed provenance + - [ ] (Meta)data meet domain-relevant community standards + +### Zenodo + +Historically, [PFHub] has accepted datasets linked from any host on the Web. +At this time, we recommend using [Zenodo] to host your benchmark data. Why? *It's not "just" a shared folder.* + +* Guided prompts to describe what you're uploading +* DOI is automatically assigned to your dataset +* Basic metadata exported in multiple formats +* Browser-based viewers for CSV, Markdown, PDF, images, videos + +#### Metadata Examples + +Zenodo gives you the option to import a repository directly from GitHub. The original [FAIR Phase-field talk](https://doi.org/10.5281/zenodo.6540105) was "uploaded" this way, producing the following record. While basic authorship information was captured, this tells an interested person or machine nothing meaningful about the dataset. + +```json +{ + "@context": "https://schema.org/", + "@id": "https://doi.org/10.5281/zenodo.6540105", + "@type": "SoftwareSourceCode", + "name": "tkphd/fair-phase-field-data: CHiMaD Phase-field XIII", + "description": "FAIR Principles for Phase-Field Practitioners", + "version": "v0.1.0", + "license": "", + "identifier": "https://doi.org/10.5281/zenodo.6540105", + "url": "https://zenodo.org/record/6540105", + "datePublished": "2022-05-11", + "creator": [{ + "@type": "Person", + "givenName": "Trevor", + "familyName": "Keller", + "affiliation": "NIST"}], + "codeRepository": "https://github.com/tkphd/fair-phase-field-data/tree/v0.1.0" +} +``` + +The *strongly* preferred method is to upload files directly. The following record represents an upload for [Benchmark 1b using HiPerC](https://doi.org/10.5281/zenodo.1124941). I would consider this metadata ***rich!*** + +```json +{ + "@context": "https://schema.org/", + "@id": "https://doi.org/10.5281/zenodo.1124941", + "@type": "Dataset", + "name": "hiperc-gpu-cuda-spinodal" + "description": "Solution to the CHiMaD Phase Field benchmark problem on spinodal decomposition using CUDA, with a 9-point discrete Laplacian stencil", + "identifier": "https://doi.org/10.5281/zenodo.1124941", + "license": "https://creativecommons.org/licenses/by/4.0/legalcode", + "url": "https://zenodo.org/record/1124941", + "datePublished": "2017-12-21", + "creator": [{ + "@type": "Person", + "@id": "https://orcid.org/0000-0002-2920-8302", + "givenName": "Trevor", + "familyName": "Keller", + "affiliation": "NIST"}], + "keywords": ["phase-field", "pfhub", "chimad"], + "sameAs": ["https://doi.org/10.6084/m9.figshare.5715103.v2"], + "distribution": [ + { + "@type": "DataDownload", + "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv", + "encodingFormat": "csv" + }, { + "@type": "DataDownload", + "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0000000.png", + "encodingFormat": "png" + }, { + "@type": "DataDownload", + "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0100000.png", + "encodingFormat": "png" + }, { + "@type": "DataDownload", + "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0200000.png", + "encodingFormat": "png" + }] +} +``` + +#### Metadata Files + +After uploading the HiPerC simulation data, I also registered it with PFHub using `meta.yaml`. This file tells the website-generating machinery what to do with the dataset, and provides additional information about the resources required to perform the simulation. + +```yaml +--- +benchmark: + id: 1b + version: '1' +data: +- name: run_time + values: + - sim_time: '200000' + wall_time: '7464' +- name: memory_usage + values: + - unit: KB + value: '308224' +- description: free energy data + url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv +- description: microstructure at t=0 + type: image + url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-000000.png +- description: microstructure at t=100,000 + type: image + url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-100000.png +- description: microstructure at t=200,000 + type: image + url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-200000.png +metadata: + author: + email: trevor.keller@nist.gov + first: Trevor + github_id: tkphd + last: Keller + hardware: + acc_architecture: gpu + clock_rate: '1.48' + cores: '1792' + cpu_architecture: x86_64 + nodes: 1 + parallel_model: threaded + implementation: + repo: + url: https://github.com/usnistgov/hiperc + version: b25b14acda7c5aef565cdbcfc88f2df3412dcc46 + simulation_name: hiperc_cuda + summary: HiPerC spinodal decomposition result using CUDA on a Tesla P100 + timestamp: 18 December, 2017 +``` + +This file is not part of my dataset: it resides in the [PFHub repository on GitHub]. Furthermore, since the structure of this file specifically suits PFHub, it is of no use at all to other software, websites, or researchers. + +### Structured Data Schemas + +In the Zenodo metadata above, note the `@context` fields: [Schema.org] is a [structured data schema] *and controlled vocabulary* for describing things on the Internet. How is this useful? + +Consider the [CodeMeta] project. It creates metadata files for software projects using [Schema.org] building blocks. There's even a handy [CodeMeta Generator]! If you maintain a phase-field software framework, you can (and should!) use it to document your code in a standards-compliant, machine-readable format. This improves interoperability and reusability! + +```json +{ + "@context": "https://doi.org/10.5063/schema/codemeta-2.0", + "@type": "SoftwareSourceCode", + "license": "https://spdx.org/licenses/CC-PDDC", + "codeRepository": "git+https://github.com/usnistgov/hiperc", + "dateCreated": "2017-08-07", + "dateModified": "2019-03-04", + "downloadUrl": "https://github.com/usnistgov/hiperc/releases/tag/v1.0", + "issueTracker": "https://github.com/usnistgov/hiperc/issues", + "name": "HiPerC", + "version": "1.0.0", + "description": "High-Performance Computing in C and CUDA", + "applicationCategory": "phase-field", + "developmentStatus": "inactive", + "programmingLanguage": ["C", "CUDA", "OpenCL", "OpenMP", "TBB"], + "author": [ + { + "@type": "Person", + "@id": "https://orcid.org/my-orcid?orcid=0000-0002-2920-8302", + "givenName": "Trevor", + "familyName": "Keller", + "email": "trevor.keller@nist.gov", + "affiliation": { + "@type": "Organization", + "name": "NIST" + } + } + ] +} +``` + +That's nice! But what about our datasets? Shouldn't the [PFHub] metadata "describing" a dataset live alongside that data? + +#### Towards a Phase-Field Schema + +We are working to build a phase-field schema (or schemas) using [Schema.org] and the [schemaorg] Python library. The work-alike port of `meta.yaml` looks like the following. + +> *N.B.:* We're going to deploy a generator similar to CodeMeta's so you won't have to write this! + +```json +{ + "@context": "https://www.schema.org", + "@type": "DataCatalog", + "author": [ + { + "@type": "Person", + "affiliation": { + "@type": "GovernmentOrganization", + "name": "Materials Science and Engineering Division", + "parentOrganization": { + "@type": "GovernmentOrganization", + "name": "Material Measurement Laboratory", + "parentOrganization": { + "@type": "GovernmentOrganization", + "address": { + "@type": "PostalAddress", + "addressCountry": "US", + "addressLocality": "Gaithersburg", + "addressRegion": "Maryland", + "postalCode": "20899", + "streetAddress": "100 Bureau Drive" + }, + "identifier": "NIST", + "name": "National Institute of Standards and Technology", + "parentOrganization": "U.S. Department of Commerce", + "url": "https://www.nist.gov" + } + } + }, + "email": "trevor.keller@nist.gov", + "familyName": "Keller", + "givenName": "Trevor", + "identifier": "tkphd", + "sameAs": "https://orcid.org/0000-0002-2920-8302" + }, { + "@type": "Person", + "affiliation": { + "@type": "GovernmentOrganization", + "name": "Materials Science and Engineering Division", + "parentOrganization": { + "@type": "GovernmentOrganization", + "name": "Material Measurement Laboratory", + "parentOrganization": { + "@type": "GovernmentOrganization", + "address": { + "@type": "PostalAddress", + "addressCountry": "US", + "addressLocality": "Gaithersburg", + "addressRegion": "Maryland", + "postalCode": "20899", + "streetAddress": "100 Bureau Drive" + }, + "identifier": "NIST", + "name": "National Institute of Standards and Technology", + "parentOrganization": "U.S. Department of Commerce", + "url": "https://www.nist.gov" + } + } + }, + "email": "daniel.wheeler@nist.gov", + "familyName": "Wheeler", + "givenName": "Daniel", + "identifier": "wd15", + "sameAs": "https://orcid.org/0000-0002-2653-7418" + } + ], + "dataset": [ + { + "@type": "Dataset", + "distribution": [ + { + "@type": "PropertyValue", + "name": "parallel_nodes", + "value": 1 + }, { + "@type": "PropertyValue", + "name": "cpu_architecture", + "value": "amd64" + }, { + "@type": "PropertyValue", + "name": "parallel_cores", + "value": 12 + }, { + "@type": "PropertyValue", + "name": "parallel_gpus", + "value": 1 + }, { + "@type": "PropertyValue", + "name": "gpu_architecture", + "value": "nvidia" + }, { + "@type": "PropertyValue", + "name": "gpu_cores", + "value": 6144 + }, { + "@type": "PropertyValue", + "name": "wall_time", + "unitCode": "SEC", + "unitText": "s", + "value": 384 + }, { + "@type": "PropertyValue", + "name": "memory_usage", + "unitCode": "E63", + "unitText": "mebibyte", + "value": 1835 + } + ], + "name": "irl" + }, { + "@type": "Dataset", + "distribution": [ + { + "@type": "DataDownload", + "contentUrl": "8a/free_energy_1.csv", + "name": "free energy" + }, { + "@type": "DataDownload", + "contentUrl": "8a/solid_fraction_1.csv", + "name": "solid fraction" + }, { + "@type": "DataDownload", + "contentUrl": "8a/free_energy_2.csv", + "name": "free energy" + }, { + "@type": "DataDownload", + "contentUrl": "8a/solid_fraction_2.csv", + "name": "solid fraction" + }, { + "@type": "DataDownload", + "contentUrl": "8a/free_energy_3.csv", + "name": "free energy" + }, { + "@type": "DataDownload", + "contentUrl": "8a/solid_fraction_3.csv", + "name": "solid fraction" + } + ], + "name": "output" + } + ], + "dateCreated": "2022-10-25T19:25:02+00:00", + "description": "A fake dataset for Benchmark 8a unprepared using FiPy by @tkphd & @wd15", + "isBasedOn": { + "@type": "SoftwareSourceCode", + "codeRepository": "https://github.com/tkphd/fake-pfhub-bm8a", + "description": "Fake benchmark 8a upload with FiPy", + "runtimePlatform": "fipy", + "targetProduct": "amd64", + "version": "9df6603e" + }, + "isPartOf": { + "@type": "Series", + "identifier": "8a", + "name": "Homogeneous Nucleation", + "url": "https://pages.nist.gov/pfhub/benchmarks/benchmark8.ipynb" + }, + "keywords": [ + "phase-field", + "benchmarks", + "pfhub", + "fipy", + "homogeneous-nucleation" + ], + "license": "https://www.nist.gov/open/license#software" +} +``` + + + +[@tkphd]: https://github.com/tkphd +[@wd15]: https://github.com/wd15 +[@DamienPinto]: https://github.com/DamienPinto +[CodeMeta]: https://codemeta.github.io +[CodeMeta Generator]: https://codemeta.github.io/codemeta-generator/ +[FAIR Principles]: https://www.go-fair.org/fair-principles/ +[PFHub]: https://pages.nist.gov/pfhub +[PFHub repository on GitHub]: https://github.com/usnistgov/pfhub +[Schema.org]: https://www.schema.org +[Zenodo]: https://zenodo.org +[fair-phase-field]: https://doi.org/10.5281/zenodo.7254581 +[schemaorg]: https://github.com/openschemas/schemaorg +[structured data schema]: https://en.wikipedia.org/wiki/Data_model From 354326cec3428d0ffce015fd38755dd74f07c2e9 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Tue, 31 Dec 2024 11:55:29 -0500 Subject: [PATCH 02/19] workflow tools overview to data curation section --- .../ch3-data-generation-and-curation.md | 65 +++++++++++++++++++ 1 file changed, 65 insertions(+) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index e3018b3..705d38f 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -302,7 +302,72 @@ check-pointing jobs? - https://www.cs.cmu.edu/~bianca/dsn06.pdf +### Using Workflow Tools + +The authors of this article use Snakemake for their workflows so will +discuss this in particular, but most of the ideas will apply to other +workflow tools. In general when running many phase field jobs for a +parameter study or dealing with many pre and post-processing steps, it +is wise to employ a workflow tool such as Snakemake. One of the main +benefits of workflow tools is the automation of all the steps in a +workflow that researchers often neglect to implement in the absence of +a workflow tool (e.g. with bash scripts). This forces a structure and +the researchers to think carefully about the inputs / outputs and task +graph. As a side effect, the graph structure produces a much FAIRer +research object when the research is published and shared and even so +that the researcher can rerun the simulation steps in the future. For +example, when using Snakemake, the `Snakefile` itself is a clear +record of the steps required to re-execute the workflow. Ideally, the +`Snakefile` will include all the steps required to go from the raw +inputs to images and data tables used in publications, but this might +not always be possible. + +A secondary impact of using a workflow tool is that it often imposes a +directory and file structure on the project. For example, Snakemake +has an ideal suggested structure. An example folder structure when +using Snakemake would look like the following. + +```plain +. +├── config +│   └── config.yaml +├── LICENSE.md +├── README.md +├── resources +├── results +│   └── image.png +└── workflow + ├── envs + │   ├── env.yaml + │   ├── flake.lock + │   ├── flake.nix + │   ├── poetry.lock + │   └── pyproject.toml + ├── notebooks + │   └── analysis.ipynb + ├── rules + │   ├── postprocess.smk + │   ├── preprocess.smk + │   └── sim.smk + ├── scripts + │   ├── func.py + │   └── run.py + └── Snakefile +``` + +Notice that the above directory strucuture includes the `envs` +directory. This allows diffferent steps in the workflow to be run in +diffferent types of environments. The benefit of this is that the +steps can be highly hetrogeneous in terms of the required +computational enviornment. Additionally, most workflow tools will +support both HPC and local workstation execution and make porting +between systems easier. + +See https://pmc.ncbi.nlm.nih.gov/articles/PMC8114187/ for a more +details overview off Snakemake and a list of other good workflow +tools. +- https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html From 2802409903e2b0a9fe73e4100d321ec508835176 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Thu, 2 Jan 2025 11:22:27 -0500 Subject: [PATCH 03/19] add data curation intro --- .../ch3-data-generation-and-curation.md | 50 +++++++++++++++++++ 1 file changed, 50 insertions(+) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 705d38f..b2de8e4 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -369,6 +369,56 @@ tools. - https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html + +## Data Curation + +Data curation involves the steps required to turn an unstructured data +from a research project into a coherent research data object +satisfying the principles of FAIR data. A robust data curation process +is often a requirement for compliance for funding requirements and to +simply meet the most basic needs of transparency in scientific +research. + +Simulation FAIR data paragraph and importance of metadata + +The fundamental steps to curate a computational research project into +a research data object and publish are as follows. + +- Automate the entire computational workflow where possible during the + research process from initial inputs to final research products such + as images and data tables. +- Publish the code and workflows appropriately during development (see + the ... guide). +- Employ a suitable metadata standard where possible to describe + different aspects of the research project such as the raw data + files, derived data assets, software environments, numerical + algorithms and problems specification. +- Identify the significant raw and derived data assets that are + required to produce the final research products. +- License the research work appropriately. This may require a separate license for the data products as they are generally not archived in the code repository. +- Select a data repository to curate the data +- Obtain a DOI for the data object and link with other research + products + +The above steps are difficult to implement near the conclusion of a +research project. The authors suggest implementing the majority of +these steps at the outset of the project and developing these steps as +part of a protocol for all research projects within a computational +materials research group. + +### Automation + +### Metadata Standards + +### Publish the codes and workflows during development + +### Identifying the significant data assets + +### Licensing + +### Selecting a data repository + + [@tkphd]: https://github.com/tkphd From 26c82661177379f31dc5480f49d7d14f289e8533 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Fri, 3 Jan 2025 12:06:50 -0500 Subject: [PATCH 04/19] various changes to the data generation chapter - update .gitignore to ignore _build subfolder - update README.md to include use of Python's simple web server - Add Nix build - Add access to sphinxcontrib.mermaid to build mermaid diagram - Ensure Overview section of data generation is using citations correctly. - Make Mermaid diagram work correctly - Include draft of automation section - Add Bibliography section --- .gitignore | 1 + README.md | 4 +- flake.lock | 175 ++ flake.nix | 63 + pf-recommended-practices/_config.yml | 2 + .../ch3-data-generation-and-curation.md | 84 +- pf-recommended-practices/references.bib | 84 + poetry.lock | 2721 +++++++++++++++++ pyproject.toml | 18 + 9 files changed, 3127 insertions(+), 25 deletions(-) create mode 100644 flake.lock create mode 100644 flake.nix create mode 100644 poetry.lock create mode 100644 pyproject.toml diff --git a/.gitignore b/.gitignore index 0188677..102c919 100644 --- a/.gitignore +++ b/.gitignore @@ -7,3 +7,4 @@ tmp/ .auctex-auto/ logo.png /_build/ +pf-recommended-practices/_build \ No newline at end of file diff --git a/README.md b/README.md index e7e09a8..142eabb 100644 --- a/README.md +++ b/README.md @@ -14,7 +14,9 @@ If you'd like to develop and/or build the Phase Field Method Recommended Practic 4. Run `jupyter-book clean pf-recommended-practices/` to remove any existing builds 5. Run `jupyter-book build pf-recommended-practices/` -A fully-rendered HTML version of the book will be built in `pf-recommended-practices/_build/html/`. +A fully-rendered HTML version of the book will be built in +`pf-recommended-practices/_build/html/`. Render using `python -m +http.server` in the `pf-recommended-practices/_build/html/` directory. ### Hosting the book diff --git a/flake.lock b/flake.lock new file mode 100644 index 0000000..e87e064 --- /dev/null +++ b/flake.lock @@ -0,0 +1,175 @@ +{ + "nodes": { + "flake-utils": { + "inputs": { + "systems": "systems" + }, + "locked": { + "lastModified": 1726560853, + "narHash": "sha256-X6rJYSESBVr3hBoH0WbKE5KvhPU5bloyZ2L4K60/fPQ=", + "owner": "numtide", + "repo": "flake-utils", + "rev": "c1dfcf08411b08f6b8615f7d8971a2bfa81d5e8a", + "type": "github" + }, + "original": { + "owner": "numtide", + "repo": "flake-utils", + "type": "github" + } + }, + "nix-github-actions": { + "inputs": { + "nixpkgs": [ + "poetry2nix", + "nixpkgs" + ] + }, + "locked": { + "lastModified": 1729742964, + "narHash": "sha256-B4mzTcQ0FZHdpeWcpDYPERtyjJd/NIuaQ9+BV1h+MpA=", + "owner": "nix-community", + "repo": "nix-github-actions", + "rev": "e04df33f62cdcf93d73e9a04142464753a16db67", + "type": "github" + }, + "original": { + "owner": "nix-community", + "repo": "nix-github-actions", + "type": "github" + } + }, + "nixpkgs": { + "locked": { + "lastModified": 1735563628, + "narHash": "sha256-OnSAY7XDSx7CtDoqNh8jwVwh4xNL/2HaJxGjryLWzX8=", + "owner": "NixOS", + "repo": "nixpkgs", + "rev": "b134951a4c9f3c995fd7be05f3243f8ecd65d798", + "type": "github" + }, + "original": { + "owner": "NixOS", + "ref": "nixos-24.05", + "repo": "nixpkgs", + "type": "github" + } + }, + "poetry2nix": { + "inputs": { + "flake-utils": "flake-utils", + "nix-github-actions": "nix-github-actions", + "nixpkgs": [ + "nixpkgs" + ], + "systems": "systems_2", + "treefmt-nix": "treefmt-nix" + }, + "locked": { + "lastModified": 1735164664, + "narHash": "sha256-DaWy+vo3c4TQ93tfLjUgcpPaSoDw4qV4t76Y3Mhu84I=", + "owner": "nix-community", + "repo": "poetry2nix", + "rev": "1fb01e90771f762655be7e0e805516cd7fa4d58e", + "type": "github" + }, + "original": { + "owner": "nix-community", + "repo": "poetry2nix", + "type": "github" + } + }, + "root": { + "inputs": { + "nixpkgs": "nixpkgs", + "poetry2nix": "poetry2nix", + "utils": "utils" + } + }, + "systems": { + "locked": { + "lastModified": 1681028828, + "narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=", + "owner": "nix-systems", + "repo": "default", + "rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e", + "type": "github" + }, + "original": { + "owner": "nix-systems", + "repo": "default", + "type": "github" + } + }, + "systems_2": { + "locked": { + "lastModified": 1681028828, + "narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=", + "owner": "nix-systems", + "repo": "default", + "rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e", + "type": "github" + }, + "original": { + "id": "systems", + "type": "indirect" + } + }, + "systems_3": { + "locked": { + "lastModified": 1681028828, + "narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=", + "owner": "nix-systems", + "repo": "default", + "rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e", + "type": "github" + }, + "original": { + "owner": "nix-systems", + "repo": "default", + "type": "github" + } + }, + "treefmt-nix": { + "inputs": { + "nixpkgs": [ + "poetry2nix", + "nixpkgs" + ] + }, + "locked": { + "lastModified": 1730120726, + "narHash": "sha256-LqHYIxMrl/1p3/kvm2ir925tZ8DkI0KA10djk8wecSk=", + "owner": "numtide", + "repo": "treefmt-nix", + "rev": "9ef337e492a5555d8e17a51c911ff1f02635be15", + "type": "github" + }, + "original": { + "owner": "numtide", + "repo": "treefmt-nix", + "type": "github" + } + }, + "utils": { + "inputs": { + "systems": "systems_3" + }, + "locked": { + "lastModified": 1731533236, + "narHash": "sha256-l0KFg5HjrsfsO/JpG+r7fRrqm12kzFHyUHqHCVpMMbI=", + "owner": "numtide", + "repo": "flake-utils", + "rev": "11707dc2f618dd54ca8739b309ec4fc024de578b", + "type": "github" + }, + "original": { + "owner": "numtide", + "repo": "flake-utils", + "type": "github" + } + } + }, + "root": "root", + "version": 7 +} diff --git a/flake.nix b/flake.nix new file mode 100644 index 0000000..139bc26 --- /dev/null +++ b/flake.nix @@ -0,0 +1,63 @@ +{ + description = "Environment for Jupyter Book"; + + inputs = { + nixpkgs.url = "github:NixOS/nixpkgs/nixos-24.05"; + utils.url = "github:numtide/flake-utils"; + poetry2nix = { + url = "github:nix-community/poetry2nix"; + inputs.nixpkgs.follows = "nixpkgs"; + }; + }; + + outputs = { self, nixpkgs, utils, poetry2nix}: (utils.lib.eachSystem ["x86_64-linux" ] (system: + let + pkgs = import nixpkgs { + inherit system; + config.cudaSupport = true; + config.allowUnfree = true; + overlays = [ + poetry2nix.overlays.default + ]; + }; + + pypkgs-build-requirements = { + # hbreader = [ "setuptools" ]; + }; + + p2n-overrides = pkgs.poetry2nix.defaultPoetryOverrides.extend (self: super: + builtins.mapAttrs (package: build-requirements: + (builtins.getAttr package super).overridePythonAttrs (old: { + buildInputs = (old.buildInputs or [ ]) ++ (builtins.map (pkg: if builtins.isString pkg then builtins.getAttr pkg super else pkg) build-requirements); + }) + ) pypkgs-build-requirements + ); + + args = { + projectDir = ./.; + preferWheels = true; + overrides = p2n-overrides; + python = pkgs.python313; + }; + env = pkgs.poetry2nix.mkPoetryEnv args; + app = pkgs.poetry2nix.mkPoetryApplication args; + + ruby = pkgs.ruby.withPackages (ps: with ps; [ jekyll kramdown-parser-gfm webrick ]); + in + rec { + ## See https://github.com/nix-community/poetry2nix/issues/1433 + ## It seems like poetry2nix does not seem to install as dev + ## environment + ## devShells.default = env.env; + devShells.default = pkgs.mkShell { + packages = [ env ruby ]; + shellHook = '' + export PYTHONPATH=$PWD + ''; + }; + packages.amdt = app; + packages.default = self.packages.${system}.amdt; + } + ) + ); +} diff --git a/pf-recommended-practices/_config.yml b/pf-recommended-practices/_config.yml index f4a6a5b..db884e6 100644 --- a/pf-recommended-practices/_config.yml +++ b/pf-recommended-practices/_config.yml @@ -45,3 +45,5 @@ parse: sphinx: config: mathjax_path: https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js + extra_extensions: + - sphinxcontrib.mermaid diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index b2de8e4..40a70ab 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -8,35 +8,36 @@ ## Overview + Phase field models are characterized by a form of PDE related to an Eulerian free boundary problem and defined by a diffuse interface. Phase field models for practical applications require sufficient high fidelity to resolve both the macro length scale related to the application and the micro length scales associated with -the free boundary. This requires extensive computationally resources -and generates large volumes of raw field data. This data consists of -field variables defined frequently across a domain or an interpolation -function with many Gauss points. Typically, data is stored at -sufficient temporal frequency to reconstruct the evolution of the -field variables. +the free boundary. Performing a useful phase field simulation requires +extensive computationally resources and can generate large volumes of +raw field data. This data consists of field variables defined +frequently across a domain or an interpolation function with many +Gauss points. Typically, data is stored at sufficient temporal +frequency to reconstruct the evolution of the field variables. In recent years there have been efforts to embed phase field models -into ICME-based materials design workflows. However, to leverage -relevant phase field resources for these workflows a systematic -approach is required for archiving and accessing data. Furthermore, it -is often difficult for downstream researchers to find and access raw -or minimally processed data from phase field studies, before the -post-processing steps and final publication. In this document, we will -provide motivation, guidance and a template for packaging and -publishing FAIR data from phase field studies as well as managing -unpublished raw data. Following the protocols outlined in this guide +into ICME-based materials design workflows +{cite}`TOURRET2022100810`. However, to leverage relevant phase field +resources for these workflows a systematic approach is required for +archiving and accessing data. Furthermore, it is often difficult for +downstream researchers to find and access raw or minimally processed +data from phase field studies, before the post-processing steps and +final publication. In this document, we will provide motivation, +guidance and a template for packaging and publishing FAIR data from +phase field studies as well as managing unpublished raw data +{cite}`Wilkinson2016`. Following the protocols outlined in this guide will provide downstream researchers with an enhanced capability to use phase field as part of larger ICME workflows and, in particular, data -intensive usages such as AI surrogate models. This guide is not a -complete guide about scientific data, but more of a though provoker so -phase field practitioners are aware of the fundamental issues before -embarking on a phase field study. - +intensive usages such as AI surrogate models. This guide serves as a +primer rather than a detailied reference on scientific data, aiming to +stimulate thought and ensure that phase field practitioners are aware +of the key considerations before initiating a phase field study. ## Definitions @@ -68,10 +69,10 @@ tools and practices. See the [Software Development] section of the best practices guide for a more detailed discussion of software and code curation. -```mermaid +```{mermaid} --- title: A Phase Field Workflow ---- +---p flowchart TD id1(Input Files\nParameters) id1.1(Code) @@ -377,7 +378,8 @@ from a research project into a coherent research data object satisfying the principles of FAIR data. A robust data curation process is often a requirement for compliance for funding requirements and to simply meet the most basic needs of transparency in scientific -research. +research. The main benefits of data curation include (see +[DCC](https://www.dcc.ac.uk/guidance/how-guides/develop-data-plan#Why%20develop)) Simulation FAIR data paragraph and importance of metadata @@ -395,7 +397,9 @@ a research data object and publish are as follows. algorithms and problems specification. - Identify the significant raw and derived data assets that are required to produce the final research products. -- License the research work appropriately. This may require a separate license for the data products as they are generally not archived in the code repository. +- License the research work appropriately. This may require a separate + license for the data products as they are generally not archived in + the code repository. - Select a data repository to curate the data - Obtain a DOI for the data object and link with other research products @@ -408,6 +412,31 @@ materials research group. ### Automation +Automating workflows in computational materials science is useful for +many reasons, however, for data curation purposed it provides and +added benefit. In short, an outlined workflow associated with a +curated FAIR object is a major way to improve FAIR quality for +subsequent researchers. For most workflow tools, the operation script +outlining the workflow graph is the ultimate form of metadata about +how the archived data files are used or generated during the +research. For example, with Snakemake, the `Snakefile` has clearly +outlined inputs and outputs as well as the procedure associated with +each input / output pair. In particular, the computational +environment, command line arguments, environment variables are +recorded as well as the order of execution for each step. + +In recent years there have been efforts in the life sciences to +provide a minimum workflow for independent code execution during the +peer review process. The [CODECHECK +initiative](https://doi.org/10.12688/f1000research.51738.2) trys to +provide a standard for executing workflows and a certification if the +workflow satisifies basic criteria. These types of efforts will likely +be used within the compuational materials science community in the +coming years so adopting automated workflow tools as part of your +research will greatly benefit this process. + +- https://www.sciencedirect.com/science/article/pii/S2666389921001707?via%3Dihub + ### Metadata Standards ### Publish the codes and workflows during development @@ -418,6 +447,11 @@ materials research group. ### Selecting a data repository +## References + +```{bibliography} +:filter: docname in docnames +``` @@ -434,3 +468,5 @@ materials research group. [fair-phase-field]: https://doi.org/10.5281/zenodo.7254581 [schemaorg]: https://github.com/openschemas/schemaorg [structured data schema]: https://en.wikipedia.org/wiki/Data_model +[FAIR]: https://doi.org/10.1038/sdata.2016.18 + diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index 5a11773..280a3fd 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -74,3 +74,87 @@ @article{boettinger_phase-field_2002 year = {2002}, pages = {163--194}, } + +@article{TOURRET2022100810, +title = {Phase-field modeling of microstructure evolution: Recent applications, perspectives and challenges}, +journal = {Progress in Materials Science}, +volume = {123}, +pages = {100810}, +year = {2022}, +note = {A Festschrift in Honor of Brian Cantor}, +issn = {0079-6425}, +doi = {https://doi.org/10.1016/j.pmatsci.2021.100810}, +url = {https://www.sciencedirect.com/science/article/pii/S0079642521000347}, +author = {Damien Tourret and Hong Liu and Javier LLorca}, +keywords = {Phase-field, Microstructure evolution, Solid state transformations, Solidification}, +abstract = {We briefly review the state-of-the-art in phase-field modeling of microstructure evolution. The focus is placed on recent applications of phase-field simulations of solid-state microstructure evolution and solidification that have been compared and/or validated with experiments. They show the potential of phase-field modeling to make quantitative predictions of the link between processing and microstructure. Finally, some current challenges in extending the application of phase-field models within the context of integrated computational materials engineering are mentioned.} +} + +@Article{Wilkinson2016, +author={Wilkinson, Mark D. +and Dumontier, Michel +and Aalbersberg, IJsbrand Jan +and Appleton, Gabrielle +and Axton, Myles +and Baak, Arie +and Blomberg, Niklas +and Boiten, Jan-Willem +and da Silva Santos, Luiz Bonino +and Bourne, Philip E. +and Bouwman, Jildau +and Brookes, Anthony J. +and Clark, Tim +and Crosas, Merc{\`e} +and Dillo, Ingrid +and Dumon, Olivier +and Edmunds, Scott +and Evelo, Chris T. +and Finkers, Richard +and Gonzalez-Beltran, Alejandra +and Gray, Alasdair J.G. +and Groth, Paul +and Goble, Carole +and Grethe, Jeffrey S. +and Heringa, Jaap +and 't Hoen, Peter A.C +and Hooft, Rob +and Kuhn, Tobias +and Kok, Ruben +and Kok, Joost +and Lusher, Scott J. +and Martone, Maryann E. +and Mons, Albert +and Packer, Abel L. +and Persson, Bengt +and Rocca-Serra, Philippe +and Roos, Marco +and van Schaik, Rene +and Sansone, Susanna-Assunta +and Schultes, Erik +and Sengstag, Thierry +and Slater, Ted +and Strawn, George +and Swertz, Morris A. +and Thompson, Mark +and van der Lei, Johan +and van Mulligen, Erik +and Velterop, Jan +and Waagmeester, Andra +and Wittenburg, Peter +and Wolstencroft, Katherine +and Zhao, Jun +and Mons, Barend}, +title={The FAIR Guiding Principles for scientific data management and stewardship}, +journal={Scientific Data}, +year={2016}, +month={Mar}, +day={15}, +volume={3}, +number={1}, +pages={160018}, +abstract={There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders---representing academia, industry, funding agencies, and scholarly publishers---have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.}, +issn={2052-4463}, +doi={10.1038/sdata.2016.18}, +url={https://doi.org/10.1038/sdata.2016.18} +} + diff --git a/poetry.lock b/poetry.lock new file mode 100644 index 0000000..d676967 --- /dev/null +++ b/poetry.lock @@ -0,0 +1,2721 @@ +# This file is automatically @generated by Poetry 1.8.5 and should not be changed by hand. + +[[package]] +name = "accessible-pygments" +version = "0.0.5" +description = "A collection of accessible pygments styles" +optional = false +python-versions = ">=3.9" +files = [ + {file = "accessible_pygments-0.0.5-py3-none-any.whl", hash = "sha256:88ae3211e68a1d0b011504b2ffc1691feafce124b845bd072ab6f9f66f34d4b7"}, + {file = "accessible_pygments-0.0.5.tar.gz", hash = "sha256:40918d3e6a2b619ad424cb91e556bd3bd8865443d9f22f1dcdf79e33c8046872"}, +] + +[package.dependencies] +pygments = ">=1.5" + +[package.extras] +dev = ["pillow", "pkginfo (>=1.10)", "playwright", "pre-commit", "setuptools", "twine (>=5.0)"] +tests = ["hypothesis", "pytest"] + +[[package]] +name = "alabaster" +version = "0.7.16" +description = "A light, configurable Sphinx theme" +optional = false +python-versions = ">=3.9" +files = [ + {file = "alabaster-0.7.16-py3-none-any.whl", hash = "sha256:b46733c07dce03ae4e150330b975c75737fa60f0a7c591b6c8bf4928a28e2c92"}, + {file = "alabaster-0.7.16.tar.gz", hash = "sha256:75a8b99c28a5dad50dd7f8ccdd447a121ddb3892da9e53d1ca5cca3106d58d65"}, +] + +[[package]] +name = "appnope" +version = "0.1.4" +description = "Disable App Nap on macOS >= 10.9" +optional = false +python-versions = ">=3.6" +files = [ + {file = "appnope-0.1.4-py2.py3-none-any.whl", hash = "sha256:502575ee11cd7a28c0205f379b525beefebab9d161b7c964670864014ed7213c"}, + {file = "appnope-0.1.4.tar.gz", hash = "sha256:1de3860566df9caf38f01f86f65e0e13e379af54f9e4bee1e66b48f2efffd1ee"}, +] + +[[package]] +name = "asttokens" +version = "3.0.0" +description = "Annotate AST trees with source code positions" +optional = false +python-versions = ">=3.8" +files = [ + {file = "asttokens-3.0.0-py3-none-any.whl", hash = "sha256:e3078351a059199dd5138cb1c706e6430c05eff2ff136af5eb4790f9d28932e2"}, + {file = "asttokens-3.0.0.tar.gz", hash = "sha256:0dcd8baa8d62b0c1d118b399b2ddba3c4aff271d0d7a9e0d4c1681c79035bbc7"}, +] + +[package.extras] +astroid = ["astroid (>=2,<4)"] +test = ["astroid (>=2,<4)", "pytest", "pytest-cov", "pytest-xdist"] + +[[package]] +name = "attrs" +version = "24.3.0" +description = "Classes Without Boilerplate" +optional = false +python-versions = ">=3.8" +files = [ + {file = "attrs-24.3.0-py3-none-any.whl", hash = "sha256:ac96cd038792094f438ad1f6ff80837353805ac950cd2aa0e0625ef19850c308"}, + {file = "attrs-24.3.0.tar.gz", hash = "sha256:8f5c07333d543103541ba7be0e2ce16eeee8130cb0b3f9238ab904ce1e85baff"}, +] + +[package.extras] +benchmark = ["cloudpickle", "hypothesis", "mypy (>=1.11.1)", "pympler", "pytest (>=4.3.0)", "pytest-codspeed", "pytest-mypy-plugins", "pytest-xdist[psutil]"] +cov = ["cloudpickle", "coverage[toml] (>=5.3)", "hypothesis", "mypy (>=1.11.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"] +dev = ["cloudpickle", "hypothesis", "mypy (>=1.11.1)", "pre-commit-uv", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"] +docs = ["cogapp", "furo", "myst-parser", "sphinx", "sphinx-notfound-page", "sphinxcontrib-towncrier", "towncrier (<24.7)"] +tests = ["cloudpickle", "hypothesis", "mypy (>=1.11.1)", "pympler", "pytest (>=4.3.0)", "pytest-mypy-plugins", "pytest-xdist[psutil]"] +tests-mypy = ["mypy (>=1.11.1)", "pytest-mypy-plugins"] + +[[package]] +name = "babel" +version = "2.16.0" +description = "Internationalization utilities" +optional = false +python-versions = ">=3.8" +files = [ + {file = "babel-2.16.0-py3-none-any.whl", hash = "sha256:368b5b98b37c06b7daf6696391c3240c938b37767d4584413e8438c5c435fa8b"}, + {file = "babel-2.16.0.tar.gz", hash = "sha256:d1f3554ca26605fe173f3de0c65f750f5a42f924499bf134de6423582298e316"}, +] + +[package.extras] +dev = ["freezegun (>=1.0,<2.0)", "pytest (>=6.0)", "pytest-cov"] + +[[package]] +name = "beautifulsoup4" +version = "4.12.3" +description = "Screen-scraping library" +optional = false +python-versions = ">=3.6.0" +files = [ + {file = "beautifulsoup4-4.12.3-py3-none-any.whl", hash = "sha256:b80878c9f40111313e55da8ba20bdba06d8fa3969fc68304167741bbf9e082ed"}, + {file = "beautifulsoup4-4.12.3.tar.gz", hash = "sha256:74e3d1928edc070d21748185c46e3fb33490f22f52a3addee9aee0f4f7781051"}, +] + +[package.dependencies] +soupsieve = ">1.2" + +[package.extras] +cchardet = ["cchardet"] +chardet = ["chardet"] +charset-normalizer = ["charset-normalizer"] +html5lib = ["html5lib"] +lxml = ["lxml"] + +[[package]] +name = "certifi" +version = "2024.12.14" +description = "Python package for providing Mozilla's CA Bundle." +optional = false +python-versions = ">=3.6" +files = [ + {file = "certifi-2024.12.14-py3-none-any.whl", hash = "sha256:1275f7a45be9464efc1173084eaa30f866fe2e47d389406136d332ed4967ec56"}, + {file = "certifi-2024.12.14.tar.gz", hash = "sha256:b650d30f370c2b724812bee08008be0c4163b163ddaec3f2546c1caf65f191db"}, +] + +[[package]] +name = "cffi" +version = "1.17.1" +description = "Foreign Function Interface for Python calling C code." +optional = false +python-versions = ">=3.8" +files = [ + {file = "cffi-1.17.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:df8b1c11f177bc2313ec4b2d46baec87a5f3e71fc8b45dab2ee7cae86d9aba14"}, + {file = "cffi-1.17.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:8f2cdc858323644ab277e9bb925ad72ae0e67f69e804f4898c070998d50b1a67"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:edae79245293e15384b51f88b00613ba9f7198016a5948b5dddf4917d4d26382"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:45398b671ac6d70e67da8e4224a065cec6a93541bb7aebe1b198a61b58c7b702"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:ad9413ccdeda48c5afdae7e4fa2192157e991ff761e7ab8fdd8926f40b160cc3"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5da5719280082ac6bd9aa7becb3938dc9f9cbd57fac7d2871717b1feb0902ab6"}, + {file = "cffi-1.17.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2bb1a08b8008b281856e5971307cc386a8e9c5b625ac297e853d36da6efe9c17"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:045d61c734659cc045141be4bae381a41d89b741f795af1dd018bfb532fd0df8"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:6883e737d7d9e4899a8a695e00ec36bd4e5e4f18fabe0aca0efe0a4b44cdb13e"}, + {file = "cffi-1.17.1-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:6b8b4a92e1c65048ff98cfe1f735ef8f1ceb72e3d5f0c25fdb12087a23da22be"}, + {file = "cffi-1.17.1-cp310-cp310-win32.whl", hash = "sha256:c9c3d058ebabb74db66e431095118094d06abf53284d9c81f27300d0e0d8bc7c"}, + {file = "cffi-1.17.1-cp310-cp310-win_amd64.whl", hash = "sha256:0f048dcf80db46f0098ccac01132761580d28e28bc0f78ae0d58048063317e15"}, + {file = "cffi-1.17.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:a45e3c6913c5b87b3ff120dcdc03f6131fa0065027d0ed7ee6190736a74cd401"}, + {file = "cffi-1.17.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:30c5e0cb5ae493c04c8b42916e52ca38079f1b235c2f8ae5f4527b963c401caf"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f75c7ab1f9e4aca5414ed4d8e5c0e303a34f4421f8a0d47a4d019ceff0ab6af4"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a1ed2dd2972641495a3ec98445e09766f077aee98a1c896dcb4ad0d303628e41"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:46bf43160c1a35f7ec506d254e5c890f3c03648a4dbac12d624e4490a7046cd1"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a24ed04c8ffd54b0729c07cee15a81d964e6fee0e3d4d342a27b020d22959dc6"}, + {file = "cffi-1.17.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:610faea79c43e44c71e1ec53a554553fa22321b65fae24889706c0a84d4ad86d"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:a9b15d491f3ad5d692e11f6b71f7857e7835eb677955c00cc0aefcd0669adaf6"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:de2ea4b5833625383e464549fec1bc395c1bdeeb5f25c4a3a82b5a8c756ec22f"}, + {file = "cffi-1.17.1-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:fc48c783f9c87e60831201f2cce7f3b2e4846bf4d8728eabe54d60700b318a0b"}, + {file = "cffi-1.17.1-cp311-cp311-win32.whl", hash = "sha256:85a950a4ac9c359340d5963966e3e0a94a676bd6245a4b55bc43949eee26a655"}, + {file = "cffi-1.17.1-cp311-cp311-win_amd64.whl", hash = "sha256:caaf0640ef5f5517f49bc275eca1406b0ffa6aa184892812030f04c2abf589a0"}, + {file = "cffi-1.17.1-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:805b4371bf7197c329fcb3ead37e710d1bca9da5d583f5073b799d5c5bd1eee4"}, + {file = "cffi-1.17.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:733e99bc2df47476e3848417c5a4540522f234dfd4ef3ab7fafdf555b082ec0c"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1257bdabf294dceb59f5e70c64a3e2f462c30c7ad68092d01bbbfb1c16b1ba36"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:da95af8214998d77a98cc14e3a3bd00aa191526343078b530ceb0bd710fb48a5"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d63afe322132c194cf832bfec0dc69a99fb9bb6bbd550f161a49e9e855cc78ff"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f79fc4fc25f1c8698ff97788206bb3c2598949bfe0fef03d299eb1b5356ada99"}, + {file = "cffi-1.17.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b62ce867176a75d03a665bad002af8e6d54644fad99a3c70905c543130e39d93"}, + {file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:386c8bf53c502fff58903061338ce4f4950cbdcb23e2902d86c0f722b786bbe3"}, + {file = "cffi-1.17.1-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:4ceb10419a9adf4460ea14cfd6bc43d08701f0835e979bf821052f1805850fe8"}, + {file = "cffi-1.17.1-cp312-cp312-win32.whl", hash = "sha256:a08d7e755f8ed21095a310a693525137cfe756ce62d066e53f502a83dc550f65"}, + {file = "cffi-1.17.1-cp312-cp312-win_amd64.whl", hash = "sha256:51392eae71afec0d0c8fb1a53b204dbb3bcabcb3c9b807eedf3e1e6ccf2de903"}, + {file = "cffi-1.17.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f3a2b4222ce6b60e2e8b337bb9596923045681d71e5a082783484d845390938e"}, + {file = "cffi-1.17.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:0984a4925a435b1da406122d4d7968dd861c1385afe3b45ba82b750f229811e2"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d01b12eeeb4427d3110de311e1774046ad344f5b1a7403101878976ecd7a10f3"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:706510fe141c86a69c8ddc029c7910003a17353970cff3b904ff0686a5927683"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:de55b766c7aa2e2a3092c51e0483d700341182f08e67c63630d5b6f200bb28e5"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:c59d6e989d07460165cc5ad3c61f9fd8f1b4796eacbd81cee78957842b834af4"}, + {file = "cffi-1.17.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dd398dbc6773384a17fe0d3e7eeb8d1a21c2200473ee6806bb5e6a8e62bb73dd"}, + {file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:3edc8d958eb099c634dace3c7e16560ae474aa3803a5df240542b305d14e14ed"}, + {file = "cffi-1.17.1-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:72e72408cad3d5419375fc87d289076ee319835bdfa2caad331e377589aebba9"}, + {file = "cffi-1.17.1-cp313-cp313-win32.whl", hash = "sha256:e03eab0a8677fa80d646b5ddece1cbeaf556c313dcfac435ba11f107ba117b5d"}, + {file = "cffi-1.17.1-cp313-cp313-win_amd64.whl", hash = "sha256:f6a16c31041f09ead72d69f583767292f750d24913dadacf5756b966aacb3f1a"}, + {file = "cffi-1.17.1-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:636062ea65bd0195bc012fea9321aca499c0504409f413dc88af450b57ffd03b"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c7eac2ef9b63c79431bc4b25f1cd649d7f061a28808cbc6c47b534bd789ef964"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e221cf152cff04059d011ee126477f0d9588303eb57e88923578ace7baad17f9"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:31000ec67d4221a71bd3f67df918b1f88f676f1c3b535a7eb473255fdc0b83fc"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:6f17be4345073b0a7b8ea599688f692ac3ef23ce28e5df79c04de519dbc4912c"}, + {file = "cffi-1.17.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0e2b1fac190ae3ebfe37b979cc1ce69c81f4e4fe5746bb401dca63a9062cdaf1"}, + {file = "cffi-1.17.1-cp38-cp38-win32.whl", hash = "sha256:7596d6620d3fa590f677e9ee430df2958d2d6d6de2feeae5b20e82c00b76fbf8"}, + {file = "cffi-1.17.1-cp38-cp38-win_amd64.whl", hash = "sha256:78122be759c3f8a014ce010908ae03364d00a1f81ab5c7f4a7a5120607ea56e1"}, + {file = "cffi-1.17.1-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:b2ab587605f4ba0bf81dc0cb08a41bd1c0a5906bd59243d56bad7668a6fc6c16"}, + {file = "cffi-1.17.1-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:28b16024becceed8c6dfbc75629e27788d8a3f9030691a1dbf9821a128b22c36"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1d599671f396c4723d016dbddb72fe8e0397082b0a77a4fab8028923bec050e8"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ca74b8dbe6e8e8263c0ffd60277de77dcee6c837a3d0881d8c1ead7268c9e576"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f7f5baafcc48261359e14bcd6d9bff6d4b28d9103847c9e136694cb0501aef87"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:98e3969bcff97cae1b2def8ba499ea3d6f31ddfdb7635374834cf89a1a08ecf0"}, + {file = "cffi-1.17.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cdf5ce3acdfd1661132f2a9c19cac174758dc2352bfe37d98aa7512c6b7178b3"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:9755e4345d1ec879e3849e62222a18c7174d65a6a92d5b346b1863912168b595"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:f1e22e8c4419538cb197e4dd60acc919d7696e5ef98ee4da4e01d3f8cfa4cc5a"}, + {file = "cffi-1.17.1-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c03e868a0b3bc35839ba98e74211ed2b05d2119be4e8a0f224fba9384f1fe02e"}, + {file = "cffi-1.17.1-cp39-cp39-win32.whl", hash = "sha256:e31ae45bc2e29f6b2abd0de1cc3b9d5205aa847cafaecb8af1476a609a2f6eb7"}, + {file = "cffi-1.17.1-cp39-cp39-win_amd64.whl", hash = "sha256:d016c76bdd850f3c626af19b0542c9677ba156e4ee4fccfdd7848803533ef662"}, + {file = "cffi-1.17.1.tar.gz", hash = "sha256:1c39c6016c32bc48dd54561950ebd6836e1670f2ae46128f67cf49e789c52824"}, +] + +[package.dependencies] +pycparser = "*" + +[[package]] +name = "charset-normalizer" +version = "3.4.1" +description = "The Real First Universal Charset Detector. Open, modern and actively maintained alternative to Chardet." +optional = false +python-versions = ">=3.7" +files = [ + {file = "charset_normalizer-3.4.1-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:91b36a978b5ae0ee86c394f5a54d6ef44db1de0815eb43de826d41d21e4af3de"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7461baadb4dc00fd9e0acbe254e3d7d2112e7f92ced2adc96e54ef6501c5f176"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:e218488cd232553829be0664c2292d3af2eeeb94b32bea483cf79ac6a694e037"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:80ed5e856eb7f30115aaf94e4a08114ccc8813e6ed1b5efa74f9f82e8509858f"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b010a7a4fd316c3c484d482922d13044979e78d1861f0e0650423144c616a46a"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4532bff1b8421fd0a320463030c7520f56a79c9024a4e88f01c537316019005a"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d973f03c0cb71c5ed99037b870f2be986c3c05e63622c017ea9816881d2dd247"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:3a3bd0dcd373514dcec91c411ddb9632c0d7d92aed7093b8c3bbb6d69ca74408"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d9c3cdf5390dcd29aa8056d13e8e99526cda0305acc038b96b30352aff5ff2bb"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:2bdfe3ac2e1bbe5b59a1a63721eb3b95fc9b6817ae4a46debbb4e11f6232428d"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:eab677309cdb30d047996b36d34caeda1dc91149e4fdca0b1a039b3f79d9a807"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-win32.whl", hash = "sha256:c0429126cf75e16c4f0ad00ee0eae4242dc652290f940152ca8c75c3a4b6ee8f"}, + {file = "charset_normalizer-3.4.1-cp310-cp310-win_amd64.whl", hash = "sha256:9f0b8b1c6d84c8034a44893aba5e767bf9c7a211e313a9605d9c617d7083829f"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8bfa33f4f2672964266e940dd22a195989ba31669bd84629f05fab3ef4e2d125"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:28bf57629c75e810b6ae989f03c0828d64d6b26a5e205535585f96093e405ed1"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:f08ff5e948271dc7e18a35641d2f11a4cd8dfd5634f55228b691e62b37125eb3"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:234ac59ea147c59ee4da87a0c0f098e9c8d169f4dc2a159ef720f1a61bbe27cd"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fd4ec41f914fa74ad1b8304bbc634b3de73d2a0889bd32076342a573e0779e00"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:eea6ee1db730b3483adf394ea72f808b6e18cf3cb6454b4d86e04fa8c4327a12"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c96836c97b1238e9c9e3fe90844c947d5afbf4f4c92762679acfe19927d81d77"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:4d86f7aff21ee58f26dcf5ae81a9addbd914115cdebcbb2217e4f0ed8982e146"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:09b5e6733cbd160dcc09589227187e242a30a49ca5cefa5a7edd3f9d19ed53fd"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:5777ee0881f9499ed0f71cc82cf873d9a0ca8af166dfa0af8ec4e675b7df48e6"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:237bdbe6159cff53b4f24f397d43c6336c6b0b42affbe857970cefbb620911c8"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-win32.whl", hash = "sha256:8417cb1f36cc0bc7eaba8ccb0e04d55f0ee52df06df3ad55259b9a323555fc8b"}, + {file = "charset_normalizer-3.4.1-cp311-cp311-win_amd64.whl", hash = "sha256:d7f50a1f8c450f3925cb367d011448c39239bb3eb4117c36a6d354794de4ce76"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:73d94b58ec7fecbc7366247d3b0b10a21681004153238750bb67bd9012414545"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dad3e487649f498dd991eeb901125411559b22e8d7ab25d3aeb1af367df5efd7"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:c30197aa96e8eed02200a83fba2657b4c3acd0f0aa4bdc9f6c1af8e8962e0757"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2369eea1ee4a7610a860d88f268eb39b95cb588acd7235e02fd5a5601773d4fa"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc2722592d8998c870fa4e290c2eec2c1569b87fe58618e67d38b4665dfa680d"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:ffc9202a29ab3920fa812879e95a9e78b2465fd10be7fcbd042899695d75e616"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:804a4d582ba6e5b747c625bf1255e6b1507465494a40a2130978bda7b932c90b"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0f55e69f030f7163dffe9fd0752b32f070566451afe180f99dbeeb81f511ad8d"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:c4c3e6da02df6fa1410a7680bd3f63d4f710232d3139089536310d027950696a"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5df196eb874dae23dcfb968c83d4f8fdccb333330fe1fc278ac5ceeb101003a9"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:e358e64305fe12299a08e08978f51fc21fac060dcfcddd95453eabe5b93ed0e1"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-win32.whl", hash = "sha256:9b23ca7ef998bc739bf6ffc077c2116917eabcc901f88da1b9856b210ef63f35"}, + {file = "charset_normalizer-3.4.1-cp312-cp312-win_amd64.whl", hash = "sha256:6ff8a4a60c227ad87030d76e99cd1698345d4491638dfa6673027c48b3cd395f"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:aabfa34badd18f1da5ec1bc2715cadc8dca465868a4e73a0173466b688f29dda"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22e14b5d70560b8dd51ec22863f370d1e595ac3d024cb8ad7d308b4cd95f8313"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:8436c508b408b82d87dc5f62496973a1805cd46727c34440b0d29d8a2f50a6c9"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2d074908e1aecee37a7635990b2c6d504cd4766c7bc9fc86d63f9c09af3fa11b"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:955f8851919303c92343d2f66165294848d57e9bba6cf6e3625485a70a038d11"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:44ecbf16649486d4aebafeaa7ec4c9fed8b88101f4dd612dcaf65d5e815f837f"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:0924e81d3d5e70f8126529951dac65c1010cdf117bb75eb02dd12339b57749dd"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:2967f74ad52c3b98de4c3b32e1a44e32975e008a9cd2a8cc8966d6a5218c5cb2"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:c75cb2a3e389853835e84a2d8fb2b81a10645b503eca9bcb98df6b5a43eb8886"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:09b26ae6b1abf0d27570633b2b078a2a20419c99d66fb2823173d73f188ce601"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:fa88b843d6e211393a37219e6a1c1df99d35e8fd90446f1118f4216e307e48cd"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-win32.whl", hash = "sha256:eb8178fe3dba6450a3e024e95ac49ed3400e506fd4e9e5c32d30adda88cbd407"}, + {file = "charset_normalizer-3.4.1-cp313-cp313-win_amd64.whl", hash = "sha256:b1ac5992a838106edb89654e0aebfc24f5848ae2547d22c2c3f66454daa11971"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:f30bf9fd9be89ecb2360c7d94a711f00c09b976258846efe40db3d05828e8089"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:97f68b8d6831127e4787ad15e6757232e14e12060bec17091b85eb1486b91d8d"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7974a0b5ecd505609e3b19742b60cee7aa2aa2fb3151bc917e6e2646d7667dcf"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:fc54db6c8593ef7d4b2a331b58653356cf04f67c960f584edb7c3d8c97e8f39e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:311f30128d7d333eebd7896965bfcfbd0065f1716ec92bd5638d7748eb6f936a"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:7d053096f67cd1241601111b698f5cad775f97ab25d81567d3f59219b5f1adbd"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_i686.whl", hash = "sha256:807f52c1f798eef6cf26beb819eeb8819b1622ddfeef9d0977a8502d4db6d534"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_ppc64le.whl", hash = "sha256:dccbe65bd2f7f7ec22c4ff99ed56faa1e9f785482b9bbd7c717e26fd723a1d1e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_s390x.whl", hash = "sha256:2fb9bd477fdea8684f78791a6de97a953c51831ee2981f8e4f583ff3b9d9687e"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:01732659ba9b5b873fc117534143e4feefecf3b2078b0a6a2e925271bb6f4cfa"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-win32.whl", hash = "sha256:7a4f97a081603d2050bfaffdefa5b02a9ec823f8348a572e39032caa8404a487"}, + {file = "charset_normalizer-3.4.1-cp37-cp37m-win_amd64.whl", hash = "sha256:7b1bef6280950ee6c177b326508f86cad7ad4dff12454483b51d8b7d673a2c5d"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:ecddf25bee22fe4fe3737a399d0d177d72bc22be6913acfab364b40bce1ba83c"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8c60ca7339acd497a55b0ea5d506b2a2612afb2826560416f6894e8b5770d4a9"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b7b2d86dd06bfc2ade3312a83a5c364c7ec2e3498f8734282c6c3d4b07b346b8"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:dd78cfcda14a1ef52584dbb008f7ac81c1328c0f58184bf9a84c49c605002da6"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6e27f48bcd0957c6d4cb9d6fa6b61d192d0b13d5ef563e5f2ae35feafc0d179c"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:01ad647cdd609225c5350561d084b42ddf732f4eeefe6e678765636791e78b9a"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:619a609aa74ae43d90ed2e89bdd784765de0a25ca761b93e196d938b8fd1dbbd"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_i686.whl", hash = "sha256:89149166622f4db9b4b6a449256291dc87a99ee53151c74cbd82a53c8c2f6ccd"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_ppc64le.whl", hash = "sha256:7709f51f5f7c853f0fb938bcd3bc59cdfdc5203635ffd18bf354f6967ea0f824"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_s390x.whl", hash = "sha256:345b0426edd4e18138d6528aed636de7a9ed169b4aaf9d61a8c19e39d26838ca"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:0907f11d019260cdc3f94fbdb23ff9125f6b5d1039b76003b5b0ac9d6a6c9d5b"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-win32.whl", hash = "sha256:ea0d8d539afa5eb2728aa1932a988a9a7af94f18582ffae4bc10b3fbdad0626e"}, + {file = "charset_normalizer-3.4.1-cp38-cp38-win_amd64.whl", hash = "sha256:329ce159e82018d646c7ac45b01a430369d526569ec08516081727a20e9e4af4"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:b97e690a2118911e39b4042088092771b4ae3fc3aa86518f84b8cf6888dbdb41"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:78baa6d91634dfb69ec52a463534bc0df05dbd546209b79a3880a34487f4b84f"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1a2bc9f351a75ef49d664206d51f8e5ede9da246602dc2d2726837620ea034b2"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:75832c08354f595c760a804588b9357d34ec00ba1c940c15e31e96d902093770"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0af291f4fe114be0280cdd29d533696a77b5b49cfde5467176ecab32353395c4"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:0167ddc8ab6508fe81860a57dd472b2ef4060e8d378f0cc555707126830f2537"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:2a75d49014d118e4198bcee5ee0a6f25856b29b12dbf7cd012791f8a6cc5c496"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:363e2f92b0f0174b2f8238240a1a30142e3db7b957a5dd5689b0e75fb717cc78"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_ppc64le.whl", hash = "sha256:ab36c8eb7e454e34e60eb55ca5d241a5d18b2c6244f6827a30e451c42410b5f7"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_s390x.whl", hash = "sha256:4c0907b1928a36d5a998d72d64d8eaa7244989f7aaaf947500d3a800c83a3fd6"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:04432ad9479fa40ec0f387795ddad4437a2b50417c69fa275e212933519ff294"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-win32.whl", hash = "sha256:3bed14e9c89dcb10e8f3a29f9ccac4955aebe93c71ae803af79265c9ca5644c5"}, + {file = "charset_normalizer-3.4.1-cp39-cp39-win_amd64.whl", hash = "sha256:49402233c892a461407c512a19435d1ce275543138294f7ef013f0b63d5d3765"}, + {file = "charset_normalizer-3.4.1-py3-none-any.whl", hash = "sha256:d98b1668f06378c6dbefec3b92299716b931cd4e6061f3c875a71ced1780ab85"}, + {file = "charset_normalizer-3.4.1.tar.gz", hash = "sha256:44251f18cd68a75b56585dd00dae26183e102cd5e0f9f1466e6df5da2ed64ea3"}, +] + +[[package]] +name = "click" +version = "8.1.8" +description = "Composable command line interface toolkit" +optional = false +python-versions = ">=3.7" +files = [ + {file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"}, + {file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"}, +] + +[package.dependencies] +colorama = {version = "*", markers = "platform_system == \"Windows\""} + +[[package]] +name = "colorama" +version = "0.4.6" +description = "Cross-platform colored terminal text." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7" +files = [ + {file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"}, + {file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"}, +] + +[[package]] +name = "comm" +version = "0.2.2" +description = "Jupyter Python Comm implementation, for usage in ipykernel, xeus-python etc." +optional = false +python-versions = ">=3.8" +files = [ + {file = "comm-0.2.2-py3-none-any.whl", hash = "sha256:e6fb86cb70ff661ee8c9c14e7d36d6de3b4066f1441be4063df9c5009f0a64d3"}, + {file = "comm-0.2.2.tar.gz", hash = "sha256:3fd7a84065306e07bea1773df6eb8282de51ba82f77c72f9c85716ab11fe980e"}, +] + +[package.dependencies] +traitlets = ">=4" + +[package.extras] +test = ["pytest"] + +[[package]] +name = "contourpy" +version = "1.3.1" +description = "Python library for calculating contours of 2D quadrilateral grids" +optional = false +python-versions = ">=3.10" +files = [ + {file = "contourpy-1.3.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:a045f341a77b77e1c5de31e74e966537bba9f3c4099b35bf4c2e3939dd54cdab"}, + {file = "contourpy-1.3.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:500360b77259914f7805af7462e41f9cb7ca92ad38e9f94d6c8641b089338124"}, + {file = "contourpy-1.3.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b2f926efda994cdf3c8d3fdb40b9962f86edbc4457e739277b961eced3d0b4c1"}, + {file = "contourpy-1.3.1-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:adce39d67c0edf383647a3a007de0a45fd1b08dedaa5318404f1a73059c2512b"}, + {file = "contourpy-1.3.1-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:abbb49fb7dac584e5abc6636b7b2a7227111c4f771005853e7d25176daaf8453"}, + {file = "contourpy-1.3.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a0cffcbede75c059f535725c1680dfb17b6ba8753f0c74b14e6a9c68c29d7ea3"}, + {file = "contourpy-1.3.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:ab29962927945d89d9b293eabd0d59aea28d887d4f3be6c22deaefbb938a7277"}, + {file = "contourpy-1.3.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:974d8145f8ca354498005b5b981165b74a195abfae9a8129df3e56771961d595"}, + {file = "contourpy-1.3.1-cp310-cp310-win32.whl", hash = "sha256:ac4578ac281983f63b400f7fe6c101bedc10651650eef012be1ccffcbacf3697"}, + {file = "contourpy-1.3.1-cp310-cp310-win_amd64.whl", hash = "sha256:174e758c66bbc1c8576992cec9599ce8b6672b741b5d336b5c74e35ac382b18e"}, + {file = "contourpy-1.3.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3e8b974d8db2c5610fb4e76307e265de0edb655ae8169e8b21f41807ccbeec4b"}, + {file = "contourpy-1.3.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:20914c8c973f41456337652a6eeca26d2148aa96dd7ac323b74516988bea89fc"}, + {file = "contourpy-1.3.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:19d40d37c1c3a4961b4619dd9d77b12124a453cc3d02bb31a07d58ef684d3d86"}, + {file = "contourpy-1.3.1-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:113231fe3825ebf6f15eaa8bc1f5b0ddc19d42b733345eae0934cb291beb88b6"}, + {file = "contourpy-1.3.1-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4dbbc03a40f916a8420e420d63e96a1258d3d1b58cbdfd8d1f07b49fcbd38e85"}, + {file = "contourpy-1.3.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a04ecd68acbd77fa2d39723ceca4c3197cb2969633836ced1bea14e219d077c"}, + {file = "contourpy-1.3.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:c414fc1ed8ee1dbd5da626cf3710c6013d3d27456651d156711fa24f24bd1291"}, + {file = "contourpy-1.3.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:31c1b55c1f34f80557d3830d3dd93ba722ce7e33a0b472cba0ec3b6535684d8f"}, + {file = "contourpy-1.3.1-cp311-cp311-win32.whl", hash = "sha256:f611e628ef06670df83fce17805c344710ca5cde01edfdc72751311da8585375"}, + {file = "contourpy-1.3.1-cp311-cp311-win_amd64.whl", hash = "sha256:b2bdca22a27e35f16794cf585832e542123296b4687f9fd96822db6bae17bfc9"}, + {file = "contourpy-1.3.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:0ffa84be8e0bd33410b17189f7164c3589c229ce5db85798076a3fa136d0e509"}, + {file = "contourpy-1.3.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:805617228ba7e2cbbfb6c503858e626ab528ac2a32a04a2fe88ffaf6b02c32bc"}, + {file = "contourpy-1.3.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ade08d343436a94e633db932e7e8407fe7de8083967962b46bdfc1b0ced39454"}, + {file = "contourpy-1.3.1-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:47734d7073fb4590b4a40122b35917cd77be5722d80683b249dac1de266aac80"}, + {file = "contourpy-1.3.1-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2ba94a401342fc0f8b948e57d977557fbf4d515f03c67682dd5c6191cb2d16ec"}, + {file = "contourpy-1.3.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:efa874e87e4a647fd2e4f514d5e91c7d493697127beb95e77d2f7561f6905bd9"}, + {file = "contourpy-1.3.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:1bf98051f1045b15c87868dbaea84f92408337d4f81d0e449ee41920ea121d3b"}, + {file = "contourpy-1.3.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:61332c87493b00091423e747ea78200659dc09bdf7fd69edd5e98cef5d3e9a8d"}, + {file = "contourpy-1.3.1-cp312-cp312-win32.whl", hash = "sha256:e914a8cb05ce5c809dd0fe350cfbb4e881bde5e2a38dc04e3afe1b3e58bd158e"}, + {file = "contourpy-1.3.1-cp312-cp312-win_amd64.whl", hash = "sha256:08d9d449a61cf53033612cb368f3a1b26cd7835d9b8cd326647efe43bca7568d"}, + {file = "contourpy-1.3.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:a761d9ccfc5e2ecd1bf05534eda382aa14c3e4f9205ba5b1684ecfe400716ef2"}, + {file = "contourpy-1.3.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:523a8ee12edfa36f6d2a49407f705a6ef4c5098de4f498619787e272de93f2d5"}, + {file = "contourpy-1.3.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:ece6df05e2c41bd46776fbc712e0996f7c94e0d0543af1656956d150c4ca7c81"}, + {file = "contourpy-1.3.1-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:573abb30e0e05bf31ed067d2f82500ecfdaec15627a59d63ea2d95714790f5c2"}, + {file = "contourpy-1.3.1-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:a9fa36448e6a3a1a9a2ba23c02012c43ed88905ec80163f2ffe2421c7192a5d7"}, + {file = "contourpy-1.3.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ea9924d28fc5586bf0b42d15f590b10c224117e74409dd7a0be3b62b74a501c"}, + {file = "contourpy-1.3.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:5b75aa69cb4d6f137b36f7eb2ace9280cfb60c55dc5f61c731fdf6f037f958a3"}, + {file = "contourpy-1.3.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:041b640d4ec01922083645a94bb3b2e777e6b626788f4095cf21abbe266413c1"}, + {file = "contourpy-1.3.1-cp313-cp313-win32.whl", hash = "sha256:36987a15e8ace5f58d4d5da9dca82d498c2bbb28dff6e5d04fbfcc35a9cb3a82"}, + {file = "contourpy-1.3.1-cp313-cp313-win_amd64.whl", hash = "sha256:a7895f46d47671fa7ceec40f31fae721da51ad34bdca0bee83e38870b1f47ffd"}, + {file = "contourpy-1.3.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:9ddeb796389dadcd884c7eb07bd14ef12408aaae358f0e2ae24114d797eede30"}, + {file = "contourpy-1.3.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:19c1555a6801c2f084c7ddc1c6e11f02eb6a6016ca1318dd5452ba3f613a1751"}, + {file = "contourpy-1.3.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:841ad858cff65c2c04bf93875e384ccb82b654574a6d7f30453a04f04af71342"}, + {file = "contourpy-1.3.1-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4318af1c925fb9a4fb190559ef3eec206845f63e80fb603d47f2d6d67683901c"}, + {file = "contourpy-1.3.1-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:14c102b0eab282427b662cb590f2e9340a9d91a1c297f48729431f2dcd16e14f"}, + {file = "contourpy-1.3.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:05e806338bfeaa006acbdeba0ad681a10be63b26e1b17317bfac3c5d98f36cda"}, + {file = "contourpy-1.3.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4d76d5993a34ef3df5181ba3c92fabb93f1eaa5729504fb03423fcd9f3177242"}, + {file = "contourpy-1.3.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:89785bb2a1980c1bd87f0cb1517a71cde374776a5f150936b82580ae6ead44a1"}, + {file = "contourpy-1.3.1-cp313-cp313t-win32.whl", hash = "sha256:8eb96e79b9f3dcadbad2a3891672f81cdcab7f95b27f28f1c67d75f045b6b4f1"}, + {file = "contourpy-1.3.1-cp313-cp313t-win_amd64.whl", hash = "sha256:287ccc248c9e0d0566934e7d606201abd74761b5703d804ff3df8935f523d546"}, + {file = "contourpy-1.3.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:b457d6430833cee8e4b8e9b6f07aa1c161e5e0d52e118dc102c8f9bd7dd060d6"}, + {file = "contourpy-1.3.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cb76c1a154b83991a3cbbf0dfeb26ec2833ad56f95540b442c73950af2013750"}, + {file = "contourpy-1.3.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:44a29502ca9c7b5ba389e620d44f2fbe792b1fb5734e8b931ad307071ec58c53"}, + {file = "contourpy-1.3.1.tar.gz", hash = "sha256:dfd97abd83335045a913e3bcc4a09c0ceadbe66580cf573fe961f4a825efa699"}, +] + +[package.dependencies] +numpy = ">=1.23" + +[package.extras] +bokeh = ["bokeh", "selenium"] +docs = ["furo", "sphinx (>=7.2)", "sphinx-copybutton"] +mypy = ["contourpy[bokeh,docs]", "docutils-stubs", "mypy (==1.11.1)", "types-Pillow"] +test = ["Pillow", "contourpy[test-no-images]", "matplotlib"] +test-no-images = ["pytest", "pytest-cov", "pytest-rerunfailures", "pytest-xdist", "wurlitzer"] + +[[package]] +name = "cycler" +version = "0.12.1" +description = "Composable style cycles" +optional = false +python-versions = ">=3.8" +files = [ + {file = "cycler-0.12.1-py3-none-any.whl", hash = "sha256:85cef7cff222d8644161529808465972e51340599459b8ac3ccbac5a854e0d30"}, + {file = "cycler-0.12.1.tar.gz", hash = "sha256:88bb128f02ba341da8ef447245a9e138fae777f6a23943da4540077d3601eb1c"}, +] + +[package.extras] +docs = ["ipython", "matplotlib", "numpydoc", "sphinx"] +tests = ["pytest", "pytest-cov", "pytest-xdist"] + +[[package]] +name = "debugpy" +version = "1.8.11" +description = "An implementation of the Debug Adapter Protocol for Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "debugpy-1.8.11-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:2b26fefc4e31ff85593d68b9022e35e8925714a10ab4858fb1b577a8a48cb8cd"}, + {file = "debugpy-1.8.11-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:61bc8b3b265e6949855300e84dc93d02d7a3a637f2aec6d382afd4ceb9120c9f"}, + {file = "debugpy-1.8.11-cp310-cp310-win32.whl", hash = "sha256:c928bbf47f65288574b78518449edaa46c82572d340e2750889bbf8cd92f3737"}, + {file = "debugpy-1.8.11-cp310-cp310-win_amd64.whl", hash = "sha256:8da1db4ca4f22583e834dcabdc7832e56fe16275253ee53ba66627b86e304da1"}, + {file = "debugpy-1.8.11-cp311-cp311-macosx_14_0_universal2.whl", hash = "sha256:85de8474ad53ad546ff1c7c7c89230db215b9b8a02754d41cb5a76f70d0be296"}, + {file = "debugpy-1.8.11-cp311-cp311-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ffc382e4afa4aee367bf413f55ed17bd91b191dcaf979890af239dda435f2a1"}, + {file = "debugpy-1.8.11-cp311-cp311-win32.whl", hash = "sha256:40499a9979c55f72f4eb2fc38695419546b62594f8af194b879d2a18439c97a9"}, + {file = "debugpy-1.8.11-cp311-cp311-win_amd64.whl", hash = "sha256:987bce16e86efa86f747d5151c54e91b3c1e36acc03ce1ddb50f9d09d16ded0e"}, + {file = "debugpy-1.8.11-cp312-cp312-macosx_14_0_universal2.whl", hash = "sha256:84e511a7545d11683d32cdb8f809ef63fc17ea2a00455cc62d0a4dbb4ed1c308"}, + {file = "debugpy-1.8.11-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ce291a5aca4985d82875d6779f61375e959208cdf09fcec40001e65fb0a54768"}, + {file = "debugpy-1.8.11-cp312-cp312-win32.whl", hash = "sha256:28e45b3f827d3bf2592f3cf7ae63282e859f3259db44ed2b129093ca0ac7940b"}, + {file = "debugpy-1.8.11-cp312-cp312-win_amd64.whl", hash = "sha256:44b1b8e6253bceada11f714acf4309ffb98bfa9ac55e4fce14f9e5d4484287a1"}, + {file = "debugpy-1.8.11-cp313-cp313-macosx_14_0_universal2.whl", hash = "sha256:8988f7163e4381b0da7696f37eec7aca19deb02e500245df68a7159739bbd0d3"}, + {file = "debugpy-1.8.11-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:6c1f6a173d1140e557347419767d2b14ac1c9cd847e0b4c5444c7f3144697e4e"}, + {file = "debugpy-1.8.11-cp313-cp313-win32.whl", hash = "sha256:bb3b15e25891f38da3ca0740271e63ab9db61f41d4d8541745cfc1824252cb28"}, + {file = "debugpy-1.8.11-cp313-cp313-win_amd64.whl", hash = "sha256:d8768edcbeb34da9e11bcb8b5c2e0958d25218df7a6e56adf415ef262cd7b6d1"}, + {file = "debugpy-1.8.11-cp38-cp38-macosx_14_0_x86_64.whl", hash = "sha256:ad7efe588c8f5cf940f40c3de0cd683cc5b76819446abaa50dc0829a30c094db"}, + {file = "debugpy-1.8.11-cp38-cp38-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:189058d03a40103a57144752652b3ab08ff02b7595d0ce1f651b9acc3a3a35a0"}, + {file = "debugpy-1.8.11-cp38-cp38-win32.whl", hash = "sha256:32db46ba45849daed7ccf3f2e26f7a386867b077f39b2a974bb5c4c2c3b0a280"}, + {file = "debugpy-1.8.11-cp38-cp38-win_amd64.whl", hash = "sha256:116bf8342062246ca749013df4f6ea106f23bc159305843491f64672a55af2e5"}, + {file = "debugpy-1.8.11-cp39-cp39-macosx_14_0_x86_64.whl", hash = "sha256:654130ca6ad5de73d978057eaf9e582244ff72d4574b3e106fb8d3d2a0d32458"}, + {file = "debugpy-1.8.11-cp39-cp39-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:23dc34c5e03b0212fa3c49a874df2b8b1b8fda95160bd79c01eb3ab51ea8d851"}, + {file = "debugpy-1.8.11-cp39-cp39-win32.whl", hash = "sha256:52d8a3166c9f2815bfae05f386114b0b2d274456980d41f320299a8d9a5615a7"}, + {file = "debugpy-1.8.11-cp39-cp39-win_amd64.whl", hash = "sha256:52c3cf9ecda273a19cc092961ee34eb9ba8687d67ba34cc7b79a521c1c64c4c0"}, + {file = "debugpy-1.8.11-py2.py3-none-any.whl", hash = "sha256:0e22f846f4211383e6a416d04b4c13ed174d24cc5d43f5fd52e7821d0ebc8920"}, + {file = "debugpy-1.8.11.tar.gz", hash = "sha256:6ad2688b69235c43b020e04fecccdf6a96c8943ca9c2fb340b8adc103c655e57"}, +] + +[[package]] +name = "decorator" +version = "5.1.1" +description = "Decorators for Humans" +optional = false +python-versions = ">=3.5" +files = [ + {file = "decorator-5.1.1-py3-none-any.whl", hash = "sha256:b8c3f85900b9dc423225913c5aace94729fe1fa9763b38939a95226f02d37186"}, + {file = "decorator-5.1.1.tar.gz", hash = "sha256:637996211036b6385ef91435e4fae22989472f9d571faba8927ba8253acbc330"}, +] + +[[package]] +name = "docutils" +version = "0.20.1" +description = "Docutils -- Python Documentation Utilities" +optional = false +python-versions = ">=3.7" +files = [ + {file = "docutils-0.20.1-py3-none-any.whl", hash = "sha256:96f387a2c5562db4476f09f13bbab2192e764cac08ebbf3a34a95d9b1e4a59d6"}, + {file = "docutils-0.20.1.tar.gz", hash = "sha256:f08a4e276c3a1583a86dce3e34aba3fe04d02bba2dd51ed16106244e8a923e3b"}, +] + +[[package]] +name = "executing" +version = "2.1.0" +description = "Get the currently executing AST node of a frame, and other information" +optional = false +python-versions = ">=3.8" +files = [ + {file = "executing-2.1.0-py2.py3-none-any.whl", hash = "sha256:8d63781349375b5ebccc3142f4b30350c0cd9c79f921cde38be2be4637e98eaf"}, + {file = "executing-2.1.0.tar.gz", hash = "sha256:8ea27ddd260da8150fa5a708269c4a10e76161e2496ec3e587da9e3c0fe4b9ab"}, +] + +[package.extras] +tests = ["asttokens (>=2.1.0)", "coverage", "coverage-enable-subprocess", "ipython", "littleutils", "pytest", "rich"] + +[[package]] +name = "fastjsonschema" +version = "2.21.1" +description = "Fastest Python implementation of JSON schema" +optional = false +python-versions = "*" +files = [ + {file = "fastjsonschema-2.21.1-py3-none-any.whl", hash = "sha256:c9e5b7e908310918cf494a434eeb31384dd84a98b57a30bcb1f535015b554667"}, + {file = "fastjsonschema-2.21.1.tar.gz", hash = "sha256:794d4f0a58f848961ba16af7b9c85a3e88cd360df008c59aac6fc5ae9323b5d4"}, +] + +[package.extras] +devel = ["colorama", "json-spec", "jsonschema", "pylint", "pytest", "pytest-benchmark", "pytest-cache", "validictory"] + +[[package]] +name = "fonttools" +version = "4.55.3" +description = "Tools to manipulate font files" +optional = false +python-versions = ">=3.8" +files = [ + {file = "fonttools-4.55.3-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:1dcc07934a2165ccdc3a5a608db56fb3c24b609658a5b340aee4ecf3ba679dc0"}, + {file = "fonttools-4.55.3-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:f7d66c15ba875432a2d2fb419523f5d3d347f91f48f57b8b08a2dfc3c39b8a3f"}, + {file = "fonttools-4.55.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27e4ae3592e62eba83cd2c4ccd9462dcfa603ff78e09110680a5444c6925d841"}, + {file = "fonttools-4.55.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:62d65a3022c35e404d19ca14f291c89cc5890032ff04f6c17af0bd1927299674"}, + {file = "fonttools-4.55.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:d342e88764fb201286d185093781bf6628bbe380a913c24adf772d901baa8276"}, + {file = "fonttools-4.55.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:dd68c87a2bfe37c5b33bcda0fba39b65a353876d3b9006fde3adae31f97b3ef5"}, + {file = "fonttools-4.55.3-cp310-cp310-win32.whl", hash = "sha256:1bc7ad24ff98846282eef1cbeac05d013c2154f977a79886bb943015d2b1b261"}, + {file = "fonttools-4.55.3-cp310-cp310-win_amd64.whl", hash = "sha256:b54baf65c52952db65df39fcd4820668d0ef4766c0ccdf32879b77f7c804d5c5"}, + {file = "fonttools-4.55.3-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:8c4491699bad88efe95772543cd49870cf756b019ad56294f6498982408ab03e"}, + {file = "fonttools-4.55.3-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:5323a22eabddf4b24f66d26894f1229261021dacd9d29e89f7872dd8c63f0b8b"}, + {file = "fonttools-4.55.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5480673f599ad410695ca2ddef2dfefe9df779a9a5cda89503881e503c9c7d90"}, + {file = "fonttools-4.55.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:da9da6d65cd7aa6b0f806556f4985bcbf603bf0c5c590e61b43aa3e5a0f822d0"}, + {file = "fonttools-4.55.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e894b5bd60d9f473bed7a8f506515549cc194de08064d829464088d23097331b"}, + {file = "fonttools-4.55.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:aee3b57643827e237ff6ec6d28d9ff9766bd8b21e08cd13bff479e13d4b14765"}, + {file = "fonttools-4.55.3-cp311-cp311-win32.whl", hash = "sha256:eb6ca911c4c17eb51853143624d8dc87cdcdf12a711fc38bf5bd21521e79715f"}, + {file = "fonttools-4.55.3-cp311-cp311-win_amd64.whl", hash = "sha256:6314bf82c54c53c71805318fcf6786d986461622dd926d92a465199ff54b1b72"}, + {file = "fonttools-4.55.3-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:f9e736f60f4911061235603a6119e72053073a12c6d7904011df2d8fad2c0e35"}, + {file = "fonttools-4.55.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7a8aa2c5e5b8b3bcb2e4538d929f6589a5c6bdb84fd16e2ed92649fb5454f11c"}, + {file = "fonttools-4.55.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:07f8288aacf0a38d174445fc78377a97fb0b83cfe352a90c9d9c1400571963c7"}, + {file = "fonttools-4.55.3-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b8d5e8916c0970fbc0f6f1bece0063363bb5857a7f170121a4493e31c3db3314"}, + {file = "fonttools-4.55.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:ae3b6600565b2d80b7c05acb8e24d2b26ac407b27a3f2e078229721ba5698427"}, + {file = "fonttools-4.55.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:54153c49913f45065c8d9e6d0c101396725c5621c8aee744719300f79771d75a"}, + {file = "fonttools-4.55.3-cp312-cp312-win32.whl", hash = "sha256:827e95fdbbd3e51f8b459af5ea10ecb4e30af50221ca103bea68218e9615de07"}, + {file = "fonttools-4.55.3-cp312-cp312-win_amd64.whl", hash = "sha256:e6e8766eeeb2de759e862004aa11a9ea3d6f6d5ec710551a88b476192b64fd54"}, + {file = "fonttools-4.55.3-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a430178ad3e650e695167cb53242dae3477b35c95bef6525b074d87493c4bf29"}, + {file = "fonttools-4.55.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:529cef2ce91dc44f8e407cc567fae6e49a1786f2fefefa73a294704c415322a4"}, + {file = "fonttools-4.55.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e75f12c82127486fac2d8bfbf5bf058202f54bf4f158d367e41647b972342ca"}, + {file = "fonttools-4.55.3-cp313-cp313-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:859c358ebf41db18fb72342d3080bce67c02b39e86b9fbcf1610cca14984841b"}, + {file = "fonttools-4.55.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:546565028e244a701f73df6d8dd6be489d01617863ec0c6a42fa25bf45d43048"}, + {file = "fonttools-4.55.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:aca318b77f23523309eec4475d1fbbb00a6b133eb766a8bdc401faba91261abe"}, + {file = "fonttools-4.55.3-cp313-cp313-win32.whl", hash = "sha256:8c5ec45428edaa7022f1c949a632a6f298edc7b481312fc7dc258921e9399628"}, + {file = "fonttools-4.55.3-cp313-cp313-win_amd64.whl", hash = "sha256:11e5de1ee0d95af4ae23c1a138b184b7f06e0b6abacabf1d0db41c90b03d834b"}, + {file = "fonttools-4.55.3-cp38-cp38-macosx_10_9_universal2.whl", hash = "sha256:caf8230f3e10f8f5d7593eb6d252a37caf58c480b19a17e250a63dad63834cf3"}, + {file = "fonttools-4.55.3-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:b586ab5b15b6097f2fb71cafa3c98edfd0dba1ad8027229e7b1e204a58b0e09d"}, + {file = "fonttools-4.55.3-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8c2794ded89399cc2169c4d0bf7941247b8d5932b2659e09834adfbb01589aa"}, + {file = "fonttools-4.55.3-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cf4fe7c124aa3f4e4c1940880156e13f2f4d98170d35c749e6b4f119a872551e"}, + {file = "fonttools-4.55.3-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:86721fbc389ef5cc1e2f477019e5069e8e4421e8d9576e9c26f840dbb04678de"}, + {file = "fonttools-4.55.3-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:89bdc5d88bdeec1b15af790810e267e8332d92561dce4f0748c2b95c9bdf3926"}, + {file = "fonttools-4.55.3-cp38-cp38-win32.whl", hash = "sha256:bc5dbb4685e51235ef487e4bd501ddfc49be5aede5e40f4cefcccabc6e60fb4b"}, + {file = "fonttools-4.55.3-cp38-cp38-win_amd64.whl", hash = "sha256:cd70de1a52a8ee2d1877b6293af8a2484ac82514f10b1c67c1c5762d38073e56"}, + {file = "fonttools-4.55.3-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:bdcc9f04b36c6c20978d3f060e5323a43f6222accc4e7fcbef3f428e216d96af"}, + {file = "fonttools-4.55.3-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:c3ca99e0d460eff46e033cd3992a969658c3169ffcd533e0a39c63a38beb6831"}, + {file = "fonttools-4.55.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:22f38464daa6cdb7b6aebd14ab06609328fe1e9705bb0fcc7d1e69de7109ee02"}, + {file = "fonttools-4.55.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ed63959d00b61959b035c7d47f9313c2c1ece090ff63afea702fe86de00dbed4"}, + {file = "fonttools-4.55.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5e8d657cd7326eeaba27de2740e847c6b39dde2f8d7cd7cc56f6aad404ddf0bd"}, + {file = "fonttools-4.55.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:fb594b5a99943042c702c550d5494bdd7577f6ef19b0bc73877c948a63184a32"}, + {file = "fonttools-4.55.3-cp39-cp39-win32.whl", hash = "sha256:dc5294a3d5c84226e3dbba1b6f61d7ad813a8c0238fceea4e09aa04848c3d851"}, + {file = "fonttools-4.55.3-cp39-cp39-win_amd64.whl", hash = "sha256:aedbeb1db64496d098e6be92b2e63b5fac4e53b1b92032dfc6988e1ea9134a4d"}, + {file = "fonttools-4.55.3-py3-none-any.whl", hash = "sha256:f412604ccbeee81b091b420272841e5ec5ef68967a9790e80bffd0e30b8e2977"}, + {file = "fonttools-4.55.3.tar.gz", hash = "sha256:3983313c2a04d6cc1fe9251f8fc647754cf49a61dac6cb1e7249ae67afaafc45"}, +] + +[package.extras] +all = ["brotli (>=1.0.1)", "brotlicffi (>=0.8.0)", "fs (>=2.2.0,<3)", "lxml (>=4.0)", "lz4 (>=1.7.4.2)", "matplotlib", "munkres", "pycairo", "scipy", "skia-pathops (>=0.5.0)", "sympy", "uharfbuzz (>=0.23.0)", "unicodedata2 (>=15.1.0)", "xattr", "zopfli (>=0.1.4)"] +graphite = ["lz4 (>=1.7.4.2)"] +interpolatable = ["munkres", "pycairo", "scipy"] +lxml = ["lxml (>=4.0)"] +pathops = ["skia-pathops (>=0.5.0)"] +plot = ["matplotlib"] +repacker = ["uharfbuzz (>=0.23.0)"] +symfont = ["sympy"] +type1 = ["xattr"] +ufo = ["fs (>=2.2.0,<3)"] +unicode = ["unicodedata2 (>=15.1.0)"] +woff = ["brotli (>=1.0.1)", "brotlicffi (>=0.8.0)", "zopfli (>=0.1.4)"] + +[[package]] +name = "idna" +version = "3.10" +description = "Internationalized Domain Names in Applications (IDNA)" +optional = false +python-versions = ">=3.6" +files = [ + {file = "idna-3.10-py3-none-any.whl", hash = "sha256:946d195a0d259cbba61165e88e65941f16e9b36ea6ddb97f00452bae8b1287d3"}, + {file = "idna-3.10.tar.gz", hash = "sha256:12f65c9b470abda6dc35cf8e63cc574b1c52b11df2c86030af0ac09b01b13ea9"}, +] + +[package.extras] +all = ["flake8 (>=7.1.1)", "mypy (>=1.11.2)", "pytest (>=8.3.2)", "ruff (>=0.6.2)"] + +[[package]] +name = "imagesize" +version = "1.4.1" +description = "Getting image size from png/jpeg/jpeg2000/gif file" +optional = false +python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*" +files = [ + {file = "imagesize-1.4.1-py2.py3-none-any.whl", hash = "sha256:0d8d18d08f840c19d0ee7ca1fd82490fdc3729b7ac93f49870406ddde8ef8d8b"}, + {file = "imagesize-1.4.1.tar.gz", hash = "sha256:69150444affb9cb0d5cc5a92b3676f0b2fb7cd9ae39e947a5e11a36b4497cd4a"}, +] + +[[package]] +name = "importlib-metadata" +version = "8.5.0" +description = "Read metadata from Python packages" +optional = false +python-versions = ">=3.8" +files = [ + {file = "importlib_metadata-8.5.0-py3-none-any.whl", hash = "sha256:45e54197d28b7a7f1559e60b95e7c567032b602131fbd588f1497f47880aa68b"}, + {file = "importlib_metadata-8.5.0.tar.gz", hash = "sha256:71522656f0abace1d072b9e5481a48f07c138e00f079c38c8f883823f9c26bd7"}, +] + +[package.dependencies] +zipp = ">=3.20" + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] +perf = ["ipython"] +test = ["flufl.flake8", "importlib-resources (>=1.3)", "jaraco.test (>=5.4)", "packaging", "pyfakefs", "pytest (>=6,!=8.1.*)", "pytest-perf (>=0.9.2)"] +type = ["pytest-mypy"] + +[[package]] +name = "ipykernel" +version = "6.29.5" +description = "IPython Kernel for Jupyter" +optional = false +python-versions = ">=3.8" +files = [ + {file = "ipykernel-6.29.5-py3-none-any.whl", hash = "sha256:afdb66ba5aa354b09b91379bac28ae4afebbb30e8b39510c9690afb7a10421b5"}, + {file = "ipykernel-6.29.5.tar.gz", hash = "sha256:f093a22c4a40f8828f8e330a9c297cb93dcab13bd9678ded6de8e5cf81c56215"}, +] + +[package.dependencies] +appnope = {version = "*", markers = "platform_system == \"Darwin\""} +comm = ">=0.1.1" +debugpy = ">=1.6.5" +ipython = ">=7.23.1" +jupyter-client = ">=6.1.12" +jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0" +matplotlib-inline = ">=0.1" +nest-asyncio = "*" +packaging = "*" +psutil = "*" +pyzmq = ">=24" +tornado = ">=6.1" +traitlets = ">=5.4.0" + +[package.extras] +cov = ["coverage[toml]", "curio", "matplotlib", "pytest-cov", "trio"] +docs = ["myst-parser", "pydata-sphinx-theme", "sphinx", "sphinx-autodoc-typehints", "sphinxcontrib-github-alt", "sphinxcontrib-spelling", "trio"] +pyqt5 = ["pyqt5"] +pyside6 = ["pyside6"] +test = ["flaky", "ipyparallel", "pre-commit", "pytest (>=7.0)", "pytest-asyncio (>=0.23.5)", "pytest-cov", "pytest-timeout"] + +[[package]] +name = "ipython" +version = "8.31.0" +description = "IPython: Productive Interactive Computing" +optional = false +python-versions = ">=3.10" +files = [ + {file = "ipython-8.31.0-py3-none-any.whl", hash = "sha256:46ec58f8d3d076a61d128fe517a51eb730e3aaf0c184ea8c17d16e366660c6a6"}, + {file = "ipython-8.31.0.tar.gz", hash = "sha256:b6a2274606bec6166405ff05e54932ed6e5cfecaca1fc05f2cacde7bb074d70b"}, +] + +[package.dependencies] +colorama = {version = "*", markers = "sys_platform == \"win32\""} +decorator = "*" +jedi = ">=0.16" +matplotlib-inline = "*" +pexpect = {version = ">4.3", markers = "sys_platform != \"win32\" and sys_platform != \"emscripten\""} +prompt_toolkit = ">=3.0.41,<3.1.0" +pygments = ">=2.4.0" +stack_data = "*" +traitlets = ">=5.13.0" + +[package.extras] +all = ["ipython[black,doc,kernel,matplotlib,nbconvert,nbformat,notebook,parallel,qtconsole]", "ipython[test,test-extra]"] +black = ["black"] +doc = ["docrepr", "exceptiongroup", "intersphinx_registry", "ipykernel", "ipython[test]", "matplotlib", "setuptools (>=18.5)", "sphinx (>=1.3)", "sphinx-rtd-theme", "sphinxcontrib-jquery", "tomli", "typing_extensions"] +kernel = ["ipykernel"] +matplotlib = ["matplotlib"] +nbconvert = ["nbconvert"] +nbformat = ["nbformat"] +notebook = ["ipywidgets", "notebook"] +parallel = ["ipyparallel"] +qtconsole = ["qtconsole"] +test = ["packaging", "pickleshare", "pytest", "pytest-asyncio (<0.22)", "testpath"] +test-extra = ["curio", "ipython[test]", "matplotlib (!=3.2.0)", "nbformat", "numpy (>=1.23)", "pandas", "trio"] + +[[package]] +name = "jedi" +version = "0.19.2" +description = "An autocompletion tool for Python that can be used for text editors." +optional = false +python-versions = ">=3.6" +files = [ + {file = "jedi-0.19.2-py2.py3-none-any.whl", hash = "sha256:a8ef22bde8490f57fe5c7681a3c83cb58874daf72b4784de3cce5b6ef6edb5b9"}, + {file = "jedi-0.19.2.tar.gz", hash = "sha256:4770dc3de41bde3966b02eb84fbcf557fb33cce26ad23da12c742fb50ecb11f0"}, +] + +[package.dependencies] +parso = ">=0.8.4,<0.9.0" + +[package.extras] +docs = ["Jinja2 (==2.11.3)", "MarkupSafe (==1.1.1)", "Pygments (==2.8.1)", "alabaster (==0.7.12)", "babel (==2.9.1)", "chardet (==4.0.0)", "commonmark (==0.8.1)", "docutils (==0.17.1)", "future (==0.18.2)", "idna (==2.10)", "imagesize (==1.2.0)", "mock (==1.0.1)", "packaging (==20.9)", "pyparsing (==2.4.7)", "pytz (==2021.1)", "readthedocs-sphinx-ext (==2.1.4)", "recommonmark (==0.5.0)", "requests (==2.25.1)", "six (==1.15.0)", "snowballstemmer (==2.1.0)", "sphinx (==1.8.5)", "sphinx-rtd-theme (==0.4.3)", "sphinxcontrib-serializinghtml (==1.1.4)", "sphinxcontrib-websupport (==1.2.4)", "urllib3 (==1.26.4)"] +qa = ["flake8 (==5.0.4)", "mypy (==0.971)", "types-setuptools (==67.2.0.1)"] +testing = ["Django", "attrs", "colorama", "docopt", "pytest (<9.0.0)"] + +[[package]] +name = "jinja2" +version = "3.1.5" +description = "A very fast and expressive template engine." +optional = false +python-versions = ">=3.7" +files = [ + {file = "jinja2-3.1.5-py3-none-any.whl", hash = "sha256:aba0f4dc9ed8013c424088f68a5c226f7d6097ed89b246d7749c2ec4175c6adb"}, + {file = "jinja2-3.1.5.tar.gz", hash = "sha256:8fefff8dc3034e27bb80d67c671eb8a9bc424c0ef4c0826edbff304cceff43bb"}, +] + +[package.dependencies] +MarkupSafe = ">=2.0" + +[package.extras] +i18n = ["Babel (>=2.7)"] + +[[package]] +name = "jsonschema" +version = "4.23.0" +description = "An implementation of JSON Schema validation for Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "jsonschema-4.23.0-py3-none-any.whl", hash = "sha256:fbadb6f8b144a8f8cf9f0b89ba94501d143e50411a1278633f56a7acf7fd5566"}, + {file = "jsonschema-4.23.0.tar.gz", hash = "sha256:d71497fef26351a33265337fa77ffeb82423f3ea21283cd9467bb03999266bc4"}, +] + +[package.dependencies] +attrs = ">=22.2.0" +jsonschema-specifications = ">=2023.03.6" +referencing = ">=0.28.4" +rpds-py = ">=0.7.1" + +[package.extras] +format = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3987", "uri-template", "webcolors (>=1.11)"] +format-nongpl = ["fqdn", "idna", "isoduration", "jsonpointer (>1.13)", "rfc3339-validator", "rfc3986-validator (>0.1.0)", "uri-template", "webcolors (>=24.6.0)"] + +[[package]] +name = "jsonschema-specifications" +version = "2024.10.1" +description = "The JSON Schema meta-schemas and vocabularies, exposed as a Registry" +optional = false +python-versions = ">=3.9" +files = [ + {file = "jsonschema_specifications-2024.10.1-py3-none-any.whl", hash = "sha256:a09a0680616357d9a0ecf05c12ad234479f549239d0f5b55f3deea67475da9bf"}, + {file = "jsonschema_specifications-2024.10.1.tar.gz", hash = "sha256:0f38b83639958ce1152d02a7f062902c41c8fd20d558b0c34344292d417ae272"}, +] + +[package.dependencies] +referencing = ">=0.31.0" + +[[package]] +name = "jupyter-book" +version = "1.0.3" +description = "Build a book with Jupyter Notebooks and Sphinx." +optional = false +python-versions = ">=3.9" +files = [ + {file = "jupyter_book-1.0.3-py3-none-any.whl", hash = "sha256:d99b0c92c62b469abc2b61047c1290923bce460367c6a7e296589cadc726119f"}, + {file = "jupyter_book-1.0.3.tar.gz", hash = "sha256:f05788a76f4d284de9887fa816f4a99a4f16cc5f7c0fd5a56074a3d03bfc3e50"}, +] + +[package.dependencies] +click = ">=7.1,<9" +Jinja2 = "*" +jsonschema = "<5" +linkify-it-py = ">=2,<3" +myst-nb = ">=1,<3" +myst-parser = ">=1,<3" +pyyaml = "*" +sphinx = ">=5,<8" +sphinx-book-theme = ">=1.1.0,<2" +sphinx-comments = "*" +sphinx-copybutton = "*" +sphinx-design = ">=0.5,<1" +sphinx-external-toc = ">=1.0.1,<2" +sphinx-jupyterbook-latex = ">=1,<2" +sphinx-multitoc-numbering = ">=0.1.3,<1" +sphinx-thebe = ">=0.3.1,<1" +sphinx_togglebutton = "*" +sphinxcontrib-bibtex = ">=2.5.0,<3" + +[package.extras] +code-style = ["pre-commit (>=3.1,<4.0)"] +pdfhtml = ["pyppeteer"] +sphinx = ["altair", "bokeh", "folium", "ipywidgets", "jupytext", "matplotlib", "nbclient", "numpy (>=2)", "pandas", "plotly", "sphinx-click", "sphinx-examples", "sphinx-proof", "sphinx_inline_tabs", "sphinxext-rediraffe (>=0.2.3,<0.3.0)", "sympy"] +testing = ["altair", "beautifulsoup4", "beautifulsoup4", "cookiecutter", "coverage", "jupytext", "matplotlib", "numpy (>=2)", "pandas", "pydata-sphinx-theme (>=0.15.3)", "pyppeteer", "pytest (>=6.2.4)", "pytest-cov", "pytest-regressions", "pytest-timeout", "pytest-xdist", "sphinx_click", "sphinx_inline_tabs", "texsoup"] + +[[package]] +name = "jupyter-cache" +version = "1.0.1" +description = "A defined interface for working with a cache of jupyter notebooks." +optional = false +python-versions = ">=3.9" +files = [ + {file = "jupyter_cache-1.0.1-py3-none-any.whl", hash = "sha256:9c3cafd825ba7da8b5830485343091143dff903e4d8c69db9349b728b140abf6"}, + {file = "jupyter_cache-1.0.1.tar.gz", hash = "sha256:16e808eb19e3fb67a223db906e131ea6e01f03aa27f49a7214ce6a5fec186fb9"}, +] + +[package.dependencies] +attrs = "*" +click = "*" +importlib-metadata = "*" +nbclient = ">=0.2" +nbformat = "*" +pyyaml = "*" +sqlalchemy = ">=1.3.12,<3" +tabulate = "*" + +[package.extras] +cli = ["click-log"] +code-style = ["pre-commit (>=2.12)"] +rtd = ["ipykernel", "jupytext", "myst-nb", "nbdime", "sphinx-book-theme", "sphinx-copybutton"] +testing = ["coverage", "ipykernel", "jupytext", "matplotlib", "nbdime", "nbformat (>=5.1)", "numpy", "pandas", "pytest (>=6)", "pytest-cov", "pytest-regressions", "sympy"] + +[[package]] +name = "jupyter-client" +version = "8.6.3" +description = "Jupyter protocol implementation and client libraries" +optional = false +python-versions = ">=3.8" +files = [ + {file = "jupyter_client-8.6.3-py3-none-any.whl", hash = "sha256:e8a19cc986cc45905ac3362915f410f3af85424b4c0905e94fa5f2cb08e8f23f"}, + {file = "jupyter_client-8.6.3.tar.gz", hash = "sha256:35b3a0947c4a6e9d589eb97d7d4cd5e90f910ee73101611f01283732bd6d9419"}, +] + +[package.dependencies] +jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0" +python-dateutil = ">=2.8.2" +pyzmq = ">=23.0" +tornado = ">=6.2" +traitlets = ">=5.3" + +[package.extras] +docs = ["ipykernel", "myst-parser", "pydata-sphinx-theme", "sphinx (>=4)", "sphinx-autodoc-typehints", "sphinxcontrib-github-alt", "sphinxcontrib-spelling"] +test = ["coverage", "ipykernel (>=6.14)", "mypy", "paramiko", "pre-commit", "pytest (<8.2.0)", "pytest-cov", "pytest-jupyter[client] (>=0.4.1)", "pytest-timeout"] + +[[package]] +name = "jupyter-core" +version = "5.7.2" +description = "Jupyter core package. A base package on which Jupyter projects rely." +optional = false +python-versions = ">=3.8" +files = [ + {file = "jupyter_core-5.7.2-py3-none-any.whl", hash = "sha256:4f7315d2f6b4bcf2e3e7cb6e46772eba760ae459cd1f59d29eb57b0a01bd7409"}, + {file = "jupyter_core-5.7.2.tar.gz", hash = "sha256:aa5f8d32bbf6b431ac830496da7392035d6f61b4f54872f15c4bd2a9c3f536d9"}, +] + +[package.dependencies] +platformdirs = ">=2.5" +pywin32 = {version = ">=300", markers = "sys_platform == \"win32\" and platform_python_implementation != \"PyPy\""} +traitlets = ">=5.3" + +[package.extras] +docs = ["myst-parser", "pydata-sphinx-theme", "sphinx-autodoc-typehints", "sphinxcontrib-github-alt", "sphinxcontrib-spelling", "traitlets"] +test = ["ipykernel", "pre-commit", "pytest (<8)", "pytest-cov", "pytest-timeout"] + +[[package]] +name = "kiwisolver" +version = "1.4.8" +description = "A fast implementation of the Cassowary constraint solver" +optional = false +python-versions = ">=3.10" +files = [ + {file = "kiwisolver-1.4.8-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:88c6f252f6816a73b1f8c904f7bbe02fd67c09a69f7cb8a0eecdbf5ce78e63db"}, + {file = "kiwisolver-1.4.8-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:c72941acb7b67138f35b879bbe85be0f6c6a70cab78fe3ef6db9c024d9223e5b"}, + {file = "kiwisolver-1.4.8-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:ce2cf1e5688edcb727fdf7cd1bbd0b6416758996826a8be1d958f91880d0809d"}, + {file = "kiwisolver-1.4.8-cp310-cp310-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:c8bf637892dc6e6aad2bc6d4d69d08764166e5e3f69d469e55427b6ac001b19d"}, + {file = "kiwisolver-1.4.8-cp310-cp310-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:034d2c891f76bd3edbdb3ea11140d8510dca675443da7304205a2eaa45d8334c"}, + {file = "kiwisolver-1.4.8-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d47b28d1dfe0793d5e96bce90835e17edf9a499b53969b03c6c47ea5985844c3"}, + {file = "kiwisolver-1.4.8-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eb158fe28ca0c29f2260cca8c43005329ad58452c36f0edf298204de32a9a3ed"}, + {file = "kiwisolver-1.4.8-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d5536185fce131780ebd809f8e623bf4030ce1b161353166c49a3c74c287897f"}, + {file = "kiwisolver-1.4.8-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:369b75d40abedc1da2c1f4de13f3482cb99e3237b38726710f4a793432b1c5ff"}, + {file = "kiwisolver-1.4.8-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:641f2ddf9358c80faa22e22eb4c9f54bd3f0e442e038728f500e3b978d00aa7d"}, + {file = "kiwisolver-1.4.8-cp310-cp310-musllinux_1_2_ppc64le.whl", hash = "sha256:d561d2d8883e0819445cfe58d7ddd673e4015c3c57261d7bdcd3710d0d14005c"}, + {file = "kiwisolver-1.4.8-cp310-cp310-musllinux_1_2_s390x.whl", hash = "sha256:1732e065704b47c9afca7ffa272f845300a4eb959276bf6970dc07265e73b605"}, + {file = "kiwisolver-1.4.8-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:bcb1ebc3547619c3b58a39e2448af089ea2ef44b37988caf432447374941574e"}, + {file = "kiwisolver-1.4.8-cp310-cp310-win_amd64.whl", hash = "sha256:89c107041f7b27844179ea9c85d6da275aa55ecf28413e87624d033cf1f6b751"}, + {file = "kiwisolver-1.4.8-cp310-cp310-win_arm64.whl", hash = "sha256:b5773efa2be9eb9fcf5415ea3ab70fc785d598729fd6057bea38d539ead28271"}, + {file = "kiwisolver-1.4.8-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:a4d3601908c560bdf880f07d94f31d734afd1bb71e96585cace0e38ef44c6d84"}, + {file = "kiwisolver-1.4.8-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:856b269c4d28a5c0d5e6c1955ec36ebfd1651ac00e1ce0afa3e28da95293b561"}, + {file = "kiwisolver-1.4.8-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c2b9a96e0f326205af81a15718a9073328df1173a2619a68553decb7097fd5d7"}, + {file = "kiwisolver-1.4.8-cp311-cp311-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c5020c83e8553f770cb3b5fc13faac40f17e0b205bd237aebd21d53d733adb03"}, + {file = "kiwisolver-1.4.8-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:dace81d28c787956bfbfbbfd72fdcef014f37d9b48830829e488fdb32b49d954"}, + {file = "kiwisolver-1.4.8-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:11e1022b524bd48ae56c9b4f9296bce77e15a2e42a502cceba602f804b32bb79"}, + {file = "kiwisolver-1.4.8-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3b9b4d2892fefc886f30301cdd80debd8bb01ecdf165a449eb6e78f79f0fabd6"}, + {file = "kiwisolver-1.4.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3a96c0e790ee875d65e340ab383700e2b4891677b7fcd30a699146f9384a2bb0"}, + {file = "kiwisolver-1.4.8-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:23454ff084b07ac54ca8be535f4174170c1094a4cff78fbae4f73a4bcc0d4dab"}, + {file = "kiwisolver-1.4.8-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:87b287251ad6488e95b4f0b4a79a6d04d3ea35fde6340eb38fbd1ca9cd35bbbc"}, + {file = "kiwisolver-1.4.8-cp311-cp311-musllinux_1_2_ppc64le.whl", hash = "sha256:b21dbe165081142b1232a240fc6383fd32cdd877ca6cc89eab93e5f5883e1c25"}, + {file = "kiwisolver-1.4.8-cp311-cp311-musllinux_1_2_s390x.whl", hash = "sha256:768cade2c2df13db52475bd28d3a3fac8c9eff04b0e9e2fda0f3760f20b3f7fc"}, + {file = "kiwisolver-1.4.8-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:d47cfb2650f0e103d4bf68b0b5804c68da97272c84bb12850d877a95c056bd67"}, + {file = "kiwisolver-1.4.8-cp311-cp311-win_amd64.whl", hash = "sha256:ed33ca2002a779a2e20eeb06aea7721b6e47f2d4b8a8ece979d8ba9e2a167e34"}, + {file = "kiwisolver-1.4.8-cp311-cp311-win_arm64.whl", hash = "sha256:16523b40aab60426ffdebe33ac374457cf62863e330a90a0383639ce14bf44b2"}, + {file = "kiwisolver-1.4.8-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:d6af5e8815fd02997cb6ad9bbed0ee1e60014438ee1a5c2444c96f87b8843502"}, + {file = "kiwisolver-1.4.8-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:bade438f86e21d91e0cf5dd7c0ed00cda0f77c8c1616bd83f9fc157fa6760d31"}, + {file = "kiwisolver-1.4.8-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b83dc6769ddbc57613280118fb4ce3cd08899cc3369f7d0e0fab518a7cf37fdb"}, + {file = "kiwisolver-1.4.8-cp312-cp312-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:111793b232842991be367ed828076b03d96202c19221b5ebab421ce8bcad016f"}, + {file = "kiwisolver-1.4.8-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:257af1622860e51b1a9d0ce387bf5c2c4f36a90594cb9514f55b074bcc787cfc"}, + {file = "kiwisolver-1.4.8-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:69b5637c3f316cab1ec1c9a12b8c5f4750a4c4b71af9157645bf32830e39c03a"}, + {file = "kiwisolver-1.4.8-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:782bb86f245ec18009890e7cb8d13a5ef54dcf2ebe18ed65f795e635a96a1c6a"}, + {file = "kiwisolver-1.4.8-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:cc978a80a0db3a66d25767b03688f1147a69e6237175c0f4ffffaaedf744055a"}, + {file = "kiwisolver-1.4.8-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:36dbbfd34838500a31f52c9786990d00150860e46cd5041386f217101350f0d3"}, + {file = "kiwisolver-1.4.8-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:eaa973f1e05131de5ff3569bbba7f5fd07ea0595d3870ed4a526d486fe57fa1b"}, + {file = "kiwisolver-1.4.8-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:a66f60f8d0c87ab7f59b6fb80e642ebb29fec354a4dfad687ca4092ae69d04f4"}, + {file = "kiwisolver-1.4.8-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:858416b7fb777a53f0c59ca08190ce24e9abbd3cffa18886a5781b8e3e26f65d"}, + {file = "kiwisolver-1.4.8-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:085940635c62697391baafaaeabdf3dd7a6c3643577dde337f4d66eba021b2b8"}, + {file = "kiwisolver-1.4.8-cp312-cp312-win_amd64.whl", hash = "sha256:01c3d31902c7db5fb6182832713d3b4122ad9317c2c5877d0539227d96bb2e50"}, + {file = "kiwisolver-1.4.8-cp312-cp312-win_arm64.whl", hash = "sha256:a3c44cb68861de93f0c4a8175fbaa691f0aa22550c331fefef02b618a9dcb476"}, + {file = "kiwisolver-1.4.8-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:1c8ceb754339793c24aee1c9fb2485b5b1f5bb1c2c214ff13368431e51fc9a09"}, + {file = "kiwisolver-1.4.8-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:54a62808ac74b5e55a04a408cda6156f986cefbcf0ada13572696b507cc92fa1"}, + {file = "kiwisolver-1.4.8-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:68269e60ee4929893aad82666821aaacbd455284124817af45c11e50a4b42e3c"}, + {file = "kiwisolver-1.4.8-cp313-cp313-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:34d142fba9c464bc3bbfeff15c96eab0e7310343d6aefb62a79d51421fcc5f1b"}, + {file = "kiwisolver-1.4.8-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3ddc373e0eef45b59197de815b1b28ef89ae3955e7722cc9710fb91cd77b7f47"}, + {file = "kiwisolver-1.4.8-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:77e6f57a20b9bd4e1e2cedda4d0b986ebd0216236f0106e55c28aea3d3d69b16"}, + {file = "kiwisolver-1.4.8-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:08e77738ed7538f036cd1170cbed942ef749137b1311fa2bbe2a7fda2f6bf3cc"}, + {file = "kiwisolver-1.4.8-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a5ce1e481a74b44dd5e92ff03ea0cb371ae7a0268318e202be06c8f04f4f1246"}, + {file = "kiwisolver-1.4.8-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:fc2ace710ba7c1dfd1a3b42530b62b9ceed115f19a1656adefce7b1782a37794"}, + {file = "kiwisolver-1.4.8-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:3452046c37c7692bd52b0e752b87954ef86ee2224e624ef7ce6cb21e8c41cc1b"}, + {file = "kiwisolver-1.4.8-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:7e9a60b50fe8b2ec6f448fe8d81b07e40141bfced7f896309df271a0b92f80f3"}, + {file = "kiwisolver-1.4.8-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:918139571133f366e8362fa4a297aeba86c7816b7ecf0bc79168080e2bd79957"}, + {file = "kiwisolver-1.4.8-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e063ef9f89885a1d68dd8b2e18f5ead48653176d10a0e324e3b0030e3a69adeb"}, + {file = "kiwisolver-1.4.8-cp313-cp313-win_amd64.whl", hash = "sha256:a17b7c4f5b2c51bb68ed379defd608a03954a1845dfed7cc0117f1cc8a9b7fd2"}, + {file = "kiwisolver-1.4.8-cp313-cp313-win_arm64.whl", hash = "sha256:3cd3bc628b25f74aedc6d374d5babf0166a92ff1317f46267f12d2ed54bc1d30"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:370fd2df41660ed4e26b8c9d6bbcad668fbe2560462cba151a721d49e5b6628c"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:84a2f830d42707de1d191b9490ac186bf7997a9495d4e9072210a1296345f7dc"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:7a3ad337add5148cf51ce0b55642dc551c0b9d6248458a757f98796ca7348712"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:7506488470f41169b86d8c9aeff587293f530a23a23a49d6bc64dab66bedc71e"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2f0121b07b356a22fb0414cec4666bbe36fd6d0d759db3d37228f496ed67c880"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d6d6bd87df62c27d4185de7c511c6248040afae67028a8a22012b010bc7ad062"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:291331973c64bb9cce50bbe871fb2e675c4331dab4f31abe89f175ad7679a4d7"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:893f5525bb92d3d735878ec00f781b2de998333659507d29ea4466208df37bed"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:b47a465040146981dc9db8647981b8cb96366fbc8d452b031e4f8fdffec3f26d"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:99cea8b9dd34ff80c521aef46a1dddb0dcc0283cf18bde6d756f1e6f31772165"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:151dffc4865e5fe6dafce5480fab84f950d14566c480c08a53c663a0020504b6"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:577facaa411c10421314598b50413aa1ebcf5126f704f1e5d72d7e4e9f020d90"}, + {file = "kiwisolver-1.4.8-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:be4816dc51c8a471749d664161b434912eee82f2ea66bd7628bd14583a833e85"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:e7a019419b7b510f0f7c9dceff8c5eae2392037eae483a7f9162625233802b0a"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:286b18e86682fd2217a48fc6be6b0f20c1d0ed10958d8dc53453ad58d7be0bf8"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4191ee8dfd0be1c3666ccbac178c5a05d5f8d689bbe3fc92f3c4abec817f8fe0"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7cd2785b9391f2873ad46088ed7599a6a71e762e1ea33e87514b1a441ed1da1c"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c07b29089b7ba090b6f1a669f1411f27221c3662b3a1b7010e67b59bb5a6f10b"}, + {file = "kiwisolver-1.4.8-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:65ea09a5a3faadd59c2ce96dc7bf0f364986a315949dc6374f04396b0d60e09b"}, + {file = "kiwisolver-1.4.8.tar.gz", hash = "sha256:23d5f023bdc8c7e54eb65f03ca5d5bb25b601eac4d7f1a042888a1f45237987e"}, +] + +[[package]] +name = "latexcodec" +version = "3.0.0" +description = "A lexer and codec to work with LaTeX code in Python." +optional = false +python-versions = ">=3.7" +files = [ + {file = "latexcodec-3.0.0-py3-none-any.whl", hash = "sha256:6f3477ad5e61a0a99bd31a6a370c34e88733a6bad9c921a3ffcfacada12f41a7"}, + {file = "latexcodec-3.0.0.tar.gz", hash = "sha256:917dc5fe242762cc19d963e6548b42d63a118028cdd3361d62397e3b638b6bc5"}, +] + +[[package]] +name = "linkify-it-py" +version = "2.0.3" +description = "Links recognition library with FULL unicode support." +optional = false +python-versions = ">=3.7" +files = [ + {file = "linkify-it-py-2.0.3.tar.gz", hash = "sha256:68cda27e162e9215c17d786649d1da0021a451bdc436ef9e0fa0ba5234b9b048"}, + {file = "linkify_it_py-2.0.3-py3-none-any.whl", hash = "sha256:6bcbc417b0ac14323382aef5c5192c0075bf8a9d6b41820a2b66371eac6b6d79"}, +] + +[package.dependencies] +uc-micro-py = "*" + +[package.extras] +benchmark = ["pytest", "pytest-benchmark"] +dev = ["black", "flake8", "isort", "pre-commit", "pyproject-flake8"] +doc = ["myst-parser", "sphinx", "sphinx-book-theme"] +test = ["coverage", "pytest", "pytest-cov"] + +[[package]] +name = "markdown-it-py" +version = "3.0.0" +description = "Python port of markdown-it. Markdown parsing, done right!" +optional = false +python-versions = ">=3.8" +files = [ + {file = "markdown-it-py-3.0.0.tar.gz", hash = "sha256:e3f60a94fa066dc52ec76661e37c851cb232d92f9886b15cb560aaada2df8feb"}, + {file = "markdown_it_py-3.0.0-py3-none-any.whl", hash = "sha256:355216845c60bd96232cd8d8c40e8f9765cc86f46880e43a8fd22dc1a1a8cab1"}, +] + +[package.dependencies] +mdurl = ">=0.1,<1.0" + +[package.extras] +benchmarking = ["psutil", "pytest", "pytest-benchmark"] +code-style = ["pre-commit (>=3.0,<4.0)"] +compare = ["commonmark (>=0.9,<1.0)", "markdown (>=3.4,<4.0)", "mistletoe (>=1.0,<2.0)", "mistune (>=2.0,<3.0)", "panflute (>=2.3,<3.0)"] +linkify = ["linkify-it-py (>=1,<3)"] +plugins = ["mdit-py-plugins"] +profiling = ["gprof2dot"] +rtd = ["jupyter_sphinx", "mdit-py-plugins", "myst-parser", "pyyaml", "sphinx", "sphinx-copybutton", "sphinx-design", "sphinx_book_theme"] +testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"] + +[[package]] +name = "markupsafe" +version = "3.0.2" +description = "Safely add untrusted strings to HTML/XML markup." +optional = false +python-versions = ">=3.9" +files = [ + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50"}, + {file = "MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d"}, + {file = "MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30"}, + {file = "MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1"}, + {file = "MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6"}, + {file = "MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:eaa0a10b7f72326f1372a713e73c3f739b524b3af41feb43e4921cb529f5929a"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48032821bbdf20f5799ff537c7ac3d1fba0ba032cfc06194faffa8cda8b560ff"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a9d3f5f0901fdec14d8d2f66ef7d035f2157240a433441719ac9a3fba440b13"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88b49a3b9ff31e19998750c38e030fc7bb937398b1f78cfa599aaef92d693144"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfad01eed2c2e0c01fd0ecd2ef42c492f7f93902e39a42fc9ee1692961443a29"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:1225beacc926f536dc82e45f8a4d68502949dc67eea90eab715dea3a21c1b5f0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3169b1eefae027567d1ce6ee7cae382c57fe26e82775f460f0b2778beaad66c0"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:eb7972a85c54febfb25b5c4b4f3af4dcc731994c7da0d8a0b4a6eb0640e1d178"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win32.whl", hash = "sha256:8c4e8c3ce11e1f92f6536ff07154f9d49677ebaaafc32db9db4620bc11ed480f"}, + {file = "MarkupSafe-3.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:6e296a513ca3d94054c2c881cc913116e90fd030ad1c656b3869762b754f5f8a"}, + {file = "markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0"}, +] + +[[package]] +name = "matplotlib" +version = "3.10.0" +description = "Python plotting package" +optional = false +python-versions = ">=3.10" +files = [ + {file = "matplotlib-3.10.0-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:2c5829a5a1dd5a71f0e31e6e8bb449bc0ee9dbfb05ad28fc0c6b55101b3a4be6"}, + {file = "matplotlib-3.10.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:a2a43cbefe22d653ab34bb55d42384ed30f611bcbdea1f8d7f431011a2e1c62e"}, + {file = "matplotlib-3.10.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:607b16c8a73943df110f99ee2e940b8a1cbf9714b65307c040d422558397dac5"}, + {file = "matplotlib-3.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:01d2b19f13aeec2e759414d3bfe19ddfb16b13a1250add08d46d5ff6f9be83c6"}, + {file = "matplotlib-3.10.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:5e6c6461e1fc63df30bf6f80f0b93f5b6784299f721bc28530477acd51bfc3d1"}, + {file = "matplotlib-3.10.0-cp310-cp310-win_amd64.whl", hash = "sha256:994c07b9d9fe8d25951e3202a68c17900679274dadfc1248738dcfa1bd40d7f3"}, + {file = "matplotlib-3.10.0-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:fd44fc75522f58612ec4a33958a7e5552562b7705b42ef1b4f8c0818e304a363"}, + {file = "matplotlib-3.10.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:c58a9622d5dbeb668f407f35f4e6bfac34bb9ecdcc81680c04d0258169747997"}, + {file = "matplotlib-3.10.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:845d96568ec873be63f25fa80e9e7fae4be854a66a7e2f0c8ccc99e94a8bd4ef"}, + {file = "matplotlib-3.10.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5439f4c5a3e2e8eab18e2f8c3ef929772fd5641876db71f08127eed95ab64683"}, + {file = "matplotlib-3.10.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4673ff67a36152c48ddeaf1135e74ce0d4bce1bbf836ae40ed39c29edf7e2765"}, + {file = "matplotlib-3.10.0-cp311-cp311-win_amd64.whl", hash = "sha256:7e8632baebb058555ac0cde75db885c61f1212e47723d63921879806b40bec6a"}, + {file = "matplotlib-3.10.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:4659665bc7c9b58f8c00317c3c2a299f7f258eeae5a5d56b4c64226fca2f7c59"}, + {file = "matplotlib-3.10.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:d44cb942af1693cced2604c33a9abcef6205601c445f6d0dc531d813af8a2f5a"}, + {file = "matplotlib-3.10.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a994f29e968ca002b50982b27168addfd65f0105610b6be7fa515ca4b5307c95"}, + {file = "matplotlib-3.10.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9b0558bae37f154fffda54d779a592bc97ca8b4701f1c710055b609a3bac44c8"}, + {file = "matplotlib-3.10.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:503feb23bd8c8acc75541548a1d709c059b7184cde26314896e10a9f14df5f12"}, + {file = "matplotlib-3.10.0-cp312-cp312-win_amd64.whl", hash = "sha256:c40ba2eb08b3f5de88152c2333c58cee7edcead0a2a0d60fcafa116b17117adc"}, + {file = "matplotlib-3.10.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:96f2886f5c1e466f21cc41b70c5a0cd47bfa0015eb2d5793c88ebce658600e25"}, + {file = "matplotlib-3.10.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:12eaf48463b472c3c0f8dbacdbf906e573013df81a0ab82f0616ea4b11281908"}, + {file = "matplotlib-3.10.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2fbbabc82fde51391c4da5006f965e36d86d95f6ee83fb594b279564a4c5d0d2"}, + {file = "matplotlib-3.10.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ad2e15300530c1a94c63cfa546e3b7864bd18ea2901317bae8bbf06a5ade6dcf"}, + {file = "matplotlib-3.10.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:3547d153d70233a8496859097ef0312212e2689cdf8d7ed764441c77604095ae"}, + {file = "matplotlib-3.10.0-cp313-cp313-win_amd64.whl", hash = "sha256:c55b20591ced744aa04e8c3e4b7543ea4d650b6c3c4b208c08a05b4010e8b442"}, + {file = "matplotlib-3.10.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:9ade1003376731a971e398cc4ef38bb83ee8caf0aee46ac6daa4b0506db1fd06"}, + {file = "matplotlib-3.10.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:95b710fea129c76d30be72c3b38f330269363fbc6e570a5dd43580487380b5ff"}, + {file = "matplotlib-3.10.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5cdbaf909887373c3e094b0318d7ff230b2ad9dcb64da7ade654182872ab2593"}, + {file = "matplotlib-3.10.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:d907fddb39f923d011875452ff1eca29a9e7f21722b873e90db32e5d8ddff12e"}, + {file = "matplotlib-3.10.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:3b427392354d10975c1d0f4ee18aa5844640b512d5311ef32efd4dd7db106ede"}, + {file = "matplotlib-3.10.0-cp313-cp313t-win_amd64.whl", hash = "sha256:5fd41b0ec7ee45cd960a8e71aea7c946a28a0b8a4dcee47d2856b2af051f334c"}, + {file = "matplotlib-3.10.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:81713dd0d103b379de4516b861d964b1d789a144103277769238c732229d7f03"}, + {file = "matplotlib-3.10.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:359f87baedb1f836ce307f0e850d12bb5f1936f70d035561f90d41d305fdacea"}, + {file = "matplotlib-3.10.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ae80dc3a4add4665cf2faa90138384a7ffe2a4e37c58d83e115b54287c4f06ef"}, + {file = "matplotlib-3.10.0.tar.gz", hash = "sha256:b886d02a581b96704c9d1ffe55709e49b4d2d52709ccebc4be42db856e511278"}, +] + +[package.dependencies] +contourpy = ">=1.0.1" +cycler = ">=0.10" +fonttools = ">=4.22.0" +kiwisolver = ">=1.3.1" +numpy = ">=1.23" +packaging = ">=20.0" +pillow = ">=8" +pyparsing = ">=2.3.1" +python-dateutil = ">=2.7" + +[package.extras] +dev = ["meson-python (>=0.13.1,<0.17.0)", "pybind11 (>=2.13.2,!=2.13.3)", "setuptools (>=64)", "setuptools_scm (>=7)"] + +[[package]] +name = "matplotlib-inline" +version = "0.1.7" +description = "Inline Matplotlib backend for Jupyter" +optional = false +python-versions = ">=3.8" +files = [ + {file = "matplotlib_inline-0.1.7-py3-none-any.whl", hash = "sha256:df192d39a4ff8f21b1895d72e6a13f5fcc5099f00fa84384e0ea28c2cc0653ca"}, + {file = "matplotlib_inline-0.1.7.tar.gz", hash = "sha256:8423b23ec666be3d16e16b60bdd8ac4e86e840ebd1dd11a30b9f117f2fa0ab90"}, +] + +[package.dependencies] +traitlets = "*" + +[[package]] +name = "mdit-py-plugins" +version = "0.4.2" +description = "Collection of plugins for markdown-it-py" +optional = false +python-versions = ">=3.8" +files = [ + {file = "mdit_py_plugins-0.4.2-py3-none-any.whl", hash = "sha256:0c673c3f889399a33b95e88d2f0d111b4447bdfea7f237dab2d488f459835636"}, + {file = "mdit_py_plugins-0.4.2.tar.gz", hash = "sha256:5f2cd1fdb606ddf152d37ec30e46101a60512bc0e5fa1a7002c36647b09e26b5"}, +] + +[package.dependencies] +markdown-it-py = ">=1.0.0,<4.0.0" + +[package.extras] +code-style = ["pre-commit"] +rtd = ["myst-parser", "sphinx-book-theme"] +testing = ["coverage", "pytest", "pytest-cov", "pytest-regressions"] + +[[package]] +name = "mdurl" +version = "0.1.2" +description = "Markdown URL utilities" +optional = false +python-versions = ">=3.7" +files = [ + {file = "mdurl-0.1.2-py3-none-any.whl", hash = "sha256:84008a41e51615a49fc9966191ff91509e3c40b939176e643fd50a5c2196b8f8"}, + {file = "mdurl-0.1.2.tar.gz", hash = "sha256:bb413d29f5eea38f31dd4754dd7377d4465116fb207585f97bf925588687c1ba"}, +] + +[[package]] +name = "myst-nb" +version = "1.1.2" +description = "A Jupyter Notebook Sphinx reader built on top of the MyST markdown parser." +optional = false +python-versions = ">=3.9" +files = [ + {file = "myst_nb-1.1.2-py3-none-any.whl", hash = "sha256:9b7034e5d62640cb6daf03f9ca16ef45d0462fced27944c77aa3f98c7cdcd566"}, + {file = "myst_nb-1.1.2.tar.gz", hash = "sha256:961b4005657029ca89892a4c75edbf0856c54ceaf6172368b46bf7676c1f7700"}, +] + +[package.dependencies] +importlib_metadata = "*" +ipykernel = "*" +ipython = "*" +jupyter-cache = ">=0.5" +myst-parser = ">=1.0.0" +nbclient = "*" +nbformat = ">=5.0" +pyyaml = "*" +sphinx = ">=5" +typing-extensions = "*" + +[package.extras] +code-style = ["pre-commit"] +rtd = ["alabaster", "altair", "bokeh", "coconut (>=1.4.3)", "ipykernel (>=5.5)", "ipywidgets", "jupytext (>=1.11.2)", "matplotlib", "numpy", "pandas", "plotly", "sphinx-book-theme (>=0.3)", "sphinx-copybutton", "sphinx-design", "sphinxcontrib-bibtex", "sympy"] +testing = ["beautifulsoup4", "coverage (>=6.4)", "ipykernel (>=5.5)", "ipython (!=8.1.0)", "ipywidgets (>=8)", "jupytext (>=1.11.2)", "matplotlib (==3.7.*)", "nbdime", "numpy", "pandas", "pyarrow", "pytest", "pytest-cov (>=3)", "pytest-param-files", "pytest-regressions", "sympy (>=1.10.1)"] + +[[package]] +name = "myst-parser" +version = "2.0.0" +description = "An extended [CommonMark](https://spec.commonmark.org/) compliant parser," +optional = false +python-versions = ">=3.8" +files = [ + {file = "myst_parser-2.0.0-py3-none-any.whl", hash = "sha256:7c36344ae39c8e740dad7fdabf5aa6fc4897a813083c6cc9990044eb93656b14"}, + {file = "myst_parser-2.0.0.tar.gz", hash = "sha256:ea929a67a6a0b1683cdbe19b8d2e724cd7643f8aa3e7bb18dd65beac3483bead"}, +] + +[package.dependencies] +docutils = ">=0.16,<0.21" +jinja2 = "*" +markdown-it-py = ">=3.0,<4.0" +mdit-py-plugins = ">=0.4,<1.0" +pyyaml = "*" +sphinx = ">=6,<8" + +[package.extras] +code-style = ["pre-commit (>=3.0,<4.0)"] +linkify = ["linkify-it-py (>=2.0,<3.0)"] +rtd = ["ipython", "pydata-sphinx-theme (==v0.13.0rc4)", "sphinx-autodoc2 (>=0.4.2,<0.5.0)", "sphinx-book-theme (==1.0.0rc2)", "sphinx-copybutton", "sphinx-design2", "sphinx-pyscript", "sphinx-tippy (>=0.3.1)", "sphinx-togglebutton", "sphinxext-opengraph (>=0.8.2,<0.9.0)", "sphinxext-rediraffe (>=0.2.7,<0.3.0)"] +testing = ["beautifulsoup4", "coverage[toml]", "pytest (>=7,<8)", "pytest-cov", "pytest-param-files (>=0.3.4,<0.4.0)", "pytest-regressions", "sphinx-pytest"] +testing-docutils = ["pygments", "pytest (>=7,<8)", "pytest-param-files (>=0.3.4,<0.4.0)"] + +[[package]] +name = "nbclient" +version = "0.10.2" +description = "A client library for executing notebooks. Formerly nbconvert's ExecutePreprocessor." +optional = false +python-versions = ">=3.9.0" +files = [ + {file = "nbclient-0.10.2-py3-none-any.whl", hash = "sha256:4ffee11e788b4a27fabeb7955547e4318a5298f34342a4bfd01f2e1faaeadc3d"}, + {file = "nbclient-0.10.2.tar.gz", hash = "sha256:90b7fc6b810630db87a6d0c2250b1f0ab4cf4d3c27a299b0cde78a4ed3fd9193"}, +] + +[package.dependencies] +jupyter-client = ">=6.1.12" +jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0" +nbformat = ">=5.1" +traitlets = ">=5.4" + +[package.extras] +dev = ["pre-commit"] +docs = ["autodoc-traits", "flaky", "ipykernel (>=6.19.3)", "ipython", "ipywidgets", "mock", "moto", "myst-parser", "nbconvert (>=7.1.0)", "pytest (>=7.0,<8)", "pytest-asyncio", "pytest-cov (>=4.0)", "sphinx (>=1.7)", "sphinx-book-theme", "sphinxcontrib-spelling", "testpath", "xmltodict"] +test = ["flaky", "ipykernel (>=6.19.3)", "ipython", "ipywidgets", "nbconvert (>=7.1.0)", "pytest (>=7.0,<8)", "pytest-asyncio", "pytest-cov (>=4.0)", "testpath", "xmltodict"] + +[[package]] +name = "nbformat" +version = "5.10.4" +description = "The Jupyter Notebook format" +optional = false +python-versions = ">=3.8" +files = [ + {file = "nbformat-5.10.4-py3-none-any.whl", hash = "sha256:3b48d6c8fbca4b299bf3982ea7db1af21580e4fec269ad087b9e81588891200b"}, + {file = "nbformat-5.10.4.tar.gz", hash = "sha256:322168b14f937a5d11362988ecac2a4952d3d8e3a2cbeb2319584631226d5b3a"}, +] + +[package.dependencies] +fastjsonschema = ">=2.15" +jsonschema = ">=2.6" +jupyter-core = ">=4.12,<5.0.dev0 || >=5.1.dev0" +traitlets = ">=5.1" + +[package.extras] +docs = ["myst-parser", "pydata-sphinx-theme", "sphinx", "sphinxcontrib-github-alt", "sphinxcontrib-spelling"] +test = ["pep440", "pre-commit", "pytest", "testpath"] + +[[package]] +name = "nest-asyncio" +version = "1.6.0" +description = "Patch asyncio to allow nested event loops" +optional = false +python-versions = ">=3.5" +files = [ + {file = "nest_asyncio-1.6.0-py3-none-any.whl", hash = "sha256:87af6efd6b5e897c81050477ef65c62e2b2f35d51703cae01aff2905b1852e1c"}, + {file = "nest_asyncio-1.6.0.tar.gz", hash = "sha256:6f172d5449aca15afd6c646851f4e31e02c598d553a667e38cafa997cfec55fe"}, +] + +[[package]] +name = "numpy" +version = "2.2.1" +description = "Fundamental package for array computing in Python" +optional = false +python-versions = ">=3.10" +files = [ + {file = "numpy-2.2.1-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:5edb4e4caf751c1518e6a26a83501fda79bff41cc59dac48d70e6d65d4ec4440"}, + {file = "numpy-2.2.1-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:aa3017c40d513ccac9621a2364f939d39e550c542eb2a894b4c8da92b38896ab"}, + {file = "numpy-2.2.1-cp310-cp310-macosx_14_0_arm64.whl", hash = "sha256:61048b4a49b1c93fe13426e04e04fdf5a03f456616f6e98c7576144677598675"}, + {file = "numpy-2.2.1-cp310-cp310-macosx_14_0_x86_64.whl", hash = "sha256:7671dc19c7019103ca44e8d94917eba8534c76133523ca8406822efdd19c9308"}, + {file = "numpy-2.2.1-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4250888bcb96617e00bfa28ac24850a83c9f3a16db471eca2ee1f1714df0f957"}, + {file = "numpy-2.2.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a7746f235c47abc72b102d3bce9977714c2444bdfaea7888d241b4c4bb6a78bf"}, + {file = "numpy-2.2.1-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:059e6a747ae84fce488c3ee397cee7e5f905fd1bda5fb18c66bc41807ff119b2"}, + {file = "numpy-2.2.1-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f62aa6ee4eb43b024b0e5a01cf65a0bb078ef8c395e8713c6e8a12a697144528"}, + {file = "numpy-2.2.1-cp310-cp310-win32.whl", hash = "sha256:48fd472630715e1c1c89bf1feab55c29098cb403cc184b4859f9c86d4fcb6a95"}, + {file = "numpy-2.2.1-cp310-cp310-win_amd64.whl", hash = "sha256:b541032178a718c165a49638d28272b771053f628382d5e9d1c93df23ff58dbf"}, + {file = "numpy-2.2.1-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:40f9e544c1c56ba8f1cf7686a8c9b5bb249e665d40d626a23899ba6d5d9e1484"}, + {file = "numpy-2.2.1-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:f9b57eaa3b0cd8db52049ed0330747b0364e899e8a606a624813452b8203d5f7"}, + {file = "numpy-2.2.1-cp311-cp311-macosx_14_0_arm64.whl", hash = "sha256:bc8a37ad5b22c08e2dbd27df2b3ef7e5c0864235805b1e718a235bcb200cf1cb"}, + {file = "numpy-2.2.1-cp311-cp311-macosx_14_0_x86_64.whl", hash = "sha256:9036d6365d13b6cbe8f27a0eaf73ddcc070cae584e5ff94bb45e3e9d729feab5"}, + {file = "numpy-2.2.1-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:51faf345324db860b515d3f364eaa93d0e0551a88d6218a7d61286554d190d73"}, + {file = "numpy-2.2.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:38efc1e56b73cc9b182fe55e56e63b044dd26a72128fd2fbd502f75555d92591"}, + {file = "numpy-2.2.1-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:31b89fa67a8042e96715c68e071a1200c4e172f93b0fbe01a14c0ff3ff820fc8"}, + {file = "numpy-2.2.1-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:4c86e2a209199ead7ee0af65e1d9992d1dce7e1f63c4b9a616500f93820658d0"}, + {file = "numpy-2.2.1-cp311-cp311-win32.whl", hash = "sha256:b34d87e8a3090ea626003f87f9392b3929a7bbf4104a05b6667348b6bd4bf1cd"}, + {file = "numpy-2.2.1-cp311-cp311-win_amd64.whl", hash = "sha256:360137f8fb1b753c5cde3ac388597ad680eccbbbb3865ab65efea062c4a1fd16"}, + {file = "numpy-2.2.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:694f9e921a0c8f252980e85bce61ebbd07ed2b7d4fa72d0e4246f2f8aa6642ab"}, + {file = "numpy-2.2.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:3683a8d166f2692664262fd4900f207791d005fb088d7fdb973cc8d663626faa"}, + {file = "numpy-2.2.1-cp312-cp312-macosx_14_0_arm64.whl", hash = "sha256:780077d95eafc2ccc3ced969db22377b3864e5b9a0ea5eb347cc93b3ea900315"}, + {file = "numpy-2.2.1-cp312-cp312-macosx_14_0_x86_64.whl", hash = "sha256:55ba24ebe208344aa7a00e4482f65742969a039c2acfcb910bc6fcd776eb4355"}, + {file = "numpy-2.2.1-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9b1d07b53b78bf84a96898c1bc139ad7f10fda7423f5fd158fd0f47ec5e01ac7"}, + {file = "numpy-2.2.1-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:5062dc1a4e32a10dc2b8b13cedd58988261416e811c1dc4dbdea4f57eea61b0d"}, + {file = "numpy-2.2.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:fce4f615f8ca31b2e61aa0eb5865a21e14f5629515c9151850aa936c02a1ee51"}, + {file = "numpy-2.2.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:67d4cda6fa6ffa073b08c8372aa5fa767ceb10c9a0587c707505a6d426f4e046"}, + {file = "numpy-2.2.1-cp312-cp312-win32.whl", hash = "sha256:32cb94448be47c500d2c7a95f93e2f21a01f1fd05dd2beea1ccd049bb6001cd2"}, + {file = "numpy-2.2.1-cp312-cp312-win_amd64.whl", hash = "sha256:ba5511d8f31c033a5fcbda22dd5c813630af98c70b2661f2d2c654ae3cdfcfc8"}, + {file = "numpy-2.2.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:f1d09e520217618e76396377c81fba6f290d5f926f50c35f3a5f72b01a0da780"}, + {file = "numpy-2.2.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3ecc47cd7f6ea0336042be87d9e7da378e5c7e9b3c8ad0f7c966f714fc10d821"}, + {file = "numpy-2.2.1-cp313-cp313-macosx_14_0_arm64.whl", hash = "sha256:f419290bc8968a46c4933158c91a0012b7a99bb2e465d5ef5293879742f8797e"}, + {file = "numpy-2.2.1-cp313-cp313-macosx_14_0_x86_64.whl", hash = "sha256:5b6c390bfaef8c45a260554888966618328d30e72173697e5cabe6b285fb2348"}, + {file = "numpy-2.2.1-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:526fc406ab991a340744aad7e25251dd47a6720a685fa3331e5c59fef5282a59"}, + {file = "numpy-2.2.1-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f74e6fdeb9a265624ec3a3918430205dff1df7e95a230779746a6af78bc615af"}, + {file = "numpy-2.2.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:53c09385ff0b72ba79d8715683c1168c12e0b6e84fb0372e97553d1ea91efe51"}, + {file = "numpy-2.2.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f3eac17d9ec51be534685ba877b6ab5edc3ab7ec95c8f163e5d7b39859524716"}, + {file = "numpy-2.2.1-cp313-cp313-win32.whl", hash = "sha256:9ad014faa93dbb52c80d8f4d3dcf855865c876c9660cb9bd7553843dd03a4b1e"}, + {file = "numpy-2.2.1-cp313-cp313-win_amd64.whl", hash = "sha256:164a829b6aacf79ca47ba4814b130c4020b202522a93d7bff2202bfb33b61c60"}, + {file = "numpy-2.2.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:4dfda918a13cc4f81e9118dea249e192ab167a0bb1966272d5503e39234d694e"}, + {file = "numpy-2.2.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:733585f9f4b62e9b3528dd1070ec4f52b8acf64215b60a845fa13ebd73cd0712"}, + {file = "numpy-2.2.1-cp313-cp313t-macosx_14_0_arm64.whl", hash = "sha256:89b16a18e7bba224ce5114db863e7029803c179979e1af6ad6a6b11f70545008"}, + {file = "numpy-2.2.1-cp313-cp313t-macosx_14_0_x86_64.whl", hash = "sha256:676f4eebf6b2d430300f1f4f4c2461685f8269f94c89698d832cdf9277f30b84"}, + {file = "numpy-2.2.1-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:27f5cdf9f493b35f7e41e8368e7d7b4bbafaf9660cba53fb21d2cd174ec09631"}, + {file = "numpy-2.2.1-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c1ad395cf254c4fbb5b2132fee391f361a6e8c1adbd28f2cd8e79308a615fe9d"}, + {file = "numpy-2.2.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:08ef779aed40dbc52729d6ffe7dd51df85796a702afbf68a4f4e41fafdc8bda5"}, + {file = "numpy-2.2.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:26c9c4382b19fcfbbed3238a14abf7ff223890ea1936b8890f058e7ba35e8d71"}, + {file = "numpy-2.2.1-cp313-cp313t-win32.whl", hash = "sha256:93cf4e045bae74c90ca833cba583c14b62cb4ba2cba0abd2b141ab52548247e2"}, + {file = "numpy-2.2.1-cp313-cp313t-win_amd64.whl", hash = "sha256:bff7d8ec20f5f42607599f9994770fa65d76edca264a87b5e4ea5629bce12268"}, + {file = "numpy-2.2.1-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:7ba9cc93a91d86365a5d270dee221fdc04fb68d7478e6bf6af650de78a8339e3"}, + {file = "numpy-2.2.1-pp310-pypy310_pp73-macosx_14_0_x86_64.whl", hash = "sha256:3d03883435a19794e41f147612a77a8f56d4e52822337844fff3d4040a142964"}, + {file = "numpy-2.2.1-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4511d9e6071452b944207c8ce46ad2f897307910b402ea5fa975da32e0102800"}, + {file = "numpy-2.2.1-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:5c5cc0cbabe9452038ed984d05ac87910f89370b9242371bd9079cb4af61811e"}, + {file = "numpy-2.2.1.tar.gz", hash = "sha256:45681fd7128c8ad1c379f0ca0776a8b0c6583d2f69889ddac01559dfe4390918"}, +] + +[[package]] +name = "packaging" +version = "24.2" +description = "Core utilities for Python packages" +optional = false +python-versions = ">=3.8" +files = [ + {file = "packaging-24.2-py3-none-any.whl", hash = "sha256:09abb1bccd265c01f4a3aa3f7a7db064b36514d2cba19a2f694fe6150451a759"}, + {file = "packaging-24.2.tar.gz", hash = "sha256:c228a6dc5e932d346bc5739379109d49e8853dd8223571c7c5b55260edc0b97f"}, +] + +[[package]] +name = "parso" +version = "0.8.4" +description = "A Python Parser" +optional = false +python-versions = ">=3.6" +files = [ + {file = "parso-0.8.4-py2.py3-none-any.whl", hash = "sha256:a418670a20291dacd2dddc80c377c5c3791378ee1e8d12bffc35420643d43f18"}, + {file = "parso-0.8.4.tar.gz", hash = "sha256:eb3a7b58240fb99099a345571deecc0f9540ea5f4dd2fe14c2a99d6b281ab92d"}, +] + +[package.extras] +qa = ["flake8 (==5.0.4)", "mypy (==0.971)", "types-setuptools (==67.2.0.1)"] +testing = ["docopt", "pytest"] + +[[package]] +name = "pexpect" +version = "4.9.0" +description = "Pexpect allows easy control of interactive console applications." +optional = false +python-versions = "*" +files = [ + {file = "pexpect-4.9.0-py2.py3-none-any.whl", hash = "sha256:7236d1e080e4936be2dc3e326cec0af72acf9212a7e1d060210e70a47e253523"}, + {file = "pexpect-4.9.0.tar.gz", hash = "sha256:ee7d41123f3c9911050ea2c2dac107568dc43b2d3b0c7557a33212c398ead30f"}, +] + +[package.dependencies] +ptyprocess = ">=0.5" + +[[package]] +name = "pillow" +version = "11.1.0" +description = "Python Imaging Library (Fork)" +optional = false +python-versions = ">=3.9" +files = [ + {file = "pillow-11.1.0-cp310-cp310-macosx_10_10_x86_64.whl", hash = "sha256:e1abe69aca89514737465752b4bcaf8016de61b3be1397a8fc260ba33321b3a8"}, + {file = "pillow-11.1.0-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:c640e5a06869c75994624551f45e5506e4256562ead981cce820d5ab39ae2192"}, + {file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a07dba04c5e22824816b2615ad7a7484432d7f540e6fa86af60d2de57b0fcee2"}, + {file = "pillow-11.1.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e267b0ed063341f3e60acd25c05200df4193e15a4a5807075cd71225a2386e26"}, + {file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_aarch64.whl", hash = "sha256:bd165131fd51697e22421d0e467997ad31621b74bfc0b75956608cb2906dda07"}, + {file = "pillow-11.1.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:abc56501c3fd148d60659aae0af6ddc149660469082859fa7b066a298bde9482"}, + {file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:54ce1c9a16a9561b6d6d8cb30089ab1e5eb66918cb47d457bd996ef34182922e"}, + {file = "pillow-11.1.0-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:73ddde795ee9b06257dac5ad42fcb07f3b9b813f8c1f7f870f402f4dc54b5269"}, + {file = "pillow-11.1.0-cp310-cp310-win32.whl", hash = "sha256:3a5fe20a7b66e8135d7fd617b13272626a28278d0e578c98720d9ba4b2439d49"}, + {file = "pillow-11.1.0-cp310-cp310-win_amd64.whl", hash = "sha256:b6123aa4a59d75f06e9dd3dac5bf8bc9aa383121bb3dd9a7a612e05eabc9961a"}, + {file = "pillow-11.1.0-cp310-cp310-win_arm64.whl", hash = "sha256:a76da0a31da6fcae4210aa94fd779c65c75786bc9af06289cd1c184451ef7a65"}, + {file = "pillow-11.1.0-cp311-cp311-macosx_10_10_x86_64.whl", hash = "sha256:e06695e0326d05b06833b40b7ef477e475d0b1ba3a6d27da1bb48c23209bf457"}, + {file = "pillow-11.1.0-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:96f82000e12f23e4f29346e42702b6ed9a2f2fea34a740dd5ffffcc8c539eb35"}, + {file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a3cd561ded2cf2bbae44d4605837221b987c216cff94f49dfeed63488bb228d2"}, + {file = "pillow-11.1.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f189805c8be5ca5add39e6f899e6ce2ed824e65fb45f3c28cb2841911da19070"}, + {file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_aarch64.whl", hash = "sha256:dd0052e9db3474df30433f83a71b9b23bd9e4ef1de13d92df21a52c0303b8ab6"}, + {file = "pillow-11.1.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:837060a8599b8f5d402e97197d4924f05a2e0d68756998345c829c33186217b1"}, + {file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:aa8dd43daa836b9a8128dbe7d923423e5ad86f50a7a14dc688194b7be5c0dea2"}, + {file = "pillow-11.1.0-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0a2f91f8a8b367e7a57c6e91cd25af510168091fb89ec5146003e424e1558a96"}, + {file = "pillow-11.1.0-cp311-cp311-win32.whl", hash = "sha256:c12fc111ef090845de2bb15009372175d76ac99969bdf31e2ce9b42e4b8cd88f"}, + {file = "pillow-11.1.0-cp311-cp311-win_amd64.whl", hash = "sha256:fbd43429d0d7ed6533b25fc993861b8fd512c42d04514a0dd6337fb3ccf22761"}, + {file = "pillow-11.1.0-cp311-cp311-win_arm64.whl", hash = "sha256:f7955ecf5609dee9442cbface754f2c6e541d9e6eda87fad7f7a989b0bdb9d71"}, + {file = "pillow-11.1.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:2062ffb1d36544d42fcaa277b069c88b01bb7298f4efa06731a7fd6cc290b81a"}, + {file = "pillow-11.1.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:a85b653980faad27e88b141348707ceeef8a1186f75ecc600c395dcac19f385b"}, + {file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9409c080586d1f683df3f184f20e36fb647f2e0bc3988094d4fd8c9f4eb1b3b3"}, + {file = "pillow-11.1.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7fdadc077553621911f27ce206ffcbec7d3f8d7b50e0da39f10997e8e2bb7f6a"}, + {file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_aarch64.whl", hash = "sha256:93a18841d09bcdd774dcdc308e4537e1f867b3dec059c131fde0327899734aa1"}, + {file = "pillow-11.1.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:9aa9aeddeed452b2f616ff5507459e7bab436916ccb10961c4a382cd3e03f47f"}, + {file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3cdcdb0b896e981678eee140d882b70092dac83ac1cdf6b3a60e2216a73f2b91"}, + {file = "pillow-11.1.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:36ba10b9cb413e7c7dfa3e189aba252deee0602c86c309799da5a74009ac7a1c"}, + {file = "pillow-11.1.0-cp312-cp312-win32.whl", hash = "sha256:cfd5cd998c2e36a862d0e27b2df63237e67273f2fc78f47445b14e73a810e7e6"}, + {file = "pillow-11.1.0-cp312-cp312-win_amd64.whl", hash = "sha256:a697cd8ba0383bba3d2d3ada02b34ed268cb548b369943cd349007730c92bddf"}, + {file = "pillow-11.1.0-cp312-cp312-win_arm64.whl", hash = "sha256:4dd43a78897793f60766563969442020e90eb7847463eca901e41ba186a7d4a5"}, + {file = "pillow-11.1.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:ae98e14432d458fc3de11a77ccb3ae65ddce70f730e7c76140653048c71bfcbc"}, + {file = "pillow-11.1.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cc1331b6d5a6e144aeb5e626f4375f5b7ae9934ba620c0ac6b3e43d5e683a0f0"}, + {file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:758e9d4ef15d3560214cddbc97b8ef3ef86ce04d62ddac17ad39ba87e89bd3b1"}, + {file = "pillow-11.1.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:b523466b1a31d0dcef7c5be1f20b942919b62fd6e9a9be199d035509cbefc0ec"}, + {file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_aarch64.whl", hash = "sha256:9044b5e4f7083f209c4e35aa5dd54b1dd5b112b108648f5c902ad586d4f945c5"}, + {file = "pillow-11.1.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:3764d53e09cdedd91bee65c2527815d315c6b90d7b8b79759cc48d7bf5d4f114"}, + {file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:31eba6bbdd27dde97b0174ddf0297d7a9c3a507a8a1480e1e60ef914fe23d352"}, + {file = "pillow-11.1.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:b5d658fbd9f0d6eea113aea286b21d3cd4d3fd978157cbf2447a6035916506d3"}, + {file = "pillow-11.1.0-cp313-cp313-win32.whl", hash = "sha256:f86d3a7a9af5d826744fabf4afd15b9dfef44fe69a98541f666f66fbb8d3fef9"}, + {file = "pillow-11.1.0-cp313-cp313-win_amd64.whl", hash = "sha256:593c5fd6be85da83656b93ffcccc2312d2d149d251e98588b14fbc288fd8909c"}, + {file = "pillow-11.1.0-cp313-cp313-win_arm64.whl", hash = "sha256:11633d58b6ee5733bde153a8dafd25e505ea3d32e261accd388827ee987baf65"}, + {file = "pillow-11.1.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:70ca5ef3b3b1c4a0812b5c63c57c23b63e53bc38e758b37a951e5bc466449861"}, + {file = "pillow-11.1.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:8000376f139d4d38d6851eb149b321a52bb8893a88dae8ee7d95840431977081"}, + {file = "pillow-11.1.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9ee85f0696a17dd28fbcfceb59f9510aa71934b483d1f5601d1030c3c8304f3c"}, + {file = "pillow-11.1.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:dd0e081319328928531df7a0e63621caf67652c8464303fd102141b785ef9547"}, + {file = "pillow-11.1.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:e63e4e5081de46517099dc30abe418122f54531a6ae2ebc8680bcd7096860eab"}, + {file = "pillow-11.1.0-cp313-cp313t-win32.whl", hash = "sha256:dda60aa465b861324e65a78c9f5cf0f4bc713e4309f83bc387be158b077963d9"}, + {file = "pillow-11.1.0-cp313-cp313t-win_amd64.whl", hash = "sha256:ad5db5781c774ab9a9b2c4302bbf0c1014960a0a7be63278d13ae6fdf88126fe"}, + {file = "pillow-11.1.0-cp313-cp313t-win_arm64.whl", hash = "sha256:67cd427c68926108778a9005f2a04adbd5e67c442ed21d95389fe1d595458756"}, + {file = "pillow-11.1.0-cp39-cp39-macosx_10_10_x86_64.whl", hash = "sha256:bf902d7413c82a1bfa08b06a070876132a5ae6b2388e2712aab3a7cbc02205c6"}, + {file = "pillow-11.1.0-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:c1eec9d950b6fe688edee07138993e54ee4ae634c51443cfb7c1e7613322718e"}, + {file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8e275ee4cb11c262bd108ab2081f750db2a1c0b8c12c1897f27b160c8bd57bbc"}, + {file = "pillow-11.1.0-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4db853948ce4e718f2fc775b75c37ba2efb6aaea41a1a5fc57f0af59eee774b2"}, + {file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_aarch64.whl", hash = "sha256:ab8a209b8485d3db694fa97a896d96dd6533d63c22829043fd9de627060beade"}, + {file = "pillow-11.1.0-cp39-cp39-manylinux_2_28_x86_64.whl", hash = "sha256:54251ef02a2309b5eec99d151ebf5c9904b77976c8abdcbce7891ed22df53884"}, + {file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:5bb94705aea800051a743aa4874bb1397d4695fb0583ba5e425ee0328757f196"}, + {file = "pillow-11.1.0-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:89dbdb3e6e9594d512780a5a1c42801879628b38e3efc7038094430844e271d8"}, + {file = "pillow-11.1.0-cp39-cp39-win32.whl", hash = "sha256:e5449ca63da169a2e6068dd0e2fcc8d91f9558aba89ff6d02121ca8ab11e79e5"}, + {file = "pillow-11.1.0-cp39-cp39-win_amd64.whl", hash = "sha256:3362c6ca227e65c54bf71a5f88b3d4565ff1bcbc63ae72c34b07bbb1cc59a43f"}, + {file = "pillow-11.1.0-cp39-cp39-win_arm64.whl", hash = "sha256:b20be51b37a75cc54c2c55def3fa2c65bb94ba859dde241cd0a4fd302de5ae0a"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:8c730dc3a83e5ac137fbc92dfcfe1511ce3b2b5d7578315b63dbbb76f7f51d90"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:7d33d2fae0e8b170b6a6c57400e077412240f6f5bb2a342cf1ee512a787942bb"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a8d65b38173085f24bc07f8b6c505cbb7418009fa1a1fcb111b1f4961814a442"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:015c6e863faa4779251436db398ae75051469f7c903b043a48f078e437656f83"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_aarch64.whl", hash = "sha256:d44ff19eea13ae4acdaaab0179fa68c0c6f2f45d66a4d8ec1eda7d6cecbcc15f"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:d3d8da4a631471dfaf94c10c85f5277b1f8e42ac42bade1ac67da4b4a7359b73"}, + {file = "pillow-11.1.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:4637b88343166249fe8aa94e7c4a62a180c4b3898283bb5d3d2fd5fe10d8e4e0"}, + {file = "pillow-11.1.0.tar.gz", hash = "sha256:368da70808b36d73b4b390a8ffac11069f8a5c85f29eff1f1b01bcf3ef5b2a20"}, +] + +[package.extras] +docs = ["furo", "olefile", "sphinx (>=8.1)", "sphinx-copybutton", "sphinx-inline-tabs", "sphinxext-opengraph"] +fpx = ["olefile"] +mic = ["olefile"] +tests = ["check-manifest", "coverage (>=7.4.2)", "defusedxml", "markdown2", "olefile", "packaging", "pyroma", "pytest", "pytest-cov", "pytest-timeout", "trove-classifiers (>=2024.10.12)"] +typing = ["typing-extensions"] +xmp = ["defusedxml"] + +[[package]] +name = "pip" +version = "24.3.1" +description = "The PyPA recommended tool for installing Python packages." +optional = false +python-versions = ">=3.8" +files = [ + {file = "pip-24.3.1-py3-none-any.whl", hash = "sha256:3790624780082365f47549d032f3770eeb2b1e8bd1f7b2e02dace1afa361b4ed"}, + {file = "pip-24.3.1.tar.gz", hash = "sha256:ebcb60557f2aefabc2e0f918751cd24ea0d56d8ec5445fe1807f1d2109660b99"}, +] + +[[package]] +name = "platformdirs" +version = "4.3.6" +description = "A small Python package for determining appropriate platform-specific dirs, e.g. a `user data dir`." +optional = false +python-versions = ">=3.8" +files = [ + {file = "platformdirs-4.3.6-py3-none-any.whl", hash = "sha256:73e575e1408ab8103900836b97580d5307456908a03e92031bab39e4554cc3fb"}, + {file = "platformdirs-4.3.6.tar.gz", hash = "sha256:357fb2acbc885b0419afd3ce3ed34564c13c9b95c89360cd9563f73aa5e2b907"}, +] + +[package.extras] +docs = ["furo (>=2024.8.6)", "proselint (>=0.14)", "sphinx (>=8.0.2)", "sphinx-autodoc-typehints (>=2.4)"] +test = ["appdirs (==1.4.4)", "covdefaults (>=2.3)", "pytest (>=8.3.2)", "pytest-cov (>=5)", "pytest-mock (>=3.14)"] +type = ["mypy (>=1.11.2)"] + +[[package]] +name = "prompt-toolkit" +version = "3.0.48" +description = "Library for building powerful interactive command lines in Python" +optional = false +python-versions = ">=3.7.0" +files = [ + {file = "prompt_toolkit-3.0.48-py3-none-any.whl", hash = "sha256:f49a827f90062e411f1ce1f854f2aedb3c23353244f8108b89283587397ac10e"}, + {file = "prompt_toolkit-3.0.48.tar.gz", hash = "sha256:d6623ab0477a80df74e646bdbc93621143f5caf104206aa29294d53de1a03d90"}, +] + +[package.dependencies] +wcwidth = "*" + +[[package]] +name = "psutil" +version = "6.1.1" +description = "Cross-platform lib for process and system monitoring in Python." +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,>=2.7" +files = [ + {file = "psutil-6.1.1-cp27-cp27m-macosx_10_9_x86_64.whl", hash = "sha256:9ccc4316f24409159897799b83004cb1e24f9819b0dcf9c0b68bdcb6cefee6a8"}, + {file = "psutil-6.1.1-cp27-cp27m-manylinux2010_i686.whl", hash = "sha256:ca9609c77ea3b8481ab005da74ed894035936223422dc591d6772b147421f777"}, + {file = "psutil-6.1.1-cp27-cp27m-manylinux2010_x86_64.whl", hash = "sha256:8df0178ba8a9e5bc84fed9cfa61d54601b371fbec5c8eebad27575f1e105c0d4"}, + {file = "psutil-6.1.1-cp27-cp27mu-manylinux2010_i686.whl", hash = "sha256:1924e659d6c19c647e763e78670a05dbb7feaf44a0e9c94bf9e14dfc6ba50468"}, + {file = "psutil-6.1.1-cp27-cp27mu-manylinux2010_x86_64.whl", hash = "sha256:018aeae2af92d943fdf1da6b58665124897cfc94faa2ca92098838f83e1b1bca"}, + {file = "psutil-6.1.1-cp27-none-win32.whl", hash = "sha256:6d4281f5bbca041e2292be3380ec56a9413b790579b8e593b1784499d0005dac"}, + {file = "psutil-6.1.1-cp27-none-win_amd64.whl", hash = "sha256:c777eb75bb33c47377c9af68f30e9f11bc78e0f07fbf907be4a5d70b2fe5f030"}, + {file = "psutil-6.1.1-cp36-abi3-macosx_10_9_x86_64.whl", hash = "sha256:fc0ed7fe2231a444fc219b9c42d0376e0a9a1a72f16c5cfa0f68d19f1a0663e8"}, + {file = "psutil-6.1.1-cp36-abi3-macosx_11_0_arm64.whl", hash = "sha256:0bdd4eab935276290ad3cb718e9809412895ca6b5b334f5a9111ee6d9aff9377"}, + {file = "psutil-6.1.1-cp36-abi3-manylinux_2_12_i686.manylinux2010_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:b6e06c20c05fe95a3d7302d74e7097756d4ba1247975ad6905441ae1b5b66003"}, + {file = "psutil-6.1.1-cp36-abi3-manylinux_2_12_x86_64.manylinux2010_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:97f7cb9921fbec4904f522d972f0c0e1f4fabbdd4e0287813b21215074a0f160"}, + {file = "psutil-6.1.1-cp36-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:33431e84fee02bc84ea36d9e2c4a6d395d479c9dd9bba2376c1f6ee8f3a4e0b3"}, + {file = "psutil-6.1.1-cp36-cp36m-win32.whl", hash = "sha256:384636b1a64b47814437d1173be1427a7c83681b17a450bfc309a1953e329603"}, + {file = "psutil-6.1.1-cp36-cp36m-win_amd64.whl", hash = "sha256:8be07491f6ebe1a693f17d4f11e69d0dc1811fa082736500f649f79df7735303"}, + {file = "psutil-6.1.1-cp37-abi3-win32.whl", hash = "sha256:eaa912e0b11848c4d9279a93d7e2783df352b082f40111e078388701fd479e53"}, + {file = "psutil-6.1.1-cp37-abi3-win_amd64.whl", hash = "sha256:f35cfccb065fff93529d2afb4a2e89e363fe63ca1e4a5da22b603a85833c2649"}, + {file = "psutil-6.1.1.tar.gz", hash = "sha256:cf8496728c18f2d0b45198f06895be52f36611711746b7f30c464b422b50e2f5"}, +] + +[package.extras] +dev = ["abi3audit", "black", "check-manifest", "coverage", "packaging", "pylint", "pyperf", "pypinfo", "pytest-cov", "requests", "rstcheck", "ruff", "sphinx", "sphinx_rtd_theme", "toml-sort", "twine", "virtualenv", "vulture", "wheel"] +test = ["pytest", "pytest-xdist", "setuptools"] + +[[package]] +name = "ptyprocess" +version = "0.7.0" +description = "Run a subprocess in a pseudo terminal" +optional = false +python-versions = "*" +files = [ + {file = "ptyprocess-0.7.0-py2.py3-none-any.whl", hash = "sha256:4b41f3967fce3af57cc7e94b888626c18bf37a083e3651ca8feeb66d492fef35"}, + {file = "ptyprocess-0.7.0.tar.gz", hash = "sha256:5c5d0a3b48ceee0b48485e0c26037c0acd7d29765ca3fbb5cb3831d347423220"}, +] + +[[package]] +name = "pure-eval" +version = "0.2.3" +description = "Safely evaluate AST nodes without side effects" +optional = false +python-versions = "*" +files = [ + {file = "pure_eval-0.2.3-py3-none-any.whl", hash = "sha256:1db8e35b67b3d218d818ae653e27f06c3aa420901fa7b081ca98cbedc874e0d0"}, + {file = "pure_eval-0.2.3.tar.gz", hash = "sha256:5f4e983f40564c576c7c8635ae88db5956bb2229d7e9237d03b3c0b0190eaf42"}, +] + +[package.extras] +tests = ["pytest"] + +[[package]] +name = "pybtex" +version = "0.24.0" +description = "A BibTeX-compatible bibliography processor in Python" +optional = false +python-versions = ">=2.7,!=3.0.*,!=3.1.*,!=3.2.*" +files = [ + {file = "pybtex-0.24.0-py2.py3-none-any.whl", hash = "sha256:e1e0c8c69998452fea90e9179aa2a98ab103f3eed894405b7264e517cc2fcc0f"}, + {file = "pybtex-0.24.0.tar.gz", hash = "sha256:818eae35b61733e5c007c3fcd2cfb75ed1bc8b4173c1f70b56cc4c0802d34755"}, +] + +[package.dependencies] +latexcodec = ">=1.0.4" +PyYAML = ">=3.01" +six = "*" + +[package.extras] +test = ["pytest"] + +[[package]] +name = "pybtex-docutils" +version = "1.0.3" +description = "A docutils backend for pybtex." +optional = false +python-versions = ">=3.7" +files = [ + {file = "pybtex-docutils-1.0.3.tar.gz", hash = "sha256:3a7ebdf92b593e00e8c1c538aa9a20bca5d92d84231124715acc964d51d93c6b"}, + {file = "pybtex_docutils-1.0.3-py3-none-any.whl", hash = "sha256:8fd290d2ae48e32fcb54d86b0efb8d573198653c7e2447d5bec5847095f430b9"}, +] + +[package.dependencies] +docutils = ">=0.14" +pybtex = ">=0.16" + +[[package]] +name = "pycparser" +version = "2.22" +description = "C parser in Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "pycparser-2.22-py3-none-any.whl", hash = "sha256:c3702b6d3dd8c7abc1afa565d7e63d53a1d0bd86cdc24edd75470f4de499cfcc"}, + {file = "pycparser-2.22.tar.gz", hash = "sha256:491c8be9c040f5390f5bf44a5b07752bd07f56edf992381b05c701439eec10f6"}, +] + +[[package]] +name = "pydata-sphinx-theme" +version = "0.16.1" +description = "Bootstrap-based Sphinx theme from the PyData community" +optional = false +python-versions = ">=3.9" +files = [ + {file = "pydata_sphinx_theme-0.16.1-py3-none-any.whl", hash = "sha256:225331e8ac4b32682c18fcac5a57a6f717c4e632cea5dd0e247b55155faeccde"}, + {file = "pydata_sphinx_theme-0.16.1.tar.gz", hash = "sha256:a08b7f0b7f70387219dc659bff0893a7554d5eb39b59d3b8ef37b8401b7642d7"}, +] + +[package.dependencies] +accessible-pygments = "*" +Babel = "*" +beautifulsoup4 = "*" +docutils = "!=0.17.0" +pygments = ">=2.7" +sphinx = ">=6.1" +typing-extensions = "*" + +[package.extras] +a11y = ["pytest-playwright"] +dev = ["pandoc", "pre-commit", "pydata-sphinx-theme[doc,test]", "pyyaml", "sphinx-theme-builder[cli]", "tox"] +doc = ["ablog (>=0.11.8)", "colorama", "graphviz", "ipykernel", "ipyleaflet", "ipywidgets", "jupyter_sphinx", "jupyterlite-sphinx", "linkify-it-py", "matplotlib", "myst-parser", "nbsphinx", "numpy", "numpydoc", "pandas", "plotly", "rich", "sphinx-autoapi (>=3.0.0)", "sphinx-copybutton", "sphinx-design", "sphinx-favicon (>=1.0.1)", "sphinx-sitemap", "sphinx-togglebutton", "sphinxcontrib-youtube (>=1.4.1)", "sphinxext-rediraffe", "xarray"] +i18n = ["Babel", "jinja2"] +test = ["pytest", "pytest-cov", "pytest-regressions", "sphinx[test]"] + +[[package]] +name = "pygments" +version = "2.18.0" +description = "Pygments is a syntax highlighting package written in Python." +optional = false +python-versions = ">=3.8" +files = [ + {file = "pygments-2.18.0-py3-none-any.whl", hash = "sha256:b8e6aca0523f3ab76fee51799c488e38782ac06eafcf95e7ba832985c8e7b13a"}, + {file = "pygments-2.18.0.tar.gz", hash = "sha256:786ff802f32e91311bff3889f6e9a86e81505fe99f2735bb6d60ae0c5004f199"}, +] + +[package.extras] +windows-terminal = ["colorama (>=0.4.6)"] + +[[package]] +name = "pyparsing" +version = "3.2.1" +description = "pyparsing module - Classes and methods to define and execute parsing grammars" +optional = false +python-versions = ">=3.9" +files = [ + {file = "pyparsing-3.2.1-py3-none-any.whl", hash = "sha256:506ff4f4386c4cec0590ec19e6302d3aedb992fdc02c761e90416f158dacf8e1"}, + {file = "pyparsing-3.2.1.tar.gz", hash = "sha256:61980854fd66de3a90028d679a954d5f2623e83144b5afe5ee86f43d762e5f0a"}, +] + +[package.extras] +diagrams = ["jinja2", "railroad-diagrams"] + +[[package]] +name = "python-dateutil" +version = "2.9.0.post0" +description = "Extensions to the standard Python datetime module" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +files = [ + {file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"}, + {file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"}, +] + +[package.dependencies] +six = ">=1.5" + +[[package]] +name = "pywin32" +version = "308" +description = "Python for Window Extensions" +optional = false +python-versions = "*" +files = [ + {file = "pywin32-308-cp310-cp310-win32.whl", hash = "sha256:796ff4426437896550d2981b9c2ac0ffd75238ad9ea2d3bfa67a1abd546d262e"}, + {file = "pywin32-308-cp310-cp310-win_amd64.whl", hash = "sha256:4fc888c59b3c0bef905ce7eb7e2106a07712015ea1c8234b703a088d46110e8e"}, + {file = "pywin32-308-cp310-cp310-win_arm64.whl", hash = "sha256:a5ab5381813b40f264fa3495b98af850098f814a25a63589a8e9eb12560f450c"}, + {file = "pywin32-308-cp311-cp311-win32.whl", hash = "sha256:5d8c8015b24a7d6855b1550d8e660d8daa09983c80e5daf89a273e5c6fb5095a"}, + {file = "pywin32-308-cp311-cp311-win_amd64.whl", hash = "sha256:575621b90f0dc2695fec346b2d6302faebd4f0f45c05ea29404cefe35d89442b"}, + {file = "pywin32-308-cp311-cp311-win_arm64.whl", hash = "sha256:100a5442b7332070983c4cd03f2e906a5648a5104b8a7f50175f7906efd16bb6"}, + {file = "pywin32-308-cp312-cp312-win32.whl", hash = "sha256:587f3e19696f4bf96fde9d8a57cec74a57021ad5f204c9e627e15c33ff568897"}, + {file = "pywin32-308-cp312-cp312-win_amd64.whl", hash = "sha256:00b3e11ef09ede56c6a43c71f2d31857cf7c54b0ab6e78ac659497abd2834f47"}, + {file = "pywin32-308-cp312-cp312-win_arm64.whl", hash = "sha256:9b4de86c8d909aed15b7011182c8cab38c8850de36e6afb1f0db22b8959e3091"}, + {file = "pywin32-308-cp313-cp313-win32.whl", hash = "sha256:1c44539a37a5b7b21d02ab34e6a4d314e0788f1690d65b48e9b0b89f31abbbed"}, + {file = "pywin32-308-cp313-cp313-win_amd64.whl", hash = "sha256:fd380990e792eaf6827fcb7e187b2b4b1cede0585e3d0c9e84201ec27b9905e4"}, + {file = "pywin32-308-cp313-cp313-win_arm64.whl", hash = "sha256:ef313c46d4c18dfb82a2431e3051ac8f112ccee1a34f29c263c583c568db63cd"}, + {file = "pywin32-308-cp37-cp37m-win32.whl", hash = "sha256:1f696ab352a2ddd63bd07430080dd598e6369152ea13a25ebcdd2f503a38f1ff"}, + {file = "pywin32-308-cp37-cp37m-win_amd64.whl", hash = "sha256:13dcb914ed4347019fbec6697a01a0aec61019c1046c2b905410d197856326a6"}, + {file = "pywin32-308-cp38-cp38-win32.whl", hash = "sha256:5794e764ebcabf4ff08c555b31bd348c9025929371763b2183172ff4708152f0"}, + {file = "pywin32-308-cp38-cp38-win_amd64.whl", hash = "sha256:3b92622e29d651c6b783e368ba7d6722b1634b8e70bd376fd7610fe1992e19de"}, + {file = "pywin32-308-cp39-cp39-win32.whl", hash = "sha256:7873ca4dc60ab3287919881a7d4f88baee4a6e639aa6962de25a98ba6b193341"}, + {file = "pywin32-308-cp39-cp39-win_amd64.whl", hash = "sha256:71b3322d949b4cc20776436a9c9ba0eeedcbc9c650daa536df63f0ff111bb920"}, +] + +[[package]] +name = "pyyaml" +version = "6.0.2" +description = "YAML parser and emitter for Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "PyYAML-6.0.2-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:0a9a2848a5b7feac301353437eb7d5957887edbf81d56e903999a75a3d743086"}, + {file = "PyYAML-6.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:29717114e51c84ddfba879543fb232a6ed60086602313ca38cce623c1d62cfbf"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8824b5a04a04a047e72eea5cec3bc266db09e35de6bdfe34c9436ac5ee27d237"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7c36280e6fb8385e520936c3cb3b8042851904eba0e58d277dca80a5cfed590b"}, + {file = "PyYAML-6.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ec031d5d2feb36d1d1a24380e4db6d43695f3748343d99434e6f5f9156aaa2ed"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:936d68689298c36b53b29f23c6dbb74de12b4ac12ca6cfe0e047bedceea56180"}, + {file = "PyYAML-6.0.2-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:23502f431948090f597378482b4812b0caae32c22213aecf3b55325e049a6c68"}, + {file = "PyYAML-6.0.2-cp310-cp310-win32.whl", hash = "sha256:2e99c6826ffa974fe6e27cdb5ed0021786b03fc98e5ee3c5bfe1fd5015f42b99"}, + {file = "PyYAML-6.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:a4d3091415f010369ae4ed1fc6b79def9416358877534caf6a0fdd2146c87a3e"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:cc1c1159b3d456576af7a3e4d1ba7e6924cb39de8f67111c735f6fc832082774"}, + {file = "PyYAML-6.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:1e2120ef853f59c7419231f3bf4e7021f1b936f6ebd222406c3b60212205d2ee"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:5d225db5a45f21e78dd9358e58a98702a0302f2659a3c6cd320564b75b86f47c"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:5ac9328ec4831237bec75defaf839f7d4564be1e6b25ac710bd1a96321cc8317"}, + {file = "PyYAML-6.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3ad2a3decf9aaba3d29c8f537ac4b243e36bef957511b4766cb0057d32b0be85"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:ff3824dc5261f50c9b0dfb3be22b4567a6f938ccce4587b38952d85fd9e9afe4"}, + {file = "PyYAML-6.0.2-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:797b4f722ffa07cc8d62053e4cff1486fa6dc094105d13fea7b1de7d8bf71c9e"}, + {file = "PyYAML-6.0.2-cp311-cp311-win32.whl", hash = "sha256:11d8f3dd2b9c1207dcaf2ee0bbbfd5991f571186ec9cc78427ba5bd32afae4b5"}, + {file = "PyYAML-6.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:e10ce637b18caea04431ce14fabcf5c64a1c61ec9c56b071a4b7ca131ca52d44"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:c70c95198c015b85feafc136515252a261a84561b7b1d51e3384e0655ddf25ab"}, + {file = "PyYAML-6.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ce826d6ef20b1bc864f0a68340c8b3287705cae2f8b4b1d932177dcc76721725"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1f71ea527786de97d1a0cc0eacd1defc0985dcf6b3f17bb77dcfc8c34bec4dc5"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:9b22676e8097e9e22e36d6b7bda33190d0d400f345f23d4065d48f4ca7ae0425"}, + {file = "PyYAML-6.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:80bab7bfc629882493af4aa31a4cfa43a4c57c83813253626916b8c7ada83476"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:0833f8694549e586547b576dcfaba4a6b55b9e96098b36cdc7ebefe667dfed48"}, + {file = "PyYAML-6.0.2-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:8b9c7197f7cb2738065c481a0461e50ad02f18c78cd75775628afb4d7137fb3b"}, + {file = "PyYAML-6.0.2-cp312-cp312-win32.whl", hash = "sha256:ef6107725bd54b262d6dedcc2af448a266975032bc85ef0172c5f059da6325b4"}, + {file = "PyYAML-6.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:7e7401d0de89a9a855c839bc697c079a4af81cf878373abd7dc625847d25cbd8"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:efdca5630322a10774e8e98e1af481aad470dd62c3170801852d752aa7a783ba"}, + {file = "PyYAML-6.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:50187695423ffe49e2deacb8cd10510bc361faac997de9efef88badc3bb9e2d1"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0ffe8360bab4910ef1b9e87fb812d8bc0a308b0d0eef8c8f44e0254ab3b07133"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:17e311b6c678207928d649faa7cb0d7b4c26a0ba73d41e99c4fff6b6c3276484"}, + {file = "PyYAML-6.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:70b189594dbe54f75ab3a1acec5f1e3faa7e8cf2f1e08d9b561cb41b845f69d5"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:41e4e3953a79407c794916fa277a82531dd93aad34e29c2a514c2c0c5fe971cc"}, + {file = "PyYAML-6.0.2-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:68ccc6023a3400877818152ad9a1033e3db8625d899c72eacb5a668902e4d652"}, + {file = "PyYAML-6.0.2-cp313-cp313-win32.whl", hash = "sha256:bc2fa7c6b47d6bc618dd7fb02ef6fdedb1090ec036abab80d4681424b84c1183"}, + {file = "PyYAML-6.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:8388ee1976c416731879ac16da0aff3f63b286ffdd57cdeb95f3f2e085687563"}, + {file = "PyYAML-6.0.2-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:24471b829b3bf607e04e88d79542a9d48bb037c2267d7927a874e6c205ca7e9a"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d7fded462629cfa4b685c5416b949ebad6cec74af5e2d42905d41e257e0869f5"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:d84a1718ee396f54f3a086ea0a66d8e552b2ab2017ef8b420e92edbc841c352d"}, + {file = "PyYAML-6.0.2-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9056c1ecd25795207ad294bcf39f2db3d845767be0ea6e6a34d856f006006083"}, + {file = "PyYAML-6.0.2-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:82d09873e40955485746739bcb8b4586983670466c23382c19cffecbf1fd8706"}, + {file = "PyYAML-6.0.2-cp38-cp38-win32.whl", hash = "sha256:43fa96a3ca0d6b1812e01ced1044a003533c47f6ee8aca31724f78e93ccc089a"}, + {file = "PyYAML-6.0.2-cp38-cp38-win_amd64.whl", hash = "sha256:01179a4a8559ab5de078078f37e5c1a30d76bb88519906844fd7bdea1b7729ff"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:688ba32a1cffef67fd2e9398a2efebaea461578b0923624778664cc1c914db5d"}, + {file = "PyYAML-6.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:a8786accb172bd8afb8be14490a16625cbc387036876ab6ba70912730faf8e1f"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d8e03406cac8513435335dbab54c0d385e4a49e4945d2909a581c83647ca0290"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f753120cb8181e736c57ef7636e83f31b9c0d1722c516f7e86cf15b7aa57ff12"}, + {file = "PyYAML-6.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:3b1fdb9dc17f5a7677423d508ab4f243a726dea51fa5e70992e59a7411c89d19"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:0b69e4ce7a131fe56b7e4d770c67429700908fc0752af059838b1cfb41960e4e"}, + {file = "PyYAML-6.0.2-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:a9f8c2e67970f13b16084e04f134610fd1d374bf477b17ec1599185cf611d725"}, + {file = "PyYAML-6.0.2-cp39-cp39-win32.whl", hash = "sha256:6395c297d42274772abc367baaa79683958044e5d3835486c16da75d2a694631"}, + {file = "PyYAML-6.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:39693e1f8320ae4f43943590b49779ffb98acb81f788220ea932a6b6c51004d8"}, + {file = "pyyaml-6.0.2.tar.gz", hash = "sha256:d584d9ec91ad65861cc08d42e834324ef890a082e591037abe114850ff7bbc3e"}, +] + +[[package]] +name = "pyzmq" +version = "26.2.0" +description = "Python bindings for 0MQ" +optional = false +python-versions = ">=3.7" +files = [ + {file = "pyzmq-26.2.0-cp310-cp310-macosx_10_15_universal2.whl", hash = "sha256:ddf33d97d2f52d89f6e6e7ae66ee35a4d9ca6f36eda89c24591b0c40205a3629"}, + {file = "pyzmq-26.2.0-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:dacd995031a01d16eec825bf30802fceb2c3791ef24bcce48fa98ce40918c27b"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:89289a5ee32ef6c439086184529ae060c741334b8970a6855ec0b6ad3ff28764"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:5506f06d7dc6ecf1efacb4a013b1f05071bb24b76350832c96449f4a2d95091c"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:8ea039387c10202ce304af74def5021e9adc6297067f3441d348d2b633e8166a"}, + {file = "pyzmq-26.2.0-cp310-cp310-manylinux_2_28_x86_64.whl", hash = "sha256:a2224fa4a4c2ee872886ed00a571f5e967c85e078e8e8c2530a2fb01b3309b88"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_aarch64.whl", hash = "sha256:28ad5233e9c3b52d76196c696e362508959741e1a005fb8fa03b51aea156088f"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_i686.whl", hash = "sha256:1c17211bc037c7d88e85ed8b7d8f7e52db6dc8eca5590d162717c654550f7282"}, + {file = "pyzmq-26.2.0-cp310-cp310-musllinux_1_1_x86_64.whl", hash = "sha256:b8f86dd868d41bea9a5f873ee13bf5551c94cf6bc51baebc6f85075971fe6eea"}, + {file = "pyzmq-26.2.0-cp310-cp310-win32.whl", hash = "sha256:46a446c212e58456b23af260f3d9fb785054f3e3653dbf7279d8f2b5546b21c2"}, + {file = "pyzmq-26.2.0-cp310-cp310-win_amd64.whl", hash = "sha256:49d34ab71db5a9c292a7644ce74190b1dd5a3475612eefb1f8be1d6961441971"}, + {file = "pyzmq-26.2.0-cp310-cp310-win_arm64.whl", hash = "sha256:bfa832bfa540e5b5c27dcf5de5d82ebc431b82c453a43d141afb1e5d2de025fa"}, + {file = "pyzmq-26.2.0-cp311-cp311-macosx_10_15_universal2.whl", hash = "sha256:8f7e66c7113c684c2b3f1c83cdd3376103ee0ce4c49ff80a648643e57fb22218"}, + {file = "pyzmq-26.2.0-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:3a495b30fc91db2db25120df5847d9833af237546fd59170701acd816ccc01c4"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:77eb0968da535cba0470a5165468b2cac7772cfb569977cff92e240f57e31bef"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:6ace4f71f1900a548f48407fc9be59c6ba9d9aaf658c2eea6cf2779e72f9f317"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:92a78853d7280bffb93df0a4a6a2498cba10ee793cc8076ef797ef2f74d107cf"}, + {file = "pyzmq-26.2.0-cp311-cp311-manylinux_2_28_x86_64.whl", hash = "sha256:689c5d781014956a4a6de61d74ba97b23547e431e9e7d64f27d4922ba96e9d6e"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_aarch64.whl", hash = "sha256:0aca98bc423eb7d153214b2df397c6421ba6373d3397b26c057af3c904452e37"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_i686.whl", hash = "sha256:1f3496d76b89d9429a656293744ceca4d2ac2a10ae59b84c1da9b5165f429ad3"}, + {file = "pyzmq-26.2.0-cp311-cp311-musllinux_1_1_x86_64.whl", hash = "sha256:5c2b3bfd4b9689919db068ac6c9911f3fcb231c39f7dd30e3138be94896d18e6"}, + {file = "pyzmq-26.2.0-cp311-cp311-win32.whl", hash = "sha256:eac5174677da084abf378739dbf4ad245661635f1600edd1221f150b165343f4"}, + {file = "pyzmq-26.2.0-cp311-cp311-win_amd64.whl", hash = "sha256:5a509df7d0a83a4b178d0f937ef14286659225ef4e8812e05580776c70e155d5"}, + {file = "pyzmq-26.2.0-cp311-cp311-win_arm64.whl", hash = "sha256:c0e6091b157d48cbe37bd67233318dbb53e1e6327d6fc3bb284afd585d141003"}, + {file = "pyzmq-26.2.0-cp312-cp312-macosx_10_15_universal2.whl", hash = "sha256:ded0fc7d90fe93ae0b18059930086c51e640cdd3baebdc783a695c77f123dcd9"}, + {file = "pyzmq-26.2.0-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:17bf5a931c7f6618023cdacc7081f3f266aecb68ca692adac015c383a134ca52"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:55cf66647e49d4621a7e20c8d13511ef1fe1efbbccf670811864452487007e08"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:4661c88db4a9e0f958c8abc2b97472e23061f0bc737f6f6179d7a27024e1faa5"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ea7f69de383cb47522c9c208aec6dd17697db7875a4674c4af3f8cfdac0bdeae"}, + {file = "pyzmq-26.2.0-cp312-cp312-manylinux_2_28_x86_64.whl", hash = "sha256:7f98f6dfa8b8ccaf39163ce872bddacca38f6a67289116c8937a02e30bbe9711"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:e3e0210287329272539eea617830a6a28161fbbd8a3271bf4150ae3e58c5d0e6"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_i686.whl", hash = "sha256:6b274e0762c33c7471f1a7471d1a2085b1a35eba5cdc48d2ae319f28b6fc4de3"}, + {file = "pyzmq-26.2.0-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:29c6a4635eef69d68a00321e12a7d2559fe2dfccfa8efae3ffb8e91cd0b36a8b"}, + {file = "pyzmq-26.2.0-cp312-cp312-win32.whl", hash = "sha256:989d842dc06dc59feea09e58c74ca3e1678c812a4a8a2a419046d711031f69c7"}, + {file = "pyzmq-26.2.0-cp312-cp312-win_amd64.whl", hash = "sha256:2a50625acdc7801bc6f74698c5c583a491c61d73c6b7ea4dee3901bb99adb27a"}, + {file = "pyzmq-26.2.0-cp312-cp312-win_arm64.whl", hash = "sha256:4d29ab8592b6ad12ebbf92ac2ed2bedcfd1cec192d8e559e2e099f648570e19b"}, + {file = "pyzmq-26.2.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:9dd8cd1aeb00775f527ec60022004d030ddc51d783d056e3e23e74e623e33726"}, + {file = "pyzmq-26.2.0-cp313-cp313-macosx_10_15_universal2.whl", hash = "sha256:28c812d9757fe8acecc910c9ac9dafd2ce968c00f9e619db09e9f8f54c3a68a3"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4d80b1dd99c1942f74ed608ddb38b181b87476c6a966a88a950c7dee118fdf50"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:8c997098cc65e3208eca09303630e84d42718620e83b733d0fd69543a9cab9cb"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7ad1bc8d1b7a18497dda9600b12dc193c577beb391beae5cd2349184db40f187"}, + {file = "pyzmq-26.2.0-cp313-cp313-manylinux_2_28_x86_64.whl", hash = "sha256:bea2acdd8ea4275e1278350ced63da0b166421928276c7c8e3f9729d7402a57b"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:23f4aad749d13698f3f7b64aad34f5fc02d6f20f05999eebc96b89b01262fb18"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_i686.whl", hash = "sha256:a4f96f0d88accc3dbe4a9025f785ba830f968e21e3e2c6321ccdfc9aef755115"}, + {file = "pyzmq-26.2.0-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:ced65e5a985398827cc9276b93ef6dfabe0273c23de8c7931339d7e141c2818e"}, + {file = "pyzmq-26.2.0-cp313-cp313-win32.whl", hash = "sha256:31507f7b47cc1ead1f6e86927f8ebb196a0bab043f6345ce070f412a59bf87b5"}, + {file = "pyzmq-26.2.0-cp313-cp313-win_amd64.whl", hash = "sha256:70fc7fcf0410d16ebdda9b26cbd8bf8d803d220a7f3522e060a69a9c87bf7bad"}, + {file = "pyzmq-26.2.0-cp313-cp313-win_arm64.whl", hash = "sha256:c3789bd5768ab5618ebf09cef6ec2b35fed88709b104351748a63045f0ff9797"}, + {file = "pyzmq-26.2.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:034da5fc55d9f8da09015d368f519478a52675e558c989bfcb5cf6d4e16a7d2a"}, + {file = "pyzmq-26.2.0-cp313-cp313t-macosx_10_15_universal2.whl", hash = "sha256:c92d73464b886931308ccc45b2744e5968cbaade0b1d6aeb40d8ab537765f5bc"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:794a4562dcb374f7dbbfb3f51d28fb40123b5a2abadee7b4091f93054909add5"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:aee22939bb6075e7afededabad1a56a905da0b3c4e3e0c45e75810ebe3a52672"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ae90ff9dad33a1cfe947d2c40cb9cb5e600d759ac4f0fd22616ce6540f72797"}, + {file = "pyzmq-26.2.0-cp313-cp313t-manylinux_2_28_x86_64.whl", hash = "sha256:43a47408ac52647dfabbc66a25b05b6a61700b5165807e3fbd40063fcaf46386"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_aarch64.whl", hash = "sha256:25bf2374a2a8433633c65ccb9553350d5e17e60c8eb4de4d92cc6bd60f01d306"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_i686.whl", hash = "sha256:007137c9ac9ad5ea21e6ad97d3489af654381324d5d3ba614c323f60dab8fae6"}, + {file = "pyzmq-26.2.0-cp313-cp313t-musllinux_1_1_x86_64.whl", hash = "sha256:470d4a4f6d48fb34e92d768b4e8a5cc3780db0d69107abf1cd7ff734b9766eb0"}, + {file = "pyzmq-26.2.0-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:3b55a4229ce5da9497dd0452b914556ae58e96a4381bb6f59f1305dfd7e53fc8"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:9cb3a6460cdea8fe8194a76de8895707e61ded10ad0be97188cc8463ffa7e3a8"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8ab5cad923cc95c87bffee098a27856c859bd5d0af31bd346035aa816b081fe1"}, + {file = "pyzmq-26.2.0-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9ed69074a610fad1c2fda66180e7b2edd4d31c53f2d1872bc2d1211563904cd9"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_aarch64.whl", hash = "sha256:cccba051221b916a4f5e538997c45d7d136a5646442b1231b916d0164067ea27"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_i686.whl", hash = "sha256:0eaa83fc4c1e271c24eaf8fb083cbccef8fde77ec8cd45f3c35a9a123e6da097"}, + {file = "pyzmq-26.2.0-cp37-cp37m-musllinux_1_1_x86_64.whl", hash = "sha256:9edda2df81daa129b25a39b86cb57dfdfe16f7ec15b42b19bfac503360d27a93"}, + {file = "pyzmq-26.2.0-cp37-cp37m-win32.whl", hash = "sha256:ea0eb6af8a17fa272f7b98d7bebfab7836a0d62738e16ba380f440fceca2d951"}, + {file = "pyzmq-26.2.0-cp37-cp37m-win_amd64.whl", hash = "sha256:4ff9dc6bc1664bb9eec25cd17506ef6672d506115095411e237d571e92a58231"}, + {file = "pyzmq-26.2.0-cp38-cp38-macosx_10_15_universal2.whl", hash = "sha256:2eb7735ee73ca1b0d71e0e67c3739c689067f055c764f73aac4cc8ecf958ee3f"}, + {file = "pyzmq-26.2.0-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:1a534f43bc738181aa7cbbaf48e3eca62c76453a40a746ab95d4b27b1111a7d2"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:aedd5dd8692635813368e558a05266b995d3d020b23e49581ddd5bbe197a8ab6"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:8be4700cd8bb02cc454f630dcdf7cfa99de96788b80c51b60fe2fe1dac480289"}, + {file = "pyzmq-26.2.0-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1fcc03fa4997c447dce58264e93b5aa2d57714fbe0f06c07b7785ae131512732"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_aarch64.whl", hash = "sha256:402b190912935d3db15b03e8f7485812db350d271b284ded2b80d2e5704be780"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_i686.whl", hash = "sha256:8685fa9c25ff00f550c1fec650430c4b71e4e48e8d852f7ddcf2e48308038640"}, + {file = "pyzmq-26.2.0-cp38-cp38-musllinux_1_1_x86_64.whl", hash = "sha256:76589c020680778f06b7e0b193f4b6dd66d470234a16e1df90329f5e14a171cd"}, + {file = "pyzmq-26.2.0-cp38-cp38-win32.whl", hash = "sha256:8423c1877d72c041f2c263b1ec6e34360448decfb323fa8b94e85883043ef988"}, + {file = "pyzmq-26.2.0-cp38-cp38-win_amd64.whl", hash = "sha256:76589f2cd6b77b5bdea4fca5992dc1c23389d68b18ccc26a53680ba2dc80ff2f"}, + {file = "pyzmq-26.2.0-cp39-cp39-macosx_10_15_universal2.whl", hash = "sha256:b1d464cb8d72bfc1a3adc53305a63a8e0cac6bc8c5a07e8ca190ab8d3faa43c2"}, + {file = "pyzmq-26.2.0-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:4da04c48873a6abdd71811c5e163bd656ee1b957971db7f35140a2d573f6949c"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:d049df610ac811dcffdc147153b414147428567fbbc8be43bb8885f04db39d98"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:05590cdbc6b902101d0e65d6a4780af14dc22914cc6ab995d99b85af45362cc9"}, + {file = "pyzmq-26.2.0-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:c811cfcd6a9bf680236c40c6f617187515269ab2912f3d7e8c0174898e2519db"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_aarch64.whl", hash = "sha256:6835dd60355593de10350394242b5757fbbd88b25287314316f266e24c61d073"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_i686.whl", hash = "sha256:bc6bee759a6bddea5db78d7dcd609397449cb2d2d6587f48f3ca613b19410cfc"}, + {file = "pyzmq-26.2.0-cp39-cp39-musllinux_1_1_x86_64.whl", hash = "sha256:c530e1eecd036ecc83c3407f77bb86feb79916d4a33d11394b8234f3bd35b940"}, + {file = "pyzmq-26.2.0-cp39-cp39-win32.whl", hash = "sha256:367b4f689786fca726ef7a6c5ba606958b145b9340a5e4808132cc65759abd44"}, + {file = "pyzmq-26.2.0-cp39-cp39-win_amd64.whl", hash = "sha256:e6fa2e3e683f34aea77de8112f6483803c96a44fd726d7358b9888ae5bb394ec"}, + {file = "pyzmq-26.2.0-cp39-cp39-win_arm64.whl", hash = "sha256:7445be39143a8aa4faec43b076e06944b8f9d0701b669df4af200531b21e40bb"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-macosx_10_15_x86_64.whl", hash = "sha256:706e794564bec25819d21a41c31d4df2d48e1cc4b061e8d345d7fb4dd3e94072"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8b435f2753621cd36e7c1762156815e21c985c72b19135dac43a7f4f31d28dd1"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:160c7e0a5eb178011e72892f99f918c04a131f36056d10d9c1afb223fc952c2d"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2c4a71d5d6e7b28a47a394c0471b7e77a0661e2d651e7ae91e0cab0a587859ca"}, + {file = "pyzmq-26.2.0-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:90412f2db8c02a3864cbfc67db0e3dcdbda336acf1c469526d3e869394fe001c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-macosx_10_9_x86_64.whl", hash = "sha256:2ea4ad4e6a12e454de05f2949d4beddb52460f3de7c8b9d5c46fbb7d7222e02c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:fc4f7a173a5609631bb0c42c23d12c49df3966f89f496a51d3eb0ec81f4519d6"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:878206a45202247781472a2d99df12a176fef806ca175799e1c6ad263510d57c"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:17c412bad2eb9468e876f556eb4ee910e62d721d2c7a53c7fa31e643d35352e6"}, + {file = "pyzmq-26.2.0-pp37-pypy37_pp73-win_amd64.whl", hash = "sha256:0d987a3ae5a71c6226b203cfd298720e0086c7fe7c74f35fa8edddfbd6597eed"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-macosx_10_9_x86_64.whl", hash = "sha256:39887ac397ff35b7b775db7201095fc6310a35fdbae85bac4523f7eb3b840e20"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_12_i686.manylinux2010_i686.whl", hash = "sha256:fdb5b3e311d4d4b0eb8b3e8b4d1b0a512713ad7e6a68791d0923d1aec433d919"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_12_x86_64.manylinux2010_x86_64.whl", hash = "sha256:226af7dcb51fdb0109f0016449b357e182ea0ceb6b47dfb5999d569e5db161d5"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0bed0e799e6120b9c32756203fb9dfe8ca2fb8467fed830c34c877e25638c3fc"}, + {file = "pyzmq-26.2.0-pp38-pypy38_pp73-win_amd64.whl", hash = "sha256:29c7947c594e105cb9e6c466bace8532dc1ca02d498684128b339799f5248277"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-macosx_10_15_x86_64.whl", hash = "sha256:cdeabcff45d1c219636ee2e54d852262e5c2e085d6cb476d938aee8d921356b3"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:35cffef589bcdc587d06f9149f8d5e9e8859920a071df5a2671de2213bef592a"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:18c8dc3b7468d8b4bdf60ce9d7141897da103c7a4690157b32b60acb45e333e6"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:7133d0a1677aec369d67dd78520d3fa96dd7f3dcec99d66c1762870e5ea1a50a"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-manylinux_2_28_x86_64.whl", hash = "sha256:6a96179a24b14fa6428cbfc08641c779a53f8fcec43644030328f44034c7f1f4"}, + {file = "pyzmq-26.2.0-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:4f78c88905461a9203eac9faac157a2a0dbba84a0fd09fd29315db27be40af9f"}, + {file = "pyzmq-26.2.0.tar.gz", hash = "sha256:070672c258581c8e4f640b5159297580a9974b026043bd4ab0470be9ed324f1f"}, +] + +[package.dependencies] +cffi = {version = "*", markers = "implementation_name == \"pypy\""} + +[[package]] +name = "referencing" +version = "0.35.1" +description = "JSON Referencing + Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "referencing-0.35.1-py3-none-any.whl", hash = "sha256:eda6d3234d62814d1c64e305c1331c9a3a6132da475ab6382eaa997b21ee75de"}, + {file = "referencing-0.35.1.tar.gz", hash = "sha256:25b42124a6c8b632a425174f24087783efb348a6f1e0008e63cd4466fedf703c"}, +] + +[package.dependencies] +attrs = ">=22.2.0" +rpds-py = ">=0.7.0" + +[[package]] +name = "requests" +version = "2.32.3" +description = "Python HTTP for Humans." +optional = false +python-versions = ">=3.8" +files = [ + {file = "requests-2.32.3-py3-none-any.whl", hash = "sha256:70761cfe03c773ceb22aa2f671b4757976145175cdfca038c02654d061d6dcc6"}, + {file = "requests-2.32.3.tar.gz", hash = "sha256:55365417734eb18255590a9ff9eb97e9e1da868d4ccd6402399eaf68af20a760"}, +] + +[package.dependencies] +certifi = ">=2017.4.17" +charset-normalizer = ">=2,<4" +idna = ">=2.5,<4" +urllib3 = ">=1.21.1,<3" + +[package.extras] +socks = ["PySocks (>=1.5.6,!=1.5.7)"] +use-chardet-on-py3 = ["chardet (>=3.0.2,<6)"] + +[[package]] +name = "rpds-py" +version = "0.22.3" +description = "Python bindings to Rust's persistent data structures (rpds)" +optional = false +python-versions = ">=3.9" +files = [ + {file = "rpds_py-0.22.3-cp310-cp310-macosx_10_12_x86_64.whl", hash = "sha256:6c7b99ca52c2c1752b544e310101b98a659b720b21db00e65edca34483259967"}, + {file = "rpds_py-0.22.3-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:be2eb3f2495ba669d2a985f9b426c1797b7d48d6963899276d22f23e33d47e37"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:70eb60b3ae9245ddea20f8a4190bd79c705a22f8028aaf8bbdebe4716c3fab24"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:4041711832360a9b75cfb11b25a6a97c8fb49c07b8bd43d0d02b45d0b499a4ff"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:64607d4cbf1b7e3c3c8a14948b99345eda0e161b852e122c6bb71aab6d1d798c"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:81e69b0a0e2537f26d73b4e43ad7bc8c8efb39621639b4434b76a3de50c6966e"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bc27863442d388870c1809a87507727b799c8460573cfbb6dc0eeaef5a11b5ec"}, + {file = "rpds_py-0.22.3-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:e79dd39f1e8c3504be0607e5fc6e86bb60fe3584bec8b782578c3b0fde8d932c"}, + {file = "rpds_py-0.22.3-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:e0fa2d4ec53dc51cf7d3bb22e0aa0143966119f42a0c3e4998293a3dd2856b09"}, + {file = "rpds_py-0.22.3-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:fda7cb070f442bf80b642cd56483b5548e43d366fe3f39b98e67cce780cded00"}, + {file = "rpds_py-0.22.3-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:cff63a0272fcd259dcc3be1657b07c929c466b067ceb1c20060e8d10af56f5bf"}, + {file = "rpds_py-0.22.3-cp310-cp310-win32.whl", hash = "sha256:9bd7228827ec7bb817089e2eb301d907c0d9827a9e558f22f762bb690b131652"}, + {file = "rpds_py-0.22.3-cp310-cp310-win_amd64.whl", hash = "sha256:9beeb01d8c190d7581a4d59522cd3d4b6887040dcfc744af99aa59fef3e041a8"}, + {file = "rpds_py-0.22.3-cp311-cp311-macosx_10_12_x86_64.whl", hash = "sha256:d20cfb4e099748ea39e6f7b16c91ab057989712d31761d3300d43134e26e165f"}, + {file = "rpds_py-0.22.3-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:68049202f67380ff9aa52f12e92b1c30115f32e6895cd7198fa2a7961621fc5a"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fb4f868f712b2dd4bcc538b0a0c1f63a2b1d584c925e69a224d759e7070a12d5"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:bc51abd01f08117283c5ebf64844a35144a0843ff7b2983e0648e4d3d9f10dbb"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:0f3cec041684de9a4684b1572fe28c7267410e02450f4561700ca5a3bc6695a2"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:7ef9d9da710be50ff6809fed8f1963fecdfecc8b86656cadfca3bc24289414b0"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:59f4a79c19232a5774aee369a0c296712ad0e77f24e62cad53160312b1c1eaa1"}, + {file = "rpds_py-0.22.3-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1a60bce91f81ddaac922a40bbb571a12c1070cb20ebd6d49c48e0b101d87300d"}, + {file = "rpds_py-0.22.3-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:e89391e6d60251560f0a8f4bd32137b077a80d9b7dbe6d5cab1cd80d2746f648"}, + {file = "rpds_py-0.22.3-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:e3fb866d9932a3d7d0c82da76d816996d1667c44891bd861a0f97ba27e84fc74"}, + {file = "rpds_py-0.22.3-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:1352ae4f7c717ae8cba93421a63373e582d19d55d2ee2cbb184344c82d2ae55a"}, + {file = "rpds_py-0.22.3-cp311-cp311-win32.whl", hash = "sha256:b0b4136a252cadfa1adb705bb81524eee47d9f6aab4f2ee4fa1e9d3cd4581f64"}, + {file = "rpds_py-0.22.3-cp311-cp311-win_amd64.whl", hash = "sha256:8bd7c8cfc0b8247c8799080fbff54e0b9619e17cdfeb0478ba7295d43f635d7c"}, + {file = "rpds_py-0.22.3-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:27e98004595899949bd7a7b34e91fa7c44d7a97c40fcaf1d874168bb652ec67e"}, + {file = "rpds_py-0.22.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1978d0021e943aae58b9b0b196fb4895a25cc53d3956b8e35e0b7682eefb6d56"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:655ca44a831ecb238d124e0402d98f6212ac527a0ba6c55ca26f616604e60a45"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:feea821ee2a9273771bae61194004ee2fc33f8ec7db08117ef9147d4bbcbca8e"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:22bebe05a9ffc70ebfa127efbc429bc26ec9e9b4ee4d15a740033efda515cf3d"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:3af6e48651c4e0d2d166dc1b033b7042ea3f871504b6805ba5f4fe31581d8d38"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e67ba3c290821343c192f7eae1d8fd5999ca2dc99994114643e2f2d3e6138b15"}, + {file = "rpds_py-0.22.3-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:02fbb9c288ae08bcb34fb41d516d5eeb0455ac35b5512d03181d755d80810059"}, + {file = "rpds_py-0.22.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:f56a6b404f74ab372da986d240e2e002769a7d7102cc73eb238a4f72eec5284e"}, + {file = "rpds_py-0.22.3-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:0a0461200769ab3b9ab7e513f6013b7a97fdeee41c29b9db343f3c5a8e2b9e61"}, + {file = "rpds_py-0.22.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:8633e471c6207a039eff6aa116e35f69f3156b3989ea3e2d755f7bc41754a4a7"}, + {file = "rpds_py-0.22.3-cp312-cp312-win32.whl", hash = "sha256:593eba61ba0c3baae5bc9be2f5232430453fb4432048de28399ca7376de9c627"}, + {file = "rpds_py-0.22.3-cp312-cp312-win_amd64.whl", hash = "sha256:d115bffdd417c6d806ea9069237a4ae02f513b778e3789a359bc5856e0404cc4"}, + {file = "rpds_py-0.22.3-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:ea7433ce7e4bfc3a85654aeb6747babe3f66eaf9a1d0c1e7a4435bbdf27fea84"}, + {file = "rpds_py-0.22.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:6dd9412824c4ce1aca56c47b0991e65bebb7ac3f4edccfd3f156150c96a7bf25"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:20070c65396f7373f5df4005862fa162db5d25d56150bddd0b3e8214e8ef45b4"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:0b09865a9abc0ddff4e50b5ef65467cd94176bf1e0004184eb915cbc10fc05c5"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:3453e8d41fe5f17d1f8e9c383a7473cd46a63661628ec58e07777c2fff7196dc"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:f5d36399a1b96e1a5fdc91e0522544580dbebeb1f77f27b2b0ab25559e103b8b"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:009de23c9c9ee54bf11303a966edf4d9087cd43a6003672e6aa7def643d06518"}, + {file = "rpds_py-0.22.3-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:1aef18820ef3e4587ebe8b3bc9ba6e55892a6d7b93bac6d29d9f631a3b4befbd"}, + {file = "rpds_py-0.22.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f60bd8423be1d9d833f230fdbccf8f57af322d96bcad6599e5a771b151398eb2"}, + {file = "rpds_py-0.22.3-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:62d9cfcf4948683a18a9aff0ab7e1474d407b7bab2ca03116109f8464698ab16"}, + {file = "rpds_py-0.22.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9253fc214112405f0afa7db88739294295f0e08466987f1d70e29930262b4c8f"}, + {file = "rpds_py-0.22.3-cp313-cp313-win32.whl", hash = "sha256:fb0ba113b4983beac1a2eb16faffd76cb41e176bf58c4afe3e14b9c681f702de"}, + {file = "rpds_py-0.22.3-cp313-cp313-win_amd64.whl", hash = "sha256:c58e2339def52ef6b71b8f36d13c3688ea23fa093353f3a4fee2556e62086ec9"}, + {file = "rpds_py-0.22.3-cp313-cp313t-macosx_10_12_x86_64.whl", hash = "sha256:f82a116a1d03628a8ace4859556fb39fd1424c933341a08ea3ed6de1edb0283b"}, + {file = "rpds_py-0.22.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3dfcbc95bd7992b16f3f7ba05af8a64ca694331bd24f9157b49dadeeb287493b"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:59259dc58e57b10e7e18ce02c311804c10c5a793e6568f8af4dead03264584d1"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5725dd9cc02068996d4438d397e255dcb1df776b7ceea3b9cb972bdb11260a83"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:99b37292234e61325e7a5bb9689e55e48c3f5f603af88b1642666277a81f1fbd"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:27b1d3b3915a99208fee9ab092b8184c420f2905b7d7feb4aeb5e4a9c509b8a1"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f612463ac081803f243ff13cccc648578e2279295048f2a8d5eb430af2bae6e3"}, + {file = "rpds_py-0.22.3-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:f73d3fef726b3243a811121de45193c0ca75f6407fe66f3f4e183c983573e130"}, + {file = "rpds_py-0.22.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:3f21f0495edea7fdbaaa87e633a8689cd285f8f4af5c869f27bc8074638ad69c"}, + {file = "rpds_py-0.22.3-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:1e9663daaf7a63ceccbbb8e3808fe90415b0757e2abddbfc2e06c857bf8c5e2b"}, + {file = "rpds_py-0.22.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:a76e42402542b1fae59798fab64432b2d015ab9d0c8c47ba7addddbaf7952333"}, + {file = "rpds_py-0.22.3-cp313-cp313t-win32.whl", hash = "sha256:69803198097467ee7282750acb507fba35ca22cc3b85f16cf45fb01cb9097730"}, + {file = "rpds_py-0.22.3-cp313-cp313t-win_amd64.whl", hash = "sha256:f5cf2a0c2bdadf3791b5c205d55a37a54025c6e18a71c71f82bb536cf9a454bf"}, + {file = "rpds_py-0.22.3-cp39-cp39-macosx_10_12_x86_64.whl", hash = "sha256:378753b4a4de2a7b34063d6f95ae81bfa7b15f2c1a04a9518e8644e81807ebea"}, + {file = "rpds_py-0.22.3-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:3445e07bf2e8ecfeef6ef67ac83de670358abf2996916039b16a218e3d95e97e"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:7b2513ba235829860b13faa931f3b6846548021846ac808455301c23a101689d"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:eaf16ae9ae519a0e237a0f528fd9f0197b9bb70f40263ee57ae53c2b8d48aeb3"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:583f6a1993ca3369e0f80ba99d796d8e6b1a3a2a442dd4e1a79e652116413091"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:4617e1915a539a0d9a9567795023de41a87106522ff83fbfaf1f6baf8e85437e"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0c150c7a61ed4a4f4955a96626574e9baf1adf772c2fb61ef6a5027e52803543"}, + {file = "rpds_py-0.22.3-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2fa4331c200c2521512595253f5bb70858b90f750d39b8cbfd67465f8d1b596d"}, + {file = "rpds_py-0.22.3-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:214b7a953d73b5e87f0ebece4a32a5bd83c60a3ecc9d4ec8f1dca968a2d91e99"}, + {file = "rpds_py-0.22.3-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:f47ad3d5f3258bd7058d2d506852217865afefe6153a36eb4b6928758041d831"}, + {file = "rpds_py-0.22.3-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:f276b245347e6e36526cbd4a266a417796fc531ddf391e43574cf6466c492520"}, + {file = "rpds_py-0.22.3-cp39-cp39-win32.whl", hash = "sha256:bbb232860e3d03d544bc03ac57855cd82ddf19c7a07651a7c0fdb95e9efea8b9"}, + {file = "rpds_py-0.22.3-cp39-cp39-win_amd64.whl", hash = "sha256:cfbc454a2880389dbb9b5b398e50d439e2e58669160f27b60e5eca11f68ae17c"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-macosx_10_12_x86_64.whl", hash = "sha256:d48424e39c2611ee1b84ad0f44fb3b2b53d473e65de061e3f460fc0be5f1939d"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-macosx_11_0_arm64.whl", hash = "sha256:24e8abb5878e250f2eb0d7859a8e561846f98910326d06c0d51381fed59357bd"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4b232061ca880db21fa14defe219840ad9b74b6158adb52ddf0e87bead9e8493"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ac0a03221cdb5058ce0167ecc92a8c89e8d0decdc9e99a2ec23380793c4dcb96"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:eb0c341fa71df5a4595f9501df4ac5abfb5a09580081dffbd1ddd4654e6e9123"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bf9db5488121b596dbfc6718c76092fda77b703c1f7533a226a5a9f65248f8ad"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0b8db6b5b2d4491ad5b6bdc2bc7c017eec108acbf4e6785f42a9eb0ba234f4c9"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:b3d504047aba448d70cf6fa22e06cb09f7cbd761939fdd47604f5e007675c24e"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:e61b02c3f7a1e0b75e20c3978f7135fd13cb6cf551bf4a6d29b999a88830a338"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-musllinux_1_2_i686.whl", hash = "sha256:e35ba67d65d49080e8e5a1dd40101fccdd9798adb9b050ff670b7d74fa41c566"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:26fd7cac7dd51011a245f29a2cc6489c4608b5a8ce8d75661bb4a1066c52dfbe"}, + {file = "rpds_py-0.22.3-pp310-pypy310_pp73-win_amd64.whl", hash = "sha256:177c7c0fce2855833819c98e43c262007f42ce86651ffbb84f37883308cb0e7d"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-macosx_10_12_x86_64.whl", hash = "sha256:bb47271f60660803ad11f4c61b42242b8c1312a31c98c578f79ef9387bbde21c"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-macosx_11_0_arm64.whl", hash = "sha256:70fb28128acbfd264eda9bf47015537ba3fe86e40d046eb2963d75024be4d055"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:44d61b4b7d0c2c9ac019c314e52d7cbda0ae31078aabd0f22e583af3e0d79723"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5f0e260eaf54380380ac3808aa4ebe2d8ca28b9087cf411649f96bad6900c728"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b25bc607423935079e05619d7de556c91fb6adeae9d5f80868dde3468657994b"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:fb6116dfb8d1925cbdb52595560584db42a7f664617a1f7d7f6e32f138cdf37d"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a63cbdd98acef6570c62b92a1e43266f9e8b21e699c363c0fef13bd530799c11"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-manylinux_2_5_i686.manylinux1_i686.whl", hash = "sha256:2b8f60e1b739a74bab7e01fcbe3dddd4657ec685caa04681df9d562ef15b625f"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-musllinux_1_2_aarch64.whl", hash = "sha256:2e8b55d8517a2fda8d95cb45d62a5a8bbf9dd0ad39c5b25c8833efea07b880ca"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-musllinux_1_2_i686.whl", hash = "sha256:2de29005e11637e7a2361fa151f780ff8eb2543a0da1413bb951e9f14b699ef3"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-musllinux_1_2_x86_64.whl", hash = "sha256:666ecce376999bf619756a24ce15bb14c5bfaf04bf00abc7e663ce17c3f34fe7"}, + {file = "rpds_py-0.22.3-pp39-pypy39_pp73-win_amd64.whl", hash = "sha256:5246b14ca64a8675e0a7161f7af68fe3e910e6b90542b4bfb5439ba752191df6"}, + {file = "rpds_py-0.22.3.tar.gz", hash = "sha256:e32fee8ab45d3c2db6da19a5323bc3362237c8b653c70194414b892fd06a080d"}, +] + +[[package]] +name = "setuptools" +version = "75.6.0" +description = "Easily download, build, install, upgrade, and uninstall Python packages" +optional = false +python-versions = ">=3.9" +files = [ + {file = "setuptools-75.6.0-py3-none-any.whl", hash = "sha256:ce74b49e8f7110f9bf04883b730f4765b774ef3ef28f722cce7c273d253aaf7d"}, + {file = "setuptools-75.6.0.tar.gz", hash = "sha256:8199222558df7c86216af4f84c30e9b34a61d8ba19366cc914424cdbd28252f6"}, +] + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)", "ruff (>=0.7.0)"] +core = ["importlib_metadata (>=6)", "jaraco.collections", "jaraco.functools (>=4)", "jaraco.text (>=3.7)", "more_itertools", "more_itertools (>=8.8)", "packaging", "packaging (>=24.2)", "platformdirs (>=4.2.2)", "tomli (>=2.0.1)", "wheel (>=0.43.0)"] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "pygments-github-lexers (==0.0.5)", "pyproject-hooks (!=1.1)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-favicon", "sphinx-inline-tabs", "sphinx-lint", "sphinx-notfound-page (>=1,<2)", "sphinx-reredirects", "sphinxcontrib-towncrier", "towncrier (<24.7)"] +enabler = ["pytest-enabler (>=2.2)"] +test = ["build[virtualenv] (>=1.0.3)", "filelock (>=3.4.0)", "ini2toml[lite] (>=0.14)", "jaraco.develop (>=7.21)", "jaraco.envs (>=2.2)", "jaraco.path (>=3.2.0)", "jaraco.test (>=5.5)", "packaging (>=24.2)", "pip (>=19.1)", "pyproject-hooks (!=1.1)", "pytest (>=6,!=8.1.*)", "pytest-home (>=0.5)", "pytest-perf", "pytest-subprocess", "pytest-timeout", "pytest-xdist (>=3)", "tomli-w (>=1.0.0)", "virtualenv (>=13.0.0)", "wheel (>=0.44.0)"] +type = ["importlib_metadata (>=7.0.2)", "jaraco.develop (>=7.21)", "mypy (>=1.12,<1.14)", "pytest-mypy"] + +[[package]] +name = "six" +version = "1.17.0" +description = "Python 2 and 3 compatibility utilities" +optional = false +python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7" +files = [ + {file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"}, + {file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"}, +] + +[[package]] +name = "snowballstemmer" +version = "2.2.0" +description = "This package provides 29 stemmers for 28 languages generated from Snowball algorithms." +optional = false +python-versions = "*" +files = [ + {file = "snowballstemmer-2.2.0-py2.py3-none-any.whl", hash = "sha256:c8e1716e83cc398ae16824e5572ae04e0d9fc2c6b985fb0f900f5f0c96ecba1a"}, + {file = "snowballstemmer-2.2.0.tar.gz", hash = "sha256:09b16deb8547d3412ad7b590689584cd0fe25ec8db3be37788be3810cbf19cb1"}, +] + +[[package]] +name = "soupsieve" +version = "2.6" +description = "A modern CSS selector implementation for Beautiful Soup." +optional = false +python-versions = ">=3.8" +files = [ + {file = "soupsieve-2.6-py3-none-any.whl", hash = "sha256:e72c4ff06e4fb6e4b5a9f0f55fe6e81514581fca1515028625d0f299c602ccc9"}, + {file = "soupsieve-2.6.tar.gz", hash = "sha256:e2e68417777af359ec65daac1057404a3c8a5455bb8abc36f1a9866ab1a51abb"}, +] + +[[package]] +name = "sphinx" +version = "7.4.7" +description = "Python documentation generator" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinx-7.4.7-py3-none-any.whl", hash = "sha256:c2419e2135d11f1951cd994d6eb18a1835bd8fdd8429f9ca375dc1f3281bd239"}, + {file = "sphinx-7.4.7.tar.gz", hash = "sha256:242f92a7ea7e6c5b406fdc2615413890ba9f699114a9c09192d7dfead2ee9cfe"}, +] + +[package.dependencies] +alabaster = ">=0.7.14,<0.8.0" +babel = ">=2.13" +colorama = {version = ">=0.4.6", markers = "sys_platform == \"win32\""} +docutils = ">=0.20,<0.22" +imagesize = ">=1.3" +Jinja2 = ">=3.1" +packaging = ">=23.0" +Pygments = ">=2.17" +requests = ">=2.30.0" +snowballstemmer = ">=2.2" +sphinxcontrib-applehelp = "*" +sphinxcontrib-devhelp = "*" +sphinxcontrib-htmlhelp = ">=2.0.0" +sphinxcontrib-jsmath = "*" +sphinxcontrib-qthelp = "*" +sphinxcontrib-serializinghtml = ">=1.1.9" + +[package.extras] +docs = ["sphinxcontrib-websupport"] +lint = ["flake8 (>=6.0)", "importlib-metadata (>=6.0)", "mypy (==1.10.1)", "pytest (>=6.0)", "ruff (==0.5.2)", "sphinx-lint (>=0.9)", "tomli (>=2)", "types-docutils (==0.21.0.20240711)", "types-requests (>=2.30.0)"] +test = ["cython (>=3.0)", "defusedxml (>=0.7.1)", "pytest (>=8.0)", "setuptools (>=70.0)", "typing_extensions (>=4.9)"] + +[[package]] +name = "sphinx-book-theme" +version = "1.1.3" +description = "A clean book theme for scientific explanations and documentation with Sphinx" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinx_book_theme-1.1.3-py3-none-any.whl", hash = "sha256:a554a9a7ac3881979a87a2b10f633aa2a5706e72218a10f71be38b3c9e831ae9"}, + {file = "sphinx_book_theme-1.1.3.tar.gz", hash = "sha256:1f25483b1846cb3d353a6bc61b3b45b031f4acf845665d7da90e01ae0aef5b4d"}, +] + +[package.dependencies] +pydata-sphinx-theme = ">=0.15.2" +sphinx = ">=5" + +[package.extras] +code-style = ["pre-commit"] +doc = ["ablog", "folium", "ipywidgets", "matplotlib", "myst-nb", "nbclient", "numpy", "numpydoc", "pandas", "plotly", "sphinx-copybutton", "sphinx-design", "sphinx-examples", "sphinx-tabs", "sphinx-thebe", "sphinx-togglebutton", "sphinxcontrib-bibtex", "sphinxcontrib-youtube", "sphinxext-opengraph"] +test = ["beautifulsoup4", "coverage", "defusedxml", "myst-nb", "pytest", "pytest-cov", "pytest-regressions", "sphinx_thebe"] + +[[package]] +name = "sphinx-comments" +version = "0.0.3" +description = "Add comments and annotation to your documentation." +optional = false +python-versions = "*" +files = [ + {file = "sphinx-comments-0.0.3.tar.gz", hash = "sha256:00170afff27019fad08e421da1ae49c681831fb2759786f07c826e89ac94cf21"}, + {file = "sphinx_comments-0.0.3-py3-none-any.whl", hash = "sha256:1e879b4e9bfa641467f83e3441ac4629225fc57c29995177d043252530c21d00"}, +] + +[package.dependencies] +sphinx = ">=1.8" + +[package.extras] +code-style = ["black", "flake8 (>=3.7.0,<3.8.0)", "pre-commit (==1.17.0)"] +sphinx = ["myst-parser", "sphinx (>=2)", "sphinx-book-theme"] +testing = ["beautifulsoup4", "myst-parser", "pytest", "pytest-regressions", "sphinx (>=2)", "sphinx-book-theme"] + +[[package]] +name = "sphinx-copybutton" +version = "0.5.2" +description = "Add a copy button to each of your code cells." +optional = false +python-versions = ">=3.7" +files = [ + {file = "sphinx-copybutton-0.5.2.tar.gz", hash = "sha256:4cf17c82fb9646d1bc9ca92ac280813a3b605d8c421225fd9913154103ee1fbd"}, + {file = "sphinx_copybutton-0.5.2-py3-none-any.whl", hash = "sha256:fb543fd386d917746c9a2c50360c7905b605726b9355cd26e9974857afeae06e"}, +] + +[package.dependencies] +sphinx = ">=1.8" + +[package.extras] +code-style = ["pre-commit (==2.12.1)"] +rtd = ["ipython", "myst-nb", "sphinx", "sphinx-book-theme", "sphinx-examples"] + +[[package]] +name = "sphinx-design" +version = "0.6.1" +description = "A sphinx extension for designing beautiful, view size responsive web components." +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinx_design-0.6.1-py3-none-any.whl", hash = "sha256:b11f37db1a802a183d61b159d9a202314d4d2fe29c163437001324fe2f19549c"}, + {file = "sphinx_design-0.6.1.tar.gz", hash = "sha256:b44eea3719386d04d765c1a8257caca2b3e6f8421d7b3a5e742c0fd45f84e632"}, +] + +[package.dependencies] +sphinx = ">=6,<9" + +[package.extras] +code-style = ["pre-commit (>=3,<4)"] +rtd = ["myst-parser (>=2,<4)"] +testing = ["defusedxml", "myst-parser (>=2,<4)", "pytest (>=8.3,<9.0)", "pytest-cov", "pytest-regressions"] +testing-no-myst = ["defusedxml", "pytest (>=8.3,<9.0)", "pytest-cov", "pytest-regressions"] +theme-furo = ["furo (>=2024.7.18,<2024.8.0)"] +theme-im = ["sphinx-immaterial (>=0.12.2,<0.13.0)"] +theme-pydata = ["pydata-sphinx-theme (>=0.15.2,<0.16.0)"] +theme-rtd = ["sphinx-rtd-theme (>=2.0,<3.0)"] +theme-sbt = ["sphinx-book-theme (>=1.1,<2.0)"] + +[[package]] +name = "sphinx-external-toc" +version = "1.0.1" +description = "A sphinx extension that allows the site-map to be defined in a single YAML file." +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinx_external_toc-1.0.1-py3-none-any.whl", hash = "sha256:d9e02d50731dee9697c1887e4f8b361e7b86d38241f0e66bd5a9f4096779646f"}, + {file = "sphinx_external_toc-1.0.1.tar.gz", hash = "sha256:a7d2c63cc47ec688546443b28bc4ef466121827ef3dc7bb509de354bad4ea2e0"}, +] + +[package.dependencies] +click = ">=7.1" +pyyaml = "*" +sphinx = ">=5" + +[package.extras] +code-style = ["pre-commit (>=2.12)"] +rtd = ["myst-parser (>=1.0.0)", "sphinx-book-theme (>=1.0.0)"] +testing = ["coverage", "pytest (>=7.1)", "pytest-cov", "pytest-regressions"] + +[[package]] +name = "sphinx-jupyterbook-latex" +version = "1.0.0" +description = "Latex specific features for jupyter book" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinx_jupyterbook_latex-1.0.0-py3-none-any.whl", hash = "sha256:e0cd3e9e1c5af69136434e21a533343fdf013475c410a414d5b7b4922b4f3891"}, + {file = "sphinx_jupyterbook_latex-1.0.0.tar.gz", hash = "sha256:f54c6674c13f1616f9a93443e98b9b5353f9fdda8e39b6ec552ccf0b3e5ffb62"}, +] + +[package.dependencies] +packaging = "*" +sphinx = ">=5" + +[package.extras] +code-style = ["pre-commit (>=2.12,<3.0)"] +myst = ["myst-nb (>=1.0.0)"] +rtd = ["myst-parser", "sphinx-book-theme", "sphinx-design", "sphinx-jupyterbook-latex"] +testing = ["coverage (>=6.0)", "myst-nb (>=1.0.0)", "pytest (>=7.1)", "pytest-cov (>=3)", "pytest-regressions", "sphinx-external-toc (>=1.0.0)", "sphinxcontrib-bibtex (>=2.6.0)", "texsoup"] + +[[package]] +name = "sphinx-multitoc-numbering" +version = "0.1.3" +description = "Supporting continuous HTML section numbering" +optional = false +python-versions = "*" +files = [ + {file = "sphinx-multitoc-numbering-0.1.3.tar.gz", hash = "sha256:c9607671ac511236fa5d61a7491c1031e700e8d498c9d2418e6c61d1251209ae"}, + {file = "sphinx_multitoc_numbering-0.1.3-py3-none-any.whl", hash = "sha256:33d2e707a9b2b8ad636b3d4302e658a008025106fe0474046c651144c26d8514"}, +] + +[package.dependencies] +sphinx = ">=3" + +[package.extras] +code-style = ["black", "flake8 (>=3.7.0,<3.8.0)", "pre-commit (==1.17.0)"] +rtd = ["myst-parser", "sphinx (>=3.0)", "sphinx-book-theme"] +testing = ["coverage (<5.0)", "jupyter-book", "pytest (>=5.4,<6.0)", "pytest-cov (>=2.8,<3.0)", "pytest-regressions"] + +[[package]] +name = "sphinx-thebe" +version = "0.3.1" +description = "Integrate interactive code blocks into your documentation with Thebe and Binder." +optional = false +python-versions = ">=3.8" +files = [ + {file = "sphinx_thebe-0.3.1-py3-none-any.whl", hash = "sha256:e7e7edee9f0d601c76bc70156c471e114939484b111dd8e74fe47ac88baffc52"}, + {file = "sphinx_thebe-0.3.1.tar.gz", hash = "sha256:576047f45560e82f64aa5f15200b1eb094dcfe1c5b8f531a8a65bd208e25a493"}, +] + +[package.dependencies] +sphinx = ">=4" + +[package.extras] +dev = ["sphinx-thebe[testing]"] +sphinx = ["myst-nb", "sphinx-book-theme", "sphinx-copybutton", "sphinx-design"] +testing = ["beautifulsoup4", "matplotlib", "myst-nb (>=1.0.0rc0)", "pytest", "pytest-regressions", "sphinx-copybutton", "sphinx-design"] + +[[package]] +name = "sphinx-togglebutton" +version = "0.3.2" +description = "Toggle page content and collapse admonitions in Sphinx." +optional = false +python-versions = "*" +files = [ + {file = "sphinx-togglebutton-0.3.2.tar.gz", hash = "sha256:ab0c8b366427b01e4c89802d5d078472c427fa6e9d12d521c34fa0442559dc7a"}, + {file = "sphinx_togglebutton-0.3.2-py3-none-any.whl", hash = "sha256:9647ba7874b7d1e2d43413d8497153a85edc6ac95a3fea9a75ef9c1e08aaae2b"}, +] + +[package.dependencies] +docutils = "*" +setuptools = "*" +sphinx = "*" +wheel = "*" + +[package.extras] +sphinx = ["matplotlib", "myst-nb", "numpy", "sphinx-book-theme", "sphinx-design", "sphinx-examples"] + +[[package]] +name = "sphinxcontrib-applehelp" +version = "2.0.0" +description = "sphinxcontrib-applehelp is a Sphinx extension which outputs Apple help books" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinxcontrib_applehelp-2.0.0-py3-none-any.whl", hash = "sha256:4cd3f0ec4ac5dd9c17ec65e9ab272c9b867ea77425228e68ecf08d6b28ddbdb5"}, + {file = "sphinxcontrib_applehelp-2.0.0.tar.gz", hash = "sha256:2f29ef331735ce958efa4734873f084941970894c6090408b079c61b2e1c06d1"}, +] + +[package.extras] +lint = ["mypy", "ruff (==0.5.5)", "types-docutils"] +standalone = ["Sphinx (>=5)"] +test = ["pytest"] + +[[package]] +name = "sphinxcontrib-bibtex" +version = "2.6.3" +description = "Sphinx extension for BibTeX style citations." +optional = false +python-versions = ">=3.7" +files = [ + {file = "sphinxcontrib_bibtex-2.6.3-py3-none-any.whl", hash = "sha256:ff016b738fcc867df0f75c29e139b3b2158d26a2c802db27963cb128be3b75fb"}, + {file = "sphinxcontrib_bibtex-2.6.3.tar.gz", hash = "sha256:7c790347ef1cb0edf30de55fc324d9782d085e89c52c2b8faafa082e08e23946"}, +] + +[package.dependencies] +docutils = ">=0.8,<0.18.dev0 || >=0.20.dev0" +pybtex = ">=0.24" +pybtex-docutils = ">=1.0.0" +setuptools = {version = "*", markers = "python_version >= \"3.12\""} +Sphinx = ">=3.5" + +[package.extras] +test = ["pytest", "pytest-cov"] + +[[package]] +name = "sphinxcontrib-devhelp" +version = "2.0.0" +description = "sphinxcontrib-devhelp is a sphinx extension which outputs Devhelp documents" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinxcontrib_devhelp-2.0.0-py3-none-any.whl", hash = "sha256:aefb8b83854e4b0998877524d1029fd3e6879210422ee3780459e28a1f03a8a2"}, + {file = "sphinxcontrib_devhelp-2.0.0.tar.gz", hash = "sha256:411f5d96d445d1d73bb5d52133377b4248ec79db5c793ce7dbe59e074b4dd1ad"}, +] + +[package.extras] +lint = ["mypy", "ruff (==0.5.5)", "types-docutils"] +standalone = ["Sphinx (>=5)"] +test = ["pytest"] + +[[package]] +name = "sphinxcontrib-htmlhelp" +version = "2.1.0" +description = "sphinxcontrib-htmlhelp is a sphinx extension which renders HTML help files" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinxcontrib_htmlhelp-2.1.0-py3-none-any.whl", hash = "sha256:166759820b47002d22914d64a075ce08f4c46818e17cfc9470a9786b759b19f8"}, + {file = "sphinxcontrib_htmlhelp-2.1.0.tar.gz", hash = "sha256:c9e2916ace8aad64cc13a0d233ee22317f2b9025b9cf3295249fa985cc7082e9"}, +] + +[package.extras] +lint = ["mypy", "ruff (==0.5.5)", "types-docutils"] +standalone = ["Sphinx (>=5)"] +test = ["html5lib", "pytest"] + +[[package]] +name = "sphinxcontrib-jsmath" +version = "1.0.1" +description = "A sphinx extension which renders display math in HTML via JavaScript" +optional = false +python-versions = ">=3.5" +files = [ + {file = "sphinxcontrib-jsmath-1.0.1.tar.gz", hash = "sha256:a9925e4a4587247ed2191a22df5f6970656cb8ca2bd6284309578f2153e0c4b8"}, + {file = "sphinxcontrib_jsmath-1.0.1-py2.py3-none-any.whl", hash = "sha256:2ec2eaebfb78f3f2078e73666b1415417a116cc848b72e5172e596c871103178"}, +] + +[package.extras] +test = ["flake8", "mypy", "pytest"] + +[[package]] +name = "sphinxcontrib-mermaid" +version = "1.0.0" +description = "Mermaid diagrams in yours Sphinx powered docs" +optional = false +python-versions = ">=3.8" +files = [ + {file = "sphinxcontrib_mermaid-1.0.0-py3-none-any.whl", hash = "sha256:60b72710ea02087f212028feb09711225fbc2e343a10d34822fe787510e1caa3"}, + {file = "sphinxcontrib_mermaid-1.0.0.tar.gz", hash = "sha256:2e8ab67d3e1e2816663f9347d026a8dee4a858acdd4ad32dd1c808893db88146"}, +] + +[package.dependencies] +pyyaml = "*" +sphinx = "*" + +[package.extras] +test = ["defusedxml", "myst-parser", "pytest", "ruff", "sphinx"] + +[[package]] +name = "sphinxcontrib-qthelp" +version = "2.0.0" +description = "sphinxcontrib-qthelp is a sphinx extension which outputs QtHelp documents" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinxcontrib_qthelp-2.0.0-py3-none-any.whl", hash = "sha256:b18a828cdba941ccd6ee8445dbe72ffa3ef8cbe7505d8cd1fa0d42d3f2d5f3eb"}, + {file = "sphinxcontrib_qthelp-2.0.0.tar.gz", hash = "sha256:4fe7d0ac8fc171045be623aba3e2a8f613f8682731f9153bb2e40ece16b9bbab"}, +] + +[package.extras] +lint = ["mypy", "ruff (==0.5.5)", "types-docutils"] +standalone = ["Sphinx (>=5)"] +test = ["defusedxml (>=0.7.1)", "pytest"] + +[[package]] +name = "sphinxcontrib-serializinghtml" +version = "2.0.0" +description = "sphinxcontrib-serializinghtml is a sphinx extension which outputs \"serialized\" HTML files (json and pickle)" +optional = false +python-versions = ">=3.9" +files = [ + {file = "sphinxcontrib_serializinghtml-2.0.0-py3-none-any.whl", hash = "sha256:6e2cb0eef194e10c27ec0023bfeb25badbbb5868244cf5bc5bdc04e4464bf331"}, + {file = "sphinxcontrib_serializinghtml-2.0.0.tar.gz", hash = "sha256:e9d912827f872c029017a53f0ef2180b327c3f7fd23c87229f7a8e8b70031d4d"}, +] + +[package.extras] +lint = ["mypy", "ruff (==0.5.5)", "types-docutils"] +standalone = ["Sphinx (>=5)"] +test = ["pytest"] + +[[package]] +name = "sqlalchemy" +version = "2.0.36" +description = "Database Abstraction Library" +optional = false +python-versions = ">=3.7" +files = [ + {file = "SQLAlchemy-2.0.36-cp310-cp310-macosx_10_9_x86_64.whl", hash = "sha256:59b8f3adb3971929a3e660337f5dacc5942c2cdb760afcabb2614ffbda9f9f72"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:37350015056a553e442ff672c2d20e6f4b6d0b2495691fa239d8aa18bb3bc908"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:8318f4776c85abc3f40ab185e388bee7a6ea99e7fa3a30686580b209eaa35c08"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c245b1fbade9c35e5bd3b64270ab49ce990369018289ecfde3f9c318411aaa07"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:69f93723edbca7342624d09f6704e7126b152eaed3cdbb634cb657a54332a3c5"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:f9511d8dd4a6e9271d07d150fb2f81874a3c8c95e11ff9af3a2dfc35fe42ee44"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-win32.whl", hash = "sha256:c3f3631693003d8e585d4200730616b78fafd5a01ef8b698f6967da5c605b3fa"}, + {file = "SQLAlchemy-2.0.36-cp310-cp310-win_amd64.whl", hash = "sha256:a86bfab2ef46d63300c0f06936bd6e6c0105faa11d509083ba8f2f9d237fb5b5"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-macosx_10_9_x86_64.whl", hash = "sha256:fd3a55deef00f689ce931d4d1b23fa9f04c880a48ee97af488fd215cf24e2a6c"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:4f5e9cd989b45b73bd359f693b935364f7e1f79486e29015813c338450aa5a71"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:d0ddd9db6e59c44875211bc4c7953a9f6638b937b0a88ae6d09eb46cced54eff"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2519f3a5d0517fc159afab1015e54bb81b4406c278749779be57a569d8d1bb0d"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:59b1ee96617135f6e1d6f275bbe988f419c5178016f3d41d3c0abb0c819f75bb"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:39769a115f730d683b0eb7b694db9789267bcd027326cccc3125e862eb03bfd8"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-win32.whl", hash = "sha256:66bffbad8d6271bb1cc2f9a4ea4f86f80fe5e2e3e501a5ae2a3dc6a76e604e6f"}, + {file = "SQLAlchemy-2.0.36-cp311-cp311-win_amd64.whl", hash = "sha256:23623166bfefe1487d81b698c423f8678e80df8b54614c2bf4b4cfcd7c711959"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:f7b64e6ec3f02c35647be6b4851008b26cff592a95ecb13b6788a54ef80bbdd4"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:46331b00096a6db1fdc052d55b101dbbfc99155a548e20a0e4a8e5e4d1362855"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fdf3386a801ea5aba17c6410dd1dc8d39cf454ca2565541b5ac42a84e1e28f53"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:ac9dfa18ff2a67b09b372d5db8743c27966abf0e5344c555d86cc7199f7ad83a"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:90812a8933df713fdf748b355527e3af257a11e415b613dd794512461eb8a686"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1bc330d9d29c7f06f003ab10e1eaced295e87940405afe1b110f2eb93a233588"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-win32.whl", hash = "sha256:79d2e78abc26d871875b419e1fd3c0bca31a1cb0043277d0d850014599626c2e"}, + {file = "SQLAlchemy-2.0.36-cp312-cp312-win_amd64.whl", hash = "sha256:b544ad1935a8541d177cb402948b94e871067656b3a0b9e91dbec136b06a2ff5"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b5cc79df7f4bc3d11e4b542596c03826063092611e481fcf1c9dfee3c94355ef"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3c01117dd36800f2ecaa238c65365b7b16497adc1522bf84906e5710ee9ba0e8"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:9bc633f4ee4b4c46e7adcb3a9b5ec083bf1d9a97c1d3854b92749d935de40b9b"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:9e46ed38affdfc95d2c958de328d037d87801cfcbea6d421000859e9789e61c2"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:b2985c0b06e989c043f1dc09d4fe89e1616aadd35392aea2844f0458a989eacf"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:4a121d62ebe7d26fec9155f83f8be5189ef1405f5973ea4874a26fab9f1e262c"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-win32.whl", hash = "sha256:0572f4bd6f94752167adfd7c1bed84f4b240ee6203a95e05d1e208d488d0d436"}, + {file = "SQLAlchemy-2.0.36-cp313-cp313-win_amd64.whl", hash = "sha256:8c78ac40bde930c60e0f78b3cd184c580f89456dd87fc08f9e3ee3ce8765ce88"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-macosx_10_9_x86_64.whl", hash = "sha256:be9812b766cad94a25bc63bec11f88c4ad3629a0cec1cd5d4ba48dc23860486b"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:50aae840ebbd6cdd41af1c14590e5741665e5272d2fee999306673a1bb1fdb4d"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:4557e1f11c5f653ebfdd924f3f9d5ebfc718283b0b9beebaa5dd6b77ec290971"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-musllinux_1_2_aarch64.whl", hash = "sha256:07b441f7d03b9a66299ce7ccf3ef2900abc81c0db434f42a5694a37bd73870f2"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-musllinux_1_2_x86_64.whl", hash = "sha256:28120ef39c92c2dd60f2721af9328479516844c6b550b077ca450c7d7dc68575"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-win32.whl", hash = "sha256:b81ee3d84803fd42d0b154cb6892ae57ea6b7c55d8359a02379965706c7efe6c"}, + {file = "SQLAlchemy-2.0.36-cp37-cp37m-win_amd64.whl", hash = "sha256:f942a799516184c855e1a32fbc7b29d7e571b52612647866d4ec1c3242578fcb"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-macosx_10_9_x86_64.whl", hash = "sha256:3d6718667da04294d7df1670d70eeddd414f313738d20a6f1d1f379e3139a545"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-macosx_11_0_arm64.whl", hash = "sha256:72c28b84b174ce8af8504ca28ae9347d317f9dba3999e5981a3cd441f3712e24"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:b11d0cfdd2b095e7b0686cf5fabeb9c67fae5b06d265d8180715b8cfa86522e3"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e32092c47011d113dc01ab3e1d3ce9f006a47223b18422c5c0d150af13a00687"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-musllinux_1_2_aarch64.whl", hash = "sha256:6a440293d802d3011028e14e4226da1434b373cbaf4a4bbb63f845761a708346"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-musllinux_1_2_x86_64.whl", hash = "sha256:c54a1e53a0c308a8e8a7dffb59097bff7facda27c70c286f005327f21b2bd6b1"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-win32.whl", hash = "sha256:1e0d612a17581b6616ff03c8e3d5eff7452f34655c901f75d62bd86449d9750e"}, + {file = "SQLAlchemy-2.0.36-cp38-cp38-win_amd64.whl", hash = "sha256:8958b10490125124463095bbdadda5aa22ec799f91958e410438ad6c97a7b793"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-macosx_10_9_x86_64.whl", hash = "sha256:dc022184d3e5cacc9579e41805a681187650e170eb2fd70e28b86192a479dcaa"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:b817d41d692bf286abc181f8af476c4fbef3fd05e798777492618378448ee689"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:a4e46a888b54be23d03a89be510f24a7652fe6ff660787b96cd0e57a4ebcb46d"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c4ae3005ed83f5967f961fd091f2f8c5329161f69ce8480aa8168b2d7fe37f06"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:03e08af7a5f9386a43919eda9de33ffda16b44eb11f3b313e6822243770e9763"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:3dbb986bad3ed5ceaf090200eba750b5245150bd97d3e67343a3cfed06feecf7"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-win32.whl", hash = "sha256:9fe53b404f24789b5ea9003fc25b9a3988feddebd7e7b369c8fac27ad6f52f28"}, + {file = "SQLAlchemy-2.0.36-cp39-cp39-win_amd64.whl", hash = "sha256:af148a33ff0349f53512a049c6406923e4e02bf2f26c5fb285f143faf4f0e46a"}, + {file = "SQLAlchemy-2.0.36-py3-none-any.whl", hash = "sha256:fddbe92b4760c6f5d48162aef14824add991aeda8ddadb3c31d56eb15ca69f8e"}, + {file = "sqlalchemy-2.0.36.tar.gz", hash = "sha256:7f2767680b6d2398aea7082e45a774b2b0767b5c8d8ffb9c8b683088ea9b29c5"}, +] + +[package.dependencies] +typing-extensions = ">=4.6.0" + +[package.extras] +aiomysql = ["aiomysql (>=0.2.0)", "greenlet (!=0.4.17)"] +aioodbc = ["aioodbc", "greenlet (!=0.4.17)"] +aiosqlite = ["aiosqlite", "greenlet (!=0.4.17)", "typing_extensions (!=3.10.0.1)"] +asyncio = ["greenlet (!=0.4.17)"] +asyncmy = ["asyncmy (>=0.2.3,!=0.2.4,!=0.2.6)", "greenlet (!=0.4.17)"] +mariadb-connector = ["mariadb (>=1.0.1,!=1.1.2,!=1.1.5,!=1.1.10)"] +mssql = ["pyodbc"] +mssql-pymssql = ["pymssql"] +mssql-pyodbc = ["pyodbc"] +mypy = ["mypy (>=0.910)"] +mysql = ["mysqlclient (>=1.4.0)"] +mysql-connector = ["mysql-connector-python"] +oracle = ["cx_oracle (>=8)"] +oracle-oracledb = ["oracledb (>=1.0.1)"] +postgresql = ["psycopg2 (>=2.7)"] +postgresql-asyncpg = ["asyncpg", "greenlet (!=0.4.17)"] +postgresql-pg8000 = ["pg8000 (>=1.29.1)"] +postgresql-psycopg = ["psycopg (>=3.0.7)"] +postgresql-psycopg2binary = ["psycopg2-binary"] +postgresql-psycopg2cffi = ["psycopg2cffi"] +postgresql-psycopgbinary = ["psycopg[binary] (>=3.0.7)"] +pymysql = ["pymysql"] +sqlcipher = ["sqlcipher3_binary"] + +[[package]] +name = "stack-data" +version = "0.6.3" +description = "Extract data from python stack frames and tracebacks for informative displays" +optional = false +python-versions = "*" +files = [ + {file = "stack_data-0.6.3-py3-none-any.whl", hash = "sha256:d5558e0c25a4cb0853cddad3d77da9891a08cb85dd9f9f91b9f8cd66e511e695"}, + {file = "stack_data-0.6.3.tar.gz", hash = "sha256:836a778de4fec4dcd1dcd89ed8abff8a221f58308462e1c4aa2a3cf30148f0b9"}, +] + +[package.dependencies] +asttokens = ">=2.1.0" +executing = ">=1.2.0" +pure-eval = "*" + +[package.extras] +tests = ["cython", "littleutils", "pygments", "pytest", "typeguard"] + +[[package]] +name = "tabulate" +version = "0.9.0" +description = "Pretty-print tabular data" +optional = false +python-versions = ">=3.7" +files = [ + {file = "tabulate-0.9.0-py3-none-any.whl", hash = "sha256:024ca478df22e9340661486f85298cff5f6dcdba14f3813e8830015b9ed1948f"}, + {file = "tabulate-0.9.0.tar.gz", hash = "sha256:0095b12bf5966de529c0feb1fa08671671b3368eec77d7ef7ab114be2c068b3c"}, +] + +[package.extras] +widechars = ["wcwidth"] + +[[package]] +name = "tornado" +version = "6.4.2" +description = "Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed." +optional = false +python-versions = ">=3.8" +files = [ + {file = "tornado-6.4.2-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:e828cce1123e9e44ae2a50a9de3055497ab1d0aeb440c5ac23064d9e44880da1"}, + {file = "tornado-6.4.2-cp38-abi3-macosx_10_9_x86_64.whl", hash = "sha256:072ce12ada169c5b00b7d92a99ba089447ccc993ea2143c9ede887e0937aa803"}, + {file = "tornado-6.4.2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a017d239bd1bb0919f72af256a970624241f070496635784d9bf0db640d3fec"}, + {file = "tornado-6.4.2-cp38-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c36e62ce8f63409301537222faffcef7dfc5284f27eec227389f2ad11b09d946"}, + {file = "tornado-6.4.2-cp38-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bca9eb02196e789c9cb5c3c7c0f04fb447dc2adffd95265b2c7223a8a615ccbf"}, + {file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:304463bd0772442ff4d0f5149c6f1c2135a1fae045adf070821c6cdc76980634"}, + {file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_i686.whl", hash = "sha256:c82c46813ba483a385ab2a99caeaedf92585a1f90defb5693351fa7e4ea0bf73"}, + {file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:932d195ca9015956fa502c6b56af9eb06106140d844a335590c1ec7f5277d10c"}, + {file = "tornado-6.4.2-cp38-abi3-win32.whl", hash = "sha256:2876cef82e6c5978fde1e0d5b1f919d756968d5b4282418f3146b79b58556482"}, + {file = "tornado-6.4.2-cp38-abi3-win_amd64.whl", hash = "sha256:908b71bf3ff37d81073356a5fadcc660eb10c1476ee6e2725588626ce7e5ca38"}, + {file = "tornado-6.4.2.tar.gz", hash = "sha256:92bad5b4746e9879fd7bf1eb21dce4e3fc5128d71601f80005afa39237ad620b"}, +] + +[[package]] +name = "traitlets" +version = "5.14.3" +description = "Traitlets Python configuration system" +optional = false +python-versions = ">=3.8" +files = [ + {file = "traitlets-5.14.3-py3-none-any.whl", hash = "sha256:b74e89e397b1ed28cc831db7aea759ba6640cb3de13090ca145426688ff1ac4f"}, + {file = "traitlets-5.14.3.tar.gz", hash = "sha256:9ed0579d3502c94b4b3732ac120375cda96f923114522847de4b3bb98b96b6b7"}, +] + +[package.extras] +docs = ["myst-parser", "pydata-sphinx-theme", "sphinx"] +test = ["argcomplete (>=3.0.3)", "mypy (>=1.7.0)", "pre-commit", "pytest (>=7.0,<8.2)", "pytest-mock", "pytest-mypy-testing"] + +[[package]] +name = "typing-extensions" +version = "4.12.2" +description = "Backported and Experimental Type Hints for Python 3.8+" +optional = false +python-versions = ">=3.8" +files = [ + {file = "typing_extensions-4.12.2-py3-none-any.whl", hash = "sha256:04e5ca0351e0f3f85c6853954072df659d0d13fac324d0072316b67d7794700d"}, + {file = "typing_extensions-4.12.2.tar.gz", hash = "sha256:1a7ead55c7e559dd4dee8856e3a88b41225abfe1ce8df57b7c13915fe121ffb8"}, +] + +[[package]] +name = "uc-micro-py" +version = "1.0.3" +description = "Micro subset of unicode data files for linkify-it-py projects." +optional = false +python-versions = ">=3.7" +files = [ + {file = "uc-micro-py-1.0.3.tar.gz", hash = "sha256:d321b92cff673ec58027c04015fcaa8bb1e005478643ff4a500882eaab88c48a"}, + {file = "uc_micro_py-1.0.3-py3-none-any.whl", hash = "sha256:db1dffff340817673d7b466ec86114a9dc0e9d4d9b5ba229d9d60e5c12600cd5"}, +] + +[package.extras] +test = ["coverage", "pytest", "pytest-cov"] + +[[package]] +name = "urllib3" +version = "2.3.0" +description = "HTTP library with thread-safe connection pooling, file post, and more." +optional = false +python-versions = ">=3.9" +files = [ + {file = "urllib3-2.3.0-py3-none-any.whl", hash = "sha256:1cee9ad369867bfdbbb48b7dd50374c0967a0bb7710050facf0dd6911440e3df"}, + {file = "urllib3-2.3.0.tar.gz", hash = "sha256:f8c5449b3cf0861679ce7e0503c7b44b5ec981bec0d1d3795a07f1ba96f0204d"}, +] + +[package.extras] +brotli = ["brotli (>=1.0.9)", "brotlicffi (>=0.8.0)"] +h2 = ["h2 (>=4,<5)"] +socks = ["pysocks (>=1.5.6,!=1.5.7,<2.0)"] +zstd = ["zstandard (>=0.18.0)"] + +[[package]] +name = "wcwidth" +version = "0.2.13" +description = "Measures the displayed width of unicode strings in a terminal" +optional = false +python-versions = "*" +files = [ + {file = "wcwidth-0.2.13-py2.py3-none-any.whl", hash = "sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859"}, + {file = "wcwidth-0.2.13.tar.gz", hash = "sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5"}, +] + +[[package]] +name = "wheel" +version = "0.45.1" +description = "A built-package format for Python" +optional = false +python-versions = ">=3.8" +files = [ + {file = "wheel-0.45.1-py3-none-any.whl", hash = "sha256:708e7481cc80179af0e556bbf0cc00b8444c7321e2700b8d8580231d13017248"}, + {file = "wheel-0.45.1.tar.gz", hash = "sha256:661e1abd9198507b1409a20c02106d9670b2576e916d58f520316666abca6729"}, +] + +[package.extras] +test = ["pytest (>=6.0.0)", "setuptools (>=65)"] + +[[package]] +name = "zipp" +version = "3.21.0" +description = "Backport of pathlib-compatible object wrapper for zip files" +optional = false +python-versions = ">=3.9" +files = [ + {file = "zipp-3.21.0-py3-none-any.whl", hash = "sha256:ac1bbe05fd2991f160ebce24ffbac5f6d11d83dc90891255885223d42b3cd931"}, + {file = "zipp-3.21.0.tar.gz", hash = "sha256:2c9958f6430a2040341a52eb608ed6dd93ef4392e02ffe219417c1b28b5dd1f4"}, +] + +[package.extras] +check = ["pytest-checkdocs (>=2.4)", "pytest-ruff (>=0.2.1)"] +cover = ["pytest-cov"] +doc = ["furo", "jaraco.packaging (>=9.3)", "jaraco.tidelift (>=1.4)", "rst.linker (>=1.9)", "sphinx (>=3.5)", "sphinx-lint"] +enabler = ["pytest-enabler (>=2.2)"] +test = ["big-O", "importlib-resources", "jaraco.functools", "jaraco.itertools", "jaraco.test", "more-itertools", "pytest (>=6,!=8.1.*)", "pytest-ignore-flaky"] +type = ["pytest-mypy"] + +[metadata] +lock-version = "2.0" +python-versions = "^3.13" +content-hash = "01c2fcac2095d5742ed480ee6ae7ec1958c3ed53e06c2d9cd796899715a4e7bc" diff --git a/pyproject.toml b/pyproject.toml new file mode 100644 index 0000000..6b73104 --- /dev/null +++ b/pyproject.toml @@ -0,0 +1,18 @@ +[tool.poetry] +name = "pf-recommended-practices" +version = "0.1.0" +description = "" +authors = ["Daniel Wheeler "] +readme = "README.md" + +[tool.poetry.dependencies] +python = "^3.13" +jupyter-book = "^1.0.3" +numpy = "^2.2.1" +matplotlib = "^3.10" +sphinxcontrib-mermaid = "^1.0.0" + + +[build-system] +requires = ["poetry-core"] +build-backend = "poetry.core.masonry.api" From eb736e9cb1aa4f99fe2f7a8ef36c254dd39ae44d Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Fri, 3 Jan 2025 12:14:06 -0500 Subject: [PATCH 05/19] remove ruby environment from flake.nix --- flake.nix | 6 ++---- .../bp-guide-gh/ch3-data-generation-and-curation.md | 1 - 2 files changed, 2 insertions(+), 5 deletions(-) diff --git a/flake.nix b/flake.nix index 139bc26..0b42686 100644 --- a/flake.nix +++ b/flake.nix @@ -40,9 +40,7 @@ python = pkgs.python313; }; env = pkgs.poetry2nix.mkPoetryEnv args; - app = pkgs.poetry2nix.mkPoetryApplication args; - - ruby = pkgs.ruby.withPackages (ps: with ps; [ jekyll kramdown-parser-gfm webrick ]); + app = pkgs.poetry2nix.mkPoetryApplication args;3 in rec { ## See https://github.com/nix-community/poetry2nix/issues/1433 @@ -50,7 +48,7 @@ ## environment ## devShells.default = env.env; devShells.default = pkgs.mkShell { - packages = [ env ruby ]; + packages = [ env ]; shellHook = '' export PYTHONPATH=$PWD ''; diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 40a70ab..b1733d9 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -468,5 +468,4 @@ research will greatly benefit this process. [fair-phase-field]: https://doi.org/10.5281/zenodo.7254581 [schemaorg]: https://github.com/openschemas/schemaorg [structured data schema]: https://en.wikipedia.org/wiki/Data_model -[FAIR]: https://doi.org/10.1038/sdata.2016.18 From fca7960b815f56bff730e0cfb5cb1b7841744245 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Fri, 3 Jan 2025 14:43:49 -0500 Subject: [PATCH 06/19] update the definitions section of the data generation guide --- pf-recommended-practices/_config.yml | 3 + .../ch3-data-generation-and-curation.md | 111 ++++++++++-------- .../bp-guide-gh/ch4-software-development.md | 1 + pf-recommended-practices/references.bib | 9 ++ 4 files changed, 75 insertions(+), 49 deletions(-) diff --git a/pf-recommended-practices/_config.yml b/pf-recommended-practices/_config.yml index db884e6..4df5dc4 100644 --- a/pf-recommended-practices/_config.yml +++ b/pf-recommended-practices/_config.yml @@ -45,5 +45,8 @@ parse: sphinx: config: mathjax_path: https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js + mermaid_version: "11.4.1" extra_extensions: - sphinxcontrib.mermaid + + diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index b1733d9..82d1fd7 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -22,22 +22,24 @@ Gauss points. Typically, data is stored at sufficient temporal frequency to reconstruct the evolution of the field variables. In recent years there have been efforts to embed phase field models -into ICME-based materials design workflows -{cite}`TOURRET2022100810`. However, to leverage relevant phase field -resources for these workflows a systematic approach is required for -archiving and accessing data. Furthermore, it is often difficult for -downstream researchers to find and access raw or minimally processed -data from phase field studies, before the post-processing steps and -final publication. In this document, we will provide motivation, -guidance and a template for packaging and publishing FAIR data from -phase field studies as well as managing unpublished raw data -{cite}`Wilkinson2016`. Following the protocols outlined in this guide -will provide downstream researchers with an enhanced capability to use -phase field as part of larger ICME workflows and, in particular, data -intensive usages such as AI surrogate models. This guide serves as a -primer rather than a detailied reference on scientific data, aiming to -stimulate thought and ensure that phase field practitioners are aware -of the key considerations before initiating a phase field study. +into integrated computataional materials engineering (ICME) based +materials design workflows {cite}`TOURRET2022100810`. However, to +leverage relevant phase field resources for these workflows a +systematic approach is required for archiving and accessing +data. Furthermore, it is often difficult for downstream researchers to +find and access raw or minimally processed data from phase field +studies, before the post-processing steps and final publication. In +this document, we will provide motivation, guidance and a template for +packaging and publishing findable, accessible, interoperable and +reusable (FAIR) data from phase field studies as well as managing +unpublished raw data {cite}`Wilkinson2016`. Following the protocols +outlined in this guide will provide downstream researchers with an +enhanced capability to use phase field as part of larger ICME +workflows and, in particular, data intensive usages such as AI +surrogate models. This guide serves as a primer rather than a +detailied reference on scientific data, aiming to stimulate thought +and ensure that phase field practitioners are aware of the key +considerations before initiating a phase field study. ## Definitions @@ -46,48 +48,54 @@ concepts of FAIR data management when applied to phase field studies. Broadly speaking, **FAIR data management** encompasses the curation of simulation workflows (including the software, data inputs and data outputs) for subsequent researchers or even machine -agents. **FAIR data** concepts have been well explained elsewhere, see -[X, Y, Z]. A **scientific workflow** is generally conceptualized as a -graph of connected actions with various inputs and outputs. Some of -the nodes in a workflow may not be entirely automated and require -human agent inputs, which can increase the complexity of workflow +agents. **FAIR data** concepts for simulation workflows have been well +explained elsewhere, see +{cite}`wilkinson2024applyingfairprinciplescomputational`. A +**scientific workflow** is generally conceptualized as a graph of +connected actions with various inputs and outputs. Some of the nodes +in a workflow may not be entirely automated and require human agent +inputs, which can increase the complexity of workflow curation. Workflow nodes include the pre and post-processing steps for phase field simulation workflows. In this guide, the **raw and -post-processed** data is considered to be different from the -**metadata**, which describes the simulation and associated -workflow. The **raw data** is generated by the simulation as it is -running and often consists of field data. The **post-processed data** -consists of derived quantities and images generated using the **raw -data**. The **software** used the for the simulation generally refers -to the phase field code used to run the simulation directly and is -part of the larger **computational environment**. The **code** might -also refer to **software**, but the distinction is that the **code** -may have been modified by the researcher and might include **input -files** to the **software application**. Although software and code -can be considered as data, the curation process involves different -tools and practices. See the [Software Development] section of the -best practices guide for a more detailed discussion of software and -code curation. +post-processed data** is considered to be distinct from the +**metadata**, which describes the simulation, associated workflow and +data files. The **raw data** is generated by the simulation as it is +running and often consists of temporal field data. The +**post-processed data** consists of derived quantities and images +generated using the **raw data**. The **software** or **software +application** used to perform the simulation generally refers to a +particular phase field code, which is part of the larger +**computational environment**. The **code** might also refer to the +**software**, but the distinction is that the **code** may have been +modified by the researcher and might include **input files** to the +**software application**. Although the code can be considered as data +in the larger sense, in this work, the data curation process excludes +consideration of code curation, which involves its own distinct +practices. See the [Software Development](label-software-development) +section of the best practices guide for a more detailed discussion of +software and code curation. ```{mermaid} --- title: A Phase Field Workflow ---p flowchart TD - id1(Input Files\nParameters) - id1.1(Code) - id2(Computational\nEnvironment) - id2.5[[Pre-processing,\ne.g. CALPHAD or Meshing]] - id2.7([Pro-processed Data]) - id3[[Phase Field Simulation]] - id3.5([Scratch Data]) - id4([Raw Field Data]) - id5[[Post-processing\ne.g. Data Visualization]] - id6([Post-processed Data\ne.g. Derivied Quantities, Images]) + id1@{ shape: lean-r, label: "Input Files", fill: #f96 } + id1.2@{ shape: lean-r, label: "Parameters" } + id1.1([Code]) + id2([Computational Environment]) + id2.5@{ shape: rect, label: "Pre-processing, (e.g. CALPHAD or Meshing)" } + id2.7@{ shape: bow-rect, label: "Pre-processed Data" } + id3[Phase Field Simulation] + id3.5@{ shape: bow-rect, label: "Scratch Data" } + id4@{ shape: bow-rect, label: "Raw Field Data" } + id5@{ shape: rect, label: "Post-processing (e.g. Data Visualization)" } + id6@{ shape: bow-rect, label: "Post-processed Data" } + id7@{ shape: lin-cyl, label: "Data Repository" } + id1.2-->id1 id1-->id2.5 id1-->id5 id2.5-->id2.7-->id3 - id2.5-->id3 id1-->id3 id1.1-->id3 id2-->id3 @@ -95,12 +103,14 @@ flowchart TD id3-->id3.5-->id3 id2-->id2.5 id2-->id5 + id6--Curation-->id7 + id4--Curation-->id7 ``` ## Data Generation -Let's first draw the distinction between data generation and data -curation. Data generation involves writing raw data to disk during the +Let's first draw the distinction between **data generation** and **data +curation**. Data generation involves writing raw data to disk during the simulation execution and generating post-processed data from that raw data. Data curation involves packaging the generated data objects from a phase field workflow or study along with sufficient provenance @@ -447,6 +457,8 @@ research will greatly benefit this process. ### Selecting a data repository +Dockstore and Workflowhub https://arxiv.org/pdf/2410.03490 + ## References ```{bibliography} @@ -468,4 +480,5 @@ research will greatly benefit this process. [fair-phase-field]: https://doi.org/10.5281/zenodo.7254581 [schemaorg]: https://github.com/openschemas/schemaorg [structured data schema]: https://en.wikipedia.org/wiki/Data_model +[link1]: https://workflows.community/groups/fair/best-practices/ diff --git a/pf-recommended-practices/bp-guide-gh/ch4-software-development.md b/pf-recommended-practices/bp-guide-gh/ch4-software-development.md index 557b9a0..0664b64 100644 --- a/pf-recommended-practices/bp-guide-gh/ch4-software-development.md +++ b/pf-recommended-practices/bp-guide-gh/ch4-software-development.md @@ -1,3 +1,4 @@ +(label-software-development)= # Software Development Daniel Schwen ([@dschwen](https://github.com/dschwen)), Jon Guyer ([@guyer](https://github.com/guyer)), Trevor Keller ([@tkphd](https://github.com/tkphd)) diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index 280a3fd..a23c508 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -158,3 +158,12 @@ @Article{Wilkinson2016 url={https://doi.org/10.1038/sdata.2016.18} } +@misc{wilkinson2024applyingfairprinciplescomputational, + title={Applying the FAIR Principles to Computational Workflows}, + author={Sean R. Wilkinson and Meznah Aloqalaa and Khalid Belhajjame and Michael R. Crusoe and Bruno de Paula Kinoshita and Luiz Gadelha and Daniel Garijo and Ove Johan Ragnar Gustafsson and Nick Juty and Sehrish Kanwal and Farah Zaib Khan and Johannes Köster and Karsten Peters-von Gehlen and Line Pouchard and Randy K. Rannow and Stian Soiland-Reyes and Nicola Soranzo and Shoaib Sufi and Ziheng Sun and Baiba Vilne and Merridee A. Wouters and Denis Yuen and Carole Goble}, + year={2024}, + eprint={2410.03490}, + archivePrefix={arXiv}, + primaryClass={cs.DL}, + url={https://arxiv.org/abs/2410.03490}, +} \ No newline at end of file From 83d3803939d229b138b67cde95e8607b1a5d2371 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Fri, 3 Jan 2025 16:53:15 -0500 Subject: [PATCH 07/19] clean up writing to disk section of data generation --- .../ch3-data-generation-and-curation.md | 85 ++++++++++--------- pf-recommended-practices/references.bib | 13 ++- 2 files changed, 58 insertions(+), 40 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 82d1fd7..25e1022 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -22,7 +22,7 @@ Gauss points. Typically, data is stored at sufficient temporal frequency to reconstruct the evolution of the field variables. In recent years there have been efforts to embed phase field models -into integrated computataional materials engineering (ICME) based +into integrated computational materials engineering (ICME) based materials design workflows {cite}`TOURRET2022100810`. However, to leverage relevant phase field resources for these workflows a systematic approach is required for archiving and accessing @@ -37,7 +37,7 @@ outlined in this guide will provide downstream researchers with an enhanced capability to use phase field as part of larger ICME workflows and, in particular, data intensive usages such as AI surrogate models. This guide serves as a primer rather than a -detailied reference on scientific data, aiming to stimulate thought +detailed reference on scientific data, aiming to stimulate thought and ensure that phase field practitioners are aware of the key considerations before initiating a phase field study. @@ -82,8 +82,8 @@ title: A Phase Field Workflow flowchart TD id1@{ shape: lean-r, label: "Input Files", fill: #f96 } id1.2@{ shape: lean-r, label: "Parameters" } - id1.1([Code]) - id2([Computational Environment]) + id1.1@{ shape: lean-r, label: "Code" } + id2@{ shape: lean-r, label: "Computational Environment" } id2.5@{ shape: rect, label: "Pre-processing, (e.g. CALPHAD or Meshing)" } id2.7@{ shape: bow-rect, label: "Pre-processed Data" } id3[Phase Field Simulation] @@ -92,6 +92,7 @@ flowchart TD id5@{ shape: rect, label: "Post-processing (e.g. Data Visualization)" } id6@{ shape: bow-rect, label: "Post-processed Data" } id7@{ shape: lin-cyl, label: "Data Repository" } + id8@{ shape: bow-rect, label: "Metadata" } id1.2-->id1 id1-->id2.5 id1-->id5 @@ -105,6 +106,11 @@ flowchart TD id2-->id5 id6--Curation-->id7 id4--Curation-->id7 + id8--Curation-->id7 + id1.2-->id8 + id1.1-->id8 + id1-->id8 + id2-->id8 ``` ## Data Generation @@ -119,52 +125,50 @@ scientific studies. When performing a phase field simulation, one must be cognizant of several factors pertaining to data generation. Generally speaking, the -considerations can be defined as follow, +considerations can be defined as follow. -- choosing data to generate (and then curate), -- file formats, -- file system hierarchy, -- restarts and recovering from crashes -- data generation and workflow tools, and -- HPC environments and writing to disk in parallel. + - Writing raw data to disk + - File formats + - Recovering from crashes and restarts + - Using workflow tools + - High performance computing (HPC) environments and parallel writes These considerations are often conflicting, require trial and error to determine the best approach and are highly specific to the requirements of the workflow and post-processing. However, there are some general guidelines that will be outlined below. -### Choosing data to generate +### Writing raw data to disk Selecting the appropriate data to write to disk during the simulation largely depends on the requirements such as post-processing or debugging. However, it is good practice to consider future uses of the -data for future work such as subsequent researchers trying to reuse -the workflow or even reviewers. Lack of forethought in saving data -could hinder the data curation of the eventual curation of the data -research object. This step should be considered independently from -restarts. The data required to reconstruct derived quantities or the -evolution of field data will not be the same as the data required to -restart a simulation. - -Another aspect of saving data to disk is the frequency off the disk -writes. This choice can often impact the performance of a simulation -as the simulation might have to wait on the disk before continuing the -computation. A method to avoid this is to use a separate thread that -runs concurrently to write the data (in the same process), see -[Stackflow -Question](https://stackoverflow.com/questions/1014113/efficient-way-to-save-data-to-disk-while-running-a-computationally-intensive-tas). In -fact many practitioners overlook optimizing this part of aspect of -phase field codes -[ref](https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0202410). Generally, -when writing data it is best to do single large write to disk as -opposed to multiple small writes. In practice this could involve -caching multiple field variables across multiple print steps as a -single data blob to an HDF5 file. However, there is a trade off -between simulation performance and memory usage as well as latency and -communication overhead when considering parallel simulations. - -https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0202410 -https://dl.acm.org/doi/abs/10.1145/1713072.1713079 +data that might not be of immediate benefit to the research +study. Lack of forethought in retaining data could hinder the data +curation of the final research object. The data generation process +should be considered independently from restarts, which is discussed +in a [subsequent section](#label-restarts). In general, the data +required to reconstruct derived quantities or the evolution of field +data will not be the same as the data required to restart a +simulation. + +On many HPC systems writing to disk frequently can be expensive and +intermittently stall a simulation due to a number off factors such as +I/O contention, see the [HPC section +below](#label-hpc-environments). Generally, when writing data it is +best to use single large writes to disk as opposed to multiple small +writes especially on shared file systems (i.e. "perform more write +bytes per write function call" {cite}`9406732`). In practice this +could involve caching multiple field variables across multiple save +steps and then writing to disk as a single data blob in an HDF5 file +for example. This is a trade-off between IO efficiency and losing all +the data if the job crashes without writing any useful data as well as +simulation performance, memory usage and communication overhead when +considering parallel simulations. Overall, it is essential that the IO +part of a code is well profiled using different write +configurations. The replicability of writes should also be tested by +checking the hash of data based on parallel configurations and write +frequencies. ### File formats @@ -229,6 +233,7 @@ https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/ https://docs.vtk.org/en/latest/index.html https://docs.xarray.dev/en/stable/user-guide/io.html= +(label-restarts)= ### Recovering from crashes and restarts A study from 2020 of HPC systems calculated the success rate (I.e. no @@ -380,6 +385,8 @@ tools. - https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html +(label-hpc-environments)= +### HPC Environments and parallel writes ## Data Curation diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index a23c508..cdbfe93 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -166,4 +166,15 @@ @misc{wilkinson2024applyingfairprinciplescomputational archivePrefix={arXiv}, primaryClass={cs.DL}, url={https://arxiv.org/abs/2410.03490}, -} \ No newline at end of file +} + +@INPROCEEDINGS{9406732, + author={Paul, Arnab K. and Faaland, Olaf and Moody, Adam and Gonsiorowski, Elsa and Mohror, Kathryn and Butt, Ali R.}, + booktitle={2020 IEEE 27th International Conference on High Performance Computing, Data, and Analytics (HiPC)}, + title={Understanding HPC Application I/O Behavior Using System Level Statistics}, + year={2020}, + volume={}, + number={}, + pages={202-211}, + keywords={Runtime;File systems;High performance computing;Conferences;Machine learning;Metadata;Servers;I/O analysis;High Performance Computing;Parallel File Systems;Lustre File System;I/O contention}, + doi={10.1109/HiPC50609.2020.00034}} \ No newline at end of file From d0e5ab5541ed1e96b03b14e409b472ac8b1620f3 Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 10:23:45 -0500 Subject: [PATCH 08/19] POSIX files end on a blank line. --- .gitignore | 13 +++++++------ 1 file changed, 7 insertions(+), 6 deletions(-) diff --git a/.gitignore b/.gitignore index 102c919..e6a5571 100644 --- a/.gitignore +++ b/.gitignore @@ -1,10 +1,11 @@ +*.ipynb_checkpoints/ +*.log */.DS_Store +*~undo-tree~ .DS_Store -.idea -*.log -tmp/ -*.ipynb_checkpoints/ .auctex-auto/ -logo.png +.idea /_build/ -pf-recommended-practices/_build \ No newline at end of file +logo.png +pf-recommended-practices/_build +tmp/ From 53aad627d7401b34d1c1c6422b56089d06e722e1 Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 10:31:11 -0500 Subject: [PATCH 09/19] Wrap lines at 79th col. --- README.md | 45 ++++++++++++++++++++++++++++++++++----------- 1 file changed, 34 insertions(+), 11 deletions(-) diff --git a/README.md b/README.md index 142eabb..a70ef7e 100644 --- a/README.md +++ b/README.md @@ -1,33 +1,56 @@ # Phase Field Method Recommended Practices -A set of recommended practices that help to ensure the best outcomes from your use of the phase-field method +A set of recommended practices that help to ensure the best outcomes from your +use of the phase-field method. ## Usage ### Building the book -If you'd like to develop and/or build the Phase Field Method Recommended Practices book, you should: +If you'd like to develop and/or build the Phase Field Method Recommended +Practices book, you should: 1. Clone this repository -2. Run `pip install -r requirements.txt` (it is recommended you do this within a virtual environment) -3. (Optional) Edit the books source files located in the `pf-recommended-practices/` directory -4. Run `jupyter-book clean pf-recommended-practices/` to remove any existing builds +2. Run `pip install -r requirements.txt` (it is recommended you do this within + a virtual environment) +3. (Optional) Edit the books source files located in the + `pf-recommended-practices/` directory +4. Run `jupyter-book clean pf-recommended-practices/` to remove any existing + builds 5. Run `jupyter-book build pf-recommended-practices/` A fully-rendered HTML version of the book will be built in -`pf-recommended-practices/_build/html/`. Render using `python -m -http.server` in the `pf-recommended-practices/_build/html/` directory. +`pf-recommended-practices/_build/html/`. Render using `python -m http.server` +in the `pf-recommended-practices/_build/html/` directory. ### Hosting the book -Please see the [Jupyter Book documentation](https://jupyterbook.org/publish/web.html) to discover options for deploying a book online using services such as GitHub, GitLab, or Netlify. +Please see the [Jupyter Book documentation][jb-docs] to discover options for +deploying a book online using services such as GitHub, GitLab, or Netlify. -For GitHub and GitLab deployment specifically, the [cookiecutter-jupyter-book](https://github.com/executablebooks/cookiecutter-jupyter-book) includes templates for, and information about, optional continuous integration (CI) workflow files to help easily and automatically deploy books online with GitHub or GitLab. For example, if you chose `github` for the `include_ci` cookiecutter option, your book template was created with a GitHub actions workflow file that, once pushed to GitHub, automatically renders and pushes your book to the `gh-pages` branch of your repo and hosts it on GitHub Pages when a push or pull request is made to the main branch. +For GitHub and GitLab deployment specifically, the +[cookiecutter-jupyter-book][eb-cook] repository includes templates for, and +information about, optional continuous integration (CI) workflow files to help +easily and automatically deploy books online with GitHub or GitLab. For +example, if you chose `github` for the `include_ci` cookiecutter option, your +book template was created with a GitHub actions workflow file that, once pushed +to GitHub, automatically renders and pushes your book to the `gh-pages` branch +of your repo and hosts it on GitHub Pages when a push or pull request is made +to the main branch. ## Contributors -We welcome and recognize all contributions. You can see a list of current contributors in the [contributors tab](https://github.com/guyer/pf-recommended-practices/graphs/contributors). +We welcome and recognize all contributions. You can see a list of current +contributors in the [contributors tab][gh-cont]. ## Credits -This project is created using the excellent open source [Jupyter Book project](https://jupyterbook.org/) and the [executablebooks/cookiecutter-jupyter-book template](https://github.com/executablebooks/cookiecutter-jupyter-book). +This project is created using the excellent open source [Jupyter Book][jb] +project and the [executablebooks/cookiecutter-jupyter-book][eb-cook] template. + + +[gh-cont]: https://github.com/guyer/pf-recommended-practices/graphs/contributors + +[eb-cook]: https://github.com/executablebooks/cookiecutter-jupyter-book +[jb]: https://jupyterbook.org/ +[jb-docs]: https://jupyterbook.org/publish/web.html From c78de72a7ca05073b8b09f87e6fa3e0f6845d353 Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 11:50:35 -0500 Subject: [PATCH 10/19] Harmonize BiBTeX style, add entries. --- .../ch3-data-generation-and-curation.md | 548 +++++++++--------- pf-recommended-practices/references.bib | 541 +++++++++++------ 2 files changed, 623 insertions(+), 466 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 25e1022..8a73564 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -1,116 +1,106 @@ # Data Generation and Curation -- *[Trevor Keller](https://www.nist.gov/people/trevor-keller), NIST*, [@tkphd] -- *[Daniel Wheeler](https://www.nist.gov/people/daniel-wheeler), NIST*, [@wd15] -- *[Damien Pinto](https://ca.linkedin.com/in/damien-pinto-4748387b), McGill*, [@DamienPinto] +Authors: -# Data Generation and Curation +- [Trevor Keller](https://www.nist.gov/people/trevor-keller), NIST, [@tkphd] +- [Daniel Wheeler](https://www.nist.gov/people/daniel-wheeler), NIST, [@wd15] +- [Damien Pinto](https://ca.linkedin.com/in/damien-pinto-4748387b), McGill, [@DamienPinto] ## Overview - -Phase field models are characterized by a form of PDE related to an -Eulerian free boundary problem and defined by a diffuse -interface. Phase field models for practical applications require -sufficient high fidelity to resolve both the macro length scale -related to the application and the micro length scales associated with -the free boundary. Performing a useful phase field simulation requires -extensive computationally resources and can generate large volumes of -raw field data. This data consists of field variables defined -frequently across a domain or an interpolation function with many -Gauss points. Typically, data is stored at sufficient temporal -frequency to reconstruct the evolution of the field variables. - -In recent years there have been efforts to embed phase field models -into integrated computational materials engineering (ICME) based -materials design workflows {cite}`TOURRET2022100810`. However, to -leverage relevant phase field resources for these workflows a -systematic approach is required for archiving and accessing -data. Furthermore, it is often difficult for downstream researchers to -find and access raw or minimally processed data from phase field -studies, before the post-processing steps and final publication. In -this document, we will provide motivation, guidance and a template for -packaging and publishing findable, accessible, interoperable and -reusable (FAIR) data from phase field studies as well as managing -unpublished raw data {cite}`Wilkinson2016`. Following the protocols -outlined in this guide will provide downstream researchers with an -enhanced capability to use phase field as part of larger ICME -workflows and, in particular, data intensive usages such as AI -surrogate models. This guide serves as a primer rather than a -detailed reference on scientific data, aiming to stimulate thought -and ensure that phase field practitioners are aware of the key -considerations before initiating a phase field study. +Phase-field models are characterized by a form of PDE related to an Eulerian +free boundary problem and defined by a diffuse interface. Phase-field models +for practical applications require sufficient high fidelity to resolve both the +macro length scale related to the application and the micro length scales +associated with the free boundary. Performing a useful phase-field simulation +requires extensive computational resources and can generate large volumes of +raw field data. This data consists of field variables defined throughout a +discretized domain or an interpolation function with many Gauss +points. Typically, data is stored at sufficient temporal frequency to +reconstruct the evolution of the field variables. + +In recent years there have been efforts to embed phase-field models into +integrated computational materials engineering (ICME) based materials design +workflows {cite}`Tourret2022`. However, to leverage relevant phase-field +resources for these workflows a systematic approach is required for archiving +and accessing data. Furthermore, it is often difficult for downstream +researchers to find and access raw or minimally processed data from phase-field +studies, before the post-processing steps and final publication. In this +document, we will provide motivation, guidance and a template for packaging and +publishing findable, accessible, interoperable, and reusable (FAIR) data from +phase-field studies as well as managing unpublished raw data +{cite}`Wilkinson2016`. Following the protocols outlined in this guide will +provide downstream researchers with an enhanced capability to use phase-field +as part of larger ICME workflows and, in particular, data intensive usages such +as AI surrogate models. This guide serves as a primer rather than a detailed +reference on scientific data, aiming to stimulate thought and ensure that +phase-field practitioners are aware of the key considerations before initiating +a phase-field study. ## Definitions -It is beneficial for the reader to be clear regarding the main -concepts of FAIR data management when applied to phase field -studies. Broadly speaking, **FAIR data management** encompasses the -curation of simulation workflows (including the software, data inputs -and data outputs) for subsequent researchers or even machine -agents. **FAIR data** concepts for simulation workflows have been well -explained elsewhere, see -{cite}`wilkinson2024applyingfairprinciplescomputational`. A -**scientific workflow** is generally conceptualized as a graph of -connected actions with various inputs and outputs. Some of the nodes -in a workflow may not be entirely automated and require human agent -inputs, which can increase the complexity of workflow +It is beneficial for the reader to be clear regarding the main concepts of FAIR +data management when applied to phase-field studies. Broadly speaking, **FAIR +data management** encompasses the curation of simulation workflows (including +the software, data inputs and data outputs) for subsequent researchers or even +machine agents. **FAIR data** concepts for simulation workflows have been well +explained elsewhere, see {cite}`Wilkinson2024`. A **scientific workflow** is +generally conceptualized as a graph of connected actions with various inputs +and outputs. Some of the nodes in a workflow may not be entirely automated and +require human agent inputs, which can increase the complexity of workflow curation. Workflow nodes include the pre and post-processing steps for -phase field simulation workflows. In this guide, the **raw and -post-processed data** is considered to be distinct from the -**metadata**, which describes the simulation, associated workflow and -data files. The **raw data** is generated by the simulation as it is -running and often consists of temporal field data. The -**post-processed data** consists of derived quantities and images -generated using the **raw data**. The **software** or **software -application** used to perform the simulation generally refers to a -particular phase field code, which is part of the larger -**computational environment**. The **code** might also refer to the -**software**, but the distinction is that the **code** may have been -modified by the researcher and might include **input files** to the -**software application**. Although the code can be considered as data -in the larger sense, in this work, the data curation process excludes -consideration of code curation, which involves its own distinct -practices. See the [Software Development](label-software-development) -section of the best practices guide for a more detailed discussion of -software and code curation. +phase-field simulation workflows. In this guide, the **raw and post-processed +data** is considered to be distinct from the **metadata**, which describes the +simulation, associated workflow and data files. The **raw data** is generated +by the simulation as it is running and often consists of temporal field +data. The **post-processed data** consists of derived quantities and images +generated using the **raw data**. The **software** or **software application** +used to perform the simulation generally refers to a particular phase-field +code, which is part of the larger **computational environment**. The **code** +might also refer to the **software**, but the distinction is that the **code** +may have been modified by the researcher and might include **input files** to +the **software application**. Although the code can be considered as data in +the larger sense, in this work, the data curation process excludes +consideration of code curation, which involves its own distinct practices. See +the [Software Development](label-software-development) section of the best +practices guide for a more detailed discussion of software and code curation. ```{mermaid} --- -title: A Phase Field Workflow +title: A Phase-Field Workflow ---p flowchart TD - id1@{ shape: lean-r, label: "Input Files", fill: #f96 } - id1.2@{ shape: lean-r, label: "Parameters" } - id1.1@{ shape: lean-r, label: "Code" } - id2@{ shape: lean-r, label: "Computational Environment" } - id2.5@{ shape: rect, label: "Pre-processing, (e.g. CALPHAD or Meshing)" } + id1@{ shape: lean-r, label: "Input Files", fill: #f96 } + id1.2@{ shape: lean-r, label: "Parameters" } + id1.1@{ shape: lean-r, label: "Code" } + id2@{ shape: lean-r, label: "Computational Environment" } + id2.5@{ shape: rect, label: "Pre-processing, (e.g. CALPHAD or Meshing)" } id2.7@{ shape: bow-rect, label: "Pre-processed Data" } - id3[Phase Field Simulation] - id3.5@{ shape: bow-rect, label: "Scratch Data" } - id4@{ shape: bow-rect, label: "Raw Field Data" } - id5@{ shape: rect, label: "Post-processing (e.g. Data Visualization)" } - id6@{ shape: bow-rect, label: "Post-processed Data" } - id7@{ shape: lin-cyl, label: "Data Repository" } - id8@{ shape: bow-rect, label: "Metadata" } - id1.2-->id1 - id1-->id2.5 - id1-->id5 - id2.5-->id2.7-->id3 - id1-->id3 - id1.1-->id3 - id2-->id3 - id3-->id4-->id5-->id6 - id3-->id3.5-->id3 - id2-->id2.5 - id2-->id5 - id6--Curation-->id7 - id4--Curation-->id7 - id8--Curation-->id7 - id1.2-->id8 - id1.1-->id8 - id1-->id8 - id2-->id8 + id3[Phase-Field Simulation] + id3.5@{ shape: bow-rect, label: "Scratch Data" } + id4@{ shape: bow-rect, label: "Raw Field Data" } + id5@{ shape: rect, label: "Post-processing (e.g. Data Visualization)" } + id6@{ shape: bow-rect, label: "Post-processed Data" } + id7@{ shape: lin-cyl, label: "Data Repository" } + id8@{ shape: bow-rect, label: "Metadata" } + id1.2-->id1 + id1-->id2.5 + id1-->id5 + id2.5-->id2.7-->id3 + id1-->id3 + id1.1-->id3 + id2-->id3 + id3-->id4-->id5-->id6 + id3-->id3.5-->id3 + id2-->id2.5 + id2-->id5 + id6--Curation-->id7 + id4--Curation-->id7 + id8--Curation-->id7 + id1.2-->id8 + id1.1-->id8 + id1-->id8 + id2-->id8 ``` ## Data Generation @@ -118,142 +108,124 @@ flowchart TD Let's first draw the distinction between **data generation** and **data curation**. Data generation involves writing raw data to disk during the simulation execution and generating post-processed data from that raw -data. Data curation involves packaging the generated data objects from -a phase field workflow or study along with sufficient provenance -metadata into a FAIR research object for consumption by subsequent -scientific studies. +data. Data curation involves packaging the generated data objects from a +phase-field workflow or study along with sufficient provenance metadata into a +FAIR research object for consumption by subsequent scientific studies. -When performing a phase field simulation, one must be cognizant of -several factors pertaining to data generation. Generally speaking, the -considerations can be defined as follow. +When performing a phase-field simulation, one must be cognizant of several +factors pertaining to data generation. Generally speaking, the considerations +can be defined as follow. - - Writing raw data to disk - - File formats - - Recovering from crashes and restarts - - Using workflow tools - - High performance computing (HPC) environments and parallel writes +- Writing raw data to disk +- File formats +- Recovering from crashes and restarts +- Using workflow tools +- High performance computing (HPC) environments and parallel writes These considerations are often conflicting, require trial and error to -determine the best approach and are highly specific to the -requirements of the workflow and post-processing. However, there are -some general guidelines that will be outlined below. +determine the best approach and are highly specific to the requirements of the +workflow and post-processing. However, there are some general guidelines that +will be outlined below. ### Writing raw data to disk -Selecting the appropriate data to write to disk during the simulation -largely depends on the requirements such as post-processing or -debugging. However, it is good practice to consider future uses of the -data that might not be of immediate benefit to the research -study. Lack of forethought in retaining data could hinder the data -curation of the final research object. The data generation process -should be considered independently from restarts, which is discussed -in a [subsequent section](#label-restarts). In general, the data -required to reconstruct derived quantities or the evolution of field -data will not be the same as the data required to restart a -simulation. +Selecting the appropriate data to write to disk during the simulation largely +depends on the requirements such as post-processing or debugging. However, it +is good practice to consider future uses of the data that might not be of +immediate benefit to the research study. Lack of forethought in retaining data +could hinder the data curation of the final research object. The data +generation process should be considered independently from restarts, which is +discussed in a [subsequent section](#label-restarts). In general, the data +required to reconstruct derived quantities or the evolution of field data will +not be the same as the data required to restart a simulation. On many HPC systems writing to disk frequently can be expensive and -intermittently stall a simulation due to a number off factors such as -I/O contention, see the [HPC section -below](#label-hpc-environments). Generally, when writing data it is -best to use single large writes to disk as opposed to multiple small -writes especially on shared file systems (i.e. "perform more write -bytes per write function call" {cite}`9406732`). In practice this -could involve caching multiple field variables across multiple save -steps and then writing to disk as a single data blob in an HDF5 file -for example. This is a trade-off between IO efficiency and losing all -the data if the job crashes without writing any useful data as well as -simulation performance, memory usage and communication overhead when -considering parallel simulations. Overall, it is essential that the IO -part of a code is well profiled using different write -configurations. The replicability of writes should also be tested by -checking the hash of data based on parallel configurations and write -frequencies. +intermittently stall a simulation due to a number off factors such as I/O +contention, see the [HPC section below](#label-hpc-environments). Generally, +when writing data it is best to use single large writes to disk as opposed to +multiple small writes especially on shared file systems (i.e. "perform more +write bytes per write function call" {cite}`Paul2020`). In practice this could +involve caching multiple field variables across multiple save steps and then +writing to disk as a single data blob in an HDF5 file for example. This is a +trade-off between IO efficiency and losing all the data if the job crashes +without writing any useful data as well as simulation performance, memory usage +and communication overhead when considering parallel simulations. Overall, it +is essential that the IO part of a code is well profiled using different write +configurations. The replicability of writes should also be tested by checking +the hash of data based on parallel configurations and write frequencies. ### File formats -In general, when running phase field simulations, the user is limited -to the file format that the software supports. For example, if the -research is using PRISMS-PF the default data format is VTK and there -is no reason to seek an alternative. If an alternative file format is -required then the researcher could code a C++ function to write data -in an alternative format to VTK such as NetCDF. - -As a general rule it is best to choose file formats that work with the -tools already in use and / or that your colleagues are using. There -are other considerations to be aware of though. Human readable formats -such as CSV and JSON are often useful for small medium data sets (such -as derived quantities) as some metadata can be embedded alongside the -raw data resulting in a FAIRer data product than standard binary -formats. Some binary file formats also support metadata and might be -more useful for final data curation of a phase field study even if not -used during the research process. One main benefit of using binary -data (beyond saving disk space) is the ability to preserve full -precision for floating point numbers. The longevity of file formats -should be considered as well. A particularly egregious case of -ignoring longevity would be using the Pickle file format in Python, -which is both language dependent and code dependent. It is an example -of data serialization, which is used mainly for in-process data -storage for asynchronous tasks, but not good for long term data -storage. - -There are many binary formats used for storing field data based on an -Eulerian mesh or grid. Common formats for field data are NetCDF, VTK, -XDMF and EXODUS. Within the phase field community, VTK seems to be the -mostly widely used. VTK is actually a visualization library, but -supports a number of different native file formats based on both XML -and HDF5 (both non-binary and binary). The VTK library works well with -FE simulations supporting many different element types as well as -parallel data storage for domain decomposition. See the [XML file -formats -documentation](https://docs.vtk.org/en/latest/design_documents/VTKFileFormats.html#xml-file-formats) -for VTK for an overview of zoo of different file extensions and their -meaning. In contrast to VTK, NetCDF is more geared towards gridded -data having arisen from atmospheric research, which uses more FD and -FV than FE. For a comparison of performance and metrics for different -file types see the [Python MeshIO tools's -README.md](https://github.com/nschloe/meshio?tab=readme-ov-file#performance-comparison) - -The Python MeshIO tool is a good place to start for IO when writing -custom phase field codes in Python (or Julia using `pyimport`). MeshIO -is also a good place to start for exploring, debugging or picking -apart file data in an interactive Python environment, which can be -harder to do with dedicated viewing tools like Paraview. The -scientific Python ecosystem is very rich with tools for data -manipulation and storage such as Pandas, which supports storage in -many different formats, and xarray for higher dimensional data. xarray -supports NetCDF file storage, which includes coordinate systems and -metadata in HDF5. Both Pandas and xarray can be used in a parallel or -a distributed manner in conjucntion with Dask. Dask along with xarray -supports writing to the Zarr data format. Zarr allows data to be -stored on disk during analysis to avoid loading the entire data object -into memory. - -https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/ -https://docs.vtk.org/en/latest/index.html -https://docs.xarray.dev/en/stable/user-guide/io.html= +In general, when running phase-field simulations, the user is limited to the +file format that the software supports. For example, if the research is using +PRISMS-PF the default data format is VTK and there is no reason to seek an +alternative. If an alternative file format is required then the researcher +could code a C++ function to write data in an alternative format to VTK such as +NetCDF. + +As a general rule it is best to choose file formats that work with the tools +already in use and / or that your colleagues are using. There are other +considerations to be aware of though. Human readable formats such as CSV and +JSON are often useful for small medium data sets (such as derived quantities) +as some metadata can be embedded alongside the raw data resulting in a FAIRer +data product than standard binary formats. Some binary file formats also +support metadata and might be more useful for final data curation of a +phase-field study even if not used during the research process. One main +benefit of using binary data (beyond saving disk space) is the ability to +preserve full precision for floating point numbers. The longevity of file +formats should be considered as well. A particularly egregious case of ignoring +longevity would be using the Pickle file format in Python, which is both +language dependent and code dependent. It is an example of data serialization, +which is used mainly for in-process data storage for asynchronous tasks, but +not good for long term data storage. + +There are many binary formats used for storing field data based on an Eulerian +mesh or grid. Common formats for field data are NetCDF, VTK, XDMF and +EXODUS. Within the phase-field community, VTK seems to be the mostly widely +used. VTK is actually a visualization library, but supports a number of +different native file formats based on both XML and HDF5 (both non-binary and +binary). The VTK library works well with FE simulations supporting many +different element types as well as parallel data storage for domain +decomposition. See the [XML file formats documentation][vtk-xml] for VTK for +an overview of zoo of different file extensions and their meaning. In contrast +to VTK, NetCDF is more geared towards gridded data having arisen from +atmospheric research, which uses more FD and FV than FE. For a comparison of +performance and metrics for different file types see the +[Python MeshIO tools's README.md][meshio]. + +The Python MeshIO tool is a good place to start for IO when writing custom +phase-field codes in Python (or Julia using `pyimport`). MeshIO is also a good +place to start for exploring, debugging or picking apart file data in an +interactive Python environment, which can be harder to do with dedicated +viewing tools like Paraview. The scientific Python ecosystem is very rich with +tools for data manipulation and storage such as Pandas, which supports storage +in many different formats, and xarray for higher dimensional data. xarray +supports NetCDF file storage, which includes coordinate systems and metadata in +HDF5. Both Pandas and xarray can be used in a parallel or a distributed manner +in conjucntion with Dask. Dask along with xarray supports writing to the Zarr +data format. Zarr allows data to be stored on disk during analysis to avoid +loading the entire data object into memory. + +- https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/ +- https://docs.vtk.org/en/latest/index.html +- https://docs.xarray.dev/en/stable/user-guide/io.html= (label-restarts)= ### Recovering from crashes and restarts -A study from 2020 of HPC systems calculated the success rate (I.e. no -error code on completion) of multi-node jobs with non-shared memory at -between 60 and 70 % -[Kumar](https://engineering.purdue.edu/dcsl/publications/papers/2020/fresco_dsn20_cameraready.pdf). This -success rate diminishes rapidly as the run time of jobs -increases. Needless to say that check-pointing is absolutely required -for any jobs of more than a few hours. Nearly everyday, an HPC -platform will experience some sort of failure -[Benoit1](https://inria.hal.science/hal-03264047/file/rr9413.pdf) -[Aupy](https://www.sciencedirect.com/science/article/abs/pii/S0743731513002219). That -doesn't mean that every job will fail everyday, but it would be -optimistic to think that jobs will go beyond a week without some -issues. Given that fact one can estimate how long it might take to run -a job without check-pointing. A very rough estimate for expected -completion time assuming instantaneous restarts and no queuing time is -given by, - -$$ E(T) = \frac{1}{2} \left(1 + e^{\frac{T}{\mu}} \right) T $$ +A study from 2020 of HPC systems calculated the success rate (I.e. no error +code on completion) of multi-node jobs with non-shared memory at between 60% +and 70% {cite}`Kumar2020`. This success rate diminishes rapidly as the run time +of jobs increases. Needless to say that check-pointing is absolutely required +for any jobs of more than a few hours. Nearly everyday, an HPC platform will +experience some sort of failure {cite}`Benoit2022b`, {cite}`Aupy2014`. That +doesn't mean that every job will fail every day, but it would be optimistic to +think that jobs will go beyond a week without some issues. Given that fact one +can estimate how long it might take to run a job without check-pointing. A very +rough estimate for expected completion time assuming instantaneous restarts and +no queuing time is given by, + +$$ E(T) = \frac{1}{2} \left(1 + e^{T / \mu} \right) T $$ where $T$ is the nominal job completion time with no failures and $\mu$ is the mean time to failure. The formula predicts an expected @@ -264,18 +236,19 @@ failure distribution the exponential time increase without check-pointing is inescapable. Assuming that we're agreed on the need for checkpoint, the next step is to decide on the optimal time interval between checkpoints. This is given by the well known -Young/Daly formula, $W=\sqrt{2 \mu C}$, where $C$ is the time taken -for a checkpoint -[Benoit2](https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf) -[Bautista-Gomez](https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf). The -Young/Daly formula accounts for the trade off between the start up -time cost for a job to get back to its original point of failure and -the cost associated with writing the checkpoint to disk. For example, -with a weekly failure rate and $C=6$ minutes, $W=5.8$ hours. In -practice these estimates for $\mu$ and $C$ might be a little -pessimistic, but be aware of the trade off. [Benoit1] . Note that some -HPC systems have upper bounds on run times (e.g. TACC has a 7 days -time limit so $\mu<7$ days regardless of other system failures). +Young/Daly formula, + +$$ W = \sqrt{2 \mu C} $$ + +where $C$ is the time taken for a checkpoint {cite}`Benoit2022a`, +{cite}`BautistaGomez2024`. The Young/Daly formula accounts for the trade off +between the start up time cost for a job to get back to its original point of +failure and the cost associated with writing the checkpoint to disk. For +example, with a weekly failure rate and $C=6$ minutes, $W=5.8$ hours. In +practice these estimates for $\mu$ and $C$ might be a little pessimistic, but +be aware of the trade off {cite}`Benoit2022b`. Note that some HPC systems have +upper bounds on run times (e.g. TACC has a 7 days time limit so $\mu<7$ days +regardless of other system failures). Given the above theory, what is the some practical advice for check-pointing jobs? @@ -284,45 +257,46 @@ check-pointing jobs? value with the HPC cluster administrator to get some valid numbers. Of course $C$ can be estimated by running test jobs. It's good to know if you should be writing checkpoints every day or every - hour or every minute. + hour -- definitely not every minute! - Ensure that restarts are deterministic (i.e. results don't change between a job that restarts and one that doesn't). One way to do this is to hash output files assuming that the simulation itself is - deterministic + deterministic. - Consider using a checkpointing library if you're using a custom - phase field code or even a workflow tool such as Snakemake which has + phase-field code or even a workflow tool such as Snakemake which has the inbuilt ability to handle checkpointing. A tool like Snakemake is good for large parameter studies where it is difficult to keep track of which jobs wrote which files. The `pickle` library is - acceptable for checkpointint Python programs in this shortlived - circumstance. -- Use the inbuilt check-pointing available in the phase field code + acceptable for checkpointint Python programs _in this short-lived + circumstance_. +- Use the built-in checkpointing available in the phase-field code that you're using. -- Whatever system is being used check that the check-pointing actually - works and is deterministic. - -- https://hivehpc.haifa.ac.il/index.php/slurm?start=5 -- https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf -- https://inria.hal.science/hal-03264047/file/rr9413.pdf -- https://www.sciencedirect.com/science/article/abs/pii/S0743731513002219 -- https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf -- https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf -- https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf -- https://icl.utk.edu/files/publications/2020/icl-utk-1385-2020.pdf -- https://ftp.cs.toronto.edu/csrg-technical-reports/621/ut-csrg-621.pdf -- https://arxiv.org/pdf/2012.00825 -- https://icl.utk.edu/~herault/papers/007%20-%20Checkpointing%20Strategies%20for%20Shared%20High-Performance%20Computing%20Platforms%20-%20IJNC%20(2019).pdf -- https://dl.acm.org/doi/10.1145/2184512.2184574 -- https://engineering.purdue.edu/dcsl/publications/papers/2020/fresco_dsn20_cameraready.pdf -- [Job failures](https://pdf.sciencedirectassets.com/271503/1-s2.0-S0898122111X00251/1-s2.0-S0898122111005980/main.pdf?X-Amz-Security-Token=IQoJb3JpZ2luX2VjEKf%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEaCXVzLWVhc3QtMSJIMEYCIQCnYz7Yg2JHorkw2CwX7PI5fbyLRr02ykVPbgtxZhNy8QIhAIY%2BTq58bdBe3iRdnRXNP%2FjQ0%2B4LgrXUQh7aakHn9TSTKrsFCID%2F%2F%2F%2F%2F%2F%2F%2F%2F%2FwEQBRoMMDU5MDAzNTQ2ODY1IgwlVGRSpcXtIR5IRTcqjwUcZrP%2B7%2Byn4dYmelmvfF1CCdKNMP%2BftdY1KdvKA%2BnlBpDHwh%2FulDxZPkotPpiaFnrHfT85QaPGB0Q5Ck16mfWIG5KAjrrPFXY2azR3%2FxLIM6I8Ka4aHzcvUDa5L2rn8PpHqVF61wtBWRZYI8N0YM5CZi8r2%2B6NLe9OJvgLR1%2B55%2BfwK5GucDahcWDrP2FvbBFWQyidEBNl7thbpO4NIKoUTGJkb8H%2BBezk09N%2F4CPCjHel5mA1CHA8cQLH9lcCPiLurzKTBP8ozNi%2FtrlAZUKRKk%2BYHMy5HyFl%2Bobumh1eesuGe19b%2FpYOZBGzrQ4mn9eblczLd2SQi3k%2FoAws7yW9HHqzmMkJnla2B3tgfpP9WxaCnb5ZMNLIjlwEfu67ZiydyWQ2VnygTjG8CsiBYUeFuCbTplAeviP4WN%2BtiK%2FAsXbQZW93Q7cH7K%2B7lhfPiuPaauh0tlgQJNVdp2QzT8qvsxbQtzdEOQ5ethcoNbXU8YmXYgYdUtGwQbySD18i3aQo6zdUD0h9YtNNl%2BXskT8nv9xVPzsfMdkvrWA23PdIDYagX6n4Dd45DkeYIa10oXQfQKy7JiYIyfy7L1zt6tE6Rr0H9aMogJZIfjZG3tFcRea97xN%2BputMKCCqpyz%2FBJgfCptLvjhoKsWCfIqiE23xTHTTl%2FTkTK20ZQ8yO8lHuKHwtjbJJrX%2BQgnfiZQp2Sm6sKWchwej1Nn8IgtbHswKexMqoyaQHUeokSJ6MwQmtHAoUjuwuhaVcOqHn5gNEBuAWo3pAS%2Fwn2TTG5g00gF7RpQRoZvflp4b9poAN20kS%2F0lmLzutQ3wHOq1Ak59OzQhZASygvTGiqCxRp%2BzqctUt6%2B3%2FKRFwhR7q%2BwRluPBMMyix7sGOrAB3n1WiLFCV6WV5KLzctnyXliLNDqpIxUPVeXy2v%2FvcR8zUDFo49QquQ6nJudq2u9aUJ4nKzEkzpLTdAOCnQ%2FIR68LdMPQF%2BqJr0BD78PqcfoaacB%2BH4vV0FhCOVs0rVLdevPVGoAhB%2BIFbrsv%2BvHvQUZxU3wX%2FuOp2ChL1cUohEeoQzo2PyF2FZXkNcUenB2EWhcZGpgAIL1EYIyu1iCYmbfjVlISQL7xMJHj%2BSq1H3c%3D&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Date=20241230T003505Z&X-Amz-SignedHeaders=host&X-Amz-Expires=299&X-Amz-Credential=ASIAQ3PHCVTYXOJ65CE4%2F20241230%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Signature=4d3b88d3c0b7d80e5ff0df11ef0d4246875d4aabdc0c8d49be54c9c029beae8e&hash=ebead6fd98beaced790a9b84e76f9f6326f41ebbf4cedadb1fde7cfc734bfc31&host=68042c943591013ac2b2430a89b270f6af2c76d8dfd086a07176afe7c76c2c61&pii=S0898122111005980&tid=spdf-4974a446-6433-4619-afca-a5c210488d71&sid=a6179c5a4b870940603b745364079c847353gxrqa&type=client&tsoh=d3d3LnNjaWVuY2VkaXJlY3QuY29t&ua=161d5b095903025d02&rr=8f9df2a7da4fc990&cc=us) -- https://www.cs.cmu.edu/~bianca/dsn06.pdf - +- Whatever system is being used, check that the checkpointing machinery + actually works and is deterministic. + +Some links to further reading: + +- +- +- +- +- +- +- +- +- +- +- +- +- +- [Job failures](https://pdf.sciencedirectassets.com/271503/1-s2.0-S0898122111X00251/1-s2.0-S0898122111005980/main.pdf) +- ### Using Workflow Tools The authors of this article use Snakemake for their workflows so will discuss this in particular, but most of the ideas will apply to other -workflow tools. In general when running many phase field jobs for a +workflow tools. In general when running many phase-field jobs for a parameter study or dealing with many pre and post-processing steps, it is wise to employ a workflow tool such as Snakemake. One of the main benefits of workflow tools is the automation of all the steps in a @@ -379,9 +353,8 @@ computational enviornment. Additionally, most workflow tools will support both HPC and local workstation execution and make porting between systems easier. -See https://pmc.ncbi.nlm.nih.gov/articles/PMC8114187/ for a more -details overview off Snakemake and a list of other good workflow -tools. +See {cite}`Moelder2021` for a more detailed overview of Snakemake and a list of +other good workflow tools. - https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html @@ -398,7 +371,7 @@ simply meet the most basic needs of transparency in scientific research. The main benefits of data curation include (see [DCC](https://www.dcc.ac.uk/guidance/how-guides/develop-data-plan#Why%20develop)) -Simulation FAIR data paragraph and importance of metadata +_To Do:_ Simulation FAIR data paragraph and importance of metadata The fundamental steps to curate a computational research project into a research data object and publish are as follows. @@ -421,38 +394,34 @@ a research data object and publish are as follows. - Obtain a DOI for the data object and link with other research products -The above steps are difficult to implement near the conclusion of a -research project. The authors suggest implementing the majority of -these steps at the outset of the project and developing these steps as -part of a protocol for all research projects within a computational -materials research group. +The above steps are difficult to implement near the conclusion of a research +project. The authors suggest implementing the majority of these steps at the +outset of the project and developing these steps as part of a protocol for all +research projects within a computational materials research group. ### Automation -Automating workflows in computational materials science is useful for -many reasons, however, for data curation purposed it provides and -added benefit. In short, an outlined workflow associated with a -curated FAIR object is a major way to improve FAIR quality for -subsequent researchers. For most workflow tools, the operation script -outlining the workflow graph is the ultimate form of metadata about -how the archived data files are used or generated during the -research. For example, with Snakemake, the `Snakefile` has clearly -outlined inputs and outputs as well as the procedure associated with -each input / output pair. In particular, the computational -environment, command line arguments, environment variables are -recorded as well as the order of execution for each step. - -In recent years there have been efforts in the life sciences to -provide a minimum workflow for independent code execution during the -peer review process. The [CODECHECK -initiative](https://doi.org/10.12688/f1000research.51738.2) trys to -provide a standard for executing workflows and a certification if the -workflow satisifies basic criteria. These types of efforts will likely -be used within the compuational materials science community in the -coming years so adopting automated workflow tools as part of your -research will greatly benefit this process. - -- https://www.sciencedirect.com/science/article/pii/S2666389921001707?via%3Dihub +Automating workflows in computational materials science is useful for many +reasons, however, for data curation purposed it provides and added benefit. In +short, an outlined workflow associated with a curated FAIR object is a major +way to improve FAIR quality for subsequent researchers. For most workflow +tools, the operation script outlining the workflow graph is the ultimate form +of metadata about how the archived data files are used or generated during the +research. For example, with Snakemake, the `Snakefile` has clearly outlined +inputs and outputs as well as the procedure associated with each input / output +pair. In particular, the computational environment, command line arguments, +environment variables are recorded as well as the order of execution for each +step. + +In recent years there have been efforts in the life sciences to provide a +minimum workflow for independent code execution during the peer review +process. The CODECHECK initiative {cite}`Nuest2021` tries to provide a standard +for executing workflows and a certification if the workflow satisifies basic +criteria. These types of efforts will likely be used within the compuational +materials science community in the coming years so adopting automated workflow +tools as part of your research will greatly benefit this process. + +- See also {cite}`Leipzig2021` ### Metadata Standards @@ -488,4 +457,5 @@ Dockstore and Workflowhub https://arxiv.org/pdf/2410.03490 [schemaorg]: https://github.com/openschemas/schemaorg [structured data schema]: https://en.wikipedia.org/wiki/Data_model [link1]: https://workflows.community/groups/fair/best-practices/ - +[mehio]: https://github.com/nschloe/meshio?tab=readme-ov-file#performance-comparison +[vtk-xml]: https://docs.vtk.org/en/latest/design_documents/VTKFileFormats.html#xml-file-formats diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index cdbfe93..86dc4d1 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -1,180 +1,367 @@ ---- ---- - -@article{bi_phase-field_1998, - title = {Phase-field model of solidification of a binary alloy}, - volume = {261}, - issn = {0378-4371}, - url = {https://www.sciencedirect.com/science/article/pii/S0378437198003641}, - doi = {10.1016/S0378-4371(98)00364-1}, - abstract = {We develop a rather general thermodynamically consistent phase-field model for solidification of a binary alloy, based on an entropy functional that contains squared gradient terms in the energy density, the composition and the phase-field variable. By assuming positive local entropy production, we derive generalized phase-field equations for an alloy, including cross terms that connect thermal and compositional driving forces to energy and solute fluxes. We explore this model in detail for a regular solution and show that four existing models can be recovered as special cases. We also use it to develop a new phase-field model for an alloy in which an explicit phase-field variable is absent.}, - number = {1}, - urldate = {2024-09-05}, - journal = {Physica A: Statistical Mechanics and its Applications}, - author = {Bi, Zhiqiang and Sekerka, Robert F.}, - month = dec, - year = {1998}, - keywords = {Phase-field, Models, Alloy, Solidification, Irreversible thermodynamics}, - pages = {95--106}, -} - -@book{groot_non-equilibrium_2013, - title = {Non-{Equilibrium} {Thermodynamics}}, - isbn = {9780486153506}, - abstract = {The study of thermodynamics is especially timely today, as its concepts are being applied to problems in biology, biochemistry, electrochemistry, and engineering. This book treats irreversible processes and phenomena — non-equilibrium thermodynamics.S. R. de Groot and P. Mazur, Professors of Theoretical Physics, present a comprehensive and insightful survey of the foundations of the field, providing the only complete discussion of the fluctuating linear theory of irreversible thermodynamics. The application covers a wide range of topics: the theory of diffusion and heat conduction, fluid dynamics, relaxation phenomena, acoustical relaxation, and the behavior of systems in an electromagnetic field.The statistical foundations of non-equilibrium thermodynamics are treated in detail, and there are special sections on fluctuation theory, the theory of stochastic processes, the kinetic theory of gases, and the derivation of the Onsager reciprocal relations. The implications of causality conditions and of dispersion relations are analyzed in depth.Advanced students will find a great number of challenging problems, with hints for their solutions. Chemists will be especially interested in the applications to electrochemistry and the theory of chemical reactions. Physicists, teachers, scholars, biologists, and anyone interested in the principle and modern applications of non-equilibrium thermodynamics will find this classic monograph an invaluable reference.}, - language = {en}, - publisher = {Courier Corporation}, - author = {Groot, S. R. De and Mazur, P.}, - month = jan, - year = {2013}, - note = {Google-Books-ID: mfFyG9jfaMYC}, - keywords = {Science / Physics / General}, -} - -@book{malvern_introduction_1969, - title = {Introduction to the {Mechanics} of a {Continuous} {Medium}}, - isbn = {9780134876030}, - abstract = {A unified presentation of the concepts and general principles common to all branches of solid and fluid mechanics.}, - language = {en}, - publisher = {Prentice-Hall}, - author = {Malvern, Lawrence E.}, - year = {1969}, - note = {Google-Books-ID: IIMpAQAAMAAJ}, - keywords = {Science / Mechanics / General, Science / Mechanics / Fluids, Technology \& Engineering / Civil / General}, -} - -@article{mishin_irreversible_2013, - title = {Irreversible thermodynamics of creep in crystalline solids}, - volume = {88}, - url = {https://link.aps.org/doi/10.1103/PhysRevB.88.184303}, - doi = {10.1103/PhysRevB.88.184303}, - abstract = {We develop an irreversible thermodynamics framework for the description of creep deformation in crystalline solids by mechanisms that involve vacancy diffusion and lattice site generation and annihilation. The material undergoing the creep deformation is treated as a nonhydrostatically stressed multicomponent solid medium with nonconserved lattice sites and inhomogeneities handled by employing gradient thermodynamics. Phase fields describe microstructure evolution, which gives rise to redistribution of vacancy sinks and sources in the material during the creep process. We derive a general expression for the entropy production rate and use it to identify of the relevant fluxes and driving forces and to formulate phenomenological relations among them taking into account symmetry properties of the material. As a simple application, we analyze a one-dimensional model of a bicrystal in which the grain boundary acts as a sink and source of vacancies. The kinetic equations of the model describe a creep deformation process accompanied by grain boundary migration and relative rigid translations of the grains. They also demonstrate the effect of grain boundary migration induced by a vacancy concentration gradient across the boundary.}, - number = {18}, - urldate = {2024-09-05}, - journal = {Physical Review B}, - author = {Mishin, Y. and Warren, J. A. and Sekerka, R. F. and Boettinger, W. J.}, - month = nov, - year = {2013}, - pages = {184303}, -} - -@article{boettinger_phase-field_2002, - title = {Phase-{Field} {Simulation} of {Solidification}}, - volume = {32}, - issn = {1531-7331, 1545-4118}, - url = {https://www.annualreviews.org/doi/10.1146/annurev.matsci.32.101901.155803}, - doi = {10.1146/annurev.matsci.32.101901.155803}, - abstract = {▪ Abstract  An overview of the phase-field method for modeling solidification is presented, together with several example results. Using a phase-field variable and a corresponding governing equation to describe the state (solid or liquid) in a material as a function of position and time, the diffusion equations for heat and solute can be solved without tracking the liquid-solid interface. The interfacial regions between liquid and solid involve smooth but highly localized variations of the phase-field variable. The method has been applied to a wide variety of problems including dendritic growth in pure materials; dendritic, eutectic, and peritectic growth in alloys; and solute trapping during rapid solidification.}, - language = {en}, - number = {1}, - urldate = {2024-09-05}, - journal = {Annual Review of Materials Research}, - author = {Boettinger, W. J. and Warren, J. A. and Beckermann, C. and Karma, A.}, - month = aug, - year = {2002}, - pages = {163--194}, -} - -@article{TOURRET2022100810, -title = {Phase-field modeling of microstructure evolution: Recent applications, perspectives and challenges}, -journal = {Progress in Materials Science}, -volume = {123}, -pages = {100810}, -year = {2022}, -note = {A Festschrift in Honor of Brian Cantor}, -issn = {0079-6425}, -doi = {https://doi.org/10.1016/j.pmatsci.2021.100810}, -url = {https://www.sciencedirect.com/science/article/pii/S0079642521000347}, -author = {Damien Tourret and Hong Liu and Javier LLorca}, -keywords = {Phase-field, Microstructure evolution, Solid state transformations, Solidification}, -abstract = {We briefly review the state-of-the-art in phase-field modeling of microstructure evolution. The focus is placed on recent applications of phase-field simulations of solid-state microstructure evolution and solidification that have been compared and/or validated with experiments. They show the potential of phase-field modeling to make quantitative predictions of the link between processing and microstructure. Finally, some current challenges in extending the application of phase-field models within the context of integrated computational materials engineering are mentioned.} +@article{Aupy2014, + author = {Guillaume Aupy and + Yves Robert and + Frédéric Vivien and + Dounia Zaidouni}, + title = {Checkpointing algorithms and fault prediction}, + journal = {Journal of Parallel and Distributed Computing}, + volume = {74}, + number = {2}, + pages = {2048-2064}, + year = {2014}, + issn = {0743-7315}, + doi = {10.1016/j.jpdc.2013.10.010}, + keywords = {Algorithms, Checkpoint, Prediction, Fault-tolerance, Resilience, Exascale}, + abstract = {This paper deals with the impact of fault prediction techniques on checkpointing strategies. We extend the classical first-order analysis of Young and Daly in the presence of a fault prediction system, characterized by its recall and its precision. In this framework, we provide optimal algorithms to decide whether and when to take predictions into account, and we derive the optimal value of the checkpointing period. These results allow us to analytically assess the key parameters that impact the performance of fault predictors at very large scale.} +} + +@article{BautistaGomez2024, + author = {Leonardo Bautista-Gomez and + Anne Benoit and + Sheng Di and + Thomas Herault and + Yves Robert and + Hongyang Sun}, + title = {A survey on checkpointing strategies: {S}hould we always checkpoint à la {Y}oung/{D}aly?}, + journal = {Future Generation Computer Systems}, + volume = {161}, + pages = {315-328}, + year = {2024}, + issn = {0167-739X}, + doi = {10.1016/j.future.2024.07.022}, + url = {https://www.ittc.ku.edu/~sun/publications/fgcs24.pdf}, + urldate={2025-01-06}, + keywords = {Checkpointing, Optimal period, Young/Daly formula, Resilience}, + abstract = {The Young/Daly formula provides an approximation of the optimal checkpointing period for a parallel application executing on a supercomputing platform. It was originally designed to handle fail-stop errors for preemptible tightly-coupled applications, but has been extended to other application and resilience frameworks. We provide some backgroundand survey various scenarios to assess the usefulness and limitations of the formula, both for preemptible applications and workflow applications represented as a graph of tasks. We also discuss scenarios with uncertainties, and extend the study to silent errors. We exhibit cases where the optimal period is of a different order than that dictated by the Young/Daly formula, and finally we explain how checkpointing can be further combined with replication.} +} + +@inproceedings{Benoit2022a, + author = {Benoit, Anne and + Du, Yishu and + Herault, Thomas and + Marchal, Loris and + Pallez, Guillaume and + Perotin, Lucas and + Robert, Yves and + Sun, Hongyang and + Vivien, Frederic}, + title = {Checkpointing \`{a} la {Y}oung/{D}aly: {A}n Overview}, + year = {2022}, + isbn = {9781450396752}, + publisher = {Association for Computing Machinery}, + address = {New York, NY}, + url = {https://icl.utk.edu/files/publications/2022/icl-utk-1569-2022.pdf}, + urldate={2025-01-06}, + doi = {10.1145/3549206.3549328}, + abstract = {The Young/Daly formula provides an approximation of the optimal checkpoint period for a parallel application executing on a supercomputing platform. The Young/Daly formula was originally designed for preemptible tightly-coupled applications. We provide some background and survey various application scenarios to assess the usefulness and limitations of the formula.}, + booktitle = {Proceedings of the 2022 Fourteenth International Conference on Contemporary Computing}, + pages = {701–710}, + numpages = {10}, + keywords = {Checkpoint, Optimal period, Young/Daly formula.}, + location = {Noida, India}, + series = {IC3-2022} +} + +@article{Benoit2022b, + author = {Benoit, Anne and + Perotin, Luca and + Robert, Yves and + Sun, Hongyang}, + title = {Checkpointing Workflows \`{a} la {Y}oung/{D}aly Is Not Good Enough}, + year = {2022}, + issue_date = {December 2022}, + publisher = {Association for Computing Machinery}, + address = {New York, NY, USA}, + volume = {9}, + number = {4}, + issn = {2329-4949}, + url = {https://inria.hal.science/hal-03264047/file/rr9413.pdf}, + urldate={2025-01-06}, + doi = {10.1145/3548607}, + abstract = {This article revisits checkpointing strategies when workflows composed of multiple tasks execute on a parallel platform. The objective is to minimize the expectation of the total execution time. For a single task, the Young/Daly formula provides the optimal checkpointing period. However, when many tasks execute simultaneously, the risk that one of them is severely delayed increases with the number of tasks. To mitigate this risk, a possibility is to checkpoint each task more often than with the Young/Daly strategy. But is it worth slowing each task down with extra checkpoints? Does the extra checkpointing make a difference globally? This article answers these questions. On the theoretical side, we prove several negative results for keeping the Young/Daly period when many tasks execute concurrently, and we design novel checkpointing strategies that guarantee an efficient execution with high probability. On the practical side, we report comprehensive experiments that demonstrate the need to go beyond the Young/Daly period and to checkpoint more often for a wide range of application/platform settings.}, + journal = {ACM Trans. Parallel Comput.}, + month = dec, + articleno = {14}, + numpages = {25}, + keywords = {Young/Daly formula, concurrent tasks, workflow, Checkpoint} +} + +@article{Bi1998, + title = {Phase-field model of solidification of a binary alloy}, + volume = {261}, + issn = {0378-4371}, + url = {https://www.sciencedirect.com/science/article/pii/S0378437198003641}, + doi = {10.1016/S0378-4371(98)00364-1}, + abstract = {We develop a rather general thermodynamically consistent phase-field model for solidification of a binary alloy, based on an entropy functional that contains squared gradient terms in the energy density, the composition and the phase-field variable. By assuming positive local entropy production, we derive generalized phase-field equations for an alloy, including cross terms that connect thermal and compositional driving forces to energy and solute fluxes. We explore this model in detail for a regular solution and show that four existing models can be recovered as special cases. We also use it to develop a new phase-field model for an alloy in which an explicit phase-field variable is absent.}, + number = {1}, + urldate = {2024-09-05}, + journal = {Physica A: Statistical Mechanics and its Applications}, + author = {Bi, Zhiqiang and Sekerka, Robert F.}, + month = dec, + year = {1998}, + keywords = {Phase-field, Models, Alloy, Solidification, Irreversible thermodynamics}, + pages = {95--106}, +} + +@article{Boettinger2002, + title = {Phase-Field Simulation of Solidification}, + volume = {32}, + issn = {1531-7331, 1545-4118}, + url = {https://www.annualreviews.org/doi/10.1146/annurev.matsci.32.101901.155803}, + doi = {10.1146/annurev.matsci.32.101901.155803}, + abstract = {▪ Abstract  An overview of the phase-field method for modeling solidification is presented, together with several example results. Using a phase-field variable and a corresponding governing equation to describe the state (solid or liquid) in a material as a function of position and time, the diffusion equations for heat and solute can be solved without tracking the liquid-solid interface. The interfacial regions between liquid and solid involve smooth but highly localized variations of the phase-field variable. The method has been applied to a wide variety of problems including dendritic growth in pure materials; dendritic, eutectic, and peritectic growth in alloys; and solute trapping during rapid solidification.}, + language = {en}, + number = {1}, + urldate = {2024-09-05}, + journal = {Annual Review of Materials Research}, + author = {Boettinger, W. J. and Warren, J. A. and Beckermann, C. and Karma, A.}, + month = aug, + year = {2002}, + pages = {163--194}, +} + +@book{Groot2013, + title = {{N}on-{E}quilibrium {T}hermodynamics}, + isbn = {9780486153506}, + abstract = {The study of thermodynamics is especially timely today, as its concepts are being applied to problems in biology, biochemistry, electrochemistry, and engineering. This book treats irreversible processes and phenomena — non-equilibrium thermodynamics. S. R. de Groot and P. Mazur, Professors of Theoretical Physics, present a comprehensive and insightful survey of the foundations of the field, providing the only complete discussion of the fluctuating linear theory of irreversible thermodynamics. The application covers a wide range of topics: the theory of diffusion and heat conduction, fluid dynamics, relaxation phenomena, acoustical relaxation, and the behavior of systems in an electromagnetic field.The statistical foundations of non-equilibrium thermodynamics are treated in detail, and there are special sections on fluctuation theory, the theory of stochastic processes, the kinetic theory of gases, and the derivation of the Onsager reciprocal relations. The implications of causality conditions and of dispersion relations are analyzed in depth.Advanced students will find a great number of challenging problems, with hints for their solutions. Chemists will be especially interested in the applications to electrochemistry and the theory of chemical reactions. Physicists, teachers, scholars, biologists, and anyone interested in the principle and modern applications of non-equilibrium thermodynamics will find this classic monograph an invaluable reference.}, + language = {en}, + publisher = {Courier Corporation}, + author = {Groot, S. R. De and Mazur, P.}, + month = jan, + year = {2013}, + note = {Google-Books-ID: mfFyG9jfaMYC}, + keywords = {Science / Physics / General}, +} + +@INPROCEEDINGS{Kumar2020, + author={Kumar, Rakesh and + Jha, Saurabh and + Mahgoub, Ashraf and + Kalyanam, Rajesh and + Harrell, Stephen and + Song, Xiaohui Carol and + Kalbarczyk, Zbigniew and + Kramer, William and + Iyer, Ravishankar and + Bagchi, Saurabh}, + booktitle={2020 50th Annual IEEE/IFIP International Conference on Dependable Systems and Networks (DSN)}, + title={The Mystery of the Failing Jobs: {I}nsights from Operational Data from Two University-Wide Computing Systems}, + year={2020}, + pages={158-171}, + keywords={Predictive models;Checkpointing;Runtime;Computational modeling;Reliability;Indexes;Organizations;HPC, Production failure data, Data analytics, Compute clusters}, + doi={10.1109/DSN48063.2020.00034}, + url={https://engineering.purdue.edu/dcsl/publications/papers/2020/fresco_dsn20_cameraready.pdf}, + urldate={2025-01-06} +} + +@article{Leipzig2021, + title = {The role of metadata in reproducible computational research}, + journal = {Patterns}, + volume = {2}, + number = {9}, + pages = {100322}, + year = {2021}, + issn = {2666-3899}, + doi = {10.1016/j.patter.2021.100322}, + author = {Jeremy Leipzig and + Daniel Nüst and + Charles Tapley Hoyt and + Karthik Ram and + Jane Greenberg}, + keywords = {reproducible research, reproducible computational research, RCR, reproducibility, replicability, metadata, provenance, workflows, pipelines, ontologies, notebooks, containers, software dependencies, semantic, FAIR}, + abstract = {Reproducible computational research (RCR) is the keystone of the scientific method for in silico analyses, packaging the transformation of raw data to published results. In addition to its role in research integrity, improving the reproducibility of scientific studies can accelerate evaluation and reuse. This potential and wide support for the FAIR principles have motivated interest in metadata standards supporting reproducibility. Metadata provide context and provenance to raw data and methods and are essential to both discovery and validation. Despite this shared connection with scientific data, few studies have explicitly described how metadata enable reproducible computational research. This review employs a functional content analysis to identify metadata standards that support reproducibility across an analytic stack consisting of input data, tools, notebooks, pipelines, and publications. Our review provides background context, explores gaps, and discovers component trends of embeddedness and methodology weight from which we derive recommendations for futurework.} +} + +@book{Malvern1969, + title = {Introduction to the {Mechanics} of a {Continuous} {Medium}}, + isbn = {9780134876030}, + abstract = {A unified presentation of the concepts and general principles common to all branches of solid and fluid mechanics.}, + language = {en}, + publisher = {Prentice-Hall}, + author = {Malvern, Lawrence E.}, + year = {1969}, + note = {Google-Books-ID: IIMpAQAAMAAJ}, + keywords = {Science / Mechanics / General, Science / Mechanics / Fluids, Technology \& Engineering / Civil / General}, +} + +@article{Mishin2013, + title = {Irreversible thermodynamics of creep in crystalline solids}, + volume = {88}, + url = {https://link.aps.org/doi/10.1103/PhysRevB.88.184303}, + doi = {10.1103/PhysRevB.88.184303}, + abstract = {We develop an irreversible thermodynamics framework for the description of creep deformation in crystalline solids by mechanisms that involve vacancy diffusion and lattice site generation and annihilation. The material undergoing the creep deformation is treated as a nonhydrostatically stressed multicomponent solid medium with nonconserved lattice sites and inhomogeneities handled by employing gradient thermodynamics. Phase fields describe microstructure evolution, which gives rise to redistribution of vacancy sinks and sources in the material during the creep process. We derive a general expression for the entropy production rate and use it to identify of the relevant fluxes and driving forces and to formulate phenomenological relations among them taking into account symmetry properties of the material. As a simple application, we analyze a one-dimensional model of a bicrystal in which the grain boundary acts as a sink and source of vacancies. The kinetic equations of the model describe a creep deformation process accompanied by grain boundary migration and relative rigid translations of the grains. They also demonstrate the effect of grain boundary migration induced by a vacancy concentration gradient across the boundary.}, + number = {18}, + urldate = {2024-09-05}, + journal = {Physical Review B}, + author = {Mishin, Y. and Warren, J. A. and Sekerka, R. F. and Boettinger, W. J.}, + month = nov, + year = {2013}, + pages = {184303}, +} + +@article{Moelder2021, + author = {Felix Mölder and + Kim Philipp Jablonski and + Brice Letcher and + Michael B Hall and + Christopher H Tomkins-Tinch and + Vanessa Sochat and + Jan Forster and + Soohyun Lee and + Sven O Twardziok and + Alexander Kanitz and + Andreas Wilm and + Manuel Holtgrewe and + Sven Rahmann and + Sven Nahnsen and + Johannes Köster}, + title = {Sustainable data analysis with {Snakemake}}, + journal = {F1000Research}, + year = {2021}, + volume = {10}, + number = {33}, + doi = {10.12688/f1000research.29032.2} +} + +@article{Nuest2021, + author = {Nüst D and + Eglen SJ}, + title = {{CODECHECK}: an {O}pen {S}cience initiative for the independent execution of computations underlying research articles during peer review to improve reproducibility}, + journal = {F1000Research}, + year = {2021}, + volume = {10}, + number = {253}, + doi = {10.12688/f1000research.51738.2} +} + +@INPROCEEDINGS{Paul2020, + author={Paul, Arnab K. and + Faaland, Olaf and + Moody, Adam and + Gonsiorowski, Elsa and + Mohror, Kathryn and + Butt, Ali R.}, + booktitle={2020 IEEE 27th International Conference on High Performance Computing, Data, and Analytics (HiPC)}, + title={Understanding {HPC} Application {I/O} Behavior Using System Level Statistics}, + year={2020}, + pages={202-211}, + keywords={Runtime;File systems;High performance computing;Conferences;Machine learning;Metadata;Servers;I/O analysis;High Performance Computing;Parallel File Systems;Lustre File System;I/O contention}, + doi={10.1109/HiPC50609.2020.00034} +} + +@article{Tourret2022, + author = {Damien Tourret and + Hong Liu and + Javier LLorca}, + title = {Phase-field modeling of microstructure evolution: {R}ecent applications, perspectives and challenges}, + journal = {Progress in Materials Science}, + volume = {123}, + pages = {100810}, + year = {2022}, + note = {A Festschrift in Honor of {B}rian {C}antor}, + issn = {0079-6425}, + doi = {10.1016/j.pmatsci.2021.100810}, + url = {https://www.sciencedirect.com/science/article/pii/S0079642521000347}, + keywords = {Phase-field, Microstructure evolution, Solid state transformations, Solidification}, + abstract = {We briefly review the state-of-the-art in phase-field modeling of microstructure evolution. The focus is placed on recent applications of phase-field simulations of solid-state microstructure evolution and solidification that have been compared and/or validated with experiments. They show the potential of phase-field modeling to make quantitative predictions of the link between processing and microstructure. Finally, some current challenges in extending the application of phase-field models within the context of integrated computational materials engineering are mentioned.} } @Article{Wilkinson2016, -author={Wilkinson, Mark D. -and Dumontier, Michel -and Aalbersberg, IJsbrand Jan -and Appleton, Gabrielle -and Axton, Myles -and Baak, Arie -and Blomberg, Niklas -and Boiten, Jan-Willem -and da Silva Santos, Luiz Bonino -and Bourne, Philip E. -and Bouwman, Jildau -and Brookes, Anthony J. -and Clark, Tim -and Crosas, Merc{\`e} -and Dillo, Ingrid -and Dumon, Olivier -and Edmunds, Scott -and Evelo, Chris T. -and Finkers, Richard -and Gonzalez-Beltran, Alejandra -and Gray, Alasdair J.G. -and Groth, Paul -and Goble, Carole -and Grethe, Jeffrey S. -and Heringa, Jaap -and 't Hoen, Peter A.C -and Hooft, Rob -and Kuhn, Tobias -and Kok, Ruben -and Kok, Joost -and Lusher, Scott J. -and Martone, Maryann E. -and Mons, Albert -and Packer, Abel L. -and Persson, Bengt -and Rocca-Serra, Philippe -and Roos, Marco -and van Schaik, Rene -and Sansone, Susanna-Assunta -and Schultes, Erik -and Sengstag, Thierry -and Slater, Ted -and Strawn, George -and Swertz, Morris A. -and Thompson, Mark -and van der Lei, Johan -and van Mulligen, Erik -and Velterop, Jan -and Waagmeester, Andra -and Wittenburg, Peter -and Wolstencroft, Katherine -and Zhao, Jun -and Mons, Barend}, -title={The FAIR Guiding Principles for scientific data management and stewardship}, -journal={Scientific Data}, -year={2016}, -month={Mar}, -day={15}, -volume={3}, -number={1}, -pages={160018}, -abstract={There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders---representing academia, industry, funding agencies, and scholarly publishers---have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.}, -issn={2052-4463}, -doi={10.1038/sdata.2016.18}, -url={https://doi.org/10.1038/sdata.2016.18} -} - -@misc{wilkinson2024applyingfairprinciplescomputational, - title={Applying the FAIR Principles to Computational Workflows}, - author={Sean R. Wilkinson and Meznah Aloqalaa and Khalid Belhajjame and Michael R. Crusoe and Bruno de Paula Kinoshita and Luiz Gadelha and Daniel Garijo and Ove Johan Ragnar Gustafsson and Nick Juty and Sehrish Kanwal and Farah Zaib Khan and Johannes Köster and Karsten Peters-von Gehlen and Line Pouchard and Randy K. Rannow and Stian Soiland-Reyes and Nicola Soranzo and Shoaib Sufi and Ziheng Sun and Baiba Vilne and Merridee A. Wouters and Denis Yuen and Carole Goble}, - year={2024}, - eprint={2410.03490}, - archivePrefix={arXiv}, - primaryClass={cs.DL}, - url={https://arxiv.org/abs/2410.03490}, -} - -@INPROCEEDINGS{9406732, - author={Paul, Arnab K. and Faaland, Olaf and Moody, Adam and Gonsiorowski, Elsa and Mohror, Kathryn and Butt, Ali R.}, - booktitle={2020 IEEE 27th International Conference on High Performance Computing, Data, and Analytics (HiPC)}, - title={Understanding HPC Application I/O Behavior Using System Level Statistics}, - year={2020}, - volume={}, - number={}, - pages={202-211}, - keywords={Runtime;File systems;High performance computing;Conferences;Machine learning;Metadata;Servers;I/O analysis;High Performance Computing;Parallel File Systems;Lustre File System;I/O contention}, - doi={10.1109/HiPC50609.2020.00034}} \ No newline at end of file + author={Wilkinson, Mark D. and + Dumontier, Michel and + Aalbersberg, IJsbrJan and + Appleton, Gabrielle and + Axton, Myles and + Baak, Arie and + Blomberg, Niklas and + Boiten, Jan-Willem and + da Silva Santos, Luiz Bonino and + Bourne, Philip E. and + Bouwman, Jildau and + Brookes, Anthony J. and + Clark, Tim and + Crosas, Merc{\`e} and + Dillo, Ingrid and + Dumon, Olivier and + Edmunds, Scott and + Evelo, Chris T. and + Finkers, Richard and + Gonzalez-Beltran, Alejandra and + Gray, Alasdair J.G. and + Groth, Paul and + Goble, Carole and + Grethe, Jeffrey S. and + Heringa, Jaap and + 't Hoen, Peter A.C and + Hooft, Rob and + Kuhn, Tobias and + Kok, Ruben and + Kok, Joost and + Lusher, Scott J. and + Martone, Maryann E. and + Mons, Albert and + Packer, Abel L. and + Persson, Bengt and + Rocca-Serra, Philippe and + Roos, Marco and + van Schaik, Rene and + Sansone, Susanna-Assunta and + Schultes, Erik and + Sengstag, Thierry and + Slater, Ted and + Strawn, George and + Swertz, Morris A. and + Thompson, Mark and + van der Lei, Johan and + van Mulligen, Erik and + Velterop, Jan and + Waagmeester, Andra and + Wittenburg, Peter and + Wolstencroft, Katherine and + Zhao, Jun and + Mons, Barend}, + title={The {FAIR} Guiding Principles for scientific data management and stewardship}, + journal={Scientific Data}, + year={2016}, + month={Mar}, + day={15}, + volume={3}, + number={1}, + pages={160018}, + abstract={There is an urgent need to improve the infrastructure supporting the reuse of scholarly data. A diverse set of stakeholders---representing academia, industry, funding agencies, and scholarly publishers---have come together to design and jointly endorse a concise and measureable set of principles that we refer to as the FAIR Data Principles. The intent is that these may act as a guideline for those wishing to enhance the reusability of their data holdings. Distinct from peer initiatives that focus on the human scholar, the FAIR Principles put specific emphasis on enhancing the ability of machines to automatically find and use the data, in addition to supporting its reuse by individuals. This Comment is the first formal publication of the FAIR Principles, and includes the rationale behind them, and some exemplar implementations in the community.}, + issn={2052-4463}, + doi={10.1038/sdata.2016.18}, + url={https://doi.org/10.1038/sdata.2016.18} +} + +@misc{Wilkinson2024, + title={Applying the {FAIR} Principles to Computational Workflows}, + author={Sean R. Wilkinson and + Meznah Aloqalaa and + Khalid Belhajjame and + Michael R. Crusoe and + Bruno de Paula Kinoshita and + Luiz Gadelha and + Daniel Garijo and + Ove Johan Ragnar Gustafsson and + Nick Juty and + Sehrish Kanwal and + Farah Zaib Khan and + Johannes Köster and + Karsten Peters-von Gehlen and + Line Pouchard and + Randy K. Rannow and + Stian Soiland-Reyes and + Nicola Soranzo and + Shoaib Sufi and + Ziheng Sun and + Baiba Vilne and + Merridee A. Wouters and + Denis Yuen and + Carole Goble}, + year={2024}, + eprint={2410.03490}, + archivePrefix={arXiv}, + primaryClass={cs.DL}, + url={https://arxiv.org/abs/2410.03490}, +} From dbedca1bb919f097144534418cf5b9fedd98d628 Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 11:55:46 -0500 Subject: [PATCH 11/19] Drop backup file. --- .gitignore | 1 + .../ch3-data-generation-and-curation.md.bkup | 500 ------------------ 2 files changed, 1 insertion(+), 500 deletions(-) delete mode 100644 pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup diff --git a/.gitignore b/.gitignore index e6a5571..9174fdc 100644 --- a/.gitignore +++ b/.gitignore @@ -1,3 +1,4 @@ +*.bkup *.ipynb_checkpoints/ *.log */.DS_Store diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup deleted file mode 100644 index eb1761d..0000000 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md.bkup +++ /dev/null @@ -1,500 +0,0 @@ -# Data Generation and Curation - -- *[Trevor Keller](https://www.nist.gov/people/trevor-keller), NIST*, [@tkphd] -- *[Daniel Wheeler](https://www.nist.gov/people/daniel-wheeler), NIST*, [@wd15] -- *[Damien Pinto](https://ca.linkedin.com/in/damien-pinto-4748387b), McGill*, [@DamienPinto] - - -## Overview - - - Look at lit on data and see how this is implemented - - Generation and dissemination - -## Ideas - - - Data formats - - FAIR - - Metadata (hierarchical data standards (look for current versions)) - - What standards exist - - One or two examples of phase field data stored currently - - Use an existing - - Create our own example - - Practical choices for storing data (figshare, zenodo, dryad, MDF) - - Deciding what data to keep - - What data to store when publishing - - What is supplementary material versus store versus leave on hard drive - - Lit review, good citations - - minting DOIs for the data - - might include simulatioin execution and logging - - how frequently to store data - - how to store - -## How do we want to structure the sections? - - 1. Intro (Daniel) - - What is data? - - What is metadata? - - Why do we need curate data? - - What is the motivation for this document? - - What should the reader get out of this document - - Why is this useful - - Create a distinction between software, data, and post-processed results - - 2. Data Generation (Trevor) - - HPC - - file systems - - data formats - - formats not to use (e.g., don't use serialization that depends on the version of the code that reads and writes because code changes) - - don't use pickles - - restarts - - data frequency - - post-processing -> refer to other document for this - - precision - - importance of folder structure - - 3. Data Curation (Trevor) - - Why is curating data important? - - Why do we need to store our data - - What formats can we use - - Is my data too large, data sizes - - What is useful for third party / subsequent users - - How might the data be used for AI or something else - - Could a reviewer make use of your curated data - - Storing post-processed data and raw data and which to store or keep - - Minting DOIs for your software when publishing a paper - - FAIR - - 4. Metadata standards (Daniel) - - Why do we need to keep some metadata beyond the data - - Zoo of things like data dictionaries, ontologies - - however, these are not well developed for our use case - - For example, you curate on Zenodo - - what extra data should you include - - how to describe the data files - - how to maintain some minimalist info about the simulation that generated the run - - When, why and how I ran this simulation - - What software - - Give example of a yaml file with 10 flat fields - - The future should be better in this regard. People actively working to improve this issue. - - 5. Examples - - Practical examples (Trevor) - - Using Zenodo for a PFHub record to store data and metadata - - Relatively rich metadata scheme - - - Simulation from scratch (Damien) - - data generation - - folder structure - - HPC issues with data - - capture process / descriptive parameters for the data that - are useful for subsequent ML practitioners that use the data - - ML / store data - - Narrative of what gets stored to disk - - Decisions of what to keep and how frequently to save data - - Auxiliary metadata decisions - - - 6. Summary (Daniel) - 7. Biblio (Daniel) - ---- - -## Old version - -- Save the data from your published work as much as possible, with meta data -- Save the inputs used to produce the results from all your published work - -### FAIR Data - -We discussed the [FAIR Principles] at [CHiMaD Phase-Field XIII][fair-phase-field]: - -#### Findable - -- [ ] (Meta)data are assigned a globally unique and persistent identifier -- [ ] Data are described with rich metadata (defined by R1 below) -- [ ] Metadata clearly and explicitly include the identifier of the data they describe -- [ ] (Meta)data are registered or indexed in a searchable resource - -#### Accessible - -- [ ] (Meta)data are retrievable by their identifier using a standardized - communications protocol - - [ ] The protocol is open, free, and universally implementable - - [ ] The protocol allows for an authentication and authorisation procedure, - where necessary -- [ ] Metadata are accessible, even when the data are no longer available - -#### Interoperable - -- [ ] (Meta)data use a formal, accessible, shared, and broadly applicable language - for knowledge representation. -- [ ] (Meta)data use vocabularies that follow FAIR principles -- [ ] (Meta)data include qualified references to other (meta)data - -#### Reusable - -- [ ] (Meta)data are richly described with a plurality of accurate and relevant attributes - - [ ] (Meta)data are released with a clear and accessible data usage license - - [ ] (Meta)data are associated with detailed provenance - - [ ] (Meta)data meet domain-relevant community standards - -### Zenodo - -Historically, [PFHub] has accepted datasets linked from any host on the Web. -At this time, we recommend using [Zenodo] to host your benchmark data. Why? *It's not "just" a shared folder.* - -* Guided prompts to describe what you're uploading -* DOI is automatically assigned to your dataset -* Basic metadata exported in multiple formats -* Browser-based viewers for CSV, Markdown, PDF, images, videos - -#### Metadata Examples - -Zenodo gives you the option to import a repository directly from GitHub. The original [FAIR Phase-field talk](https://doi.org/10.5281/zenodo.6540105) was "uploaded" this way, producing the following record. While basic authorship information was captured, this tells an interested person or machine nothing meaningful about the dataset. - -```json -{ - "@context": "https://schema.org/", - "@id": "https://doi.org/10.5281/zenodo.6540105", - "@type": "SoftwareSourceCode", - "name": "tkphd/fair-phase-field-data: CHiMaD Phase-field XIII", - "description": "FAIR Principles for Phase-Field Practitioners", - "version": "v0.1.0", - "license": "", - "identifier": "https://doi.org/10.5281/zenodo.6540105", - "url": "https://zenodo.org/record/6540105", - "datePublished": "2022-05-11", - "creator": [{ - "@type": "Person", - "givenName": "Trevor", - "familyName": "Keller", - "affiliation": "NIST"}], - "codeRepository": "https://github.com/tkphd/fair-phase-field-data/tree/v0.1.0" -} -``` - -The *strongly* preferred method is to upload files directly. The following record represents an upload for [Benchmark 1b using HiPerC](https://doi.org/10.5281/zenodo.1124941). I would consider this metadata ***rich!*** - -```json -{ - "@context": "https://schema.org/", - "@id": "https://doi.org/10.5281/zenodo.1124941", - "@type": "Dataset", - "name": "hiperc-gpu-cuda-spinodal" - "description": "Solution to the CHiMaD Phase Field benchmark problem on spinodal decomposition using CUDA, with a 9-point discrete Laplacian stencil", - "identifier": "https://doi.org/10.5281/zenodo.1124941", - "license": "https://creativecommons.org/licenses/by/4.0/legalcode", - "url": "https://zenodo.org/record/1124941", - "datePublished": "2017-12-21", - "creator": [{ - "@type": "Person", - "@id": "https://orcid.org/0000-0002-2920-8302", - "givenName": "Trevor", - "familyName": "Keller", - "affiliation": "NIST"}], - "keywords": ["phase-field", "pfhub", "chimad"], - "sameAs": ["https://doi.org/10.6084/m9.figshare.5715103.v2"], - "distribution": [ - { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv", - "encodingFormat": "csv" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0000000.png", - "encodingFormat": "png" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0100000.png", - "encodingFormat": "png" - }, { - "@type": "DataDownload", - "contentUrl": "https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal.0200000.png", - "encodingFormat": "png" - }] -} -``` - -#### Metadata Files - -After uploading the HiPerC simulation data, I also registered it with PFHub using `meta.yaml`. This file tells the website-generating machinery what to do with the dataset, and provides additional information about the resources required to perform the simulation. - -```yaml ---- -benchmark: - id: 1b - version: '1' -data: -- name: run_time - values: - - sim_time: '200000' - wall_time: '7464' -- name: memory_usage - values: - - unit: KB - value: '308224' -- description: free energy data - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/free-energy-9pt.csv -- description: microstructure at t=0 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-000000.png -- description: microstructure at t=100,000 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-100000.png -- description: microstructure at t=200,000 - type: image - url: https://zenodo.org/api/files/ce1ca4a3-b6bc-4e2c-9b70-8fe45fc243fd/spinodal-200000.png -metadata: - author: - email: trevor.keller@nist.gov - first: Trevor - github_id: tkphd - last: Keller - hardware: - acc_architecture: gpu - clock_rate: '1.48' - cores: '1792' - cpu_architecture: x86_64 - nodes: 1 - parallel_model: threaded - implementation: - repo: - url: https://github.com/usnistgov/hiperc - version: b25b14acda7c5aef565cdbcfc88f2df3412dcc46 - simulation_name: hiperc_cuda - summary: HiPerC spinodal decomposition result using CUDA on a Tesla P100 - timestamp: 18 December, 2017 -``` - -This file is not part of my dataset: it resides in the [PFHub repository on GitHub]. Furthermore, since the structure of this file specifically suits PFHub, it is of no use at all to other software, websites, or researchers. - -### Structured Data Schemas - -In the Zenodo metadata above, note the `@context` fields: [Schema.org] is a [structured data schema] *and controlled vocabulary* for describing things on the Internet. How is this useful? - -Consider the [CodeMeta] project. It creates metadata files for software projects using [Schema.org] building blocks. There's even a handy [CodeMeta Generator]! If you maintain a phase-field software framework, you can (and should!) use it to document your code in a standards-compliant, machine-readable format. This improves interoperability and reusability! - -```json -{ - "@context": "https://doi.org/10.5063/schema/codemeta-2.0", - "@type": "SoftwareSourceCode", - "license": "https://spdx.org/licenses/CC-PDDC", - "codeRepository": "git+https://github.com/usnistgov/hiperc", - "dateCreated": "2017-08-07", - "dateModified": "2019-03-04", - "downloadUrl": "https://github.com/usnistgov/hiperc/releases/tag/v1.0", - "issueTracker": "https://github.com/usnistgov/hiperc/issues", - "name": "HiPerC", - "version": "1.0.0", - "description": "High-Performance Computing in C and CUDA", - "applicationCategory": "phase-field", - "developmentStatus": "inactive", - "programmingLanguage": ["C", "CUDA", "OpenCL", "OpenMP", "TBB"], - "author": [ - { - "@type": "Person", - "@id": "https://orcid.org/my-orcid?orcid=0000-0002-2920-8302", - "givenName": "Trevor", - "familyName": "Keller", - "email": "trevor.keller@nist.gov", - "affiliation": { - "@type": "Organization", - "name": "NIST" - } - } - ] -} -``` - -That's nice! But what about our datasets? Shouldn't the [PFHub] metadata "describing" a dataset live alongside that data? - -#### Towards a Phase-Field Schema - -We are working to build a phase-field schema (or schemas) using [Schema.org] and the [schemaorg] Python library. The work-alike port of `meta.yaml` looks like the following. - -> *N.B.:* We're going to deploy a generator similar to CodeMeta's so you won't have to write this! - -```json -{ - "@context": "https://www.schema.org", - "@type": "DataCatalog", - "author": [ - { - "@type": "Person", - "affiliation": { - "@type": "GovernmentOrganization", - "name": "Materials Science and Engineering Division", - "parentOrganization": { - "@type": "GovernmentOrganization", - "name": "Material Measurement Laboratory", - "parentOrganization": { - "@type": "GovernmentOrganization", - "address": { - "@type": "PostalAddress", - "addressCountry": "US", - "addressLocality": "Gaithersburg", - "addressRegion": "Maryland", - "postalCode": "20899", - "streetAddress": "100 Bureau Drive" - }, - "identifier": "NIST", - "name": "National Institute of Standards and Technology", - "parentOrganization": "U.S. Department of Commerce", - "url": "https://www.nist.gov" - } - } - }, - "email": "trevor.keller@nist.gov", - "familyName": "Keller", - "givenName": "Trevor", - "identifier": "tkphd", - "sameAs": "https://orcid.org/0000-0002-2920-8302" - }, { - "@type": "Person", - "affiliation": { - "@type": "GovernmentOrganization", - "name": "Materials Science and Engineering Division", - "parentOrganization": { - "@type": "GovernmentOrganization", - "name": "Material Measurement Laboratory", - "parentOrganization": { - "@type": "GovernmentOrganization", - "address": { - "@type": "PostalAddress", - "addressCountry": "US", - "addressLocality": "Gaithersburg", - "addressRegion": "Maryland", - "postalCode": "20899", - "streetAddress": "100 Bureau Drive" - }, - "identifier": "NIST", - "name": "National Institute of Standards and Technology", - "parentOrganization": "U.S. Department of Commerce", - "url": "https://www.nist.gov" - } - } - }, - "email": "daniel.wheeler@nist.gov", - "familyName": "Wheeler", - "givenName": "Daniel", - "identifier": "wd15", - "sameAs": "https://orcid.org/0000-0002-2653-7418" - } - ], - "dataset": [ - { - "@type": "Dataset", - "distribution": [ - { - "@type": "PropertyValue", - "name": "parallel_nodes", - "value": 1 - }, { - "@type": "PropertyValue", - "name": "cpu_architecture", - "value": "amd64" - }, { - "@type": "PropertyValue", - "name": "parallel_cores", - "value": 12 - }, { - "@type": "PropertyValue", - "name": "parallel_gpus", - "value": 1 - }, { - "@type": "PropertyValue", - "name": "gpu_architecture", - "value": "nvidia" - }, { - "@type": "PropertyValue", - "name": "gpu_cores", - "value": 6144 - }, { - "@type": "PropertyValue", - "name": "wall_time", - "unitCode": "SEC", - "unitText": "s", - "value": 384 - }, { - "@type": "PropertyValue", - "name": "memory_usage", - "unitCode": "E63", - "unitText": "mebibyte", - "value": 1835 - } - ], - "name": "irl" - }, { - "@type": "Dataset", - "distribution": [ - { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_1.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_1.csv", - "name": "solid fraction" - }, { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_2.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_2.csv", - "name": "solid fraction" - }, { - "@type": "DataDownload", - "contentUrl": "8a/free_energy_3.csv", - "name": "free energy" - }, { - "@type": "DataDownload", - "contentUrl": "8a/solid_fraction_3.csv", - "name": "solid fraction" - } - ], - "name": "output" - } - ], - "dateCreated": "2022-10-25T19:25:02+00:00", - "description": "A fake dataset for Benchmark 8a unprepared using FiPy by @tkphd & @wd15", - "isBasedOn": { - "@type": "SoftwareSourceCode", - "codeRepository": "https://github.com/tkphd/fake-pfhub-bm8a", - "description": "Fake benchmark 8a upload with FiPy", - "runtimePlatform": "fipy", - "targetProduct": "amd64", - "version": "9df6603e" - }, - "isPartOf": { - "@type": "Series", - "identifier": "8a", - "name": "Homogeneous Nucleation", - "url": "https://pages.nist.gov/pfhub/benchmarks/benchmark8.ipynb" - }, - "keywords": [ - "phase-field", - "benchmarks", - "pfhub", - "fipy", - "homogeneous-nucleation" - ], - "license": "https://www.nist.gov/open/license#software" -} -``` - - - -[@tkphd]: https://github.com/tkphd -[@wd15]: https://github.com/wd15 -[@DamienPinto]: https://github.com/DamienPinto -[CodeMeta]: https://codemeta.github.io -[CodeMeta Generator]: https://codemeta.github.io/codemeta-generator/ -[FAIR Principles]: https://www.go-fair.org/fair-principles/ -[PFHub]: https://pages.nist.gov/pfhub -[PFHub repository on GitHub]: https://github.com/usnistgov/pfhub -[Schema.org]: https://www.schema.org -[Zenodo]: https://zenodo.org -[fair-phase-field]: https://doi.org/10.5281/zenodo.7254581 -[schemaorg]: https://github.com/openschemas/schemaorg -[structured data schema]: https://en.wikipedia.org/wiki/Data_model From aff79d7648a8aab188b7dd2301e40360b83f515d Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 12:00:04 -0500 Subject: [PATCH 12/19] Include Mermaid. --- requirements.txt | 1 + 1 file changed, 1 insertion(+) diff --git a/requirements.txt b/requirements.txt index 7e821e4..311d021 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,3 +1,4 @@ jupyter-book matplotlib numpy +sphinxcontrib-mermaid From 6f226e760804478c09b270ebdc03e97ce35f904b Mon Sep 17 00:00:00 2001 From: Trevor Keller Date: Mon, 6 Jan 2025 12:03:18 -0500 Subject: [PATCH 13/19] Revised BiBTeX keys. --- .../bp-guide-gh/ch1-model-formulation.md | 12 ++++++------ 1 file changed, 6 insertions(+), 6 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md b/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md index e106ed3..24b56ef 100644 --- a/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md +++ b/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md @@ -13,7 +13,7 @@ Phase field models are, quite generally, extensions of classical non-equilibrium thermodynamics. There are quite a few treatments of this in the literature. Here we will follow formulations similar to those developed by Sekerka and Bi in "Interfaces for the Twenty-First Century." -{cite}`bi_phase-field_1998` In that spirit, we will start with a very general +{cite}`Bi1998` In that spirit, we will start with a very general formulation, including hydrodynamics, and then simplify the problem. For those interested in starting with a simple model, you can skip over much of the initial formulation and jump to the section on the further reduction of the @@ -64,10 +64,10 @@ A complete discussion of how to use the above rules in this context is outside the scope of this best-practice guide, but we offer a highly abbreviated discussion of the ''flavor,'' following the ideas of irreversible thermodynamics (we largely are following works like deGroot and Mazur -{cite}`groot_non-equilibrium_2013`, although the continuum mechanics community +{cite}`Groot2013`, although the continuum mechanics community may be more comfortable with Noll, Coleman, and Truesdale -{cite}`malvern_introduction_1969`), as well as the aforementioned work by -Sekerka and Bi {cite}`bi_phase-field_1998`. +{cite}`Malvern1969`), as well as the aforementioned work by +Sekerka and Bi {cite}`Bi1998`. ### Mass @@ -239,7 +239,7 @@ arise, and also, for careful readers, how to extend this approach to multiple phases and additional gradient corrections, we will proceed with some more simplifications for a less complex system. For those who are interested in solid state systems that can creep, the work of Mishin, Warren, Sekerka, and -Boettinger (2013) {cite}`mishin_irreversible_2013` extends this framework. +Boettinger (2013) {cite}`Mishin2013` extends this framework. Here we eliminate the ${\bf v}$ equations by fiat, assuming that only diffusion controls the evolution of the system, which is often reasonable in microgravity situations. We can also go further, and consider an isothermal system. Then we @@ -299,7 +299,7 @@ modeling a liquid-solid binary alloy, although the details are less important that understanding that a specific choice of state function has to come from _somewhere_. Following the treatment in the Annual Reviews of Materials Research (2001) by Boettinger, Warren, Beckerman and Karma -{cite}`boettinger_phase-field_2002` we note that the free energy can be +{cite}`Boettinger2002` we note that the free energy can be determined through a multi-step process where the two components are called $A$ and $B$ respectively: From b2533f7c2e54eb86ac46c0323ddf250142cbd17a Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Mon, 6 Jan 2025 20:01:47 -0500 Subject: [PATCH 14/19] clean up writing raw data to disk section --- flake.nix | 2 +- .../ch3-data-generation-and-curation.md | 21 +++++++++---------- pf-recommended-practices/references.bib | 10 +++++++++ 3 files changed, 21 insertions(+), 12 deletions(-) diff --git a/flake.nix b/flake.nix index 0b42686..50e8159 100644 --- a/flake.nix +++ b/flake.nix @@ -40,7 +40,7 @@ python = pkgs.python313; }; env = pkgs.poetry2nix.mkPoetryEnv args; - app = pkgs.poetry2nix.mkPoetryApplication args;3 + app = pkgs.poetry2nix.mkPoetryApplication args; in rec { ## See https://github.com/nix-community/poetry2nix/issues/1433 diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 8a73564..021cf99 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -122,10 +122,7 @@ can be defined as follow. - Using workflow tools - High performance computing (HPC) environments and parallel writes -These considerations are often conflicting, require trial and error to -determine the best approach and are highly specific to the requirements of the -workflow and post-processing. However, there are some general guidelines that -will be outlined below. +These considerations will be outlined below. ### Writing raw data to disk @@ -146,13 +143,15 @@ when writing data it is best to use single large writes to disk as opposed to multiple small writes especially on shared file systems (i.e. "perform more write bytes per write function call" {cite}`Paul2020`). In practice this could involve caching multiple field variables across multiple save steps and then -writing to disk as a single data blob in an HDF5 file for example. This is a -trade-off between IO efficiency and losing all the data if the job crashes -without writing any useful data as well as simulation performance, memory usage -and communication overhead when considering parallel simulations. Overall, it -is essential that the IO part of a code is well profiled using different write -configurations. The replicability of writes should also be tested by checking -the hash of data based on parallel configurations and write frequencies. +writing to disk as a single data blob in an HDF5 file for example. Caching and +chunking data writes is a trade-off between IO efficiency, data loss due to +jobs crashing, simulation performance, memory usage and communication overhead +for parallel jobs. Overall, it is essential that the IO part of a code is well +profiled using different write configurations. The replicability of writes +should also be tested by checking the hash of data files while varying parallel +configurations, write frequencies and data chunking strategies. I/O performance +can be a major bottleneck for larger parallel simulations, but there are tools +to help characterize I/O, see {cite}`Ather2024` for an overview. ### File formats diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index 86dc4d1..713b9bc 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -365,3 +365,13 @@ @misc{Wilkinson2024 primaryClass={cs.DL}, url={https://arxiv.org/abs/2410.03490}, } + +@misc{Ather2024, + title={Parallel I/O Characterization and Optimization on Large-Scale HPC Systems: A 360-Degree Survey}, + author={Hammad Ather and Jean Luca Bez and Chen Wang and Hank Childs and Allen D. Malony and Suren Byna}, + year={2024}, + eprint={2501.00203}, + archivePrefix={arXiv}, + primaryClass={cs.DC}, + url={https://arxiv.org/abs/2501.00203}, +} From 74842fbc077156ae67520dadfcef3718157ebc1f Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Tue, 7 Jan 2025 13:45:07 -0500 Subject: [PATCH 15/19] clean up file formats section of data generation guide --- .../ch3-data-generation-and-curation.md | 76 +++++++++---------- pf-recommended-practices/references.bib | 19 +++++ 2 files changed, 54 insertions(+), 41 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 021cf99..b908351 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -151,32 +151,27 @@ profiled using different write configurations. The replicability of writes should also be tested by checking the hash of data files while varying parallel configurations, write frequencies and data chunking strategies. I/O performance can be a major bottleneck for larger parallel simulations, but there are tools -to help characterize I/O, see {cite}`Ather2024` for an overview. +to help characterize I/O, see {cite}`Ather2024` for a thorough overview. ### File formats -In general, when running phase-field simulations, the user is limited to the -file format that the software supports. For example, if the research is using -PRISMS-PF the default data format is VTK and there is no reason to seek an -alternative. If an alternative file format is required then the researcher -could code a C++ function to write data in an alternative format to VTK such as -NetCDF. - As a general rule it is best to choose file formats that work with the tools already in use and / or that your colleagues are using. There are other -considerations to be aware of though. Human readable formats such as CSV and -JSON are often useful for small medium data sets (such as derived quantities) -as some metadata can be embedded alongside the raw data resulting in a FAIRer -data product than standard binary formats. Some binary file formats also -support metadata and might be more useful for final data curation of a +considerations to be aware of though. Human readable formats such as CSV, JSON +or even YAML are often useful for small medium data sets (such as derived +quantities) as some metadata can be embedded alongside the raw data resulting +in a FAIRer data product than standard binary formats. Some binary file formats +also support metadata and might be more useful for final data curation of a phase-field study even if not used during the research process. One main benefit of using binary data (beyond saving disk space) is the ability to -preserve full precision for floating point numbers. The longevity of file -formats should be considered as well. A particularly egregious case of ignoring -longevity would be using the Pickle file format in Python, which is both -language dependent and code dependent. It is an example of data serialization, -which is used mainly for in-process data storage for asynchronous tasks, but -not good for long term data storage. +preserve full precision for floating point numbers. See the [Working with +Data][working-with-data] section of the Python for Scientific Computing +document for a comparison of binary versus text based formats. The longevity of +file formats should be considered as well. A particularly egregious case of +ignoring longevity would be using the Pickle file format in Python, which is +both language dependent and code dependent. It is an example of data +serialization, which is used mainly for in-process data storage for +asynchronous tasks and checkpointing, but not good for long term data storage. There are many binary formats used for storing field data based on an Eulerian mesh or grid. Common formats for field data are NetCDF, VTK, XDMF and @@ -186,28 +181,24 @@ different native file formats based on both XML and HDF5 (both non-binary and binary). The VTK library works well with FE simulations supporting many different element types as well as parallel data storage for domain decomposition. See the [XML file formats documentation][vtk-xml] for VTK for -an overview of zoo of different file extensions and their meaning. In contrast -to VTK, NetCDF is more geared towards gridded data having arisen from -atmospheric research, which uses more FD and FV than FE. For a comparison of -performance and metrics for different file types see the -[Python MeshIO tools's README.md][meshio]. - -The Python MeshIO tool is a good place to start for IO when writing custom -phase-field codes in Python (or Julia using `pyimport`). MeshIO is also a good -place to start for exploring, debugging or picking apart file data in an -interactive Python environment, which can be harder to do with dedicated -viewing tools like Paraview. The scientific Python ecosystem is very rich with -tools for data manipulation and storage such as Pandas, which supports storage -in many different formats, and xarray for higher dimensional data. xarray -supports NetCDF file storage, which includes coordinate systems and metadata in +an overview of the many different file extensions and their meanings. In +contrast to VTK, NetCDF is more geared towards gridded data having arisen from +atmospheric research (using finite difference grids rather than finite element +meshes). For a comparison of performance and metrics for different file types +see the [MeshIO README.md][meshio]. + +The MeshIO tool {cite}`Schlomer` is a good place to start for IO when writing +custom phase-field codes in Python (or Julia using `pyimport`). MeshIO is also +a good place to start for exploring, debugging or picking apart file data in an +interactive Python environment. Debugging data can be much more difficult with +GUI style data viewers such as Paraview. The scientific Python ecosystem is +very rich with tools for data manipulation and storage such as Pandas, which +supports table data storage in many different formats, and xarray +{cite}`Hoyer2017` for higher dimensional data storage. [xarray supports NetCDF +file storage][xarray-io], which includes coordinate systems and metadata in HDF5. Both Pandas and xarray can be used in a parallel or a distributed manner -in conjucntion with Dask. Dask along with xarray supports writing to the Zarr -data format. Zarr allows data to be stored on disk during analysis to avoid -loading the entire data object into memory. - -- https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/ -- https://docs.vtk.org/en/latest/index.html -- https://docs.xarray.dev/en/stable/user-guide/io.html= +in conjunction with Dask. Dask along with xarray supports writing to the Zarr +data format which supports out-of-memory operations. (label-restarts)= ### Recovering from crashes and restarts @@ -456,5 +447,8 @@ Dockstore and Workflowhub https://arxiv.org/pdf/2410.03490 [schemaorg]: https://github.com/openschemas/schemaorg [structured data schema]: https://en.wikipedia.org/wiki/Data_model [link1]: https://workflows.community/groups/fair/best-practices/ -[mehio]: https://github.com/nschloe/meshio?tab=readme-ov-file#performance-comparison +[meshio]: https://github.com/nschloe/meshio?tab=readme-ov-file#performance-comparison [vtk-xml]: https://docs.vtk.org/en/latest/design_documents/VTKFileFormats.html#xml-file-formats +[working-with-data]: https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/#binary-file-formats +[xarray-io]: https://docs.xarray.dev/en/stable/user-guide/io.html + diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index 713b9bc..12da889 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -375,3 +375,22 @@ @misc{Ather2024 primaryClass={cs.DC}, url={https://arxiv.org/abs/2501.00203}, } + +@misc{Schlomer, + author={Schlömer, Nico}, + doi={10.5281/zenodo.1173115}, + license = {MIT}, + title={{meshio: Tools for mesh files}}, + url={https://github.com/nschloe/meshio} +} + +@article{Hoyer2017, + author={Hoyer, Stephan and Joseph, Hamman}, + doi={10.5334/jors.148}, + journal={Journal of Open Research Software}, + month=apr, + number={1}, + title={{xarray: N-D labeled Arrays and Datasets in Python}}, + volume={5}, + year={2017} +} \ No newline at end of file From 9bb53168a0684778a2990b0a8b21cd80263f5e92 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Tue, 7 Jan 2025 18:53:31 -0500 Subject: [PATCH 16/19] clean up checkpointing section in data generation guide --- .../ch3-data-generation-and-curation.md | 118 ++++++++---------- pf-recommended-practices/references.bib | 67 ++++++---- 2 files changed, 92 insertions(+), 93 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index b908351..ba07ce6 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -205,82 +205,66 @@ data format which supports out-of-memory operations. A study from 2020 of HPC systems calculated the success rate (I.e. no error code on completion) of multi-node jobs with non-shared memory at between 60% -and 70% {cite}`Kumar2020`. This success rate diminishes rapidly as the run time -of jobs increases. Needless to say that check-pointing is absolutely required -for any jobs of more than a few hours. Nearly everyday, an HPC platform will +and 70% {cite}`Kumar2020`. Needless to say that check-pointing is absolutely +required for any jobs of more than a day. Nearly everyday, an HPC platform will experience some sort of failure {cite}`Benoit2022b`, {cite}`Aupy2014`. That doesn't mean that every job will fail every day, but it would be optimistic to -think that jobs will go beyond a week without some issues. Given that fact one -can estimate how long it might take to run a job without check-pointing. A very -rough estimate for expected completion time assuming instantaneous restarts and -no queuing time is given by, +think that jobs will go beyond a week without some issues. Given the failure +rate one can estimate how long it might take to run a job without +check-pointing. A very rough estimate for expected completion time assuming +instantaneous restarts and no queuing time is given by, $$ E(T) = \frac{1}{2} \left(1 + e^{T / \mu} \right) T $$ -where $T$ is the nominal job completion time with no failures and -$\mu$ is the mean time to failure. The formula predicts an expected -time of 3.8 days for a job that nominally runs for 3 days with a $\mu$ -of one week. The formula is of course a gross simplification and -includes many invalid assumptions, but regardless of the assumed -failure distribution the exponential time increase without -check-pointing is inescapable. Assuming that we're agreed on the need -for checkpoint, the next step is to decide on the optimal time -interval between checkpoints. This is given by the well known -Young/Daly formula, +where $T$ is the nominal job completion time with no failures and $\mu$ is the +mean time to failure. The formula predicts an expected time of 3.8 days for a +job that nominally runs for 3 days with a $\mu$ of one week. The formula is of +course a gross simplification and includes many assumptions that are invalid in +practice (such as a uniform failure distribution), but regardless of the +assumptions the exponential time increase without check-pointing is +inescapable. Assuming that we're agreed on the need for checkpoints, the next +step is to decide on the optimal time interval between checkpoints. This is +given by the well known Young/Daly formula, $$ W = \sqrt{2 \mu C} $$ -where $C$ is the time taken for a checkpoint {cite}`Benoit2022a`, -{cite}`BautistaGomez2024`. The Young/Daly formula accounts for the trade off -between the start up time cost for a job to get back to its original point of -failure and the cost associated with writing the checkpoint to disk. For -example, with a weekly failure rate and $C=6$ minutes, $W=5.8$ hours. In -practice these estimates for $\mu$ and $C$ might be a little pessimistic, but -be aware of the trade off {cite}`Benoit2022b`. Note that some HPC systems have -upper bounds on run times (e.g. TACC has a 7 days time limit so $\mu<7$ days -regardless of other system failures). - -Given the above theory, what is the some practical advice for -check-pointing jobs? - -- Estimate both $\mu$ and $C$. It might be worth discussing the $\mu$ - value with the HPC cluster administrator to get some valid - numbers. Of course $C$ can be estimated by running test jobs. It's - good to know if you should be writing checkpoints every day or every - hour -- definitely not every minute! -- Ensure that restarts are deterministic (i.e. results don't change - between a job that restarts and one that doesn't). One way to do - this is to hash output files assuming that the simulation itself is +where $C$ is the wall time required to execute the code associated with a +checkpoint {cite}`Benoit2022a`, {cite}`BautistaGomez2024`. The Young/Daly +formula accounts for the trade off between the start up time cost for a job to +get back to its original point of failure and the cost associated with writing +the checkpoint to disk. For example, with a weekly failure rate and $C=6$ +minutes the optimal write frequency is 5.8 hours. In practice these estimates +for $\mu$ and $C$ might be a little pessimistic, but be aware of the trade off +{cite}`Benoit2022b`. Note that some HPC systems have upper bounds on run +times. The Texas Advanced Computing Center has an upper bound of 7 days for +most jobs so $\mu<7$ days regardless of other system failures. + +Given the above theory, what are some practical conclusions to draw? + +- Take some time to estimate both $\mu$ and $C$. It might be worth discussing + the $\mu$ value with the HPC cluster administrator to get some valid + numbers. Of course $C$ can be estimated by running test jobs. Estimating + these values can be difficult due to HPC cluster volatility, but it's good to + know if you should be checkpointing every day or every hour or even never + checkpointing at all in the circumstances that $W \approx T$. +- Ensure that restarts are deterministic (i.e. results don't change between a + job that restarts and one that doesn't). One way to do this is to compare + hashes from raw data output files assuming that the simulation itself is deterministic. -- Consider using a checkpointing library if you're using a custom - phase-field code or even a workflow tool such as Snakemake which has - the inbuilt ability to handle checkpointing. A tool like Snakemake - is good for large parameter studies where it is difficult to keep - track of which jobs wrote which files. The `pickle` library is - acceptable for checkpointint Python programs _in this short-lived - circumstance_. -- Use the built-in checkpointing available in the phase-field code - that you're using. -- Whatever system is being used, check that the checkpointing machinery - actually works and is deterministic. - -Some links to further reading: - -- -- -- -- -- -- -- -- -- -- -- -- -- -- [Job failures](https://pdf.sciencedirectassets.com/271503/1-s2.0-S0898122111X00251/1-s2.0-S0898122111005980/main.pdf) -- +- Consider using a checkpointing library if you're using a custom phase-field + code or even a workflow tool such as Snakemake which has the inbuilt ability + to handle checkpointing. A tool like Snakemake is good for large parameter + studies where it is difficult to keep track of a multiplicy of jobs and their + various output files making restarts complicated. The `pickle` library is + acceptable for checkpointing Python programs as checkpoint data is only + useful for a brief period. +- Many PDE solvers and dedicated phase field codes will have a checkpoint + mechanism built in. However, never trust the veracity of these + mechanisms. Always run your own tests varying parallel parameters and + checkpoint frequency! + +Checkpointing strategies on HPC clusters is a complex topic, see +{cite}`Herault2019` for an overview. ### Using Workflow Tools diff --git a/pf-recommended-practices/references.bib b/pf-recommended-practices/references.bib index 12da889..6a5d8de 100644 --- a/pf-recommended-practices/references.bib +++ b/pf-recommended-practices/references.bib @@ -15,6 +15,16 @@ @article{Aupy2014 abstract = {This paper deals with the impact of fault prediction techniques on checkpointing strategies. We extend the classical first-order analysis of Young and Daly in the presence of a fault prediction system, characterized by its recall and its precision. In this framework, we provide optimal algorithms to decide whether and when to take predictions into account, and we derive the optimal value of the checkpointing period. These results allow us to analytically assess the key parameters that impact the performance of fault predictors at very large scale.} } +@misc{Ather2024, + title={Parallel I/O Characterization and Optimization on Large-Scale HPC Systems: A 360-Degree Survey}, + author={Hammad Ather and Jean Luca Bez and Chen Wang and Hank Childs and Allen D. Malony and Suren Byna}, + year={2024}, + eprint={2501.00203}, + archivePrefix={arXiv}, + primaryClass={cs.DC}, + url={https://arxiv.org/abs/2501.00203}, +} + @article{BautistaGomez2024, author = {Leonardo Bautista-Gomez and Anne Benoit and @@ -133,6 +143,29 @@ @book{Groot2013 keywords = {Science / Physics / General}, } +@article{Herault2019, + author = {Thomas Herault and Yves Robert and Aurelien Bouteiller and Dorian Arnold and Kurt Ferreira and George Bosilca and Jack Dongarra}, + title = {Checkpointing Strategies for Shared High-Performance Computing Platforms}, + journal = {International Journal of Networking and Computing}, + volume = {9}, + number = {1}, + year = {2019}, + keywords = {}, + abstract = {}, + issn = {2185-2847}, pages = {28--52}, url = {http://www.ijnc.org/index.php/ijnc/article/view/195} +} + +@article{Hoyer2017, + author={Hoyer, Stephan and Joseph, Hamman}, + doi={10.5334/jors.148}, + journal={Journal of Open Research Software}, + month=apr, + number={1}, + title={{xarray: N-D labeled Arrays and Datasets in Python}}, + volume={5}, + year={2017} +} + @INPROCEEDINGS{Kumar2020, author={Kumar, Rakesh and Jha, Saurabh and @@ -249,6 +282,14 @@ @INPROCEEDINGS{Paul2020 doi={10.1109/HiPC50609.2020.00034} } +@misc{Schlomer, + author={Schlömer, Nico}, + doi={10.5281/zenodo.1173115}, + license = {MIT}, + title={{meshio: Tools for mesh files}}, + url={https://github.com/nschloe/meshio} +} + @article{Tourret2022, author = {Damien Tourret and Hong Liu and @@ -366,31 +407,5 @@ @misc{Wilkinson2024 url={https://arxiv.org/abs/2410.03490}, } -@misc{Ather2024, - title={Parallel I/O Characterization and Optimization on Large-Scale HPC Systems: A 360-Degree Survey}, - author={Hammad Ather and Jean Luca Bez and Chen Wang and Hank Childs and Allen D. Malony and Suren Byna}, - year={2024}, - eprint={2501.00203}, - archivePrefix={arXiv}, - primaryClass={cs.DC}, - url={https://arxiv.org/abs/2501.00203}, -} -@misc{Schlomer, - author={Schlömer, Nico}, - doi={10.5281/zenodo.1173115}, - license = {MIT}, - title={{meshio: Tools for mesh files}}, - url={https://github.com/nschloe/meshio} -} -@article{Hoyer2017, - author={Hoyer, Stephan and Joseph, Hamman}, - doi={10.5334/jors.148}, - journal={Journal of Open Research Software}, - month=apr, - number={1}, - title={{xarray: N-D labeled Arrays and Datasets in Python}}, - volume={5}, - year={2017} -} \ No newline at end of file From 4aead9572fd57f76cbb097dd3e61affce35c21b8 Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Wed, 8 Jan 2025 12:02:12 -0500 Subject: [PATCH 17/19] clean up using workflow tools section of data generation guide --- .../ch3-data-generation-and-curation.md | 83 +++++++++---------- 1 file changed, 40 insertions(+), 43 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index ba07ce6..e9a8702 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -203,10 +203,10 @@ data format which supports out-of-memory operations. (label-restarts)= ### Recovering from crashes and restarts -A study from 2020 of HPC systems calculated the success rate (I.e. no error -code on completion) of multi-node jobs with non-shared memory at between 60% -and 70% {cite}`Kumar2020`. Needless to say that check-pointing is absolutely -required for any jobs of more than a day. Nearly everyday, an HPC platform will +A study from 2020 of HPC systems calculated the success rate (I.e. no error code +on completion) of multi-node jobs with non-shared memory at between 60% and 70% +{cite}`Kumar2020`. Needless to say that check-pointing is absolutely required +for any jobs of more than a day. Nearly everyday, an HPC platform will experience some sort of failure {cite}`Benoit2022b`, {cite}`Aupy2014`. That doesn't mean that every job will fail every day, but it would be optimistic to think that jobs will go beyond a week without some issues. Given the failure @@ -236,16 +236,16 @@ the checkpoint to disk. For example, with a weekly failure rate and $C=6$ minutes the optimal write frequency is 5.8 hours. In practice these estimates for $\mu$ and $C$ might be a little pessimistic, but be aware of the trade off {cite}`Benoit2022b`. Note that some HPC systems have upper bounds on run -times. The Texas Advanced Computing Center has an upper bound of 7 days for -most jobs so $\mu<7$ days regardless of other system failures. +times. The Texas Advanced Computing Center has an upper bound of 7 days for most +jobs so $\mu<7$ days regardless of other system failures. Given the above theory, what are some practical conclusions to draw? - Take some time to estimate both $\mu$ and $C$. It might be worth discussing the $\mu$ value with the HPC cluster administrator to get some valid - numbers. Of course $C$ can be estimated by running test jobs. Estimating - these values can be difficult due to HPC cluster volatility, but it's good to - know if you should be checkpointing every day or every hour or even never + numbers. Of course $C$ can be estimated by running test jobs. Estimating these + values can be difficult due to HPC cluster volatility, but it's good to know + if you should be checkpointing every day or every hour or even never checkpointing at all in the circumstances that $W \approx T$. - Ensure that restarts are deterministic (i.e. results don't change between a job that restarts and one that doesn't). One way to do this is to compare @@ -256,8 +256,8 @@ Given the above theory, what are some practical conclusions to draw? to handle checkpointing. A tool like Snakemake is good for large parameter studies where it is difficult to keep track of a multiplicy of jobs and their various output files making restarts complicated. The `pickle` library is - acceptable for checkpointing Python programs as checkpoint data is only - useful for a brief period. + acceptable for checkpointing Python programs as checkpoint data is only useful + for a brief period. - Many PDE solvers and dedicated phase field codes will have a checkpoint mechanism built in. However, never trust the veracity of these mechanisms. Always run your own tests varying parallel parameters and @@ -268,28 +268,28 @@ Checkpointing strategies on HPC clusters is a complex topic, see ### Using Workflow Tools -The authors of this article use Snakemake for their workflows so will -discuss this in particular, but most of the ideas will apply to other -workflow tools. In general when running many phase-field jobs for a -parameter study or dealing with many pre and post-processing steps, it -is wise to employ a workflow tool such as Snakemake. One of the main -benefits of workflow tools is the automation of all the steps in a -workflow that researchers often neglect to implement in the absence of -a workflow tool (e.g. with bash scripts). This forces a structure and -the researchers to think carefully about the inputs / outputs and task -graph. As a side effect, the graph structure produces a much FAIRer -research object when the research is published and shared and even so -that the researcher can rerun the simulation steps in the future. For -example, when using Snakemake, the `Snakefile` itself is a clear -record of the steps required to re-execute the workflow. Ideally, the -`Snakefile` will include all the steps required to go from the raw -inputs to images and data tables used in publications, but this might -not always be possible. - -A secondary impact of using a workflow tool is that it often imposes a -directory and file structure on the project. For example, Snakemake -has an ideal suggested structure. An example folder structure when -using Snakemake would look like the following. +In general when running many phase-field jobs for a parameter study or dealing +with many pre and post-processing steps, it is wise to employ a workflow +tool. The authors are particularly familiar with Snakemake so discussion is +slanted towards this tool. One of the main benefits of using a workflow tool is +that the user is more likely to automate workflow steps that ordinarily would +not be automated with ad-hoc tools such as Bash scripts. Workflow tools enforce +a structure on and careful consideration of the inputs, outputs and overall task +graph of the workflow. As a side effect, the imposed graph structure produces a +much FAIRer research object when the research is eventually published. Future +reuse of the study is much easier when the steps in producing the final data +objects are clearly expressed. When using Snakemake, the `Snakefile` itself is a +clear human readable record of the steps required to re-execute the +workflow. Ideally, the `Snakefile` will fully automate all the steps required, +starting from the parameters and raw input data, to reach the final images and +data tables used in any publications. In practice this might be quite difficult +to implement due to the chaotic nature of research projects and the associated +workflows. + +A secondary impact of using a workflow tool is that it often imposes a directory +and file structure on the project. For example, Snakemake has an [ideal +suggested directory structure][snakemake-directory]. An example folder structure +when using Snakemake would look like the following. ```plain . @@ -319,22 +319,19 @@ using Snakemake would look like the following. └── Snakefile ``` -Notice that the above directory strucuture includes the `envs` -directory. This allows diffferent steps in the workflow to be run in -diffferent types of environments. The benefit of this is that the -steps can be highly hetrogeneous in terms of the required -computational enviornment. Additionally, most workflow tools will -support both HPC and local workstation execution and make porting -between systems easier. +Notice that the above directory structure includes the `envs` directory. This +allows different steps in the workflow to be run with independent computational +environments. Additionally, most workflow tools will support both HPC and local +workstation execution and make porting between systems easier. See {cite}`Moelder2021` for a more detailed overview of Snakemake and a list of other good workflow tools. -- https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html - (label-hpc-environments)= ### HPC Environments and parallel writes +Under construction + ## Data Curation Data curation involves the steps required to turn an unstructured data @@ -435,4 +432,4 @@ Dockstore and Workflowhub https://arxiv.org/pdf/2410.03490 [vtk-xml]: https://docs.vtk.org/en/latest/design_documents/VTKFileFormats.html#xml-file-formats [working-with-data]: https://aaltoscicomp.github.io/python-for-scicomp/work-with-data/#binary-file-formats [xarray-io]: https://docs.xarray.dev/en/stable/user-guide/io.html - +[snakemake-directory]: https://snakemake.readthedocs.io/en/stable/snakefiles/deployment.html From 7a836959eb6b3281e07b320064465708ae15728d Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Wed, 8 Jan 2025 17:24:44 -0500 Subject: [PATCH 18/19] clean up data curation intro in data guide --- .../bp-guide-gh/ch1-model-formulation.md | 1 + .../ch3-data-generation-and-curation.md | 83 ++++++++----------- .../bp-guide-gh/ch4-software-development.md | 1 + 3 files changed, 38 insertions(+), 47 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md b/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md index 24b56ef..5c99cef 100644 --- a/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md +++ b/pf-recommended-practices/bp-guide-gh/ch1-model-formulation.md @@ -504,4 +504,5 @@ scientific and engineering applications. ## References ```{bibliography} +:filter: docname in docnames ``` diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index e9a8702..322d0f0 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -334,61 +334,54 @@ Under construction ## Data Curation -Data curation involves the steps required to turn an unstructured data -from a research project into a coherent research data object -satisfying the principles of FAIR data. A robust data curation process -is often a requirement for compliance for funding requirements and to -simply meet the most basic needs of transparency in scientific -research. The main benefits of data curation include (see -[DCC](https://www.dcc.ac.uk/guidance/how-guides/develop-data-plan#Why%20develop)) - -_To Do:_ Simulation FAIR data paragraph and importance of metadata - -The fundamental steps to curate a computational research project into -a research data object and publish are as follows. - -- Automate the entire computational workflow where possible during the - research process from initial inputs to final research products such - as images and data tables. -- Publish the code and workflows appropriately during development (see - the ... guide). -- Employ a suitable metadata standard where possible to describe - different aspects of the research project such as the raw data - files, derived data assets, software environments, numerical - algorithms and problems specification. -- Identify the significant raw and derived data assets that are - required to produce the final research products. -- License the research work appropriately. This may require a separate - license for the data products as they are generally not archived in +Data curation involves manipulating an assortment of unstructured data files, +scripts and metadata from a research study into a coherent research data object +that satisfies the principles of FAIR data. A robust data curation process is +often a requirement for compliance with funding bodies and to simply meet the +most basic needs of transparency in scientific research. The fundamental steps +to curate a computational research project into a research data object and +publish are as follows. + +- **Automation:** Automate the entire computational workflow where possible + during the research process from initial inputs to final research products + such as images and data tables. +- **Public Development:** Submit the code and workflows appropriately during + development. This step will not be described here, but is discussed in the + [Version control and metadata section](label-version-control-and-metadata) of + the [Software Development Guide](label-software-development). +- **Metadata Standards:** Employ a suitable metadata standard where possible to + describe different aspects of the research project such as the raw data files, + derived data assets, software environments, numerical algorithms and problem + specification. +- **Licensing:** License the research work appropriately. This may require a + separate license for the data products as they are generally not archived in the code repository. -- Select a data repository to curate the data -- Obtain a DOI for the data object and link with other research - products +- **Data Repositories:** Select a data repository to curate the data, submit the + data and then obtain a DOI. The above steps are difficult to implement near the conclusion of a research -project. The authors suggest implementing the majority of these steps at the -outset of the project and developing these steps as part of a protocol for all -research projects within a computational materials research group. +study. The authors suggest considering these steps at the outset and during a +study and also considering these steps as part of an overall protocol in a +computational materials research group. ### Automation Automating workflows in computational materials science is useful for many reasons, however, for data curation purposed it provides and added benefit. In -short, an outlined workflow associated with a curated FAIR object is a major -way to improve FAIR quality for subsequent researchers. For most workflow -tools, the operation script outlining the workflow graph is the ultimate form -of metadata about how the archived data files are used or generated during the -research. For example, with Snakemake, the `Snakefile` has clearly outlined -inputs and outputs as well as the procedure associated with each input / output -pair. In particular, the computational environment, command line arguments, -environment variables are recorded as well as the order of execution for each -step. +short, an outlined workflow associated with a curated FAIR object is a major way +to improve FAIR quality for subsequent researchers. For most workflow tools, the +operation script outlining the workflow graph is the ultimate form of metadata +about how the archived data files are used or generated during the research. For +example, with Snakemake, the `Snakefile` has clearly outlined inputs and outputs +as well as the procedure associated with each input / output pair. In +particular, the computational environment, command line arguments, environment +variables are recorded as well as the order of execution for each step. In recent years there have been efforts in the life sciences to provide a minimum workflow for independent code execution during the peer review process. The CODECHECK initiative {cite}`Nuest2021` tries to provide a standard -for executing workflows and a certification if the workflow satisifies basic -criteria. These types of efforts will likely be used within the compuational +for executing workflows and a certification if the workflow satisfies basic +criteria. These types of efforts will likely be used within the computational materials science community in the coming years so adopting automated workflow tools as part of your research will greatly benefit this process. @@ -396,10 +389,6 @@ tools as part of your research will greatly benefit this process. ### Metadata Standards -### Publish the codes and workflows during development - -### Identifying the significant data assets - ### Licensing ### Selecting a data repository diff --git a/pf-recommended-practices/bp-guide-gh/ch4-software-development.md b/pf-recommended-practices/bp-guide-gh/ch4-software-development.md index 514d837..86b3d5f 100644 --- a/pf-recommended-practices/bp-guide-gh/ch4-software-development.md +++ b/pf-recommended-practices/bp-guide-gh/ch4-software-development.md @@ -67,6 +67,7 @@ should contain * Theory docs * Reference documentation +(label-version-control-and-metadata)= ### Version control and metadata To facilitate contributions to a code, version control is From ffc8bb275ce93470149f3f78441fd0e7facc81bf Mon Sep 17 00:00:00 2001 From: Daniel Wheeler Date: Fri, 10 Jan 2025 10:41:13 -0500 Subject: [PATCH 19/19] clean up automation section of data generation guide --- .../ch3-data-generation-and-curation.md | 21 ++++++++++--------- 1 file changed, 11 insertions(+), 10 deletions(-) diff --git a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md index 322d0f0..3ecbbe3 100644 --- a/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md +++ b/pf-recommended-practices/bp-guide-gh/ch3-data-generation-and-curation.md @@ -291,7 +291,7 @@ and file structure on the project. For example, Snakemake has an [ideal suggested directory structure][snakemake-directory]. An example folder structure when using Snakemake would look like the following. -```plain +``` . ├── config │   └── config.yaml @@ -367,15 +367,16 @@ computational materials research group. ### Automation Automating workflows in computational materials science is useful for many -reasons, however, for data curation purposed it provides and added benefit. In -short, an outlined workflow associated with a curated FAIR object is a major way -to improve FAIR quality for subsequent researchers. For most workflow tools, the -operation script outlining the workflow graph is the ultimate form of metadata -about how the archived data files are used or generated during the research. For -example, with Snakemake, the `Snakefile` has clearly outlined inputs and outputs -as well as the procedure associated with each input / output pair. In -particular, the computational environment, command line arguments, environment -variables are recorded as well as the order of execution for each step. +reasons, however, for data curation purposes it provides and added benefit. In +short, an outlined workflow associated with a curated FAIR object is a primary +method to improve FAIR quality for subsequent researchers. For most workflow +tools, the operation script outlining the workflow graph is the ultimate form of +metadata about how the archived data files are used or generated during the +research. For example, with Snakemake, the `Snakefile` has clearly outlined, +human-readable inputs and outputs as well as the procedure associated with each +input / output pair. The computational environment, command line arguments, +environment variables are recorded for each workflow step as well as the order +of execution of each of these steps. In recent years there have been efforts in the life sciences to provide a minimum workflow for independent code execution during the peer review