Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Sync container branch with main #724

Merged
merged 31 commits into from
Dec 19, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
9d4aab8
Container (#678)
aradhakrishnanGFDL Sep 16, 2024
ed422f6
Container Documentation (#687)
jtmims Sep 18, 2024
f3cd54b
Fix ci bugs (#688)
wrongkindofdoctor Sep 18, 2024
25aee92
Remove timeout lines and comment unused test tarballs in mdtf_tests.yml
wrongkindofdoctor Sep 19, 2024
9d054c5
infer 'start_time' and 'end_time' from 'time_range' due to type issue…
jtmims Sep 23, 2024
795274f
move line setting date_range in query_catalog() (#693)
jtmims Sep 26, 2024
f7ec46f
Remove modifier entry from areacello in trop_pac_sea_lev POD settings…
wrongkindofdoctor Sep 27, 2024
e1a590e
Fix issues in pp query (#692)
wrongkindofdoctor Sep 30, 2024
9024e81
add escape brackets to command-line commands (#694)
jtmims Oct 1, 2024
99862ca
Fix convective_transition_diag POD (#695)
wrongkindofdoctor Oct 2, 2024
d6a3b23
add ua200-850 and va200-850 to gfld-cmor-tables (#696)
jtmims Oct 22, 2024
25671db
add ice/ocean precip entries to GFDL fieldlist (#697)
wrongkindofdoctor Oct 23, 2024
a383592
Add alternate standard names entry to fieldlists and varlistEntry obj…
wrongkindofdoctor Oct 25, 2024
a278511
add function check_multichunk to fix issue with chunk_freqs (#701)
jtmims Oct 29, 2024
f5d4807
add plots link to pod_error_snippet.html (#705)
jtmims Oct 31, 2024
fb9c0b2
add variable table tool and put output into docs (#706)
jtmims Nov 1, 2024
2bb9ec9
rework ref_vartable.rst to link directly to html file of the table (#…
jtmims Nov 4, 2024
6b2408e
remove example_pp_script.py from user_pp_scripts list in multirun_con…
wrongkindofdoctor Nov 5, 2024
eb4ed79
remove .nc files found in OUTPUT_DIR depending on config file (#710)
jtmims Nov 18, 2024
92a7ae9
fix formatting issues in output reference documentation (#711)
jtmims Nov 18, 2024
6dd87dc
fix forcing_feedback settings.jsonc formatting and remove extra freq …
wrongkindofdoctor Nov 19, 2024
38dcb7c
Add check for user_pp_scripts attribute in config object to DaskMulti…
wrongkindofdoctor Nov 19, 2024
1ca02aa
add check for user_pp-scripts attr to execute_pp_functions
wrongkindofdoctor Nov 19, 2024
6deaa75
update 'standard_name' for each var in write_pp_catalog (#713)
jtmims Nov 20, 2024
e808181
Update docs about --env_dir flag (#715)
jtmims Nov 22, 2024
b8e1d69
fix logic when defining log messages in pod_setup
wrongkindofdoctor Nov 24, 2024
6affc67
Fix dummy translation method in NoTranslationFieldlist (#717)
wrongkindofdoctor Dec 2, 2024
5b0d16c
Reimplement crop date range capability (#718)
wrongkindofdoctor Dec 8, 2024
48a016a
Create drop attributes func (#720)
wrongkindofdoctor Dec 12, 2024
3d2bc45
Update mdtf dev env file (#722)
wrongkindofdoctor Dec 17, 2024
abc89d6
Fix various pp issues related to running seaice_suite (#721)
jtmims Dec 18, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
61 changes: 61 additions & 0 deletions .github/workflows/docker-build-and-push.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
#
name: Create and publish a Docker image

# Configures this workflow to run every time a change is pushed to the branch called `release`.
on:
push:
#on pull_request from container for testing
branches: ['container']

# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
jobs:
build-and-push-image:
runs-on: ubuntu-latest
# Sets the permissions granted to the `GITHUB_TOKEN` for the actions in this job.
permissions:
contents: read
packages: write
attestations: write
id-token: write
#
steps:
- name: Checkout repository
uses: actions/checkout@v4
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see "[Usage](https://github.com/docker/build-push-action#usage)" in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
- name: Build and push Docker image
id: push
uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see "[AUTOTITLE](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds)."
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v1
with:
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true

110 changes: 60 additions & 50 deletions .github/workflows/mdtf_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,30 +19,29 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest, macos-13]
json-file: ["tests/github_actions_test_ubuntu_set1.jsonc","tests/github_actions_test_macos_set1.jsonc"]
json-file-set2: ["tests/github_actions_test_ubuntu_set2.jsonc", "tests/github_actions_test_macos_set2.jsonc"]
json-file-set3: ["tests/github_actions_test_ubuntu_set3.jsonc", "tests/github_actions_test_macos_set3.jsonc"]
json-file-1a: ["tests/github_actions_test_ubuntu_1a.jsonc","tests/github_actions_test_macos_1a.jsonc"]
json-file-1b: ["tests/github_actions_test_ubuntu_1b.jsonc","tests/github_actions_test_macos_1b.jsonc"]
json-file-2: ["tests/github_actions_test_ubuntu_2.jsonc", "tests/github_actions_test_macos_2.jsonc"]
json-file-3: ["tests/github_actions_test_ubuntu_3.jsonc", "tests/github_actions_test_macos_3.jsonc"]
# if experimental is true, other jobs to run if one fails
experimental: [false]
exclude:
- os: ubuntu-latest
json-file: "tests/github_actions_test_macos_set1.jsonc"
json-file-1a: "tests/github_actions_test_macos_1a.jsonc"
- os: ubuntu-latest
json-file-set2: "tests/github_actions_test_macos_set2.jsonc"
json-file-1b: "tests/github_actions_test_macos_1b.jsonc"
- os: ubuntu-latest
json-file-set3: "tests/github_actions_test_macos_set3.jsonc"
- os: macos-12
json-file: "tests/github_actions_test_ubuntu_set1.jsonc"
- os: macos-12
json-file-set2: "tests/github_actions_test_ubuntu_set2.jsonc"
- os: macos-12
json-file-set3: "tests/github_actions_test_ubuntu_set3.jsonc"
json-file-2: "tests/github_actions_test_macos_2.jsonc"
- os: ubuntu-latest
json-file-3: "tests/github_actions_test_macos_3.jsonc"
- os: macos-13
json-file-1a: "tests/github_actions_test_ubuntu_1a.jsonc"
- os: macos-13
json-file: "tests/github_actions_test_ubuntu_set1.jsonc"
json-file-1b: "tests/github_actions_test_ubuntu_1b.jsonc"
- os: macos-13
json-file-set2: "tests/github_actions_test_ubuntu_set2.jsonc"
json-file-2: "tests/github_actions_test_ubuntu_2.jsonc"
- os: macos-13
json-file-set3: "tests/github_actions_test_ubuntu_set3.jsonc"
json-file-3: "tests/github_actions_test_ubuntu_3.jsonc"
max-parallel: 3
steps:
- uses: actions/checkout@v3
Expand All @@ -62,19 +61,13 @@ jobs:
condarc: |
channels:
- conda-forge

- name: Install XQuartz if macOS
if: ${{ matrix.os == 'macos-12' || matrix.os == 'macos-13'}}
- name: Set conda environment variables for macOS
if: ${{ matrix.os == 'macos-13' }}
run: |
echo "Installing XQuartz"
brew install --cask xquartz
echo "CONDA_ROOT=$(echo /Users/runner/micromamba)" >> $GITHUB_ENV
echo "MICROMAMBA_EXE=$(echo /Users/runner/micromamba-bin/micromamba)" >> $GITHUB_ENV
echo "CONDA_ENV_DIR=$(echo /Users/runner/micromamba/envs)" >> $GITHUB_ENV
- name: Set environment variables
run: |
echo "POD_OUTPUT=$(echo $PWD/../wkdir)" >> $GITHUB_ENV
- name: Set conda vars
- name: Set conda environment variables for ubuntu
if: ${{ matrix.os == 'ubuntu-latest' }}
run: |
echo "MICROMAMBA_EXE=$(echo /home/runner/micromamba-bin/micromamba)" >> $GITHUB_ENV
Expand All @@ -84,7 +77,7 @@ jobs:
run: |
echo "Installing Conda Environments"
echo "conda root ${CONDA_ROOT}"
echo "env dir ${CONDA_ENV_DIR}"
echo "env dir ${CONDA_ENV_DIR}"
# MDTF-specific setup: install all conda envs
./src/conda/micromamba_env_setup.sh --all --micromamba_root ${CONDA_ROOT} --micromamba_exe ${MICROMAMBA_EXE} --env_dir ${CONDA_ENV_DIR}
echo "Creating the _MDTF_synthetic_data environment"
Expand All @@ -104,7 +97,7 @@ jobs:
mkdir wkdir
## make input data directories
mkdir -p inputdata/obs_data
- name: Get Observational Data for Set 1
- name: Get Observational Data for Set 1a
run: |
echo "${PWD}"
cd ../
Expand All @@ -113,39 +106,56 @@ jobs:
# attempt FTP data fetch
# allow 20 min for transfer before timeout; Github actions allows 6 hours for individual
# jobs, but we don't want to max out resources that are shared by the NOAA-GFDL repos.
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/convective_transition_diag_obs_data.tar --output convective_transition_diag_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/EOF_500hPa_obs_data.tar --output EOF_500hPa_obs_data.tar
# curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/EOF_500hPa_obs_data.tar --output EOF_500hPa_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/Wheeler_Kiladis_obs_data.tar --output Wheeler_Kiladis_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/MJO_teleconnection_obs_data.tar --output MJO_teleconnection_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/MJO_suite_obs_data.tar --output MJO_suite_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/precip_diurnal_cycle_obs_data.tar --output precip_diurnal_cycle_obs_data.tar
echo "Untarring set 1 NCAR/CESM standard test files"
tar -xvf convective_transition_diag_obs_data.tar
tar -xvf EOF_500hPa_obs_data.tar
echo "Untarring set 1a NCAR/CESM standard test files"
# tar -xvf EOF_500hPa_obs_data.tar
tar -xvf precip_diurnal_cycle_obs_data.tar
tar -xvf MJO_teleconnection_obs_data.tar
tar -xvf MJO_suite_obs_data.tar
tar -xvf Wheeler_Kiladis_obs_data.tar
# clean up tarballs
rm -f *.tar
- name: Run diagnostic tests set 1
- name: Run diagnostic tests set 1a
run: |
echo "POD_OUTPUT is: "
echo "POD_OUTPUT=$(echo $PWD/../wkdir)" >> $GITHUB_ENV
echo "POD_OUTPUT is "
echo "${POD_OUTPUT}"
micromamba activate _MDTF_base
# trivial check that install script worked
./mdtf_framework.py --help
# run the test PODs
./mdtf -f ${{matrix.json-file}}
./mdtf -f ${{matrix.json-file-1a}}
# Debug POD log(s)
# cat ${POD_OUTPUT}/MDTF_NCAR.Synthetic_1975_1981/Wheeler_Kiladis/Wheeler_Kiladis.log
- name: Get observational data for set 1b
run: |
# clean up data from previous runs
echo "deleting data from set 1a"
cd ../wkdir
rm -rf *
cd ../inputdata/obs_data
rm -rf *
cd ../../
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/convective_transition_diag_obs_data.tar --output convective_transition_diag_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/MJO_teleconnection_obs_data.tar --output MJO_teleconnection_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/MJO_suite_obs_data.tar --output MJO_suite_obs_data.tar
tar -xvf MJO_teleconnection_obs_data.tar
tar -xvf MJO_suite_obs_data.tar
tar -xvf convective_transition_diag_obs_data.tar
# clean up tarballs
rm -f *.tar
- name: Run diagnostic tests set 1b
run: |
./mdtf -f ${{matrix.json-file-1b}}
- name: Get observational data for set 2
run: |
echo "${PWD}"
# remove data from previous run
# Actions moves you to the root repo directory in every step, so need to cd again
echo "deleting data from set 1b"
cd ../wkdir
rm -rf *
cd ../inputdata/obs_data
echo "deleting obs data from set 1"
rm -rf *
cd ../../
echo "Available Space"
Expand All @@ -160,18 +170,18 @@ jobs:
rm -f *.tar
- name: Run diagnostic tests set 2
run: |
micromamba activate _MDTF_base
# run the test PODs
./mdtf -f ${{matrix.json-file-set2}}
./mdtf -f ${{matrix.json-file-2}}
# Uncomment the following line for debugging
#cat ../wkdir/MDTF_GFDL.Synthetic_1_10/MJO_prop_amp/MJO_prop_amp.log
- name: Get observational data for set 3
run: |
echo "${PWD}"
# remove data from previous run
# Actions moves you to the root repo directory in every step, so need to cd again
echo "deleting data from set 2"
cd ../wkdir
rm -rf *
cd ../inputdata/obs_data
echo "deleting obs data from set 2"
rm -rf *
cd ../../
echo "Available Space"
Expand All @@ -181,27 +191,27 @@ jobs:
# jobs, but we don't want to max out resources that are shared by the NOAA-GFDL repos.
#curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/temp_extremes_distshape_obs_data.tar --output temp_extremes_distshape_obs_data.tar
#curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/tropical_pacific_sea_level_obs_data.tar.gz --output tropical_pacific_sea_level_obs_data.tar.gz
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/mixed_layer_depth_obs_data.tar --output mixed_layer_depth_obs_data.tar
#curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/mixed_layer_depth_obs_data.tar --output mixed_layer_depth_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/ocn_surf_flux_diag_obs_data.tar --output ocn_surf_flux_diag_obs_data.tar
# curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/albedofb_obs_data.tar --output albedofb_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/seaice_suite_obs_data.tar --output seaice_suite_obs_data.tar
curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/stc_eddy_heat_fluxes_obs_data.tar --output stc_eddy_heat_fluxes_obs_data.tar
#curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/seaice_suite_obs_data.tar --output seaice_suite_obs_data.tar
#curl --verbose --ipv4 --connect-timeout 8 --max-time 1200 --retry 128 --ftp-ssl --ftp-pasv -u "anonymous:anonymous" ftp://ftp.gfdl.noaa.gov/perm/oar.gfdl.mdtf/stc_eddy_heat_fluxes_obs_data.tar --output stc_eddy_heat_fluxes_obs_data.tar
echo "Untarring set 3 CMIP standard test files"
#tar -xvf temp_extremes_distshape_obs_data.tar
#tar -zxvf tropical_pacific_sea_level_obs_data.tar.gz
tar -xvf mixed_layer_depth_obs_data.tar
#tar -xvf mixed_layer_depth_obs_data.tar
tar -xvf ocn_surf_flux_diag_obs_data.tar
# tar -xvf albedofb_obs_data.tar
tar -xvf seaice_suite_obs_data.tar
tar -xvf stc_eddy_heat_fluxes_obs_data.tar
# tar -xvf seaice_suite_obs_data.tar
# tar -xvf stc_eddy_heat_fluxes_obs_data.tar
# clean up tarballs
rm -f *.tar
rm -f *.tar.gz
- name: Run CMIP diagnostic tests set 3
run: |
micromamba activate _MDTF_base
# run the test PODs
./mdtf -f ${{matrix.json-file-set3}}
./mdtf -f ${{matrix.json-file-3}}
#- name: Run unit tests
# run: |
# micromamba activate _MDTF_base
Expand Down
35 changes: 35 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
FROM mambaorg/micromamba:1.5.8 as micromamba

USER root
# Container Metadata
LABEL maintainer="mdtf-framework-team"
LABEL org.opencontainers.image.source=https://github.com/aradhakrishnanGFDL/MDTF-diagnostics/
LABEL org.opencontainers.image.description="This is a docker image for the MDTF-diagnostics package"
LABEL version="20140100.beta"

# Copy the MDTF-diagnostics package contents from local machine to image (or from git)
ENV CODE_ROOT=/proj/MDTF-diagnostics

COPY src ${CODE_ROOT}/src

COPY data ${CODE_ROOT}/data
COPY diagnostics ${CODE_ROOT}/diagnostics
COPY mdtf_framework.py ${CODE_ROOT}
COPY shared ${CODE_ROOT}/shared
COPY tests ${CODE_ROOT}/tests

# Install conda environments
ENV CONDA_ROOT=/opt/conda/
ENV CONDA_ENV_DIR=/opt/conda/envs
RUN apt-get -y update
#dev purpose only - install vim
RUN apt-get -y install vim
RUN apt-get -y install git

RUN micromamba create -f /proj/MDTF-diagnostics/src/conda/env_base.yml && \
micromamba create -f /proj/MDTF-diagnostics/src/conda/env_python3_base.yml && \
micromamba create -f /proj/MDTF-diagnostics/src/conda/_env_synthetic_data.yml && \
micromamba clean --all --yes && \
micromamba clean --force-pkgs-dirs --yes

ENV PATH="${PATH}:/proj/MDTF-diagnostics/"
4 changes: 1 addition & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -107,9 +107,7 @@ for, the Windows Subsystem for Linux.
when micromamba is installed
- `$MICROMAMBA_EXE` is full path to the micromamba executable on your system
(e.g., /home/${USER}/.local/bin/micromamba). This is defined by the `MAMBA_EXE` environment variable on your system
- The `--env_dir` flag allows you to put the program files in a designated location `$CONDA_ENV_DIR`
(for space reasons, or if you don’t have write access).
You can omit this flag, and the environments will be installed within `$CONDA_ROOT/envs/` by default.
- All flags noted for your system above must be supplied for the script to work.

#### NOTE: The micromamba environments may differ from the conda environments because of package compatibility discrepancies between solvers
`% ./src/conda/micromamba_env_setup.sh --all --micromamba_root $MICROMAMBA_ROOT --micromamba_exe $MICROMAMBA_EXE --env_dir $CONDA_ENV_DIR` builds
Expand Down
8 changes: 8 additions & 0 deletions data/fieldlist_CMIP.jsonc
Original file line number Diff line number Diff line change
Expand Up @@ -181,6 +181,14 @@
"standard_name": "precipitation_flux",
"realm": "atmos",
"units": "kg m-2 s-1",
"alternate_standard_names": ["rainfall_flux"],
"ndim": 3
},
"rainfall_flux": {
"standard_name": "rainfall_flux",
"realm": "seaIce",
"units": "kg m-2 s-1",
"alternate_standard_names": ["precipitation_flux"],
"ndim": 3
},
"prc": {
Expand Down
Loading
Loading