You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The STAC item creation scripts have to get the S3 bucket location for inclusion in the URL for each STAC item. It's slightly annoying to have to configure AWS credentials (e.g. when running the creation and ingest scripts on an EC2 instance) just for this operation. We should be able to send an unsigned request since the dataset buckets allow public access to the GetBucketLocation operation, but there seems to be a boto3 bug preventing unsigned requests for this operation: boto/boto3#3522
We could wait until the bug is fixed and then implement unsigned requests, or we could just hard-code the bucket location, though this risks that the location will change at some point and invalidate our STAC item URLs.
The text was updated successfully, but these errors were encountered:
The STAC item creation scripts have to get the S3 bucket location for inclusion in the URL for each STAC item. It's slightly annoying to have to configure AWS credentials (e.g. when running the creation and ingest scripts on an EC2 instance) just for this operation. We should be able to send an unsigned request since the dataset buckets allow public access to the GetBucketLocation operation, but there seems to be a
boto3
bug preventing unsigned requests for this operation: boto/boto3#3522We could wait until the bug is fixed and then implement unsigned requests, or we could just hard-code the bucket location, though this risks that the location will change at some point and invalidate our STAC item URLs.
The text was updated successfully, but these errors were encountered: