Skip to content

Commit

Permalink
Promote latest from Deploy test (#216)
Browse files Browse the repository at this point in the history
* make config agree with deployed index name;
* fix schema configuration and resolve_node helper;
* linting 1st past black + isort
* cleaned up many unused imports with autoflake;
* WIP on linting ; prep for graphql libs update;
* WIP on graphene LIB update;
* all tests pass again
* detox (but no mypy);
* cleanup new test;
* fixed search updates; fixed enums; notes for TODOs; smoketests and unit tests all passing;latest graphene etc;
* update node packages; update GHA plugins
* added bump2version and changelog
* Bump version: 0.1.0 → 0.2.0
* detoxed
* updated changelog; removed setuyp.py; cleaned up stale comments;
* added the about and version resolvers
  • Loading branch information
chrisbc authored Jun 17, 2024
1 parent befe6d6 commit b23894e
Show file tree
Hide file tree
Showing 95 changed files with 1,722 additions and 801 deletions.
16 changes: 16 additions & 0 deletions .bumpversion.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
[bumpversion]
current_version = 0.2.0
commit = True
tag = False

[bumpversion:file:pyproject.toml]
search = version = "{current_version}"
replace = version = "{new_version}"

[bumpversion:file:graphql_api/__init__.py]
search = __version__ = '{current_version}'
replace = __version__ = '{new_version}'

[bumpversion:file:package.json]
search = "version": "{current_version}",
replace = "version": "{new_version}",
4 changes: 2 additions & 2 deletions .github/workflows/deploy-aws-lambda.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ jobs:
python-version: ['3.10']

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-version }}
Expand All @@ -36,7 +36,7 @@ jobs:
installer-parallel: true

- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v3
uses: actions/setup-node@v4
with:
node-version: ${{ matrix.node-version }}
check-latest: true
Expand Down
26 changes: 13 additions & 13 deletions .github/workflows/run-tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ jobs:

steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- uses: actions/setup-python@v4
with:
python-version: ${{ matrix.python-versions }}
Expand Down Expand Up @@ -48,18 +48,18 @@ jobs:
poetry install --no-interaction --no-root --with dev --all-extras
# poetry add tox-gh-actions
# - name: test with tox (uses tox-gh-actions to select correct environment)
# run:
# poetry run tox
- name: test with tox (uses tox-gh-actions to select correct environment)
run:
poetry run tox

# - name: list files
# run: ls -l .
- name: list files
run: ls -l .

# - uses: codecov/codecov-action@v3
# with:
# fail_ci_if_error: false
# files: coverage.xml
- uses: codecov/codecov-action@v3
with:
fail_ci_if_error: false
files: coverage.xml

- name: Run test suite
run: |
SLS_OFFLINE=1 TESTING=1 poetry run pytest
# - name: Run test suite
# run: |
# SLS_OFFLINE=1 TESTING=1 poetry run pytest
21 changes: 21 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
# Changelog

## [0.2.0] - 2024-06-14

### Changed
- fixed `source_solution` resolvers bug #214 - breaking graphql API change
- upgraded NPM packages
- updated flask, flask-cors graphene libraries (major verion update)
- replace superceded `flask_graphql` import with `graphql_server.flask`
- elastic search index name is now `toshi_index_mapped`
- fixed index update method

### Added
- CHANGELOG.md and .bumpversion.cfg files
- added QA tools to worklow: `black, isort. tox`
- new resolvers: `object_identities` and `legacy_object_identities`
- new resolvers: `about` and `version`

### Removed
- setup.py
- many unused imports (with autoflake)
2 changes: 1 addition & 1 deletion app.py
Original file line number Diff line number Diff line change
@@ -1,2 +1,2 @@
# app.py
from api import api
from api import api # noqa: F401
2 changes: 1 addition & 1 deletion debug_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,4 +2,4 @@
from graphql_api import api

app = api.app
app.run()
app.run()
5 changes: 5 additions & 0 deletions graphql_api/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
"""Top-level package for nshm-toshi-api."""

__author__ = 'GNS Science'
__email__ = '[email protected]'
__version__ = '0.2.0'
5 changes: 4 additions & 1 deletion graphql_api/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,14 +5,17 @@
import yaml
from flask import Flask
from flask_cors import CORS
from flask_graphql import GraphQLView
from graphql_server.flask import GraphQLView

from graphql_api.config import LOGGING_CFG, TESTING
from graphql_api.dynamodb.models import migrate
from graphql_api.schema import root_schema

from .library_version_check import log_library_info

# from flask_graphql import GraphQLView


"""
Setup logging configuration
ref https://fangpenlin.com/posts/2012/08/26/good-logging-practice-in-python/
Expand Down
1 change: 1 addition & 0 deletions graphql_api/config.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
This module exports comfiguration for the current system
"""

import os


Expand Down
1 change: 1 addition & 0 deletions graphql_api/data/__init__.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
"""
Module entry point
"""

import base64

from .base_data import BaseData, BaseDynamoDBData
Expand Down
54 changes: 38 additions & 16 deletions graphql_api/data/base_data.py
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
"""
BaseData is the base class for AWS_S3 data handlers
"""

import json
import logging
import os
import random
import traceback
from collections import namedtuple
from datetime import datetime as dt
from importlib import import_module
Expand All @@ -14,28 +13,23 @@
import backoff
import boto3
import pynamodb.exceptions
import requests.exceptions
from botocore.exceptions import ClientError
from graphene.relay import connection
from graphql_relay.node.node import from_global_id, to_global_id
from pynamodb.connection.base import Connection
from pynamodb.exceptions import DoesNotExist, PutError, TransactWriteError, VerboseClientError
from pynamodb.exceptions import DoesNotExist
from pynamodb.transactions import TransactWrite

import graphql_api.dynamodb
from graphql_api.cloudwatch import ServerlessMetricWriter
from graphql_api.config import (
CW_METRICS_RESOLUTION,
DB_ENDPOINT,
DEPLOYMENT_STAGE,
FIRST_DYNAMO_ID,
IS_OFFLINE,
REGION,
S3_BUCKET_NAME,
STACK_NAME,
TESTING,
)
from graphql_api.dynamodb.models import ToshiFileObject, ToshiIdentity, ToshiThingObject
from graphql_api.dynamodb.models import ToshiIdentity

db_metrics = ServerlessMetricWriter(
lambda_name=STACK_NAME, metric_name="MethodDuration", resolution=CW_METRICS_RESOLUTION
Expand Down Expand Up @@ -111,7 +105,7 @@ def get_all(self, clazz_name=None):
prefix, result_id, _ = obj_summary.key.split('/')
assert prefix == self._prefix
object = self.get_one(result_id)
if clazz == None or isinstance(object, clazz):
if clazz is None or isinstance(object, clazz):
results.append(object)
db_metrics.put_duration(__name__, 'get_all', dt.utcnow() - t0)
return results
Expand All @@ -120,7 +114,8 @@ def get_all_s3_paginated(self, limit, after):
"""legacy iterator"""
count, seen = 0, 0
after = after or ""
# TODO refine this, see https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3/bucket/objects.html#filter
# TODO refine this, see
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3/bucket/objects.html#filter
# need to handle multiple versions
# should use:
# - Marker to define start of itertion
Expand Down Expand Up @@ -248,7 +243,7 @@ def get_next_id(self) -> str:
t0 = dt.utcnow()
try:
identity = ToshiIdentity.get(self._prefix)
except DoesNotExist as e:
except DoesNotExist:
# very first use of the identity
logger.debug(f'get_next_id setting initial ID; table_name={self._prefix}, object_id={FIRST_DYNAMO_ID}')
identity = ToshiIdentity(table_name=self._prefix, object_id=FIRST_DYNAMO_ID)
Expand All @@ -273,7 +268,7 @@ def _read_object(self, object_id):
obj = self.get_object(object_id)
db_metrics.put_duration(__name__, '_read_object', dt.utcnow() - t0)
return obj.object_content
except:
except Exception:
obj = self._from_s3(object_id)
db_metrics.put_duration(__name__, '_read_object', dt.utcnow() - t0)
return obj
Expand All @@ -298,11 +293,17 @@ def _write_object(self, object_id, object_type, body):

toshi_object = self._model(object_id=str(object_id), object_type=body['clazz_name'], object_content=body)

# Note that we won't see any json serialisatoin errors here, body seriaze is called
logger.debug(f"toshi_object: {toshi_object}")

# print(dir(toshi_object))
# print(toshi_object.to_json())

with TransactWrite(connection=self._connection) as transaction:
transaction.update(identity, actions=[ToshiIdentity.object_id.add(1)])
transaction.save(toshi_object)

logger.debug(f"toshi_object: object_id {object_id} object_type: {body['clazz_name']}")
logger.info(f"toshi_object: object_id {object_id} object_type: {body['clazz_name']}")

db_metrics.put_duration(__name__, '_write_object', dt.utcnow() - t0)
es_key = f"{self._prefix}_{object_id}"
Expand Down Expand Up @@ -337,9 +338,15 @@ def create(self, clazz_name, **kwargs):
Raises:
ValueError: invalid data exception
"""
logger.info(f"create() {clazz_name} {kwargs}")

clazz = getattr(import_module('graphql_api.schema'), clazz_name)
next_id = self.get_next_id()

# TODO: this whole approach sucks !@#%$#
# consider the ENUM problem, and datatime serialisatin
# mayby graphene o
# cant we just use the graphene classes json serialisation ??
def new_body(next_id, kwargs):
new = clazz(next_id, **kwargs)
body = new.__dict__.copy()
Expand All @@ -348,8 +355,23 @@ def new_body(next_id, kwargs):
body['created'] = body['created'].isoformat()
return body

self._write_object(next_id, self._prefix, new_body(next_id, kwargs))
return clazz(next_id, **kwargs)
object_instance = clazz(next_id, **kwargs)

# print(object_instance.__class__)
# print(type(object_instance))
# print(dir(object_instance))
# print(f" TODICT: {graphql.utilities.ast_to_dict(object_instance)}")
# # print(f" PRINT_TYPE: {graphql.utilities.print_type(object_instance._type.value_from_ast)}")
# # print( graphql.utilities.value_from_ast_untyped(object_instance.created) )

try:
self._write_object(next_id, self._prefix, new_body(next_id, kwargs))
except Exception as err:
logger.error(F"faild to write {clazz_name} {kwargs} {err}")
raise

logger.info(f"create() object_instance: {object_instance}")
return object_instance

def get_all(self, object_type, limit: int, after: str):
t0 = dt.utcnow()
Expand Down
1 change: 0 additions & 1 deletion graphql_api/data/data_manager.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@ def get_data_manager():


class DataManager:

"""DataManager provides the entry point to the data handlers"""

def __init__(self, search_manager, client_args=None):
Expand Down
23 changes: 10 additions & 13 deletions graphql_api/data/file_data.py
Original file line number Diff line number Diff line change
@@ -1,25 +1,19 @@
"""
The object manager for File (and subclassed) schema objects
"""

import json
import logging
import re
from datetime import datetime as dt
from importlib import import_module

from boto3.resources.model import Identifier
from graphene.relay import connection
from pynamodb.exceptions import DoesNotExist
from pynamodb.transactions import Connection, TransactGet, TransactWrite

from graphql_api.dynamodb.models import ToshiFileObject, ToshiIdentity, ToshiThingObject
from graphql_api.cloudwatch import ServerlessMetricWriter
from graphql_api.config import CW_METRICS_RESOLUTION, STACK_NAME

from .base_data import BaseDynamoDBData, append_uniq
from .base_data import BaseDynamoDBData

logger = logging.getLogger(__name__)

from graphql_api.cloudwatch import ServerlessMetricWriter
from graphql_api.config import CW_METRICS_RESOLUTION, STACK_NAME

db_metrics = ServerlessMetricWriter(
lambda_name=STACK_NAME, metric_name="MethodDuration", resolution=CW_METRICS_RESOLUTION
Expand Down Expand Up @@ -52,11 +46,12 @@ def create(self, clazz_name, **kwargs):
Returns:
File: the File object
"""
logger.info(f"FileData.create {kwargs}")
new_instance = super().create(clazz_name, **kwargs)
data_key = "%s/%s/%s" % (self._prefix, new_instance.id, new_instance.file_name)

t0 = dt.utcnow()
response2 = self.s3_bucket.put_object(Key=data_key, Body="placeholder_to_be_overwritten")
self.s3_bucket.put_object(Key=data_key, Body="placeholder_to_be_overwritten")
parts = self.s3_client.generate_presigned_post(
Bucket=self._bucket_name,
Key=data_key,
Expand Down Expand Up @@ -131,12 +126,15 @@ def from_json(jsondata):
clazz_name = jsondata.pop('clazz_name')
clazz = getattr(import_module('graphql_api.schema'), clazz_name)

# Rule based migration
# Rule based migration,
# this deals with graphql_api/tests/legacy/test_inversion_solution_file_migration_bug.py
if clazz_name == "File" and (jsondata.get('tables') or jsondata.get('metrics')):
# this is actually an InversionSolution
logger.info("from_json migration to InversionSolution of: %s" % str(jsondata))
clazz = getattr(import_module('graphql_api.schema'), 'InversionSolution')

logger.debug("from_json clazz: %s" % str(clazz))

## produced_by_id -> produced_by schema migration
produced_by_id = jsondata.pop('produced_by_id', None)
if produced_by_id and not jsondata.get('produced_by'):
Expand All @@ -147,5 +145,4 @@ def from_json(jsondata):
for tbl in jsondata.get('tables'):
tbl['created'] = dt.fromisoformat(tbl['created'])

# print('updated json', jsondata)
return clazz(**jsondata)
Loading

0 comments on commit b23894e

Please sign in to comment.