Using TinyDB JSON with S3

Thursday, October 28, 2021

We recently wanted to use S3 as a simple database storing a JSON object. Using both TindyDB and jsonObject I was able to make this work. I created a subclass using tinydb and used boto3 to connect to S3 and pull in a JSON object to use as a pseudo-ORM database.

class S3Storage(Storage):
    def __init__(self, bucket, file):
        self.bucket = bucket
        self.file = file
        self.client = boto3.resource('s3')


    def read(self):
        obj = self.client.Object(self.bucket, self.file)
        data = obj.get()

        return json.loads(data['Body'].read())

    def write(self, data):
        self.client.Object(self.bucket, self.file).put(Body=json.dumps(data))


    def close(self):
        pass

Call it using storage modifier

db = TinyDB(bucket='json-data-test', file='data.json', storage=S3Storage)

Define the objects using JsonObject which makes it ORM like:

class Builds(JsonObject):
   build_name = StringProperty(required=True)
   account = StringProperty(required=True)
   k8s_version = StringProperty()
   completed = BooleanProperty(default=False)
   start_time = DateTimeProperty(required=True)
   end_time = DateTimeProperty()
   total_time = IntegerProperty()

The largest caveat to this is that you cannot call this asynchronously. Only one thread can call this at a time, and write back to it, or you will get into some very nasty race conditions.

pythonjsons3tinydb

nginx-ingress, cert-manager, and default wildcard certificates

Drone and Hugo