Actually, moto is just one bandaid for that problem - there are SO MANY s3 storage implementations, including the pre-license-switch Apache 2 version of minio (one need not use a bleeding edge for something as relatively stable as the S3 Api)
Actually, moto is just one bandaid for that problem - there are SO MANY s3 storage implementations, including the pre-license-switch Apache 2 version of minio (one need not use a bleeding edge for something as relatively stable as the S3 Api)
EDIT: They probably do not, I'm guessing they mean https://docs.getmoto.org/en/latest/index.html ?
I use this, and testing.postgresql for unit testing my api servers with barely any mocks used at all.
I suppose given this is under the AWS Labs org, they don’t really care about non-AWS S3 implementations.
[0] https://github.com/awslabs/git-remote-s3?tab=readme-ov-file#...
I believe moto has an "embedded" version such that one need not even have in listen on a network port, but I find it much, much less mental gymnastics to just supersede the "endpoint" address in the actual AWS SDKs to point to 127.0.0.1:4566 and off to the races. The AWS SDKs are even so friendly as to not mandate TLS or have allowlists of endpoint addresses, unlike their misguided Azure colleagues
1: https://docs.gitlab.com/ee/user/infrastructure/iac/terraform...
Doesn't S3 provide primitives to do the same? At least since moving to strong read-after-write consistency?
PS: I wrote the above package. Happy to answer questions about it.
But I'm still confused as to what is dvc is after a cursory glance at their homepage.
They address this directly in their section on concurrent writes: https://github.com/awslabs/git-remote-s3?tab=readme-ov-file#...
And in their design: https://github.com/awslabs/git-remote-s3?tab=readme-ov-file#...
But it seems like this is just the wrong tool for the job (hosting git repos).
S3 standard, which is likely what people would use for git storage, doesn't have that minimum file size charge.
See the asterisk sections in https://aws.amazon.com/s3/pricing/
$ brew create --python --set-license Apache-2 https://github.com/awslabs/git-remote-s3/archive/refs/tags/v0.1.19.tar.gz
Formula name [git-remote-s3]:
==> Downloading https://github.com/awslabs/git-remote-s3/archive/refs/tags/v0.1.19.tar.gz
==> Downloading from https://codeload.github.com/awslabs/git-remote-s3/tar.gz/refs/tags/v0.1.19
##O=-# #
Warning: Cannot verify integrity of '84b0a9a6936ebc07a39f123a3e85cd23d7458c876ac5f42e9f3ffb027dcb3a0f--git-remote-s3-0.1.19.tar.gz'.
No checksum was provided.
For your reference, the checksum is:
sha256 "3faa1f9534c4ef2ec130fac2df61428d4f0a525efb88ebe074db712b8fd2063b"
==> Retrieving PyPI dependencies for "https://github.com/awslabs/git-remote-s3/archive/refs/tags/v0.1.19.tar.gz"...
==> Retrieving PyPI dependencies for excluded ""...
==> Getting PyPI info for "boto3==1.35.44"
==> Getting PyPI info for "botocore==1.35.44"
==> Excluding "git-remote-s3==0.1.19"
==> Getting PyPI info for "jmespath==1.0.1"
==> Getting PyPI info for "python-dateutil==2.9.0.post0"
==> Getting PyPI info for "s3transfer==0.10.3"
==> Getting PyPI info for "six==1.16.0"
==> Getting PyPI info for "urllib3==2.2.3"
==> Updating resource blocks
Please run the following command before submitting:
HOMEBREW_NO_INSTALL_FROM_API=1 brew audit --new git-remote-s3
Editing /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula/g/git-remote-s3.rb
They also support building from git directly, if you want to track non-tagged releases (see the "--head" option to create)I’ve used this guy’s CloudFormation template since forever for LFS on S3.
GitHub has to lower its egregious LFS pricing.
GCS also allows for conditional overwrites using `If-Match: <etag>` which means you can do optimistic concurrency control. https://cloud.google.com/storage/docs/request-preconditions
Been using it to store datasets via lfs. Written in rust and has been very reliable.