Getting extra information about S3 buckets on NetApp object storage

3 months ago
9

Correction: it's not just NetApp, but I don't want people to come here for AWS S3 and get pissed off. In reality the approach works with any S3-compatible storage - basically, if the bucket isn't huge, you may be able to list all objects every day or at least every week, and then either stuff that data into some logging platform (such as Loki) or store it in a database (e.g. InfluxDB).

The first part also highlights that - on ONTAP or StorageGRID - you can store Loki data on S3 as well, which is shown in the demo.

In the second half of this video I wanted to point out that having data in a DB makes it possible to avoid constant live "re-querying" (especially no bueno if you have buckets with many objects), but also that you can expose the same DB to any other app/front-end, rather than have the information just in your logs (here Loki).

I muse about that in this blog post. The same (or similar) applies to SMB and NFS. It's a bit longish or "ranty" if you will, but I was excited that this is actually feasible for buckets and shares far larger than I expected.

https://scaleoutsean.github.io/2025/06/05/simple-filesystem-and-s3-analytics-and-mcp.html

Loading 3 comments...