More MinIO Data Options - The MinIO FTP/SFTP Server
MinIO has added support for FTP and SFTP into the MinIO Server.
Read more
MinIO has added support for FTP and SFTP into the MinIO Server.
Read more
What is ArgoCD? In short, it's a GitOps continuous deployment tool that stores the state of the infrastructure in a Git repository and automates deployment by tracking the changes between the existing and new deployment configurations.
Read more
I wanted to share my thoughts on the semi-annual confab that is Kubecon, this one the European edition. These are fairly candid takes, I can be critical or complementary, but given how important this space is to us, it is worthy of analysis. Let’s get one thing out of the way. This was a superb Kubecon. The location was
Read more
Apache Kafka is an open-source distributed event streaming platform that is used for building real-time data pipelines and streaming applications. It was originally developed by LinkedIn and is now maintained by the Apache Software Foundation. Kafka is designed to handle high volume, high throughput, and low latency data streams, making it a popular choice for building scalable and reliable data
Read more
We are back with another educational course in our rapidly expanding repertoire on our YouTube channel. This month, MinIO’s Will Dinyes is discussing Object Management for those who have set up their object store and want to learn more about how to set up an efficient and sustainable data lifecycle management strategy. This 11-part series spanning just over an
Read more
Build your on-prem data lake with Apache Iceberg, Dremio and MinIO
Read more
Learn how to get started with Dremio and MinIO on Kubernetes for fast, scalable analytics.
Read more
In this post we’ll talk about Erasure Coding and Erasure Sets, and then dive deeper into how to use the Erasure Code Calculator when designing deployments to make the most out of MinIO by opting for the right hardware configuration setup from the get go.
Read moreAfter listening to community feedback, we feel as if an update is warranted on a number of fronts including Weka. We wanted to take the opportunity to clarify some language we used - specifically around revocation. At the outset we want to make clear: Any company is free to use MinIO’s licensed code for commercial purposes, even for competitive
Read more
Let's review some of the tools available to get data out of S3, local FileSystem, NFS, Azure, GCP, Hitachi Content Platform, Ceph, and others, and into MinIO clusters.
Read more
In this blog post, we will build a Notebook that uses MinIO as object storage for Spark jobs to manage Iceberg tables.
Read more
Learn how MinIO simplifies and streamlines the generation and assignment of TLS certificates for services running on Kubernetes.
Read more
What if you had a personal health monitor that kept tabs on your well-being, without any effort on your part? Imagine a system that tracks your health daily, highlighting even the slightest problems, so that you can seek help before minor issues become major concerns.
Read more
GitLab can use MinIO as its object storage backend to store large files such as artifacts, Docker images, and Git LFS files. Given the right underlying hardware, MinIO provides the performance and scale to support any modern workload, including GitLab.
Read moreAt MinIO, we are dedicated to the principles of open source software. From the beginning, we’ve remained committed to this philosophy and that’s why you will find that the upstream and the commercial code are exactly the same. We are obligated to protect our software - particularly from companies that package it in their proprietary products and pass
Read more
Time to first byte is a key performance metric for video streaming. Learn how MinIO improves customer experience and reduces churn.
Read more
We’ve joined forces with Rafay to develop this tutorial to show you how to make the most of multi-cloud Kubernetes using Rafay to deploy, update and manage Kubernetes and applications using MinIO for object storage.
Read more
MinIO's Object Lambda implementation allows for the transformation of your data to serve unique data format requirements on an application-by-application basis. For example, a dataset created by an e-commerce application might include personally identifiable information (PII). When the same data is processed for analytics, PII should be redacted. However, if the same dataset is used for a marketing
Read more
Apache Spark and MinIO are powerful tools for data lakes and analytics. Learn how to run them in Kubernetes.
Read more
This post focuses on some of the features associated with this unique model and is the second in a series that details the features and capabilities that come with a commercial relationship.
Read more