Automated Migration from Hitachi Content Platform to MinIO

Automated Migration from Hitachi Content Platform to MinIO

We wrote the HCP-to-MinIO tool to help our customers migrate data from HCP object storage to MinIO. The tool can be freely downloaded from its GitHub repo.

We developed this tool to make life easier for the many customers who came to us as they transitioned away from their HCP object storage environments. The reasons are varied. Most often, it is to move to a platform that is software-defined as well as cloud-and-Kubernetes-native. These are requirements for the modern application stack that our customers use. The second most common reason is to improve performance-at-scale for mixed object sizes, both very small and very large. What our customers tell us echoes the larger trend in the market – dedicated storage appliances are being put out to pasture en masse as enterprises embrace software-defined S3-compatible object storage for greater flexibility and cloud neutrality.

HCP-to-MinIO Migration

During the migration process, we recommend not writing to HCP and writing only to MinIO. Customers usually start treating MinIO as the primary object store immediately. While objects are being migrated to MinIO, getobject calls are first made to MinIO, and if the object does not exist in MinIO, then the object is read from the HCP object store. Once all the objects are copied to MinIO, this fall back option can be removed from the code. While copying objects, the tool maintains the modified timestamp of the objects as it was in HCP.

The tool is first used to make a list of the objects stored in HCP object storage. Then, the list is used to download the objects from HCP object storage and copied over to MinIO. The tool has two commands (list and migrate) to accomplish this.

The list command produces a list of objects stored in HCP and writes it to a file on a local drive. The list file is then split into multiple smaller list files and the HCP-to-MinIO tool is then executed from multiple client machines. This way multiple nodes are used to download the objects from HCP and migrate the data to MinIO, allowing customers to saturate the network and migrate as much data as possible in the shortest amount of time.

Once the migration is done, applications will use S3-compatible APIs to talk to MinIO. Data is now available for cloud-native analytics and AI/ML frameworks.

Usage for the list and migrate commands are given below. Download the HCP-to-MinIO tool to your workstation, then run list, followed by migrate.

list: List objects in HCP namespace and download to disk

NAME:
  hcp-to-minio list - List objects in HCP namespace and download to disk

USAGE:
  hcp-to-minio list --auth-token --namespace-url --host-header --dir

FLAGS:
  --auth-token value, -a value     authorization token for HCP
  --namespace-url value, -n value  namespace URL path, e.g https://namespace-name.tenant-name.hcp-domain-name/rest
  --host-header value              host header for HCP
  --data-dir value, -d value       path to work directory for tool
  --insecure, -i                   disable TLS certificate verification
  --log, -l                        enable logging
  --debug                          enable debugging
  --help, -h                       show help
  

EXAMPLES:
1. List objects in HCP namespace https://hcp-vip.example.com and download list to /tmp/data
     $ hcp-to-minio list --a "HCP bXl1c2Vy:3f3c6784e97531774380db177774ac8d" --host-header "HOST:s3testbucket.tenant.hcp.example.com" \
                  --namespace-url "https://hcp-vip.example.com" --dir "/tmp/data"

migrate - Migrate objects from HCP object store to MinIO

NAME:
  hcp-to-minio migrate - Migrate objects from HCP object store to MinIO

USAGE:
  hcp-to-minio migrate --auth-token --namespace-url --host-header --data-dir

FLAGS:
  --auth-token value, -a value     authorization token for HCP
  --namespace-url value, -n value  namespace URL path, e.g https://namespace-name.tenant-name.hcp-domain-name/rest
  --host-header value              host header for HCP
  --data-dir value, -d value       path to work directory for tool
  --bucket                         bucket name on HCP
  --input-file                     file that contains the list of objects to be migrated from HCP to MinIO
  --insecure, -i                   disable TLS certificate verification
  --log, -l                        enable logging
  --debug                          enable debugging
  --help, -h                       show help

You can set the following configuration prior to beginning migration:

$ export MINIO_ACCESS_KEY=<Your-MinIO-Access-Key> 
$ export MINIO_SECRET_KEY=<Your-MinIO-Secret-Key> 
$ export MINIO_ENDPOINT=https://<Your-MinIO-IP-Address>:9000
$ export MINIO_BUCKET=newbucket 
# optional, if unspecified HCP bucket name is used

Create the temporary directory to house the list:

$ mkdir /tmp/data 
# temporary dir where output of listing is stored.

Begin migration:

$ hcp-to-minio migrate --namespace-url https://finance.europe.hcp.example.com
--auth-token "HCP bXl1c2Vy:3f3c6784e97531774380db177774ac8d"
--host-header "s3testbucket.sandbox.hcp.example.com"
--data-dir /mnt/data
--bucket s3testbucket
--input-file /tmp/data/to-migrate.txt

Migrate to MinIO Today

Upgrading your infrastructure and application stack can be challenging. We created the HCP-to-MinIO migration tool to make it easier for you and your organization to embrace the technologies that are driving the cloud forward – the S3 API and Kubernetes. It’s time to break free from on-premise storage appliances and experience the fastest and most-scalable object storage on the planet.

Download MinIO and the HCP-to-MinIO tool and get started today. As always, if you need help planning or have additional questions, please email us at hello@min.io.