AHAR is a tool for Hadoop Distributed File System (HDFS) that allows you to add new files to an already existing Hadoop Archive (HAR). AHAR is executable without any HDFS modification.
https://github.com/joh-mue/ahar
AHAR appends the binary data of the new files to one of the part-n files of the HAR archive using a first fit algorithm. Afterwards, the index entries in the masterindex and index files are updated and rewritten to HDFS.
AHAR uses HDFS append method to append new data to an existing HAR part files. Thus, you must have a cluster with three or more DataNodes or set part file replication to one for testing AHAR on a small cluster HDFS-4600 or HDFS-8960.