Package org.apache.hadoop.hbase.backup
Class TestHFileArchiving
java.lang.Object
org.apache.hadoop.hbase.backup.TestHFileArchiving
Test that the
HFileArchiver
correctly removes all the parts of a region when cleaning up
a region-
Nested Class Summary
Modifier and TypeClassDescriptionprivate static interface
-
Field Summary
Modifier and TypeFieldDescriptionstatic final HBaseClassTestRule
private static final org.slf4j.Logger
org.junit.rules.TestName
private static org.apache.hadoop.hbase.master.cleaner.DirScanPool
private static final byte[]
private static final HBaseTestingUtility
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionprivate void
assertArchiveFiles
(org.apache.hadoop.fs.FileSystem fs, List<String> storeFiles, long timeout) static void
private void
getAllFileNames
(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path archiveDir) Get the names of all the files below the given directoryprivate List<org.apache.hadoop.hbase.regionserver.HRegion>
initTableForArchivingRegions
(org.apache.hadoop.hbase.TableName tableName) recurseOnFiles
(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.FileStatus[] files, List<String> fileNames) Recursively lookup all the file names under the file[] arraystatic void
Setup the config for the clusterprivate static void
setupConf
(org.apache.hadoop.conf.Configuration conf) void
tearDown()
void
void
Test that the store files are archived when a column family is removed.void
void
private void
void
void
void
void
void
void
void
private void
testArchiveStoreFilesDifferentFileSystems
(String walDir, String expectedBase, boolean archiveFileExists, boolean sourceFileExists, boolean archiveFileDifferentLength, TestHFileArchiving.ArchivingFunction<org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.FileSystem, org.apache.hadoop.hbase.client.RegionInfo, org.apache.hadoop.fs.Path, byte[], Collection<org.apache.hadoop.hbase.regionserver.HStoreFile>> archivingFunction) private void
testArchiveStoreFilesDifferentFileSystems
(String walDir, String expectedBase, TestHFileArchiving.ArchivingFunction<org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.FileSystem, org.apache.hadoop.hbase.client.RegionInfo, org.apache.hadoop.fs.Path, byte[], Collection<org.apache.hadoop.hbase.regionserver.HStoreFile>> archivingFunction) void
void
void
void
void
void
void
Test HFileArchiver.resolveAndArchive() race condition HBASE-7643void
Test that the region directory is removed when we archive a region without store files, but still has hidden files.void
-
Field Details
-
CLASS_RULE
-
LOG
-
UTIL
-
TEST_FAM
-
POOL
-
name
-
-
Constructor Details
-
TestHFileArchiving
public TestHFileArchiving()
-
-
Method Details
-
setupCluster
Setup the config for the cluster- Throws:
Exception
-
setupConf
-
tearDown
- Throws:
Exception
-
cleanupTest
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsWallWithSchemaPlainRoot
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsWallNullPlainRoot
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsWallAndRootSame
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsFileAlreadyArchived
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsArchiveFileMatchCurrent
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystemsArchiveFileMismatch
- Throws:
Exception
-
testArchiveStoreFilesDifferentFileSystems
private void testArchiveStoreFilesDifferentFileSystems(String walDir, String expectedBase, TestHFileArchiving.ArchivingFunction<org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.FileSystem, throws IOExceptionorg.apache.hadoop.hbase.client.RegionInfo, org.apache.hadoop.fs.Path, byte[], Collection<org.apache.hadoop.hbase.regionserver.HStoreFile>> archivingFunction) - Throws:
IOException
-
testArchiveStoreFilesDifferentFileSystems
private void testArchiveStoreFilesDifferentFileSystems(String walDir, String expectedBase, boolean archiveFileExists, boolean sourceFileExists, boolean archiveFileDifferentLength, TestHFileArchiving.ArchivingFunction<org.apache.hadoop.conf.Configuration, org.apache.hadoop.fs.FileSystem, throws IOExceptionorg.apache.hadoop.hbase.client.RegionInfo, org.apache.hadoop.fs.Path, byte[], Collection<org.apache.hadoop.hbase.regionserver.HStoreFile>> archivingFunction) - Throws:
IOException
-
testArchiveRecoveredEditsWalDirNull
- Throws:
Exception
-
testArchiveRecoveredEditsWalDirSameFsStoreFiles
- Throws:
Exception
-
testArchiveRecoveredEditsWalDirNullOrSame
- Throws:
Exception
-
testArchiveRecoveredEditsWrongFS
- Throws:
Exception
-
testArchiveRecoveredEditsWalDirDifferentFS
- Throws:
Exception
-
testRemoveRegionDirOnArchive
- Throws:
Exception
-
testDeleteRegionWithNoStoreFiles
Test that the region directory is removed when we archive a region without store files, but still has hidden files.- Throws:
Exception
-
initTableForArchivingRegions
private List<org.apache.hadoop.hbase.regionserver.HRegion> initTableForArchivingRegions(org.apache.hadoop.hbase.TableName tableName) throws IOException - Throws:
IOException
-
testArchiveRegions
- Throws:
Exception
-
testArchiveRegionsWhenPermissionDenied
- Throws:
Exception
-
testArchiveOnTableDelete
- Throws:
Exception
-
assertArchiveFiles
private void assertArchiveFiles(org.apache.hadoop.fs.FileSystem fs, List<String> storeFiles, long timeout) throws IOException - Throws:
IOException
-
testArchiveOnTableFamilyDelete
Test that the store files are archived when a column family is removed.- Throws:
Exception
-
testCleaningRace
Test HFileArchiver.resolveAndArchive() race condition HBASE-7643- Throws:
Exception
-
testArchiveRegionTableAndRegionDirsNull
- Throws:
IOException
-
testArchiveRegionWithTableDirNull
- Throws:
IOException
-
testArchiveRegionWithRegionDirNull
- Throws:
IOException
-
clearArchiveDirectory
- Throws:
IOException
-
getAllFileNames
private List<String> getAllFileNames(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.Path archiveDir) throws IOException Get the names of all the files below the given directory- Parameters:
fs
- the file system to inspectarchiveDir
- the directory in which to look- Returns:
- a list of all files in the directory and sub-directories
- Throws:
IOException
-
recurseOnFiles
private List<String> recurseOnFiles(org.apache.hadoop.fs.FileSystem fs, org.apache.hadoop.fs.FileStatus[] files, List<String> fileNames) throws IOException Recursively lookup all the file names under the file[] array- Throws:
IOException
-