Package org.apache.hadoop.hbase.test
Class IntegrationTestBigLinkedList.Verify.VerifyReducer
java.lang.Object
org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>
org.apache.hadoop.hbase.test.IntegrationTestBigLinkedList.Verify.VerifyReducer
- Enclosing class:
- IntegrationTestBigLinkedList.Verify
public static class IntegrationTestBigLinkedList.Verify.VerifyReducer
extends org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable,org.apache.hadoop.io.BytesWritable>
Per reducer, we output problem rows as byte arrays so can be used as input for subsequent
investigative mapreduce jobs. Each emitted value is prefaced by a one byte flag saying what
sort of emission it is. Flag is the Count enum ordinal as a short.
-
Nested Class Summary
Nested classes/interfaces inherited from class org.apache.hadoop.mapreduce.Reducer
org.apache.hadoop.mapreduce.Reducer.Context
-
Field Summary
Modifier and TypeFieldDescriptionprivate org.apache.hadoop.hbase.client.Connection
private final org.apache.hadoop.io.BytesWritable
private ArrayList<byte[]>
private AtomicInteger
private final org.apache.hadoop.io.BytesWritable
-
Constructor Summary
-
Method Summary
Modifier and TypeMethodDescriptionstatic byte[]
addPrefixFlag
(int ordinal, byte[] r) Returns new byte array that hasordinal
as prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byr
protected void
cleanup
(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) private StringBuilder
dumpExtraInfoOnRefs
(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any.static byte[]
getRowOnly
(org.apache.hadoop.io.BytesWritable bw) Returns Row bytes minus the type flag.void
reduce
(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) protected void
setup
(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) whichType
(byte[] bs) Returns type from the Counts enum of this row.Methods inherited from class org.apache.hadoop.mapreduce.Reducer
run
-
Field Details
-
refs
-
UNREF
-
LOSTFAM
-
rows
-
connection
-
-
Constructor Details
-
VerifyReducer
public VerifyReducer()
-
-
Method Details
-
setup
protected void setup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
setup
in classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,
org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOException
InterruptedException
-
cleanup
protected void cleanup(org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
cleanup
in classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,
org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOException
InterruptedException
-
addPrefixFlag
Returns new byte array that hasordinal
as prefix on front taking up Bytes.SIZEOF_SHORT bytes followed byr
-
whichType
Returns type from the Counts enum of this row. Reads prefix added byaddPrefixFlag(int, byte[])
-
getRowOnly
Returns Row bytes minus the type flag. -
reduce
public void reduce(org.apache.hadoop.io.BytesWritable key, Iterable<org.apache.hadoop.io.BytesWritable> values, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOException, InterruptedExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context) - Overrides:
reduce
in classorg.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable,
org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable> - Throws:
IOException
InterruptedException
-
dumpExtraInfoOnRefs
private StringBuilder dumpExtraInfoOnRefs(org.apache.hadoop.io.BytesWritable key, org.apache.hadoop.mapreduce.Reducer<org.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable, throws IOExceptionorg.apache.hadoop.io.BytesWritable, org.apache.hadoop.io.BytesWritable>.org.apache.hadoop.mapreduce.Reducer.Context context, List<byte[]> refs) Dump out extra info around references if there are any. Helps debugging.- Returns:
- StringBuilder filled with references if any.
- Throws:
IOException
-