当前位置:网站首页>Sqoop imports data from Mysql to HDFS using lzop compression format and reports NullPointerException

Sqoop imports data from Mysql to HDFS using lzop compression format and reports NullPointerException

2022-04-23 20:11:00 My brother is not strong enough to fight

The specific errors are as follows :

Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

2021-12-21 11:14:28,253 INFO mapreduce.Job: Task Id : attempt_1639967851440_0006_m_000000_1, Status : FAILED
Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

2021-12-21 11:14:31,265 INFO mapreduce.Job: Task Id : attempt_1639967851440_0006_m_000000_2, Status : FAILED
Error: java.lang.NullPointerException
        at com.hadoop.mapreduce.LzoSplitRecordReader.initialize(LzoSplitRecordReader.java:63)
        at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:560)
        at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:798)
        at org.apache.hadoop.mapred.MapTask.run(MapTask.java:347)
        at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
        at java.security.AccessController.doPrivileged(Native Method)
        at javax.security.auth.Subject.doAs(Subject.java:422)
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
        at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)

2021-12-21 11:14:36,283 INFO mapreduce.Job:  map 100% reduce 0%
2021-12-21 11:14:37,292 INFO mapreduce.Job: Job job_1639967851440_0006 failed with state FAILED due to: Task failed task_1639967851440_0006_m_000000
Job failed as tasks failed. failedMaps:1 failedReduces:0 killedMaps:0 killedReduces: 0

2021-12-21 11:14:37,344 INFO mapreduce.Job: Counters: 9
        Job Counters
                Failed map tasks=4
                Launched map tasks=4
                Other local map tasks=3
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=11344
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=5672
                Total vcore-milliseconds taken by all map tasks=5672
                Total megabyte-milliseconds taken by all map tasks=5808128
2021-12-21 11:14:37,345 ERROR lzo.DistributedLzoIndexer: DistributedIndexer job job_1639967851440_0006 failed.

resolvent :

stay core-site.xml Add configuration support LZO Compression configuration is enough .

<configuration>
    <property>
        <name>io.compression.codecs</name>
        <value>
            org.apache.hadoop.io.compress.GzipCodec,
            org.apache.hadoop.io.compress.DefaultCodec,
            org.apache.hadoop.io.compress.BZip2Codec,
            org.apache.hadoop.io.compress.SnappyCodec,
            com.hadoop.compression.lzo.LzoCodec,
            com.hadoop.compression.lzo.LzopCodec
        </value>
    </property>

    <property>
        <name>io.compression.codec.lzo.class</name>
        <value>com.hadoop.compression.lzo.LzoCodec</value>
    </property>
</configuration>

版权声明
本文为[My brother is not strong enough to fight]所创,转载请带上原文链接,感谢
https://yzsam.com/2022/04/202204210556214387.html

随机推荐