Matlab Error : parallel:internal:DeserializationException ??
4 vues (au cours des 30 derniers jours)
Afficher commentaires plus anciens
Pulkesh Haran
le 1 Mai 2017
Commenté : Walter Roberson
le 1 Mai 2017
While implementing 50K images using matlab+hadoop integration we faced this Error ??
how to solve this problem...
Matlab Exception : parallel:internal:DeserializationException
Error =
MException with properties:
identifier: 'parallel:internal:DeserializationException'
message: 'Deserialization threw an exception.'
cause: {0×1 cell}
stack: [3×1 struct]
Hadoop Data Node Log File :
2017-05-01 11:02:07,299 INFO [main] org.apache.hadoop.metrics2.impl.MetricsConfig: loaded properties from hadoop-metrics2.properties
2017-05-01 11:02:07,416 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s).
2017-05-01 11:02:07,416 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system started
2017-05-01 11:02:07,425 INFO [main] org.apache.hadoop.mapred.YarnChild: Executing with tokens:
2017-05-01 11:02:07,425 INFO [main] org.apache.hadoop.mapred.YarnChild: Kind: mapreduce.job, Service: job_1493614370302_0004, Ident: (org.apache.hadoop.mapreduce.security.token.JobTokenIdentifier@6fc2862b)
2017-05-01 11:02:07,587 INFO [main] org.apache.hadoop.mapred.YarnChild: Sleeping for 0ms before retrying again. Got null now.
2017-05-01 11:02:09,429 INFO [main] org.apache.hadoop.mapred.YarnChild: mapreduce.cluster.local.dir for child: /tmp/hadoop-nitw_viper_user/nm-local-dir/usercache/nitw_viper_user/appcache/application_14936$
2017-05-01 11:02:10,133 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2017-05-01 11:02:10,610 INFO [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: File Output Committer Algorithm version is 1
2017-05-01 11:02:10,762 INFO [main] org.apache.hadoop.mapred.Task: Using ResourceCalculatorProcessTree : [ ]
2017-05-01 11:02:10,990 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: hdfs://master:9000/images/39/394286.jpg:0+280234
2017-05-01 11:02:11,014 INFO [main] org.apache.hadoop.mapred.MapTask: (EQUATOR) 0 kvi 26214396(104857584)
2017-05-01 11:02:11,014 INFO [main] org.apache.hadoop.mapred.MapTask: mapreduce.task.io.sort.mb: 100
2017-05-01 11:02:11,014 INFO [main] org.apache.hadoop.mapred.MapTask: soft limit at 83886080
2017-05-01 11:02:11,014 INFO [main] org.apache.hadoop.mapred.MapTask: bufstart = 0; bufvoid = 104857600
2017-05-01 11:02:11,014 INFO [main] org.apache.hadoop.mapred.MapTask: kvstart = 26214396; length = 6553600
2017-05-01 11:02:11,017 INFO [main] org.apache.hadoop.mapred.MapTask: Map output collector class = org.apache.hadoop.mapred.MapTask$MapOutputBuffer
2017-05-01 11:03:36,877 INFO [main] org.apache.hadoop.mapred.MapTask: Starting flush of map output
2017-05-01 11:03:46,807 WARN [main] org.apache.hadoop.mapred.YarnChild: Exception running child : com.mathworks.toolbox.parallel.hadoop.worker.RemoteFuture$CommunicationLostException
at com.mathworks.toolbox.parallel.hadoop.worker.RemoteFuture.get(Unknown Source)
at com.mathworks.toolbox.parallel.hadoop.worker.RemoteFuture.get(Unknown Source)
at com.mathworks.toolbox.parallel.hadoop.link.MatlabWorkerFevalFuture.get(Unknown Source)
at com.mathworks.toolbox.parallel.hadoop.link.MatlabMapper.map(Unknown Source)
at com.mathworks.toolbox.parallel.hadoop.link.MatlabMapper.map(Unknown Source)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at com.mathworks.toolbox.parallel.hadoop.MatlabReflectionMapper.run(Unknown Source)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:164)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
2017-05-01 11:03:49,307 INFO [main] org.apache.hadoop.mapred.Task: Runnning cleanup for the task
2017-05-01 11:03:55,474 WARN [main] org.apache.hadoop.mapreduce.lib.output.FileOutputCommitter: Could not delete hdfs://master:9000/corel_image_10L_seq_8/_temporary/1/_temporary/attempt_1493614370302_0004$
2017-05-01 11:03:56,752 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics system...
2017-05-01 11:03:57,366 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system stopped.
2017-05-01 11:03:57,388 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system shutdown complete.
Thanks
0 commentaires
Réponse acceptée
Walter Roberson
le 1 Mai 2017
possibly you ran out of memory: that would prevent deserialization.
2 commentaires
Walter Roberson
le 1 Mai 2017
I would recommend talking to Mathworks about this. I do not have any experience with mapreduce.
Plus de réponses (0)
Voir également
Catégories
En savoir plus sur MapReduce dans Help Center et File Exchange
Produits
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!