MCR problems with linux/hadoop

2 vues (au cours des 30 derniers jours)
Evan
Evan le 15 Juil 2013
I am trying to run a program called Phenoripper(image analysis software) that utilizes MCR inside of a Hadoop multi-node setup (version 1.1.2) on Ubuntu (13.04). The hadoop setup works perfectly with generic python/java mapreduce jobs; however, when I try to use phenoripper in the setup I get this error:
boost::filesystem::create_directory:permission denied
I have asked the devs of Phenoripper about this and they have assured me that their program is not creating or writing to temporary directories and since it works on a single computer perfectly I believe that MCR is responsible for this error. If I understand correctly, it is trying to create a temporary directory somewhere where it doesn't have permission, likely on one of the slave nodes. Does anyone have any idea where this directory might be located or how to find it? If I can find this directory, my solution would be to permanently create it with appropriate permissions. Does this sound like a viable solution? Any ideas as to what on earth is going on/how to fix it would be much appreciated!

Réponse acceptée

Evan
Evan le 15 Juil 2013
Problem fixed...the most recent version of MCR as of this post has a major error in which some process looks for a /homes/ directory with the .matlab directory in it...Some programmer must've just put an extra s in there equaling days of pain. Creating that directory under 777 permissions and putting the .matlab folder in it fixed the problem. This is a major bug for any distrobuted computing tasks attempting to use MCR and should be fixed in the next release

Plus de réponses (1)

Rick Amos
Rick Amos le 12 Nov 2014
This behavior arises because Hadoop defines the 'HOME' environment variable of task processes to be '/homes' by default. MATLAB requires the 'HOME' environment variable to point to a valid writable directory. If you have control of the Hadoop configuration for the cluster, an alternative solution is to set one of the following two Hadoop configuration properties:
mapreduce.admin.user.home.dir (for Hadoop v1.X)
yarn.nodemanager.user-home-dir (for Hadoop v2.X)
For example, if the Hadoop cluster is running as 'hduser', one option is to set these properties to '/home/hduser'.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by