Web Crawler fails in GPFS with permission issues
The Web Crawler application fails in GPFS with permission issues.
When running the Web Crawler application in GPFS, the crawler creates two directories, /tmp/web-crawler and /tmp/mapred, with 750 permissions.
The application will run successfully the first time you executes it but will fail due to permission issues when other users try to run it.
Diagnosing the problem
The JobTracker log will display an error message.
Resolving the problem
As a workaround, before running the application, set permission to 777 for the /tmp/web-crawler and /tmp/mapred directories.
Translate this page: