Solution: /tmp/hive is temporary directory. Only temporary files are kept in this location. No problem even if we delete this directory, will be created when required with proper permissions.
Step 1) In hdfs, Remove the /tmp/hive directory ==> "hdfs dfs -rm -r /tmp/hive"
2) At OS level too, delete the dir /tmp/hive ==> rm -rf /tmp/hive
After this, started the spark-shell and it worked fine..
No comments:
Post a Comment