Write more verbose, if you are eager to find the answer directly to see the bold part of the ....
(PS: What is written here is all the content in the official document of the 2.5.2, the problem I encountered when I did it)
When you execute a mapreduce job locally, you encounter the problem of No such file or directory, follow the steps in the official documentation:
1. Formatting Namenode
Bin/hdfs Namenode-format
2. Start the Namenode and Datanode daemon threads
sbin/start-dfs.sh
3. If the boot succeeds: Enter http://localhost:50070/in the browser then log on to Namenode
4. Establish the directory required to execute the MapReduce job
Bin/hdfs Dfs-mkdir/user (Note: There are spaces between-mkdir and/user
Bin/hdfs dfs-mkdir/user/User name 5. Copy the input file to DFS bin/hdfs dfs-put etc/hadoop input (Etc/hadoop is the input file Input is the destination folder)
error: Put: ' input ': No such file or directory Workaround: Add/in front of input and change to Bin/hdfs dfs-put etc/hadoop/input (here, "/" in front of input)
6. Running the provided example
Bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.5.2.jar grep input Output ' dfs[a-z. +'