To manually load a file into Hadoop, you should first load the file to the name node server. Using files on the name server, you can load files into the Hadoop file system (HDFS) using one of the two commands at the Hadoop command prompt. While this is not ideal for most data-loading requirements, this technique is good for development exercises and other one-off situations when the data file is small enough to fit the name node.
To demonstrate the manual loading of files, we will load the Integers.txt files from the desktop development environment's name node server, which were created or downloaded in previous posts in this series. Make sure to place the Integers.txt file in the C:temp folder of the name node server, or change the following appropriate statement:
Note the steps described here will be used to use HDFS or AVS as the underlying Hadoop data storage mechanism.
1. Start the Hadoop command prompt from the desktop:
If you configure environment variables, start the DOS window directly
2. From the Hadoop command prompt, create the/demo/simple/in folder structure in the Hadoop file system by issuing the following command:
Hadoop fs-mkdir/demo/simple/in
3. Create an Out folder under/demo/simple using the following command:
Hadoop fs-mkdir/demo/simple/out
Note that this folder will be used for future exercises.
4. Load the Integers.txt files from the local file system into the/demo/simple/in in the Hadoop file system using the following command:
Hadoop fs-put "C:tempintegers.txt"/demo/simple/in
5. Verify that the Integers.txt file is in the/demo/simple/in folder by issuing the following command:
Hadoop fs-ls/demo/simple/in
The result is as shown in figure: