Solve some minor hadoop problems (finishing Version)

Source: Internet
Author: User
1 dfsadmin-setquota Problem
Dfsadmin-setquota limit the number of files
Dfsadmin-setspacequota limits disk space


2 solve the hadoop small file problem?
The default size of a data block is 64 mb. If the size of a file is smaller than 64 MB, it is a small hadoop file.
This will waste space, so we need to use archive to merge small files. The data block size can be used.
The DFS. Block. Size attribute is configured.


3 start-dfs.sh alarm information
Unable to load native-hadoop library for your platform...
Using built-in Java classes
The native library is not found and the built-in Java code is used. This shows that you have not compiled the hadoop system completely.
It is best to download the source code and compile the source code step by step to obtain the final compiled file.

Solve some minor hadoop problems (finishing Version)

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.