Opening:
Checked the document for two days, error 38 times, and finally simply remove all the additional conditions,
Not even the interface, in the command line run through a lump of the most withered upload flow!
Still braving the heat ...
Make a record here, tomorrow can be paired with the beautiful interface to continue debugging.
Recently, the mood of repression was immediately 10%.
1. Registration and related configuration:
Sign up for an Amazon account, if you often buy and buy on Amazon,
You already have an Amazon retail account that you can sign in directly to your AWS account,
But how is it possible to become such a large account without bleeding?
In the process, it costs $1 to certify ...
After I entered the credit card information, I deducted $1 without verification, and it was a good 1 dollars to scare the urine.
Sign in to security Credentials and create a new access Key.
If you are using Linux or OS X, write in the ~/.aws/credentials file:
[Default]
aws_access_key_id = Your Key ID
Aws_secret_access_key = Your access key
If you use Windows, we don't want to be friends anymore ...
C:\Users\USER_NAME\.aws\credentials for Windows Users ...
2. Installation dependencies
mkdir a directory such as name: Myaws
Create Package.json and write:
{
"Dependencies": {
"AWS-SDK": ">= 2.0.9",
"Node-uuid": ">= 1.4.1"
}
}
This is the two dependencies we need to use.
After saving it can be handsome to go to NPM install!
3. Test upload in command line
New app.js, with node of course is first require:
var AWS = require (' aws-sdk ');
var uuid = require (' Node-uuid ');
Then cache your BUCKET name: var bucket_name = "Testupload";
(You can use the V4 of the UUID to add a random string to the bucket name followed by a test to find that it can pass without it)
This bucket can be created manually in the console admin background of AWS,
can also be in the JS code createbucket(recommended first checkbucketexists).
then var s3 = new AWS. S3 (); A S3 client was created.
Because of the time, today only in the command line to complete the upload, no use of the interface,
So write only one of the simplest TXT files as a test:
var keyName = "Download-me.txt";
var keybody = "Thank you for downloading me!";
Now it's time to go to bed properly:
...... Uploaded by:
S3.createbucket ({Bucket:bucket_name}, function () {
var params = {
Bucket:bucket_name,
Key:keyname,
ACL: ' Public-read ',
Body:keybody
};
(I just commented t_t:) s3.putobject (params, function (err, data) {/* PutObject can only send back Etag, no location. */
S3.upload (params, function (err, data) {
if (err) {
Console.log ("error! Err =====> ", err);
}else{
var url = data[' location '];
Console.log ("Successfully uploaded! URL =====> ", url);
}
});
});
The remark: PutObject because its callback returned only the ETag, no location, so I replaced it with upload.
Parameter ACLs can set the file access permissions, if not set up here, you must find the corresponding bucket in the console to add policy,
Otherwise, you will not be able to access the uploaded file.
Adding policy is an incredibly abusive process, so don't go into it today and avoid vomiting ...
4. Conclusion
This is the simplest and most withered upload process.
I'm so hungry ...
Use node to complete the world's simplest version of the upload process for AWS S3