Develop NodeJS applications based on Docker and dockernodejs

Source: Internet
Author: User
Tags benchmark docker run redis server

Develop NodeJS applications based on Docker and dockernodejs

For this Node Application

This application contains a package. json, server. js, and A. gitignore file, which can be easily written.

. Gitignore

node_modules/*

Package. json

{ "name": "docker-dev", "version": "0.1.0", "description": "Docker Dev", "dependencies": {  "connect-redis": "~1.4.5",  "express": "~3.3.3",  "hiredis": "~0.1.15",  "redis": "~0.8.4" }}

Server. js

var express = require('express'), app = express(), redis = require('redis'), RedisStore = require('connect-redis')(express), server = require('http').createServer(app);app.configure(function() { app.use(express.cookieParser('keyboard-cat')); app.use(express.session({  store: new RedisStore({   host: process.env.REDIS_HOST || 'localhost',   port: process.env.REDIS_PORT || 6379,   db: process.env.REDIS_DB || 0  }),  cookie: {   expires: false,   maxAge: 30 * 24 * 60 * 60 * 1000  } }));});app.get('/', function(req, res) { res.json({ status: "ok" });});var port = process.env.HTTP_PORT || 3000;server.listen(port);console.log('Listening on port ' + port);

Server. js pulls all dependencies and starts a specific application. this specific application is set to store session information in Redis and expose a request endpoint. It will return a JSON status message in response. this is a very standard thing.

One thing to note is that the connection information for Redis can be rewritten using environment variables-this will play a role in migrating from the development environment dev to the production environment prod later.

Docker file

For the purpose of development, we will allow Redis and Node to run in the same container. Therefore, we will use a Dockerfile to configure this container.

Dockerfile

FROM dockerfile/ubuntuMAINTAINER Abhinav Ajgaonkar <abhinav316@gmail.com># Install RedisRUN  \ apt-get -y -qq install python redis-server# Install NodeRUN  \ cd /opt && \ wget http://nodejs.org/dist/v0.10.28/node-v0.10.28-linux-x64.tar.gz && \ tar -xzf node-v0.10.28-linux-x64.tar.gz && \ mv node-v0.10.28-linux-x64 node && \ cd /usr/local/bin && \ ln -s /opt/node/bin/* . && \ rm -f /opt/node-v0.10.28-linux-x64.tar.gz# Set the working directoryWORKDIR  /srcCMD ["/bin/bash"]

Let's take a line to understand,

FROM dockerfile/ubuntu
This tells docker to use the dockerfile/ubuntu image provided by Docker Inc. As the benchmark image for building.

RUN \
Apt-get-y-qq install python redis-server
The benchmark image does not contain anything at all-so we need to use apt-get to get everything the application needs to run. this statement will install python and redis-server. redis server is required, because we will store session information in it, and the necessity of python is that npm can be built as the C extension required by the Redis node module.

RUN \ cd /opt && \ wget http://nodejs.org/dist/v0.10.28/node-v0.10.28-linux-x64.tar.gz && \ tar -xzf node-v0.10.28-linux-x64.tar.gz && \ mv node-v0.10.28-linux-x64 node && \ cd /usr/local/bin && \ ln -s /opt/node/bin/* . && \ rm -f /opt/node-v0.10.28-linux-x64.tar.gz

This will download and extract the 64-bit NodeJS binary file.

WORKDIR /src

This statement tells docker to do cd/src once the container has been started and before executing the things specified by the CMD attribute.

CMD ["/bin/bash"]

As the last step, run/bin/bash.

Build and run containers

Now that the docker file is ready, let's build a Docker image.

docker build -t sqldump/docker-dev:0.1 .

Once the image is built, we can run a container using the following statement:

docker run -i -t --rm \      -p 3000:3000 \      -v `pwd`:/src \      sqldump/docker-dev:0.1

Let's take a look at what happened in the docker running command.

-I will start the container in interactive mode (contrast-d is in separation mode). This means that the container will exit once the interaction session ends.

-T will allocate a pseudo-tty.

-- Rm will remove the container and its file system when exiting.

-P 3000: 3000 forwards port 3000 on the host to port 3000 on the container.

-V 'pwd':/src
This statement will mount the current working directory to/src in the host (for example, our project file) container. instead of using the ADD command in Dockerfile, we can immediately see any modifications made in the text editor in the container.

Sqldump/docker-dev: 0.1 is the name and version of The docker image to be run. This is the same as the name and version we used to build the docker image.

Because Dockerfile specifies CMD ["/bin/bash"], once the container is started, we will log on to a bash shell environment. if docker runs the command successfully, it will be as follows:

Start Development

Now the container is running. before writing code, we need to sort out some standard, non-docker-related things. first, use the following statement to start the redis server in the container:

service redis-server start

Then, install the project dependency and nodemon. Nodemon, observe the changes in the project file, and restart the server in due time.

npm installnpm install -g nodemon

Finally, run the following command to start the server:

nodemon server.js

Now, if you navigate to http: // localhost: 3000 in your browser, you will see something like the following:

Let's add another endpoint like Server. js to simulate the development process:

app.get('/hello/:name', function(req, res) { res.json({  hello: req.params.name });});

You will see that nodemon has detected your modifications and restarted the server:

Now, if you navigate the browser to http: // localhost: 3000/hello/world, you will see the following response:

Production Environment

Containers in the current status are far from being released as products. redis data will not be persistent when it is restarted across containers. For example, if you restart the container, all session data will be wiped out. the same thing happens when you destroy the container and start a new container. Obviously this is not what you want. I will discuss this issue in the second part of the productization content.


For concurrent access to applications developed by nodejs, machine A is E5504cpu, machine B is I7cpu, and result E is

About the number of CPU Cores
 
Why should we migrate from NodeJS to Ruby on Rails?

It only describes the reasons behind the thinking and decision-making in our decision-making process. The two frameworks are both excellent and both of them are well designed. This is why some of our modules are still running on NodeJS. I am a big fan of NodeJs. I think this is a very exciting technology and I believe it will become more and more popular. I really appreciate this technology-although we recently migrated the Targeter App from NodeJS to Ruby on Rails. The reason why we used NodeJS to develop it was very simple. I have a package that can quickly bring our applications online (we spent 54 hours doing this). Compared with Ruby, I often use JavaScript. Because our technical architecture involves MongoDB, my skills only make sense in the NodeJS environment. However, as the application grows, I realized that choosing NodeJS to implement this application is a wrong choice. The following describes the causes. NodeJS is suitable for applications with a large number of short-lived requests. For traditional CRUD applications, it is also very good, but not very ideal. PHP, Ruby, and Python have mature and optimized frameworks to handle such applications. The idea that everything in NodeJS is executed asynchronously has no effect on CRUD applications. The popular frameworks in other languages can provide excellent caching technology, and all your requirements can be met, including asynchronous execution. NodeJS is a very young technical framework, and its peripheral libraries are not very mature. I didn't mean anything to offend code donors. They were excellent and developed many excellent libraries. However, most libraries need to be improved, and NodeJS's rapid growth environment means a lot of changes in every version upgrade. When you use a cutting-edge technology, it is imperative that you keep up with the latest version as soon as possible. This brings a lot of trouble to entrepreneurial enterprises. Another reason is about testing. The test framework in NodeJS is not bad, but it is worse than that on the Django or RoR platform. For an application that has a large amount of code submitted every day and will be released in a day or two, it is critical that the program cannot go wrong. Otherwise, your hard work will not be worth the candle. No one is willing to spend a day modifying mentally retarded bugs. Finally, we need something that can cache everything and implement it as soon as possible. Although our applications are growing and there are tens of thousands of hits per second, there will never be a large number of access requests; this is not a chat program! The maximum load of the main program is RPS, which is nothing for Ruby on Rails and Nginx. If you are still reading this article, you have read all what I want to say. You may insist on knowing where our application is still using NodeJS. Yes, our application is composed of two parts. The first is the interface, which the user can see, the second is the Report Management Section, and the log function. The latter is an optimal use scenario for NodeJS. There are a large number of short-cycle requests. This part of the action needs to be executed as soon as possible, even before our data push is complete. This is very important. When the request execution is not over, the browser continues to wait for the response to end, which affects the user experience. The asynchronous feature of NodeJS saves us. The data is either stored in the database or processed. Once the request is executed, the browser can start to do other important tasks.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.