Continuous integration solution and integration solution

Source: Internet
Author: User
Tags database sharding

Continuous integration solution and integration solution
Outline

 

Introduction: predecessor of continuous integration:
 

Before using continuous integration, many development teams use daily build ). At that time, Microsoft had been using this practice for many years. Whoever destroys the building will be responsible for monitoring the subsequent building structure until the next person who destroys the building is found.

Why continuous integration?
For most projects, adopting continuous integration practices is a big step towards high efficiency and high quality. It ensures that teams that create large and complex systems have a high degree of self-confidence and control. Once the code submission introduces the problem, continuous integration can provide us with fast feedback to ensure that the software we develop as a team can work. Continuous integration focuses on development teams. The output of the continuous integration system is usually used as the input of the manual test process and subsequent release process. In the software release process, a lot of waste comes from testing and O & M. For example, we often see:
  • Building and O & M team members are waiting for instructions or defect fixes
  • The tester waits for the "good" version to be built
  • The fault report is received only after the new function is developed for a period of time.
  • When the development is complete, the current software architecture cannot meet some non-functional requirements of the system.
Continuous integration practices:

Deliver software in a more complete end-to-end approach. We can deploy a software version with one click, or even deploy it in the production environment with one click, in this way, a very effective feedback loop can be established-because it is easy to deploy applications in the test environment, the team can get fast feedback on both software features and deployment processes. Because the deployment process (whether testing or integration) is automated, it can run and be tested frequently and regularly to reduce the risk of release, it also reduces the risk of passing the knowledge about the deployment process to the development team.

From a lean perspective, we have implemented a "Pull System". The test team can easily deploy the system by clicking the button on its own. Managers can also easily obtain key metrics, such as cycle time, throughput, and code quality. Continuous integration includes:

  • Automation of Software Building, integration, testing and deployment
  • Deployment pipeline at the team and organization level
  • Improved collaboration between developers, testers, and O & M personnel
  • Incremental concurrent software development in large distributed teams
  • Implement efficient configuration management policies
  • Analyze and implement automated Acceptance Testing
  • Capacity testing and other non-functional tests
  • Achieve continuous deployment and zero-downtime release
  • Manage infrastructure, Data, components, and dependencies
  • Risk management, Compliance and Audit
1. Build

What is building:
In modern software, I/O, ISO (large team), Small and Medium teams (Agile development, XP programming) may often be heard)


Build tools: MSBuild, Maven, Ant, NAnt, and Gradle

Unit is the cornerstone of the entire pyramid (in the construction industry, the cornerstone is the building base stone). If the cornerstone is unstable, what is the significance of Service and UI construction? Only the rock-solid foundation can the superstructure be solid enough.
I would like to explain it with the example of Swiss watches, but my colleague said the car example is better. A car consists of many accessories. Which of the following two options will you choose?

1. all unit accessories have not been tested. In the 4S Shop, the sales staff told you that they have just been assembled and have been running for a day. You can try it. 2. all unit accessories have been strictly tested in the production process. At, the sales staff told you that they have passed the national certification, passed the factory qualification, and have quality assurance. You can try it;

MSBuild is a system generated by Microsoft and Visual Studio. It is not just a construction tool, it should be called an automation platform with powerful scalability. MSBuild mainly involves three parts: execution engine, construction project, and task. The core of the construction is the execution engine, which includes defining the constructor specification, interpreting the constructor, and executing the "constructor action". The constructor is used to describe the constructor, in most cases, we use MSBuild to follow the rules and write a constructor. Each "constructor action" executed by the MSBuild engine is implemented by a task, and the task is the Extended Mechanism of MSBuild, by writing new tasks, you can continuously expand the execution capability of MSBuild. Therefore, these three parts represent the engine, script, and expansion capabilities.
MSBuild introduction and use II. Version Control

Configuration Management: Use Version Control
Version Control System (source code control Management System) is a mechanism for saving multiple versions of files. Generally, open-source tools, including Subversion and Git, can meet the needs of most teams. All version control systems need to solve such a basic problem: how to allow users to share information without interfering with each other by accident?
Without the help of version control tools, we often encounter the following problems during development:
I. Chaotic code management.
Ii. Solve code conflicts.
3. Introduce deep bugs during code integration.
4. You cannot control the permissions of the code owner.
5. Difficult to release different project versions.

Versioning all content

Version Control is not only for source code, but also for every product related to the developed software. It should be placed under version control and should include: source code, test code, database script, build and deployment script, documentation, and configuration files used by web containers (tomcat configuration.

Ensure frequent submission of reliable code to the trunk

Frequent submission of reliable and quality-assured code (compilation is the most basic requirement) allows you to easily roll back to the latest reliable version. After code submission, You can trigger continuous integration building and get feedback in a timely manner.

Submit meaningful comments

The reason why team members are forced to use meaningful annotations or even associated development tasks is. When the build fails, you know who destroyed the build, find the possible cause and locate the defect location. This additional information can shorten the time for fixing defects. Example: The team uses svn and redmine.

Some enhancements

  • Git requires that you must write a description for each submission.
  • Separate branch of test code and database script
  • Build scripted
3. Deploy the 3.1 most basic deployment Pipeline

The starting point of this process is that the developer submits code to the version control library. At this point, the continuous integration system responds to this commit and triggers an instance of this pipeline.
The first phase will compile the code, run the unit test, perform code analysis, and create a software Binary Package. If all unit tests pass and the Code complies with the encoding standard, the executable code is packaged into an executable file and put into a product library. Some are submitted, but others are executed, such as preparing a database for the acceptance test.
The second stage consists of automated acceptance tests with long running time. Therefore, the continuous integration server is best to support the practice of dividing tests into multiple groups so that tasks can be executed concurrently in the build network and the execution efficiency is improved. This phase is automatically triggered after the completion of the first phase.
The deployment pipeline may have branches, so that the build version can be deployed independently in multiple different environments, such as deployment to the user acceptance environment, capacity test environment, and generation environment. In this case, to deploy to the corresponding environment, you need to use the automated deployment script to execute this deployment process. Testers should be able to see all build versions to be tested and their statuses.
Objective: To get the fastest feedback on the practice of deploying the pipeline 3.2

  • Generate a binary package only once
    We call all sets of executable code a binary package, such as A. NET assembly. Sometimes the Code does not need to be compiled. In this case, binary is a collection of all files. A related anti-pattern is to always use the source code instead of the Binary Package. Therefore, each time you talk about a deployment modification, You need to manually export the source code from the production branch and re-compile the Binary Package. At the same time, it may also be caused by differences between the compiler and a dependent version.

Visualization of deployment 4. unit test (TDD)

Test Type:

BDD, TDD, ATDD
BDD is mainly based on scenarios and requirements, and ATDD is oriented to acceptance. We will not introduce it too much here.
Here we mainly introduce some development of TDD. What is TDD?

Test-driven development

Advantages of TDD
TDD ensures the quality of code from the very beginning: developers are encouraged to develop only "minimal" code to complete specific test functions.
Follow the SOLID principle: SOLID (single function, open and closed principle, Lishi replacement, interface isolation, and dependency inversion)
TDD ensures high consistency between code and business needs.
TDD encourages simpler and more targeted APIs
TDD encourages more communication with enterprises and internal teams.
TDD helps clear redundant code
TDD provides built-in regression testing.

If there is no unit test to help identify and diagnose bugs, most developers use the debugger to set breakpoints where they think there are bugs, sometimes called the "Shotgun method ".

Unit test features:

  • Isolated from other codes
  • Targeted
  • Isolated from other developers
  • Repeatable
  • Predictable
Principles of Automated unit testing:

What are the key points of code submission and testing? Quickly capture the most common errors introduced to the system due to modifications and notify developers so that they can quickly fix them. The value of feedback provided in the submission phase is that the input to it allows the system to work efficiently and quickly.

  • Isolate UI operations
    The UI should be a higher Level of test Level. It takes a lot of time to prepare data. The business logic is complex. It enters the UI stage too early, and it is easy to disperse the unit test energy of development.
  • Database isolation and file read/write network overhead
    In automated testing, if you need to write the results to the database and then verify that the results are correctly written, this verification method is simple and easy to understand, but it is not an efficient method. This should be solved by the Level of the integration test.
    First of all: the interaction with the database is long, and it may even take time to maintain the database. This will be an obstacle to rapid testing, and developers cannot get timely and effective feedback. Suppose it takes me an hour to verify the interaction with the database. How long is this wait.
    Second, data management requires costs, from data filtering (online data may be at the T level) to M level in the test environment, how to filter the appropriate size, this increases management costs (of course, DBUnit can be used in integration testing to solve some problems ).
    Finally, if you must have a read/write operation to complete the test, you must also reflect on the testability of the Code? Whether reconstruction is required.
    Unit tests must never rely on databases, file systems, network overhead, and other external dependencies.
  • Use the isolation framework and dependency framework
    You can use a simulation tool set: Rhino. Mock, Moq, Type Mock, etc. The R & D team is mainly based on Mockito practices. Compared with the dependencies and States that need to be assembled, testing using simulation technology usually runs very fast, so that developers can get feedback quickly on the continuous integration platform after submitting code.
Practice

General
Test an encryption and decryption method

[Test] [Ignore ("This Test has a problem! ")] Public void TestEncrypt () {FileEncrypt fileEncrypt = new FileEncrypt (); String ii = fileEncrypt. encryptContext ("Hello World", "123"); ii = fileEncrypt. decryptContext (ii, "123"); Assert. pass (ii );}

The Ignore flag method can be ignored when we run the test instance.

Category

You can only run test cases in the specified directory.
Rhino. Mocks is simple and practical.

[Test] [Category ("Simulated Object")] public void TestCustomer () {MockRepository mocks = new MockRepository (); ICustomer customer = mocks. strictMock <ICustomer> (); customer. CT (c => c. showTitle ("")). return ("567"); Ct. call (customer. pid ). return (30); customer. replay (); Assert. areEqual (customer. showTitle (""), "567");} [Test] public void SayHelloWorld () {MockRepository mocks = new MockRepository (); INameSource nameSource = mocks. dynamicMock <INameSource> (); precise CT. call (nameSource. createName (null, null )). ignoreArguments (). do (new NameSourceDelegate (Formal); mocks. replayAll (); const string expected = "Hi, my name is Ayende Rahien"; string actual = new Speaker ("Bright", "Gong", nameSource ). introduce (); Assert. areEqual (expected, actual);} delegate string NameSourceDelegate (string first, string surname); public class Speaker {private readonly string _ firstName; private readonly string _ surname; private readonly INameSource _ nameSource; public Speaker (string firstName, string surname, INameSource nameSource) {this. _ firstName = firstName; this. _ surname = surname; this. _ nameSource = nameSource;} public string Introduce () {string name = _ nameSource. createName (_ firstName, _ surname); return string. format ("Hi, my name is {0}", name );}}
Red light, green light, reconstruction

"Red light, green light, and refactoring" clarify the workflows that developers should follow when implementing TDD.

  • Red Light Stage
    When I started using TDD, many developers asked: "How can I write a test for a nonexistent code? "In fact, many tests Target classes or methods that do not exist in the current period. This means that these tests cannot even be compiled, and the effect is basically the same as that of failed tests. This is okay. Remember, this is because the corresponding code must exist with these tests.
    The red light stage mainly involves writing a code that will fail the test,
    For example
    [Category("simple")]  [Test]  public bool MyMEthod(int inputParameter)  {      throw new NotImplementedException();  }
  • Green phase
    Write only the right amount of code, because the new test passes without causing any test failure.
    [Category("simple")]  [Test]  public bool MyMEthod(int inputParameter)  {      return false;  }
    Some people think that the code is too small, at least need to understand what to say, at least like
    [Category("simple")][Test]public bool MyMEthod(int inputParameter){  if(inputParameter=60)  {      return false;  }  return true;}
    Otherwise, it is our expectation to return false. We hope to return false when the input is 60. Obviously, it is enough to return false. The tests added later will allow us to expand the method. The principle is not to introduce unnecessary complexity to the code.
  • Reconstruction phase:
    It makes our code more powerful, so that the code is maintainability, readability, or overall code quality.
    Reconstruction example:
    We will use a simple introduction to the three-way game.
    Rule: The winner marks X or O. If no player wins, an empty character is returned.
    We created a test method.
    For example

    private IGameWinnerService _gameWinnerService;  private char[,] _gameBoard;  [SetUp]  public void SetupUnitTests()  {      _gameWinnerService = new GameWinnerService();      _gameBoard = new char[3, 3]            {                {' ', ' ', ' '},                 {' ', ' ', ' '},                 {' ', ' ', ' '}            };  }  [Test]  public void NeitherPlayerHasThreeInARow()  {      const char expected = ' ';                  var actual = _gameWinnerService.Validate(_gameBoard);      Assert.AreEqual(expected, actual);  }

    Pass an empty array, that is, none of the players put 3 tags in a row. Our expectation is to return NULL characters, that is, no one wins.
    However, the above Code is not compiled, because we do not have a defined interface, IGameWinnerService

    public interface IGameWinnerService  {      char Validate(char[,] gameBoard);  }

    But it will still fail, because we have not defined the implemented class, GameWinnerService

    public class GameWinnerService : IGameWinnerService  {      public char Validate(char[,] gameBoard)      {          throw new NotImplementedException();      }  }

    The compilation is successful, but an error is reported during running. Then we can improve this method.

    public class GameWinnerService : IGameWinnerService  {          private const char SymbolForNoWinner = ' ';          public char Validate(char[,] gameBoard)          {              return SymbolForNoWinner;          }  }
    5. docalization of architecture Code 1.What are the documents?

    Development documentation, test documentation, requirement documentation, user manual, technical manual, etc. 2.Simplified document writing

  • Development Documentation: GhostDoc has a paid version, Free version, and usually enough.
    GHost can add rules to annotations based on policies to customize macros.
    Submit to version library with code
  • Requirement document:

  • Export Test Cases

3. Document Integration

Knowledge Base:
Wiki:
News:

  • Current Method
    OneNote + SharePoint + TFS + Project collaborative Office
    OneNote: user manual, technical manual
    SharePoint: requirement document, focusing on files
    TFS: development documents and use cases (requirement cases and Test Cases)
    Project: Project management, multi-dimensional
    Instance:
    SharePoint + TFS Integration
    SharePoint + OneNote integration, OneNote change call SharePoint email reminder
    Study the project requirements management function of Project2010 + TFS2010
    TFS-Project Server integration Synchronization Process Overview
4. Continuous document updates

Many projects ignore the document, mainly because a function or method may be modified after the team is too large to ensure continuous updates of the annotation document. 5.Automatic document publishing

OneNote 6.Convenient Document Retrieval

Generate HTML documents using API functions and methods
Supports directory, index, and search

Example:

News and Vico search
7.Document permission Control8.Demand change management

Instance:

  • Demand change management
    Demand changes must be recorded
    History records should be available for query and resultsClearVisible.

    Project follow-up
  • Emails can be sent to relevant personnel.
  • Demand change Review
    New requirements are not reviewed, and developers and testers are invisible. Only approved requirements can be viewed.
8. PracticeVi. Naming Conventions:
  • Extended Design
  • Error definition:
  • Type Design Specifications
  • Member design specifications
  • Framework Design Specifications
VII. Database scalability

The database sharding method can be used to share the pressure on different databases. This method is suitable for data explosion systems, you only need to add hardware to solve the performance and storage problems linearly. if you consider that the future growth will be fast, you can add sub-tables in the sub-database; on the basis of database sharding, you can perform database sharding by function or other methods.

Database sharding principles: 8. Automation

The founder of the SCRUM software development process once said:
If a process can be identified (that is, you can even understand all the details involved in the process, so that it can be designed to run multiple times repeatedly and safely predict its results ), this process is called a "confirmation process ". Theoretically, all definite projects can be completed automatically. On the other hand, people do not understand all the details of a process. They only know that under some initial conditions, they can get the desired results through some adjustments and controls, such a process is called an "experience process ".

We can use the build tool to automate the operation according to the predefined settings and report error information to key developers to reduce the workload. At the same time, automatic construction is also a guarantee for project security. It is clear who destroyed the building. Every time the project's core code is modified, the entire application is re-built and then the regression test is automatically run, ensuring that this modification has not caused any damage.

  • Automated scripts
  • Hook
IX. Feedback

Common continuous integration platforms: jenkins and Teamcity
Feedback platform: jenkins & sonarqube, TFS & Project & Sharepoint, Teamcity & NotCover

Objective: To discover potential bugs and find out the bad taste of the project, so as to improve the code quality. A brief introduction to sonarqube

Sonar is an open-source platform for code quality management. It manages the quality of source code and can detect code quality from seven dimensions.
Supports code quality management and detection in more than 20 programming languages, including java, C #, C/C ++, PL/SQL, Cobol, JavaScrip, and Groovy.
Why Sonar?

Developers 'seven Deadly Sins
1. Poor complexity Distribution
Files, classes, methods, etc. If the complexity is too high, it will be difficult to change, it will be difficult to understand. At the same time, the more complex the code is, the more error-prone it is. In addition, without automated unit tests, changes to any component in the program may result in a comprehensive regression test.

2. Duplicate
Obviously, the program contains a lot of copy and paste code, which is of low quality.
Sonar can show serious duplicates in the source code.

3. Lack of Unit Tests
Sonar allows you to conveniently calculate and display unit test coverage.

4. code coverage
5. Code standards
Sonar can be standardized by code rule detection tools such as PMD, CheckStyle, and Findbugs.
Sonar also supports multiple languages, such as C #, Java, Python, and Android.
6. Potential Problem code
7. Design and annotation Architecture

8. Comparison
Comparison between projects based on indicators


9. Scalability:
Sonar integration is implemented throughPlug-ins. Integration with other tools

Jenkins

Jenkins, formerly called Hudson, is a continuous integration tool developed based on Java and used to monitor continuous repetitive work, including:
1. Continuous software version release/test projects.
2. Monitor the execution of external calls.

 
 
  • Backup and recovery
    Backup and recovery are very simple, that is, simply copy the Jenkins directory:
    All the settings, build logs, artifact archives are stored under the JENKINS_HOME directory. simply archive this directory to make a back up. similarly, restoring the data is just replacing the contents of the JENKINS_HOME directory from a back up.
  • Automatically run Build
    There are three ways to trigger a build:
    Builds in Jenkins can be triggered periodically (on a schedule, specified in configuration) the syntax defining schedule here is a common cron syntax in unix.
    Or when source changes in the project have been detected
    You can set Jenkins to regularly check whether SVN or TFS has changed, or manually check: http: // YOURHOST/jenkins/job/PROJECTNAME/pollong. You can also set Jenkins to post-commit. This method is especially useful when checking whether the code changes for a long time.
  • Because of performance problems, build is distributed to multiple slave nodes.
    To the Jenkins management interface, you can easily add nodes. When configuring a node, you must provide the machine where the node is located, the login username and password, and the directory used.
    But slave does not need to install Jenkins again. Jenkins automatically enables the slave agent and tests the build tools on a remote machine.
    Note that the build results and artifacts will always end up on the master server. Therefore, you do not need to run to each node to view the files and logs generated by the build.
    In fact, a local workspace is created on the slave node and used at runtime. After all, because the build runs on the slave node, this node must have all the factors required to run the build.
  • Create a Project
    Because Jenkins can be used to run various CI, test, batch processing tasks, and so on, these tasks are collectively referred to as "free-style software project" in Jenkins ".

Currently: low-cost integration solutions
Jenkins + MSTEST + MSBuild
10. Practice

Tools: TFS + jenkins + sonar
TFS as the core of Version Control
Key to automatic jenkins Construction
Executor of sonar static detection, center of code quality feedback
NUnit (recommended) or MSTEST as a unit test tool
Dependency Management
Tool: Nuget

Related Article

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.