A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service
1. System Integration: concept, role, type, and development of system integration technology
2. Information System Construction: Information System Life Cycle, objectives of each stage, main work content, and information system development methods
3. Software Engineering: software requirement analysis and definition, software design, testing and maintenance, software reuse, and software development environment.
4. Object-oriented System Analysis and Design: basic object-oriented concepts, unified modeling language, visual modeling, object-oriented system analysis, and object-oriented system design.
5. Software Architecture: Software Architecture Definition, typical architecture, software architecture design method, software architecture analysis and evaluation, and software middleware.
6. Typical Application Integration Technology: database and data warehouse technology, Web Services technology, J2EE architecture,. NET architecture, workflow technology, components, their importance in the system, and common construction standards.
7. computer network knowledge: network technology standards and protocols, Internet technology and applications, network classification, network management, network servers, network exchange technology, network storage technology, wireless network technology, optical network technology, network access technology, comprehensive wiring, IDC engineering, network planning, design and implementation.
Software Engineering is a system method for developing, running, maintaining, and repairing software.Requirement Analysis
Software requirements include functional requirements, non-functional requirements, and design constraints.
1. Functional requirements: the functions required by the software system to provide useful functions to its users.
2. Non-functional requirements: the attributes or quality required by the product, such as reliability, performance, response time, fault tolerance, and scalability.
3. design constraints: these are also known as restrictions and supplementary conventions. They are usually constraints on the solution. For example, an operating system with an independent State-owned knowledge version must be used, it must run in Linux.
The key to requirement analysis lies in the research and understanding of problem domains.
To facilitate understanding of problem domains, the recommended practice of modern software engineering methods is to abstract the problem domains, break them into several basic elements, and then model the relationships between elements. Afternoon analysis is to refine, analyze, and carefully review the collected requirements to ensure that all stakeholders understand the meaning and identify the errors, omissions, or other deficiencies.
The basic principle of software design is information hiding and module independence.
Factors that may change in the future are listed during the outline design, and these factors are placed inside individual modules during module division. That is to say, the implementation details of each module are concealed for other modules. The information contained in the module (including data and processes) cannot be used by others without such information.
Module independence in software design refers to the fact that each module in the software system only involves specific functions of a specific item in the software system, and other module interfaces in the software system are relatively simple. The concept of module independence is the direct result of modularity, abstraction, and information concealment localization.
Two criteria are generally used to measure the module independence: module coupling and module cohesion. Coupling is a measurement of the relative independence between modules. Cohesion is a measurement of the module's functional strength. A module with strong independence should be highly cohesive and low coupling.
Software testing is one of the main means to ensure the quality of software, and is also a step that must be completed before the software is delivered to the customer. At present, the correctness of the software has not been fundamentally resolved, and software testing is still the main means to discover software errors and defects.
Software testing methods are generally divided into two types: Dynamic Testing and static testing. Dynamic Testing refers to detecting errors through running programs, including black box testing, white box testing, and gray box testing. static testing means that the tested program is not running on the machine, instead, the program is detected by means of manual detection and computer-aided static analysis. The main methods for manual testing in static analysis include desk checks, code reviews, and code lookup.
Based on the purpose of the test. Testing can be divided into unit testing for different stages. Integration testing, validation testing, system testing, and other types.
1. unit test: Also known as module test, it is a test of the correctness of the smallest unit (program module) in the software design. The purpose is to check whether each program unit can correctly implement the module function, performance, interface and design constraints in the detailed design instructions, and discover various internal errors of the module.
2. Integration Testing: Also known as assembly testing and joint testing. It mainly integrates modules that have passed the unit test, and mainly tests the collaboration between modules. The integration test plan is generally completed in the software outline design phase.
3. validation testing, also known as effectiveness testing, primarily verifies the functionality, performance, and other features of the software as required by the user. Confirm that the testing plan is completed in the demand analysis phase.
4. system Testing: if the project does not know the software, hardware, and network, the software should be combined with other system elements, such as the hardware, peripherals, supporting software, and data supported by the outside, A series of integration and validation tests on computer systems in the actual operating environment. Generally, the main content of system testing includes functional testing. Robustness testing, performance testing, user interface testing, security testing, installation and anti-installation testing. The system test plan is generally completed in the system analysis phase (demand analysis phase.
Based on Software features, Software maintainability is mainly determined by three factors: comprehensibility, testability, and modification. Software maintenance can be divided into corrective maintenance, adaptive maintenance, preventive maintenance, and perfection maintenance.
1. despite the rigorous tests, there is no guarantee that there is no problem in the software. As the running time continues, the data volume accumulates, and the changes in various application environments, errors may still be exposed, corrective maintenance is required.
2. With the continuous release of new computer hardware products and new operating system versions, adaptive maintenance is required for running software to adapt to different software and hardware environments.
3. When users are familiar with the software, they will propose some improvement requirements. In order to meet these requirements, they must carry out excellent maintenance, which accounts for almost half of the maintenance workload. For example, adjust the printing format, increase the statistical caliber, and improve the business process.
A component is a unit software with complete semantics, correct syntax, and reusable value. It is a system that can be clearly identified by software reuse projects. It is a combination of semantic descriptions, communication interfaces, and implementation code. Simply put, a component is a program body that has certain functions and can work independently or be assembled and coordinated with other components. The use of components is independent of its development and production.
The component model is a program description of the essential features of the component. At present, many component models have been formed internationally. These models have different goals and functions. Some of the models are reference models, some belong to descriptive models, and some others belong to implementation models. In recent years, three major schools have been formed: OMG (Object Management Group, Object Management Group)'s CORBA, sun's EJB, and Microsoft's DCOM (Distributed Component Object Model, distributed Component Object Model)
Component Software Architecture
The Client/Server (C/S) software architecture was developed based on unequal resources and shared resources. It was a mature technology in 1990s, the C/S architecture defines how a workstation is connected to a server to distribute data and applications on multiple processing machines. The C/S architecture consists of three parts: database servers, customer applications, and networks.
Compared with the layer-2 C/S architecture, an application server is added in the layer-3 C/S architecture, allowing the entire application logic to reside on the application server, only the presentation layer exists on the client. This structure is called "thin client ". The layer-3 C/S architecture divides application functions into three parts: presentation layer, function layer, and data layer. In the layer-3 C/S architecture, middleware is the most important component.
In the layer-3 C/S architecture, the presentation layer is responsible for processing user input and output to the customer (for efficiency reasons, it may verify validity before transmitting user input ). The function layer establishes database connections, generates SQL statements for accessing the database based on user requests, and returns the results to the client. The data layer stores and retrieves actual databases, responds to data processing requests at the function layer, and returns the results to the function layer.
Browser/Server (B/S Browser/Server) is an implementation method of the above three layers of applications. Its specific structure is: Browser/Web server/database server. The B/S structure is mainly based on the mature WWW browser technology, combined with a variety of script voices of the browser, the use of general browsers to achieve the powerful functionality that originally required complex special software, it also saves development costs.Middleware
Middleware is the software between the operating system and applications in a distributed system environment. middleware is an independent system software or service program, distributed Application Software shares resources between different technologies. The middleware is located on the operating system of the client server to manage computing and network resources.
The task of middleware is to make application development easier. By providing unified program abstraction, it hides the complexity of low-level programming in heterogeneous systems and distributed systems. There are many types of middleware classification. There are three main layers: underlying middleware, general middleware, and Master middleware.
1. mainstream underlying middleware technologies include Java Virtual Machine (JVM), Public language library (commonlanguage runtime, CLR), and Java database connection (javadatabase connectivity, JDBC ), open Database Connectivity (ODBC) and Adaptive Communication Environment (ACE ).
2. mainstream technologies of general-purpose middleware include Common Object Request Broker Architecture (CORBA), J2EE, message-oriented middleware (MOM), and COM. main Products: Iona orbix, BEA Weblogic and IBM MQSeries.
3. Mainstream integrated middleware technologies include workflow and Enterprise Application Integration (EAI), including BEA Weblogic and IBM WebSphere.
Basic concepts of Object-Oriented Methods
An object is an entity used to describe objective things in a system. It is a basic unit of component systems. An Object-Oriented software system is composed of objects. complex objects are composed of Simple objects. The three elements of an object are OBJECT tag, attribute, and service (operation ).
A class is an abstract definition of an object. It is a combination of objects with the same data structure and operations. Class definition includes a set of data attributes and a combination of operations on data. Class definition can be considered as a template for objects with similar characteristics and common behavior, and can be used to generate objects.
Classes and objects are the relationships between Abstract descriptions and specific instances. A specific object is called an instance of a class. They can all use the functions provided in the class. The status of an object is included in its instance variables.
Message Communication is an important principle in the object-oriented methodology. It is inseparable from the object encapsulation principle. Encapsulation is called an independent unit with different roles and mutual interference. Message Communication provides a unique and valid channel for them to interact with each other, constitute an organic system.
Unified Modeling speech (UML) is a visual modeling speech for the system.
Uml2.0 contains 13 different graphs, which are divided into static models that represent the static structure of the system and dynamic models that represent the dynamic model of the system.
1. Class diagram: shows a group of classes, interfaces, collaboration, and their relationships. The most common graph created in the modeling of the object-oriented system is the class graph. The class diagram shows the Static Design view of the system, including the class diagram of the active class, and the static process view of the system.
2. object graph: displays a group of objects and their relationships. The object graph describes the static snapshot of the instance of the transaction created in the class graph. Like a class chart, these charts provide the Static Design view or static process view of the system, but they are created from the perspective of real cases or prototype cases.
3. Component diagram: shows the internal structure of an encapsulated class and its interfaces, ports, and embedded components and connectors. Is a variant of the class graph.
4. composite structure diagram: It can depict the internal structure of a structured class.
5. use case diagram: implements a set of use cases, participants, and relationships between them.
6. Order diagram and communication diagram: both are interaction diagrams. An interaction diagram consists of a group of objects or roles and possible messages sent between them. The Order diagram emphasizes the interaction diagram of the message's time order, and the communication diagram emphasizes the structure of the message object or role. The sequence diagram emphasizes the time sequence, and the communication diagram emphasizes the data structure that the message flows through.
7. state chart: shows a state machine consisting of state, transfer, event, and activity. It facilitates modeling of reactive systems.
8. Activity diagram: displays the process or other computing structures as a step-by-step control flow and data flow within the computing.
9. Deployment diagram: shows the configuration of the processing nodes at run time and the components that exist in the nodes.
10. Package diagram: shows the organizational units and their dependencies that are decomposed by the model itself.
11. Directed Graph: an interactive graph that shows the actual time when a message spans different objects or roles.
12. Interactive olive diagram: a mixture of activity diagrams and sequence diagrams.
Typical Application Integration Technology
A data warehouse is a topic-oriented, integrated, non-zero-loss data set that changes over time and is used to support management decisions.
1. Data Warehouse is subject-oriented. Traditional operating systems are organized around company applications.
2 .. Data Warehouse is integrated. Data Warehouse enables data integration from an application-oriented operation environment to an analysis-oriented data warehouse. Because different application systems are inconsistent in coding, naming conventions, actual attributes, and attribute measurements, some method should be used to eliminate these inconsistencies when entering the data warehouse.
3. Data Warehouses are non-loss-prone. Data in a data warehouse is usually loaded and accessed together. Data in a data warehouse environment is not updated in the general sense.
4. Data Warehouse changes over time. The data time period in the data warehouse is much longer than the data time period in the operating system. The data type database contains "Current Value" data, and the accuracy of the data is effective during access, similarly, the data of the current value can be updated. The data in the data warehouse is only a series of complex snapshots generated at a certain time point. The key-code structure of the operation-type data may contain time elements, such as year, month, and day. The key-code structure of the data warehouse always contains time elements.
5. With the concept of multi-dimensional, OLAP provides multi-dimensional analysis and collapse dimension functions such as slicing, cutting, drill-down, rolling, and rotating.
OLAP system architecture mainly includes relational database-based ROLAP (relation OLAP), multidimensional database-based molap (multidimen1_olap), and hybrid data-based holap (Hybrid OLAP ).
ROLAP indicates the OLAP implementation based on relational databases. It uses relational databases as the core and uses relational structures to represent and store multidimensional data. ROLAP divides the multidimensional structure of a multi-dimensional database into two types of tables: fact tables used to store data and dimension keywords, and dimension tables, that is, at least one table is used for each dimension to store information about dimensions such as dimension levels and member categories.
Molap indicates the OLAP implementation based on multi-dimensional data organization. It uses multi-dimensional data as the core and uses multi-dimensional arrays to store data. Molap queries use a combination of index search and direct addressing, which is much faster than ROLAP's table index search and table connection.
Data Mining extracts hidden data from a large number of incomplete, noisy, fuzzy, and random data that people do not know beforehand, but it is also a process of potentially useful information and knowledge. Data mining can be divided into descriptive data mining and pre-formed Data Mining. Descriptive data mining includes data summarization, clustering, and association analysis. predictive data mining includes classification, regression, and time series analysis.
Web Services is a technology that allows applications to communicate with each other. Strictly speaking, Web servioces is an interface that describes a series of operations. It uses standard and standard XML Description interfaces. This description includes all the details required to interact with the service, including the message format, transmission protocol, and service location. The service implementation details are hidden in external interfaces, and only a series of executable operations are provided. These operations are independent of the programming speech used by the software, hardware platform, and programming program. Web services can be used independently or together with other Web Services to implement complex business functions.
There are three working roles in the solution of the WebServices model. The service provider (server) and the service requestor (client) are mandatory, and the Service Registration Center is an optional role. The interaction and operation between them constitute the architecture of Web Services. The service provider defines and implements web services, and then publishes the service description to the service requestor or service registration center. The service requestor uses the search operation to retrieve the service description from the local or registration center, then, use the service description to bind with the service provider and call Web Services.
J2EE is an architecture that uses the Java2 Platform to simplify the development, deployment, and management of enterprise solutions. The foundation of J2EE technology is the Standard Edition of the core Java platform or Java2 Platform. J2EE not only consolidates many advantages of the Standard Edition, it also provides comprehensive support for EJB, Java Servlet API, JSP, and XML technologies. The ultimate goal is to become an architecture that can greatly increase the market time for enterprise development.
A multi-layer application can provide an independent layer for different services. The typical four-layer structure of J2EE is as follows:
1. Client layer components running on client machines
2. web layer components running on the J2EE Server
3. Business logic layer components running on the J2EE Server
4. enterprise information system layer software running on the EIS Server
The J2EE application components can be installed and deployed to several containers.
1. EJB container capacity and container management manage the execution of ejbs in all J2EE applications. EJB and their containers run on the J2EE server.
2. Web containers manage the execution of all J2EE applications and JSP pages and Servlet components. The Web and their containers run on the J2EE server.
3. The application client container manages the execution of all the application client components in the J2EE application, and the application client and their containers run on the J2EE server.
The applet container is a combination of Web browsers and Java components running on the client machine.
. Net Architecture
The. NET architecture includes four products:
1..net development tool (. NET speech: C #/VB. NET and an integrated ide: Visual Studio. NET, class library, and universal speech runtime: CLR)
2..net dedicated server: supports data storage, e-mail, B2B e-commerce, and other dedicated servers
3..net Web Service
A workflow is a part or whole of the automatically operated business process. It acts as a participant taking actions on files, information, or tasks according to the rules and passing them among participants. Simply put, a workflow is a series of interconnected and automatic business activities or tasks. We can regard the entire business process as a river, where the flow is a workflow.
Workflow management is the automated coordination, control, and communication between people and computers. In the computer-based business process, by running software on the network, the execution of all commands is under control; under workflow management, the workload can be monitored and distributed to different users to achieve a balance.
The workflow management system defines, creates, and manages workflows through software. It runs on one or more workflow engines. These engines interpret the process definition, interact with workflow participants, and call other IT tools or applications as needed.
Professional Information System Integration Technology
Start building with 50+ products and up to 12 months usage for Elastic Compute Service