DB2 Magazine Chinese version: support for real-time analysis

Source: Internet
Author: User
Tags db2 require requires advantage

In a competitive environment, companies need to have a thorough understanding of their own and partners ' processes. Knowledge of the process is the key to becoming an advanced enterprise.

However, most companies are still struggling with performance challenges and inefficient processes. Companies that have a clear understanding of their process needs and are able to implement these requirements in real time can gain a huge advantage. As value chains effectively use information, they can respond effectively to business changes and provide support for competitive customer service.

Initiatives such as real-time Enterprise (RTE) and business Process Management (BPM) provide support for this flow of information. RTE and BPM are strong evidence that bulk, static, and different business processes can hinder modern enterprise.

Using real time analysis for optimization

Operational transformation requires the integration of pipelining and automation technologies. Increasing the speed of various systems can reduce the delay from action to reaction, thereby increasing the efficiency. Replacing repetitive processes with automated responses and decision loops that require human intervention can reduce costs, make actions more consistent, and reduce errors. pipelined processes and IT systems help to gain competitive advantage.

There are two core issues in optimizing the business environment through real-time analysis: first, to understand the meaning of real time for the enterprise, and to determine how to improve the decision-making level in real time.

defined in real time. Real-time requirements are generated by a variety of business requirements, a large part of which is customer-facing. For example, a bank teller needs to know that the customer at the front of the counter has recently hung up a customer service call, and he or she needs to know the nature of the call and the subsequent results. Sometimes, however, real-time data is confusing or completely incorrect. Some types of analysis, such as insurance case studies, require a static dataset for a specific period of time. Therefore, before you perform a real-time analysis, you should determine which data must be real-time, which data can be real-time, and which data must be temporarily static.

When it comes to analyzing the environment, the real time definition is difficult to fathom. In a traditional environment, real time means that you press a button and the system responds to it immediately. However, in the analysis environment, there are few instantaneous meanings in real time. In the context of the analysis system, "real time" is often used to represent things that are loaded more frequently than the bulk tasks per night.

Of course, just because an application area requires real-time data does not mean that the operating system can support such a requirement. If the business needs to get the data in a minute, but the data isn't in place in five minutes, then a decision needs to be made: is it changing the operating system or changing the demand? Legacy systems (Systems 5, 10, 20, or even 40 years ago) are the biggest obstacles to operational transformation.

Understanding the current operating environment and how to provide the best support for analysis is a requirement of real-time analysis.

Better decisions.

When building a system that supports better decisions, the BI architect is asked to answer two top questions:

What is a decision-making process?

How does the BI environment make this process better?

Once the architect has mastered the decision-making process, you will need to answer two secondary questions:

What is the most outstanding and repeatable decision-making process currently in use?

What techniques and techniques can I implement in my BI architecture to support those decision-making processes?

With Jens Rasmussen's original decision ladder, you can find clear steps to be taken to make decisions and identify which repeatable steps can be automated and which steps must rely on human intervention. Figure 1 shows the eight steps of the decision process defined by Rasmussen.

Figure 1. Decision making process

Figure 2 shows the processes that occur in most data warehouses and BI environments-all decisions are made artificially. The activity of a computer system is limited to a more traditional role, that is, the activation process after an event is detected. All other stages are done by the people involved. Although in some cases human-computer systems may be effective, they are not suitable for real-time analysis. When user intervention is required, it becomes meaningless in real time.

Figure 3, on the other hand, shows a new solution in which the computer system is used more frequently. For repetitive events that it recognizes, the computer system can initiate an indication. In other words, a computer system can make a decision without human intervention and perform an action. For unrecognized events, the computer system will give a report requiring human intervention. In this case, the combination of computer systems and human intervention makes real-time analysis possible.

Figure 2. Decision making process that is not suitable for real-time analysis

Figure 3. Decisions in real-time solutions

Consider a modern nuclear reactor, a complex system that requires sophisticated controllers to ensure high performance and safe operation. With the increasing demand for reliability, low environmental impact and high performance, and taking into account the complexity and uncertainty associated with the scale and scope of work (data and analysis), autonomous, machine based intelligent control methods become an inevitable choice. Most researchers believe that while the tasks involved in the design and operation of nuclear reactors involve important human cognition (thinking, learning and adaptation), some processes are more suitable for use with machine intelligence components (expert systems, fuzzy logic, neural networks, and genetic algorithms). In other words, because the reactor environment is so complex and the analytical work required to run that environment is so onerous, it is better for the computer to do most of the decision-making process.

The combination of conventional control systems and machine intelligence components allows for higher performance in reactor start-up, emergency shutdown, fault detection and diagnosis, and alarm processing and diagnostics.

Contact Us

The content source of this page is from Internet, which doesn't represent Alibaba Cloud's opinion; products and services mentioned on that page don't have any relationship with Alibaba Cloud. If the content of the page makes you feel confusing, please write us an email, we will handle the problem within 5 days after receiving your email.

If you find any instances of plagiarism from the community, please send an email to: info-contact@alibabacloud.com and provide relevant evidence. A staff member will contact you within 5 working days.

A Free Trial That Lets You Build Big!

Start building with 50+ products and up to 12 months usage for Elastic Compute Service

  • Sales Support

    1 on 1 presale consultation

  • After-Sales Support

    24/7 Technical Support 6 Free Tickets per Quarter Faster Response

  • Alibaba Cloud offers highly flexible support services tailored to meet your exact needs.