"Computer, what should I do next?" ”
When I was a child, I wanted to be an astronaut. Life was simple, but simple: I believe the pi I calculated was "about 3" longer than the later Bronze Age culture estimates. In other words, you don't want me to figure out the orbit of the next Mars rover. But after I had my calculator, I would check my math homework every time, my estimate of the circle was more accurate, and improved the future planning of the astronauts.
As I grow older, problems are getting harder, and my expectations of technology to help me reduce complexity are rising. I learned to generate and measure data. But it involves a lot of work when using my information to decide what to do next. This is because all of the activities of my system and their interaction with the Internet generate a lot of data, but the complex task of understanding these data is entirely left to me. This requires the ability to make the right decisions, which means that a decision (in hindsight) seems to be the right answer. But I'm not even sure I made the right decision, I just chose the path that matches the most data points.
If my software can at least help me make good decisions on the next step, that would be great!
The power of Analysis and optimization
The IBM Business Analytics and optimization software provides you with the power of scientific modeling, statistical analysis and optimization (and many other analytics tools) to help you solve practical problems. The software is designed to provide insights and implications for massive amounts of data, not only what we call real "big data" activities that generate data, but also the problems we normally encounter in our work.
Business analysis shows that I can analyze a problem to determine the relationships that already exist in the data. This means, for example, that I can analyze the cost of a product or service based on market factors. I can use the optimization package and use some input and constraints to determine the maximum or minimum value I can generate. I can finally understand all of these data trying to tell me the information.
Large data is undoubtedly a great driver of this demand, but for us all, the ultimate result is the ability to access these tools. IBM delivers many of these solutions through the cloud delivery approach, which means that this powerful software has less resources in your data center and does not weaken the value or power of the delivered solution. This combines with the Run-time encapsulation that enables you to embed the necessary APIs into your system, making them part of your system's functionality.
For years, IBM software has been at the forefront of analysis in all areas. You can see our statistical analysis in our performance evaluation software. You can even use our diagnostic software, such as IBM Support Assistant, to determine where to find a memory leak, based on in-depth analysis of the tools, or to use IBM WebSphere application Server to evaluate performance statistics to help find bottlenecks.
But business analysis products make the potential of in-depth statistical analysis to maximize the play. They help you make decisions that are supported by the data that you use to support your business by using the best mathematical algorithms that you have studied for years. This quantitative analysis method provides a unique dimension for decision support, which is helpful to make "good" decision based on statistical modeling method and first-class optimization method.
Simple example: Capacity planning
Perhaps you think Business Analytics is primarily for financial solutions, or perhaps you think your application is too small to apply such methods. But like any tool, once you see value and learn to apply its power, you will find its various uses (like a calculator for checking math operations), or use it to make decisions based on business data.
So what does business Analytics bring to the world I live in?
As a software architect, I have a very common problem: capacity planning. Suppose I intend to deploy a new system. I have a bunch of hardware. What is the relationship between cost and my throughput? Considering the cost of hardware consumption, what hardware combination can achieve the highest throughput at the lowest cost? This simple example will help you understand the power behind this software.
I'm pretty sure that faster hardware has a higher cost, but how much higher? I will determine this by analyzing the data.
Suppose I have a spreadsheet that contains the cost and throughput statistics for some computers in a fictitious datacenter. Suppose I can fully understand the statistics, know that I need a certain sample size to get meaningful results, and I have about 30 entries. We also assume that although I think the cost is associated with the CPU clock speed, I'm not interested in this simple example. This data is similar to the following table:
There are a total of 30 such entries. When it's done, it seems like the cost is a factor in throughput-but what factor?
I will open the IBM SPSS software package and perform a linear regression analysis to see how the CPU clock rate and cost affect my throughput.
Figure 1. The SPSS statistical Data Editor shows the linear regression analysis