The use of "big data" to improve corporate information security is not entirely hype, which will be a reality in the next few years, Gartner analysts said.
However, this implementation will be difficult because a successful large data security analysis deployment will redraw the logical boundaries of the IT department, but few security vendors now offer products that support this transition.
At the Gartner Security and Risk Management Summit in 2012, Neil MacDonald, vice president of Gartner, spoke about important information related to large data, noting that, given the number, speed, diversity and complexity of large data, the analysis and processing of large data requires different approaches, Few IT companies now have scalable systems that can analyze terabytes of data at a reasonable rate.
He also cites several large data types that are difficult to correlate, analyze, and use for business and information security decisions, including network packet capture, sensors, various transaction data, compliance monitoring, and threat intelligence.
"The focus is not on the data, but on what you should do with it: analyze the data to get the information you need," MacDonald said. "I think big data analysis is real, not just hype, which is the type of information security data we'll be dealing with and analyzing." ”
Due to the rapid rise of advanced continuous Attack (APT), large data analysis has become an urgent problem to be solved in many enterprise information security departments. Traditional security defenses are difficult to detect advanced persistent attacks because they are completely different from previous malware patterns.
"How did you know there was a problem?" Before you subscribe to a supplier's service, they will tell you what the problem looks like, and then you can look for the problem, "MacDonald said," But now, no one is what it looks like, how do you find the problem? ”
The enterprise must first determine what normal, non-malicious activity looks like, and then look for activities that are different from it, thus discovering malicious activity. But to succeed, MacDonald says, companies need more data to build a baseline, which is where big data comes in.
By 2016, 40% of companies (banks, insurance, pharmaceuticals and defense Industries) will actively analyze at least 10TB of data to identify potentially hazardous activities, MacDonald predicts. However, the supplier's product pattern cannot be changed in the short term. Now, businesses often rely on Siem Systems to correlate and analyze security-related data, MacDonald said that the current Siem product is unable to handle such a large workload, most Siem products provide near real-time data, but can only handle normalized data, Some Siem products can handle a large amount of raw transaction data, but cannot provide real-time intelligence information.
MacDonald says this means that some companies will have to build effective large data analysis systems on their own, and he predicts that more and more companies will start building big data projects, such as Zions bancorporation Big Data deployment projects, The company built a large data warehouse based on Hadoop security using the technology of the start-up company Zettaset Company.
MacDonald recommends that companies seek help from Siem vendors to track market trends before customizing deployment. Some of the larger vendors, such as IBM, HP and EMC and its RSA and VMware subsidiaries, are building and consolidating similar technologies based on their Siem products.
Ultimately, the big security data will evolve into a part of the IT business intelligence trend, which combines information security intelligence and IT business data to provide higher levels of business intelligence, says McDonald. The combination of security and business data can be of great value, because as IT systems become virtualized, it is becoming increasingly common to use standard behavioral baselines in security and business units to discover unusual behavior. In addition, the operations team will know about security-critical data, such as which systems carry the most valuable data for the enterprise.
"You can think of this picture, it contains massive amounts of data," MacDonald says: "To get to the top of the pyramid, you have to use meaningful patterns and visibility to extract large amounts of data, to make it operational, to know what to do with the data and what data should be prioritized. This may sound difficult, but that's the key, which is what Gartner calls Security Intelligence: I need to know where to focus-to show priorities through it risk ' hot maps '. ”
Differing views were expressed as to whether such an evolution would occur in the next few years. Luis Scull, a research analyst with Open Field Capital, said he did not think big data would be in control in the coming years because most companies did not have the resources or the policy capital.
However, Robert House, senior director of technical assistance and incident response at a company in New Jersey, says that in their businesses, people have started talking about the Big Data security analysis system, because they desperately need to find innovative ways to detect threats. But Siem suppliers have not stepped up their pace to help customers take the next step.
"My experience is that most vendor solutions do not supplement the data they provide you, and you need to do it in-house," says House. "Compliance is also driving the evolution of large data, and with the PCI DSS (payment card Industry Data Security standard) and Sox (Sarbanes-Oxley), there are more advanced defense methods, and in the next few years, we will be more urgent to secure large data systems."
"As Stuxnet and flame attack become more intelligent, you will need to tighten security controls and constantly improve your security methods." ”
(Responsible editor: Lu Guang)