October 2nd, 2013 Open Analytics

http://www.meetup.com/Open-Analytics-NYC/events/141845202/

On Wednesday, October 2, 2013, OLC attended Open Analytics: Cyber Security featuring Bob Stasio, CEO of Ronin Analytics, and Lance James, Head of Intelligence at Deloitte.

                                          

http://www.roninanalytics.com/                  http://www.deloitte.com/

Bob Stasio presented, calling his presentation, “Examining Trends in Cyber Security: Merging Network Defense and Analysis”.

“The 80-20 rule is applied to many things: Business, talks, content, attention...the 20% of threats cause 80% of problems. We’re talking about very organized hacker groups. Over the past 20-30 years, we’ve figured out how to battle the 80%, but not the 20%,” Stasio said. Currently, security systems use signature-based security software like firewalls, encryption, anti-virus and so on. For the 80%, billions of dollars are spent for these preventative tactics. “Even though we do this, the top 20% get in easily,” he said. “This year, we’re slated to spend $63 billion on security alone—and $93 billion by 2017. And 90-95% of that money is dedicated to static measures.”

Just 3% of threats are detected from the anti-virus software (and other preventative measures) with an average of 350 days of living within the computer until detection. There are four phases to hacking: Reconnaissance, scanning, exploitation and persistence.

“Cyber intelligence is a great way to get ahead of these problems. It takes external and internal feeds and analyzes them to put on a feed that gives you situational analysis.”

The cyber security map starts with compliance measures, signature-based security and defense in depth; intelligence programs, predictive analytics phases; and intelligence advantage, reduction apparatus to profit enhancements.

Lance James presented “Buzzword Beatdown: Demystifying the Big Data Threat Intelligence—Security Analytics Debacle”.

James began by explaining that threat intelligence is not big data (just yet). “Feeds on average are less than 100 megabytes and there’s a lot of unnecessary data out there. We need smart data.” He also explained that a lot of companies were using the wrong tools for the wrong solution. “A lot of the times, the software they’re using—it doesn’t scale,” James said. “You need to take baby steps. Fix what’s broken first.”

He outlined understanding threat in five steps: network indications, host indications, little ‘t’ threats (and most of today’s products and services provide information at these levels), big ‘T’ Threats, and attribution. “Experience suggests that balance is needed to understand requirements, be an adversary, understand your environment and the risks that come with it. In the end, we need to come up with some tangible objectives,” James said.

To generate new intelligence data, there are six steps that one must take: Plan, gather information, normalize data, construct a meaningful picture, distribute to relevant audiences, and improve future intelligence. Basically, it comes down to analytics.

“What we need to do is turn information into intelligence,” James said.