When it comes to security, your computer is your own worst enemy
It’s a sentiment echoed by the majority of security professionals across the industry.
“Most people think of computers as the central hub where everything goes,” said Joe Ruggiero, senior director of security solutions for Cisco Systems.
“But in reality, computers are a very decentralized system.
They are really the hub of everything else.”
Ruggieros perspective mirrors that of many IT pros across the country.
“When it comes time to create a new server, I always go to the hardware vendors,” he said.
“There is a lot of pressure for a new architecture.
And, of course, security is a very big concern.”
Rugieros view is not uncommon.
For some companies, this mindset has forced them to embrace cloud-based hosting, or virtualization, to improve security and ease the burden of managing their networks.
But Ruggiers approach to security is more nuanced, as he sees it as more about maintaining a secure network environment.
“Security is really about making sure you are making sure that you are keeping your systems safe,” he explained.
“And you’re also not exposing your servers to bad actors.”
Rigorous testing and auditing is key to ensuring that a server is properly secured, and securing a network should be a primary consideration, Ruggieri said.
“If you’re going to have a new data center, then you need to get it tested for all of the things that can go wrong,” he added.
“I think that that’s very important.”
Roughly two years ago, Riggieros team at Cisco was tasked with preparing a new network for a cloud-hosted business.
The goal: secure and manage a large amount of data and applications.
“One of the goals of this project was to ensure that we were testing all the hardware and software that was going to be used to support the new infrastructure,” he noted.
“We were also looking at all the things like how to deploy and manage the cloud infrastructure, so we were able to get all of that tested and validated.”
As part of this process, Roodieros teams security and auditors were able access the company’s systems to test how they were being used and to evaluate how they fit into their security guidelines.
They also found out how well they were performing in other areas.
“We found out a lot about the performance of our data centers, and we also found a lot on the network and how the network was used,” he observed.
“So we found out that a lot was being done to make sure that all of those things were being managed properly.”
While Ruggiestos team has been monitoring the network since its inception, they have been making sure the data center is being used appropriately for the business.
“The goal of the project was not to have any security issues,” Rugglieros said.
Rugginess has also been looking into the challenges of managing an enterprise with large amounts of sensitive data on it.
He said he sees this as a huge challenge for IT professionals.
“I think a lot more people are not aware of the threat of being able to access sensitive data, especially in a cloud environment,” he pointed out.
“If your company is not in a state where they have security protections in place, then there is a risk that it’s going to get out of hand.”
Roodieries team also has been investigating how the security and audit of the new network is being handled, as well as the security policies for it.
“In a way, we’re doing that in a very different way than we would for an existing infrastructure,” Roodiestos said, referring to the use of cloud-enabled services.
“It’s kind of a very, very different thing that you’re doing.”
“This is one of those projects where you want to look at every single piece of software and see how they interact with each other,” he concluded.
“How are they configured to meet the requirements?”
In a world of cloud computing, Rizzieros points out, there is still a lot to learn about the security of systems that will be running on cloud servers.
“You’re going through a lot,” he remarked.
“Because the infrastructure has been set up so that there are security controls in place.”