Why is a cyber supply chain important?

Amidst the current threat landscape, agencies like the General Services Administration, Defense Department, Department of Homeland Security and the Intel community have begun working together to bring more efficient and secure methodologies to the procurement process. They do this while paying attention to the risk management of the cybersecurity supply chain.

On this edition of Cyberchat, host Sean Kelley sat down with GSA’s William Zielinski, Assistant Commissioner, Information Technology Category and Lawrence Hale, Director, IT Security, Information Technology Category to discuss the acquisition process and vehicles for the cybersecurity supply chain.

“[GSA’s IT category] builds and maintains a series of very large pre-competed governmentwide acquisition contracts,” Zielinsky said.

Generally, agencies who use GSA vehicles for procurement will get better pricing and be able to move through the acquisition process faster, because the groundwork has already been laid, freeing up government procurement professionals to focus their efforts on mission critical acquisitions.

Zielinski explained how this new procurement process affects the cybersecurity supply chain. He said stakeholders from IT security, acquisitions and risk management team together to assess what they are buying and from whom. “We’re actually making a purchase of a technical capability, we have assessed those things and they are actually part of how we go about buying our technical capability.”

There is also increased pressure from Capitol Hill that requires agencies taking a closer look at the risks associated with the cybersecurity supply chain – in legislation like Section 889 of the National Defense Authorization Act and the Secure Technology Act.

GSA’s Hale explained another benefit of a cybersecurity supply chain. “There are a number of examples of GSA activity that reduce duplicative efforts by offering security screening that’s built into the solutions.” Hale pointed to FedRAMP and CDM programs where one authorization is used by multiple agencies.

Hale said using the GSA schedule helps avoid the added cost of competing on one-off contracts for industry. “We find that industry tends to be one of our best proponents … when they learned that agencies are thinking about doing a solicitation on cybersecurity, they’ll say, you should do that on GSA. And here’s why,” Hale said.

Reshaping how cybersecurity is delivered

Chad Sheridan has always been a voice for the customer. Over the past few years, he has morphed into an evangelist on how the government should deliver services to citizens and government customers.

On this edition of Cyber Chat, host Sean Kelley sits down with Sheridan to discuss his new role as Chief, Service Delivery and Operations at USDA-Farm Production and Conservation Business Center and what that means for cybersecurity.

“This whole thing stemmed about as the consolidation at USDA,” Sheridan said. “We took the organization and brought it together to serve the business center, which has consolidated all the back-office functions like finance, HR, budget, etc. We are part of that business center serving all three agencies and the people of the business center.”

Sherian said part of his job is to operationalize USDA cybersecurity and support the Chief Information Security Officer, even though cybersecurity is managed by a different part of the organization. The CISO is dependent on Sheridan and his staff to ensure patches and updates are applied and that systems are built secure before deployed into the environment.

Sheridan said [the government] needs cybersecurity professionals that look outward and want to understand the business of those they support.

“CISOs need a broader base of knowledge and understanding of what the pain points are for running an organization … It’s no different than the journey we’ve made as (IT) operations professionals,” Sheridan said.

As for the future of cybersecurity? Sheridan said IT consumerization will drive it.

“The reality of the way the world has really hit the government and what I mean by that is the expectations of rapid delivery, rapid innovation and agility are hitting us with force.“

Former WH senior advisor talks data privacy


Privacy became a real issue for America in 2018.

Marc Groman, who joined Cyber Chat with host Sean Kelley, said existing incentives for data security have so far been wrong.

“When it comes to protecting the perimeter and protecting our networks, we’re still — in some cases — at data security one-oh-one,” Groman, a former senior advisor for privacy at the White House and now principal at Groman Consulting Group LLC, said. “We don’t incentivize data security enough. The incentive is to get your database up and running and I have been in more meetings than I can count where decisions were made to cut security.”

Groman said companies want to be generations ahead either in what they produce for sale, or in their own internal business processes. But in order for them to become better prepared to defend the entire ecosystem, agencies need to plan, implement controls and implement security appropriately.

“The insecure software product they rolled out with a bug gets exploited and when [society, the government, and sometimes a company feels the pain] data security and privacy is what often gets cut. Combating this problem is absolutely at an epidemic level. That’s in both the public and private sector and I don’t think we have a handle on it at all,” he said. “We repeat mistakes, don’t learn our lessons, and of course the threats are getting increasingly serious. Our adversaries are getting more sophisticated and so it’s not about just being good, it’s about keeping up with the threats and the adversaries and the risks and I don’t think we’re doing a very good job. “

Serious hacks and data breaches in the past have come from phishing, not serious terms. Groman said it was often just someone clicking on a link the intrusion prevention system didn’t catch.

“The damage is done … [Even] if you’re going to be storing highly sensitive data that is going to be the target for sophisticated adversaries, we’re still at human error,” he said.

Kelley asked Groman if privacy was at least in some realm already gone and how it’s affecting the next generation. Can we actually recover?

“I thought we needed this 10 years ago [because] we don’t have much privacy, particularly in the context of our internet and our online interaction; that’s just factually true”, Groman said. “We’re more than a decade late to this game and unless we get a handle around who can use it and for what purposes, I think we’re going to end up in a place that almost none of us are going to like.”

The United States does not have a comprehensive law at the moment for addressing privacy. This will become a major challenge if the administration doesn’t get it under control, Groman said. There are serious challenges in the context of privacy including collection that is responsible, ethical and fair, as well as the government or private sector’s use of the data collected.

“Artificial intelligence, machine learning, the Internet, the amount of data passively [being] collected by thousands of sensors around us from machine to machine communications is going to be mind blowing,” Groman said. “We have sectorial laws that apply in very narrow spheres and there are enormous gaps.”

He said in some ways we are all at fault for not predicting how our data could be used in negative ways. Facebook headlines each week are the poster child for data breaches.

“Today we’re into protecting passwords, last week not protecting data, going to third party’s week before something else,” he said. “We’ve got to get a handle around that.”

Europe moved ahead with a very comprehensive privacy law that is having ripple effects across the entire globe, including on American companies, the public and private sector, Equifax, OPM and the government.

America needs to do a similar thing to develop a comprehensive, federal privacy law that will govern the commercial sector, Groman said. But some are hesitant, as giving up some privacy to be able to use existing applications and services are convenient.

One of the biggest problems when it comes to data privacy is that information on who has access to our data or what it will be used for is not readily available. Double-checking our privacy standards should become a priority.

“I use my privacy settings to ensure that I understand when and what apps are collecting. I change what’s accessible to the public or I use two-factor authentication, and same thing with social media,” he said. “If you are not using two factor authentications with your social media accounts today, that’s moronic, [because] if you read what they’re doing with it, you’d be horrified.”

Cybersecurity battleground – Status of cyber threat info sharing

The current status of cybersecurity threats and information sharing between the public, private and government sectors is improving. With that said, there is still much work that needs to be done.

Host Sean Kelley sat down with an esteemed panel to discuss. The guests include:

  • Wally Coggins, director of the IC Security Coordination Center within the Office of the Director of National Intelligence
  • Mo Bland, deputy chief of cybersecurity operations at NSA
  • Rex Booth, chief of cyber threat and risk analysis at the Cybersecurity and Infrastructure Security Agency
  • Allan Thomson, chief technology officer at LookingGlass

Both private and public partnerships are essential to counter those who are actively trying to penetrate networks. But how do we get the right information to the right people at the right time?

Transforming data into intelligence and making that information relevant to organizations is a real challenge. It’s not something that can be fixed easily or without a collective force of effort.

While there is a lot of noise coming from people who claim to have a mass of intelligence, they actually just have a lot of information, which isn’t the same thing. The key to having quality data is relevance.

Data overload is already upon us, and it’s something that is only going to continue to grow. Artificial intelligence and machine learning are two particularly exciting areas in the future for cybersecurity, and they will both enable a faster way through the murky clouds created by having too much data.

Partnerships between the private sector, federal civil service, Defense Department and the intelligence community at large can be leveraged to take the data that is gained and help to truly understand what the adversaries are doing, how they are doing it and the most effective ways to detect and mitigate those risks.

Overcoming challenges

How can we create a common language between analysts and the intelligence collectors?

The intelligence collectors need to understand the threats and the tools that the analysts in the cyber defense world use to protect their networks. A more defined focus on technology along with policy is what is needed. However, it will cost money and more importantly, it will take time.

Some of the considerations for the policy concern data privacy, data sharing, data handling and data storage. This could also differ based on a region, state, country or continent. The end goal seems to be understood, but the challenges lead us to believe it is not something that is close to being resolved any time soon.

We first need to know what suspicious activities actually look like. Once risks have been defined, the data will need to be transformed into a set of indicators. This data will then be searchable on a network. One of the key challenges is figuring out which controls need to be in place in order to detect harmful activity quickly and more effectively.

The ultimate goal of cyber threat intelligence is to raise the cost of operations to the adversary.

The pool of active consumers that utilize cyber threat intelligence services is small. A larger subset of passive consumers get access to this data through the the use of various technologies, and could potentially be wiped out by an advanced attack. Reaching out to them is that last mile, and it’s going to be essential.

Persuading the private sector to bring forward data — which could have reputational or financial effects — or the government to share highly-classified information will always be difficult. But as an industry, we must find middle ground, where we can more easily access the information that end users need to do their jobs effectively.

A conversation with Greg Touhill, former federal CISO

In any organization, people tend to get fixated on the policy. Same goes for cybersecurity policies. But often while the policy is sound, the issues lie within the execution.

Retired Brigadier General and former federal Chief Information Security Officer Greg Touhill joined Sean Kelley, host of CyberChat, to discuss the future of cybersecurity.

Touhill said there are three major hurdles facing today’s organizations:

  • A lack of authority on the part of the CIO
  • A lack of unity of effort
  • Inefficient and ineffective architecture

While cybersecurity policies could be ultra-secure, Touhill said the organizations and entities your agency or company deals with don’t always have the same cybersecurity posture and capacities.

‘’Moving to the cloud is the right thing to do, but it needs to be done in the right way. The appeal of the overhead reduction is compelling, organizations can be more agile, and both lower OpEx [operating expenditure] and CapEx [capital expenditure] results are both incredibly attractive. As I learned in the Air Force, you never fly into a cloud without knowing what’s on the inside or on the other side,” Touhill said.

Another consideration, Touhill said, is how to implement or sustain independent third-party auditing, a “must-have” tool. Organizations should also retain the ability to pen test and audit.

Touhill said organizations need better execution of existing policies, to retire older technology, update cybersecurity strategy to a more modern zero trust strategy and to leverage public sector best-practices.

He went on to say public and private sectors have at times been resistant to pooling resources — bulk buys and leveraging their buying power — due to fear of losing control of their decision making authority.

Touhill said in order to get to a common architecture, there needs to be a legislative approach and a push that needs to drive changes to happen. “We should be all about protecting the people’s information.”

There are initiatives and education programs for future cybersecurity professionals, but Touhill said the pipeline is not being filled quickly enough to meet with the present-day demand.

‘’The Air Force is a great example of a government organization that is getting it right. With many jobs being replaced or being made redundant because of various technologies, affected individuals are being retrained into the world of cybersecurity,’’ Touhill said.

VA adopts DevOps, agile methodology to improve cybersecurity

The Department of Veterans Affairs has been going through a transformation for years, and a huge part of that centers on technology modernization and the agency’s adoption of DevOps and Agile methodologies.

On this week’s episode of Cyber Chat, Sean Kelly spoke with VA Office of Information and Technology executives Bill James, deputy assistant secretary of DevOps and Drew Myklegard, the executive director PSF. The discussion was focused on how the department’s modernization effort can lead to better cybersecurity practices.

The guests also touched on:

  • differences between DevOps and Agile methodologies and what that means for the VA;
  • a culture shift that takes place from training the workforce and contractors on these new processes; and
  • developing software faster, better and cheaper for veterans.

Myklegard said it’s important to begin with safe and secure products that already have the proper controls in place. James chimed in saying that having security personnel present during interactions with the customer is vital as the VA steps away from a “waterfall” approach when it comes to modernization.

“We bake in security from the beginning in this DevSecOps approach, as opposed to bolted on at the end. We’re developing products at the beginning incrementally, [instead of] the add-the-waterfall methodology where you’re looking for something at the end,” James said. “So we’re engaging the customer at the beginning as opposed to pushing all the testing down to the end. So it’s a different way of thinking.”

The future of IPv6 and cybersecurity


Charles Sun, government executive and IPv6 expert appeared on Cyber Chat with host Sean Kelley to discuss infrastructure, IPv4 and the need for quicker adoption of IPv6.

“Infrastructure and cyber security are really hard [when] separate from each other,” Sun said. “The fact that the network and infrastructure logs do not report a data breach does not mean your network is secure. We are facing a challenge where networks are attacked on daily basis.

Sun said “these are attacks are happening all over the world,” on average more than six confirmed data breaches a day, more than 53,000 reported security incidents and breaches last year alone.

Sun said this is due to the fact that most network environments are in between two IP stacks.

“This is especially true in the private sector. From my perspective, this is a huge challenge that both public and private sectors really need a different approach to. Just because you haven’t found the logs or the alarms haven’t gone off, it doesn’t mean that you haven’t been breached,” Sun said.

Sun said security information and event management (SIEM) has not brought the abilities or the clarity it promised with log management.

“I don’t think that SIEMs have produced what everybody thought they were going to produce from a tool standpoint,” he said, because of the dependence on human intervention for success.

Sun said there needs to be new questions.

“What can we do differently? Can we bring a different perspective or opinion to address the issues? How do you reduce the overall attack vector or attack surfaces?” he asked.

In addition, Sun said eliminating IPv4 from the network would enhance security.

“By turning off that legacy IPv4, [we] will achieve a great reduction of all the attacks and the threats that are experienced today. The fact that currently we’re running dual stack mode of operations of both IPv4 and the IPv6 is a great vulnerability to the environment,” Sun said.

Sun acknowledged that getting rid of IPv4 will take time.

“Before we can truly enjoy automation and even artificial intelligence, we need to get down to one stack, one protocol and make sure IPv4 is entirely shut down,” he said.

Sun said quite a few carriers are already in the process of turning off IPv4, at least internally. According to a recent report, T-Mobile and Verizon are in the process.

Takeaways:

  • Infrastructure and cyber security are hard to separate from each other.
  • Even if your network or your infrastructure logs don’t report a data breach, that doesn’t mean your network is secured.
  • On average, more than six confirmed data breaches occur every single day. Last year there were over 53,000 reported security incidents and breaches.
  • Eliminating IPv4 on the network will greatly enhance security.

What is insider threat?


Any mention of an organization and insider threat in the same sentence generally conjures up an image of information being stolen by an employee — which is precisely the image Michael Theis and Matt Moynahan want to change.

Cyber Chat Host Sean Kelley sat down with Moynahan, CEO of ForcePoint and Theis, Chief Counterintelligence Expert at Carnegie Mellon University’s CERT Insider Threat Center.

Theis defined an insider threat as “the potential for an individual who has authorized access to your organization’s assets to use those excesses maliciously or unintentionally to act in a way that could negatively affect the organization.”

But Theis said it covers a lot more than just employees or former employees.

“Things like trust and business partners, those supply chain vendors. Anyone who has access to your physical people, your physical facilities, your info or your technology.”

“Insider threats to enterprises begin with access, privilege and the intentions of the person with that access,” Moynahan added. “The definition of insider becomes very blurry with things like digital transformation [or] movement to the cloud. Attackers are getting in, identities and credentials are being stolen, and the human being has become one of the primary vectors of attack.”

Companies are spending tremendous amounts of money in training with the goal being to become security companies — more or less — in order to combat insider threats.

“Hygiene training from the hygiene approach certainly raises the bar, but I don’t think that is the answer quite frankly,” Moynahan said. “The unintentional, ‘don’t click on the link, don’t open the attachment’ [is a necessity, but] we need to do something more for systems and technology in my opinion.”

Theis said it’s fair to ask for proper care and caution, but he doesn’t know how effective training is. He said training should be broken down by observables, both human behavior and technical. “What are your coworkers doing that could be putting [the company] at risk? There’s no one type of ‘insider.’ It really depends on the type of threat. It’s not as simple to say, ‘What are you likely to see? When are you likely to see it? What do you look for?’” Thes said.

“The challenge has been that despite the general best efforts, the industry hasn’t protected organizations,” Moynahan said. “And the problem with the current security marketplace is that things have gotten so bad that we’re forcing enterprises to become security companies. We’re forcing individuals to try and become security experts.”

Moynahan said around $1 trillion has been spent over the past seven years trying to keep people out, with a 95% failure rate.

“It’s not just a spend issue, I think it’s an approach issue that we need to think about in addition.”

Contextual intelligence in the cyber battlefield

Cybersecurity executives have an enormous responsibility. We have moved from the conventional data center model to a cloud environment with data spread across the world.

It used to be enough to protect an organization with some basic tools like local antiviruses and a perimeter firewall. Today, that has exploded into an arrangement of solutions like intrusion detection systems, intrusion prevention systems, network and host firewalls, security incident and event management tools, spam filtering, encryption in many of the solutions that need to be installed, integrated and managed.

The adoption of cloud technologies has also added a new level of complexity to the challenges faced by cybersecurity executives. Cloud and mobile technologies have them developing new ways to tackle these issues. Organizations need cybersecurity that provides complete visibility, intelligence, and the ability to scale to create a comprehensive view of the threat landscape. In this episode of Cyberchat, we discussed how an organization matures, uses threat intelligence, creates a comprehensive view of its cybersecurity posture and employs contextual intelligence in the cyber battlefield.

Our guests were Shane Barney, chief information security officer for U.S. Citizenship and Immigration Services; Matt Smith, senior adviser to the CISO at the Department of Homeland Security; Greg Willshusen, director of Information Technology and Cybersecurity at the Government Accountability Office; and Alan Thompson, chief technology officer at Looking Glass.

When asked if it is harder to secure data today, all agreed. All admitted that the threat landscape is changing, so a defense can’t really be aimed at one threat or vulnerability, but needs to create a comprehensive view. Smith added that though the threat landscape is changing and becoming more advanced, “the [defense] capabilities are also advanc[ing] in defending the data.” Barney added that “[USCIS was] a heavily paper-based agency for a long, long, long time. Now we’ve made this huge leap into the electronic world and we’re still sort of adjusting to that.”

Willshusen stated that “the cloud is certainly an opportunity to help secure data that’s out there, but it’s also does not allow agencies to say, well, it’s a cloud service provider’s responsibility for securing information. It’s still up to the agency to make sure that the cloud service provider is adequately protecting that information.”

He also said that, “With respect to security we have found that the security over data at most of the agencies we go to needs to be dramatically improved … and it’s not just [the Government Accountability Office], it’s also the inspectors general at the various agencies, at least 18 out of the 24th CFO Act agencies, which are the major federal departments and agencies, um, site that their agencies’ information security program is not effective.”

The panel agreed that any modern program must take a holistic approach, but also felt that the staff was a huge part of any cybersecurity posture. Barney stated that “involves getting the right people in the right places with the right knowledge and the right skills because that’s what’s gonna drive that [holistic approach].”

Thomson stated, “I would say finding experienced people in security is actually probably always going to be a challenge. So, making those individuals that you do have a more effective, a more efficient, to enable the achievement of the objectives of a security organization.” He discussed how we get to threat intelligence. “There’s a lot of things that can go into threat intelligence. Ultimately, it’s about data that can be used to help protect the organization. So, there’s no shortage of data. I think the key challenges is what data is relevant to securing your organization. So, for example, how can intelligence make that data more effective and more useful in your organization? So, it could be as simple as what type of actors are performing, certain types of campaigns, certain behaviors that ultimately can help inform or instruct your response to those aspects.”

Thomson also stated that “intelligence can be considered a much broader aspect in that informs you about your organization as a whole … There’s many different aspects of intelligence, but fundamentally it’s about focusing your defensive efforts based on what that intelligence tells you.”

Smith brought a great point to the conversation — that he valued threat intelligence but wanted to discuss risk scores and a need for a better understanding of what data was used to create that risk score. When discussing a risk score of eight, for example, he said, “What does the eight mean? Depends wildly on who your provider is, but the challenge that we have with that in operationalizing it is that what we really need?”

He also said there was plenty of data.

“But in order for me to contextualize that eight in my environment, I really need the bit of data that went into calculating that eight. And there’s some trade secret challenges in exposing that that we haven’t figured out how to overcome,” he said. “But if I had those data elements and could put that in context of my own data and my own analysis, then I can start identifying whether there’s a threat to systems at the southern border, or whether there’s a threat to a particular executive that I have, or whether there’s a threat to a location at a time that I might know that we’ve got either a particularly sensitive event going to be happening, or you know, particularly impactful travel that’s going to be happening.”

A conversation with Greg Touhill, former federal CISO

In any organization, people tend to get fixated on the policy. Same goes for cybersecurity policies. But often while the policy is sound, the issues lie within the execution.

Retired Brigadier General and former federal Chief Information Security Officer Greg Touhill joined Sean Kelley, host of CyberChat, to discuss the future of cybersecurity.

Touhill said there are three major hurdles facing today’s organizations:

  • A lack of authority on the part of the CIO
  • A lack of unity of effort
  • Inefficient and ineffective architecture

While cybersecurity policies could be ultra-secure, Touhill said the organizations and entities your agency or company deals with don’t always have the same cybersecurity posture and capacities.

‘’Moving to the cloud is the right thing to do, but it needs to be done in the right way. The appeal of the overhead reduction is compelling, organizations can be more agile, and both lower OpEx [operating expenditure] and CapEx [capital expenditure] results are both incredibly attractive. As I learned in the Air Force, you never fly into a cloud without knowing what’s on the inside or on the other side,” Touhill said.

Another consideration, Touhill said, is how to implement or sustain independent third-party auditing, a “must-have” tool. Organizations should also retain the ability to pen test and audit.

Touhill said organizations need better execution of existing policies, to retire older technology, update cybersecurity strategy to a more modern zero trust strategy and to leverage public sector best-practices.

He went on to say public and private sectors have at times been resistant to pooling resources — bulk buys and leveraging their buying power — due to fear of losing control of their decision making authority.

Touhill said in order to get to a common architecture, there needs to be a legislative approach and a push that needs to drive changes to happen. “We should be all about protecting the people’s information.”

There are initiatives and education programs for future cybersecurity professionals, but Touhill said the pipeline is not being filled quickly enough to meet with the present-day demand.

‘’The Air Force is a great example of a government organization that is getting it right. With many jobs being replaced or being made redundant because of various technologies, affected individuals are being retrained into the world of cybersecurity,’’ Touhill said.