A new cybersecurity appliance prevents hackers from being able to conduct reconnaissance and move laterally within a network.
Cryptonite’s CryptoniteNXT, which launched Oct. 26, is based on research funded by the Defense and Homeland Security departments into Moving Target Defense, which DHS defines as a way to increase complexity and uncertainty for attackers by “controlling change across multiple system dimensions.”
Delivered as a hardware appliance, CryptoniteNXT takes away hackers’ visibility into a network, making attacks more difficult and expensive.
It changes an endpoint’s view of the network into a dynamic, abstract structure, by “taking over how IP addresses are issued to the endpoints,” Cryptonite CEO Mike Simon said. That turns a once-static network into a moving target. It then maps the “obfuscated” network to the real one so legitimate traffic can flow normally across the network, but hackers’ east-west, or lateral, movement between endpoints is cut off.
“We enable any network to actively shield itself from cyberattacks by preventing reconnaissance,” an initial source of cyberattacks, Simon said.
CryptoniteNXT issues IP addresses on a temporary basis. As long as a session is active, a legitimate user has a Cryptonite token. But “as a hacker, if I come back later and try to use that temporary address as part of the planning and executing an attack, that will not work,” Simon said. In fact, if someone tries post-session to access those temporary addresses or the IP address that was in place before CryptoniteNXT was installed, the tool will trigger an alert.
“It will be an extremely high-value alert because we’ll know that someone was trying to [access] IP addresses they should not be,” Simon said. That alert goes to an organization’s security information management program within 30 seconds of the event occurring. It shows which user and IP source the address request came from, eliminating the need for the IT team to sift through network flow data or end-point analysis to determine the source of the problem, he added.
The appliance address three gaps identified by the National Institute of Standards and Technology’s Cybersecurity Framework, according to Simon. The first is protection. The guidelines ask agencies to install tools that would enable them to investigate and follow attack paths to track down the source. Using Moving Target Defense, CryptoniteNXT Net Guard stops attacks at the endpoint by mapping from the obscured network to the real one, enabling legitimate traffic to flow across the traditional network infrastructure.
The other two gaps are in detection and response. CryptoniteNXT detects blocked attempts by a compromised endpoint to access other applications and resources and issues alerts. “We’re telling you whether someone attempted to access something outside of segmentation policy and/or someone tried to access or address IP locations outside of what they’re able to do,” he said. “As soon as that comes into the SIM, you realize … they’ve already been blocked. You don’t have to worry about immediately investigating, but you know it was likely one of two things: a malicious attacker or a misconfiguration.”
Agencies don’t need to change anything about their existing infrastructure to make CryptoniteNXT work, Simon added. “Our approach was no software runs on endpoints, no software runs on servers,” he said. “We can use your existing switching equipment. We’re not going to make you change out anything.” Once the product is installed, he said, “immediately, without any configuration, the protection of reconnaissance data starts.”
There’s also no effect on end users because everything happens in the background. For instance, CryptoniteNXT uses FreeRADIUS, which will interact with RSA tokens or another two-factor authentication mechanism, and users will never know Cryptonite is there.
Network performance is a big concern for agencies when they add monitoring tools, but “we operate in some cases faster than the existing network,” Simon said. “That doesn’t mean there isn’t any latency. It means the latency in our device is very similar to another switch … in the network. Users don’t see any latency other than a normal device added to the network.”
Right now, Jason Li, vice president of network and security at Intelligent Automation, which spun off Cryptonite, is working with the Defense Information Systems Agency and DHS on a pilot deployment of CryptoniteNXT. He and a colleague were part of the original research into Moving Target Defense and identified the fundamental problems with existing cybersecurity, namely the ease of reconnaissance and lateral movement by hackers.
It has applications beyond DHS and DOD, he added. “This technology is universally applicable to all enterprise networks,” Li said. “We have totally changed the asymmetric nature of cybersecurity from the attackers have the upper hand to now the defenders have all the upper hand. That’s huge.”
Out of the gate, the company has several customers in the government, manufacturing, energy, and internet of things markets, and Simon said he’s working on building more business.
“This has been much needed. We’re approaching cybersecurity from a perspective that is in the hackers’ court. We’re giving them the advantage, and a lot of what we create is simply to track them down,” he said. “We need to be focused more on a defensive platform posture, and stop hackers vs. letting them into our networks.”
pager start pager end
About the Author
Stephanie Kanowitz is a freelance writer based in northern Virginia.
Cyber attacks have been increasing in speed and scope, demonstrating capabilities that are challenging the ability of even the fastest machines to respond. Security organizations in the government and the private sector alike have begun to realize that machine analytics alone cannot substitute for real-time communication between human analysts when confronting the current threat environment.
By any estimate, the WannaCry attack in May was the fastest-spreading self-propagating worm ever detected. It spread through unpatched Windows machines around the world in a matter of hours, with the Financial Times reporting that it affected more than 200,000 machines in the first round of attacks. The discovery of a “kill switch” prevented the payload from executing in most situations, but the worm caused significant damage as it propagated through unpatched networks. Security researchers continued to report “rebound” infections inside vulnerable Windows networks for several weeks. The attack exploited a vulnerability for which a patch existed, and defense consisted of a “fire drill” to deploy the patch in any vulnerable network.
A far more serious threat (and the first view into what type of damage can occur from a true zero-day attack) occurred with the Petya or NotPetya virus that took down the Ukrainian economy and several major international corporations in June. Petya exploited a vulnerability in software used by the Ukrainian government to handle tax and other financial transactions with individuals and businesses. Pharmaceutical giant Merck and Danish shipping company Maersk were both incapacitated by the virus within hours. Although the initial infection exploited a vector in the Ukrainian software, it spread throughout Windows networks in a manner that Microsoft has yet to explain.
Different from WannaCry, Petya’s purpose was destruction, not ransom. Affected machines in the management systems of Merck and Maersk were bricked, unrecoverable, and front-office functions were shut down within hours, according to Forbes. Maersk lost the ability to move its ships in or out of international ports for several days, effectively crippling a major part of the transportation network. Merck is still trying to recover production and packaging capabilities. According to sources familiar with the attack, the virus spread at machine speed, hitting tens of thousands of Windows systems within minutes of gaining access to a network, successfully defeating end-point sandboxing technologies by crashing the boxes before analytics could be completed.
The total impact was so shocking that on Sept. 20, the House Energy and Commerce Committee issued letters to Merck requesting testimony on the scope of the damage. The committee also has asked the Department of Health and Human Services to explain what the government intends to do about the situation.
A small hint of one solution was shown by the role played by the Healthcare Cybersecurity Communications and Intelligence Center (HCCIC), a fledgling threat analytics center that played a central role for HHS during the WannaCry incident.
HHS Senior Advisor for Healthcare Public Health Sector Cybersecurity Leo Scanlon, testifying at a June 8 hearing of the Oversight and Investigations Subcommittee of the Energy and Commerce Committee, described the role that a small threat-sharing watch floor played in supporting emergency response capabilities that the agency typically brings to natural disasters.
The watch floor supported a group of analysts who were tracking events in real time and communicating with private-sector partners through the National Healthcare Information Sharing and Analysis Center. This network provided real-time updates on current intelligence and was able to dispel rumors and bad guidance that proliferated on the internet. Organizations with capabilities to reverse engineer malware samples made their findings known hours and days ahead of official information from other government sources, putting HHS out in front of other government agencies in responding to the crisis.
The secret to the watch floor idea is that it puts the human back in the analysis loop. Automated sharing of threat indicators has been widely heralded as the “silver bullet” to defend against automated attacks. The Department of Homeland Security, for example, has a much-publicized program called Automated Indicator Sharing, which provides threat indicator reports to its subscribers. The reports are formatted and transmitted using protocols that can be consumed by network defenses and deployed at machine speed. The problem with this effort, and the many commercial variants that provide enormous amounts of similar threat data, is that there is no way to ingest and consume the information without analyzing it to determine if it is valid, or relevant to a particular environment.
That type of analytical work can only be done by highly trained specialists who know their networks and who can compare notes with colleagues in real time. Linking those groups of analysts to each other is a primary goal of the HCCIC, according to its director, Maggie Amato. Speaking at the (ISC)2 CyberSecureGov conference in June, Amato said the agency was building links with DHS, the Defense Department and private sector partners that would strengthen resilience for the entire sector. “We really do want to get to a place where we are collaborating with each other and cooperating across the board, having dynamic threat sharing and not just automated indicators,” he said.
Cybersecurity watch floors are being integrated with state and local level emergency response fusion centers. New Jersey has established the NJCCIC, California has numerous centers that serve as hubs for universities and localities and the Los Angeles Integrated Security Operations Center is recognized globally as a leader in municipal cybersecurity.
The trend is clear with these developments — defenders must share and crowdsource the same way the attackers do — the machines can’t do that on their own. The most important outcome of this human analytical activity will be much-needed context about new attack mechanisms and how they are used to exploit vulnerabilities in ways specific to different sectors. This understanding of tactics and techniques, in context, will add a critical degree of resiliency that is now lacking.
A federal judge recently unsealed the source code for a software program developed by New York City’s crime lab, exposing to public scrutiny a disputed technique for analyzing complex DNA evidence.
Judge Valerie Caproni of the Southern District of New York lifted a protective order in response to a motion by ProPublica, which argued that there was a public interest in disclosing the code. ProPublica has obtained the source code, known as the Forensic Statistical Tool, or FST, and published it on GitHub; two newly unredacted defense expert affidavits are alsoavailable.
“Everybody who has been the subject of an FST report now gets to find out to what extent that was inaccurate,” said Christopher Flood, a defense lawyer who has sought access to the code for several years. “And I mean everybody — whether they pleaded guilty before trial, or whether it was presented to a jury, or whether their case was dismissed. Everybody has a right to know, and the public has a right to know.”
Caproni’s ruling comes amid increased complaints by scientists and lawyers that flaws in the now-discontinued software program may have sent innocent people to prison. Similar legal fights for access to proprietary DNA analysis software are ongoing elsewhere in the U.S. At the same time, New York City policymakers are pushing for transparency for all of the city’s decision-making algorithms, from pre-trial risk assessments to predictive policing systems, to methods of assigning students to high schools.
DNA evidence has long been a valuable tool in criminal investigations, and matching a defendant’s genetic material with a sample found on a weapon or at a crime scene has impressed many a judge and jury. But as new types of DNA analysis have emerged in recent years to interpret trace amounts or complex mixtures that used to be dismissed as hopelessly ambiguous, the techniques are coming under fire as overly ambitious and mistake-prone.
An article ProPublica co-published with The New York Times on Sept. 4 detailed the growing doubts about the Forensic Statistical Tool, which New York City created to determine the likelihood that a given defendant’s DNA was present in a mixture of multiple people’s genetic material. According to the crime lab’s estimates, FST was used to analyze crime-scene evidence in about 1,350 cases over about 5 1/2 years. It was phased out at the beginning of this year in favor of a newer tool.
A coalition of New York City defense lawyers has called for a review of all cases that may have been affected by either FST or a second disputed analysis method, called high-sensitivity DNA testing. The state inspector general, which acts as the lab’s ombudsman, has received the lawyers’ request but has not yet announced whether she will launch an investigation.
The crime lab, which is part of the Office of the Chief Medical Examiner, did not oppose ProPublica’s motion but maintains its support of its technology. “I want to be very clear that OCME continues to stand behind the science that the FST source code operationalized and that we will continue to defend FST,” Florence Hutner, general counsel for the medical examiner’s office, wrote to the judge on Oct. 6.
She added that the lab agreed to full disclosure of the expert affidavits because the redactions had “exacerbated the substantial misunderstanding of fundamental aspects of the FST source code that is reflected in multiple published criticisms of that code.”
ProPublica’s motion came in a federal gun possession case, U.S. v. Kevin Johnson. Johnson was staying with his ex-girlfriend in the Bronx when police were called to her apartment and found two socks wedged between the refrigerator and the wall, one containing a black pistol and the other a silver revolver. By FST’s calculation, the DNA found on one gun was 156 times more likely than not to contain Johnson’s genetic material. DNA from the other gun had an overwhelming likelihood of 66 million.
In that case, Caproni became the first judge to order the lab to hand over the code for examination by the defense, but her protective order barred attorneys and experts from discussing or sharing it. Nathaniel Adams, a computer scientist and an engineer at a private forensics consulting firm in Ohio, reviewed the code for the defense and submitted an affidavit that was partially redacted before being made public. “The correctness of the behavior of the FST software should be seriously questioned,” he wrote in an unredacted section.
ProPublica’s motion, filed on Sept. 25 with the help of the Media Freedom and Information Access (MFIA) clinic at Yale Law School, argued that the judge should vacate that protective order because of “the profound importance of this technology to the integrity of the criminal justice system, and the overriding public interest in transparency.”
“This ruling finally enables ProPublica to gain access to the code in order to report on this matter of vital public concern,” said Hannah Bloch-Wehba, a supervising attorney in the MFIA clinic, following the judge’s order. “As law enforcement agencies increasingly rely on algorithmic tools in the criminal justice system, it is all the more important that the press and public have access to the information critical to understand what the government is doing and hold it accountable.”
FST was invented by employees of the crime lab and programmed by software consultants. The lab began using it in 2011 to analyze complex mixtures of DNA left behind at crime scenes. About 50 jurisdictions as far away as Bozeman, Mont., and Floresville, Texas, also sent samples to New York City for testing. When defense attorneys challenged FST’s results in court and sought access to the program’s source code, the crime lab has previously refused, saying it was proprietary.
Although almost all judges have allowed FST results as evidence in court, one state judge, Mark Dwyer of Brooklyn, ruled them inadmissible in two cases in 2014. Dwyer, now presiding in Manhattan, excluded FST evidence from two more cases this week. While prosecutors in both cases said DNA evidence analyzed with FST showed that the defendants violated gun possession laws, Dwyer said in court on Oct. 16 that his doubts about the program’s acceptance in the scientific community persist, especially since the New York lab is no longer using it, and no other lab has adopted it.
New information about the development of the FST source code and some of its purported weaknesses surfaced this past July in the cases before Dwyer in an affidavit by Eugene Lien, a technical leader in the DNA lab, whom the prosecution was using as an expert. After the lab started using FST for casework in early 2011, he and his colleagues discovered a problem with the program’s math that could skew a test’s results, according to Lien. “Because of this, the FST program was taken offline and portions of the software were re-coded,” he wrote.
The lab did a “performance check” of the new version before resuming casework with it in July 2011, he went on, but lab officials did not inform the state oversight commission about the change, nor did they run another full validation study on the program.
The letter to the state inspector general from the group of defense lawyers cited Lien’s account, saying it contained “damning admissions” about the lab’s lack of transparency. They also theorized that the recoding Lien described could itself have led to one problem identified by Adams — the exclusion of potentially valuable data from FST’s calculations of likelihood ratios. Characterizing Adams’ criticisms as merely cosmetic rather than substantive, the lab has contended that FST calculations were reliable.
Besides ongoing criminal cases in New York City involving FST, Caproni’s decision to unseal the source code may also affect another legal fight for access to a proprietary DNA software system. The American Civil Liberties Union and the Electronic Frontier Foundation intervened in a case in California’s appeals court on Sept. 13 in support of a defendant’s right to review the source code behind a commercially available DNA analysis program called TrueAllele.
“It’s a major credit to the court, the parties and ProPublica that the source code used in Mr. Johnson’s case will now be subject to public scrutiny,” said Brett Max Kaufmann, a staff attorney for the ACLU who is working on the California appeals case. “We urge other courts to follow this example when hearing cases involving similar types of evidence.”
Outside the courtroom, some New York City lawmakers are seeking more public review of algorithms and their impacts. On Oct. 16, the New York City Council’s Committee on Technology held a hearing about a proposed bill calling for all city agencies to publish online the source codes for algorithms that they use in decision-making. As an example of the danger of relying on algorithms, witnesses and a committee report cited ProPublica’s 2016 investigation that found racial bias in a software program used by courts to decide whether it’s safe to let defendants out on bail.
“These tools seem to offer objectivity, but we must be cognizant of the fact that algorithms are simply a way of encoding assumptions, and their design can be biased, and that the very data they possess can be flawed,” the bill’s author and the committee chair, James Vacca, said at the hearing. “I have proposed this legislation not to prevent city agencies from taking advantage of cutting-edge tools, but to assure that when they do, they remain accountable to the public.”
The committee heard from defense lawyers and others who support the bill as well as representatives from Mayor Bill DeBlasio’s Office of Data Analytics and the city’s Department of Information Technology and Telecommunications, which both oppose it in its current form. After the hearing, Vacca told ProPublica that he would revise the bill to address criticisms he had heard about confidentiality concerns, and also to clarify that the proposal applies to both programs developed by third-party vendors and software developed in-house by city employees. Vacca said he is determined to pass a law on this issue before the end of his term.
“To my knowledge, we are the first city, and the first legislative body in our country, to take on this issue,” Vacca said during the hearing. “And as with so many other things, I’m hoping that New York City will set the example for others throughout the world.”
Officials at the Maryland Department of Transportation learned two things since opting into a pilot test of digital driver’s licenses: The licenses are more secure than plastic cards, and people really like the idea.
“On your physical license, you’re giving pretty much all of your major data to the individual who receives that license when you hand it over at a bar, a liquor store — anyplace you go,” said Chrissy Nizer, administrator of MDOT’s Motor Vehicle Administration. With a digital driver’s license, only age verification will display if that’s what the license is being used for. “They’ll see that you’re over 18 for tobacco purchases, that you’re over 21 for alcohol purchases, but they don’t see your exact date of birth, they don’t see your driver’s license number,” Nizer said. “They don’t see all that personal information — your address — that people are, frankly, probably a little reluctant to have a stranger see.”
MDOT opened the test of Gemalto’s digital driver’s license to employees and their families, and more than 400 people enrolled, enabling the state and the company — which is working under a $2 million, two-year grant from the National Institute of Standards and Technology on the licenses — to study how the application would work on multiple smartphone makes, models and operating systems. Eight-six percent of the participants told the department they were very interested in moving forward with a digital driver’s license, and at public events that showcased the technology, people inquired about getting one right away, Nizer said.
“The interesting thing for us is the amount of interest from the general public,” she said. “We were really impressed by how much the general public is really hungry for this new technology.”
The test licenses used enrollees’ actual information on their actual smartphones, said Tiffany Conway, Gemalto’s field marketing manager for government programs in North America. There was a pre-enrollment process in which the voluntary testers provided their phone numbers and email addresses and registered from their smart phone. They received an authorization code enabling them to download the app from the Google Play Store or Apple App Store, and they set up a login code that they had to enter every time they opened the app.
An interesting security feature is that the traditional keypad where users enter in numbers rearranges the numbers every time a user logs in, Nizer said. “If somebody happened to be watching you logging into your phone, they wouldn’t know what number you’re entering in because there’s a randomization of the keypad.”
A dynamic QR code is generated on the device that displays to an establishment’s QR reader only the information, such as age, needed to complete a particular transaction.
In addition to security, the digital licenses let the state keep its information updated. For instance, if someone’s driving privileges get revoked, that change gets pushed out over the air and pops up when the license is scanned.
“You can’t get more accurate than a digital driver’s license that’s reflective of what’s on the Motor Vehicle Administration’s system,” Nizer said. “That definitely is an enhancement from the state perspective — the accuracy of that data and making sure if somebody is not entitled to have a valid status at that point that, that is reflected.”
Digital licenses are nearly impossible to replicate, or fake, Conway added. Each of the digital driver’s licenses issued by the state DMV has a PKI certificate, and “if that PKI is ever tampered with or not recognized or not there because a hacker has tried to do something funny … it would recognize it as invalid immediately,” Conway said.
The information also is safe if people lose their phone, she added. DMVs would be able to remotely wipe the credential and reissue a replacement to a new device almost instantaneously.
The process for getting the digital license is similar to the one for getting a physical license, but DMV employees must take one extra step: pairing the phone with the information.
“In the long run, it really helps them with managing these credentials in the field,” Conway said. “When you think about issuing a piece of plastic, once it’s issued, it’s fixed. It’s static. There’s no updating that, and you’re pretty much relying on the good memory of your residents to come back in if they need to make an update either to their address or if the license is expiring or if something has changed with their restrictions. That doesn’t always happen.”
NIST awarded a grant to Gemalto in October 2016 to explore an interoperable federated identity credential, and the company brought on four jurisdictions to test the technology: Colorado, Idaho, Maryland and Washington, D.C. Wyoming recently joined the group, too.
Other states are trying out digital licenses. Iowa started testing a mobile ID in 2015, Louisiana passed legislation last year that could lead to a digital ID and Virginia’s General Assembly passed a bill this year enabling the DMV to “digitally verify the authenticity and validity of driver’s licenses.”
pager start pager end
About the Author
Stephanie Kanowitz is a freelance writer based in northern Virginia.