The Lesson of the GitHub DDoS Attack: Why Your Web Host Matters
By Adam Stern | Hosting Journalist
Surviving a cyberattack isn’t like weathering a Cat 5 hurricane or coming through a 7.0 earthquake unscathed. Granting that natural disasters too often have horrendous consequences, there’s also a “right place, right time” element to making it through. Cyber-disasters – which can be every bit as calamitous in their own way as acts of nature – don’t typically bend to the element of chance. If you come out the other side intact, it’s probably no accident. It is, instead, the result of specific choices, tools, policies and practices that can be codified and emulated – and that need to be reinforced.
Consider the recent case of GitHub, the target of the largest DDoS attack ever recorded. GitHub’s experience is instructive, and perhaps the biggest takeaway can be expressed in four simple words: Your web host matters.
That’s especially crucial where security is concerned. Cloud security isn’t like filling out a job application; it’s not a matter of checking boxes and moving on. Piecemeal approaches to security simply don’t work. Patching a hole or fixing a bug, and then putting it “behind” you – that’s hardly the stuff of which effective security policies are made. Because security is a moving target, scattershot repairs ignore the hundreds or even thousands of points of vulnerability that a policy of continuing monitoring can help mitigate.
Any cloud provider worth its salt brings to the task a phalanx of time-tested tools, procedures and technologies that ensure continuous uptime, regular backups, data redundancy, data encryption, anti-virus/anti-malware deployment, multiple firewalls, intrusion prevention and round-the-clock monitoring. So while data is considerably safer in the cloud than beached on equipment under someone’s desk, there is no substitute for active vigilance – accent on active, since vigilance is both a mindset and a verb. About that mindset: sound security planning requires assessing threats, choosing tools to meet those threats, implementing those tools, assessing the effectiveness of the tools implemented – and repeating this process on an ongoing basis.
Among the elements of a basic cybersecurity routine: setting password expirations, obtaining certificates, avoiding the use of public networks, meeting with staff about security, and so on. Perfection in countering cyberattacks is as elusive here as it is in any other endeavor. Even so, that can’t be an argument for complacence or anything less than maximum due diligence, backed up by the most capable technology at each organization’s disposal.
In this of events is a counterintuitive lesson about who and what is most vulnerable during a hack. The experience of public cloud providers should put to rest the notion that the cloud isn’t safe. GitHub’s experience makes a compelling argument that the cloud is in fact the safest place to be in a cyber hurricane. Internal IT departments, fixated on their own in-house mixology, can be affected big-time – as they were in a number of recent ransomware attacks— raising the very legitimate question of why some roll-your-own organizations devote precious resources, including Bitcoin, to those departments in the belief that the cloud is a snakepit.
Cloud security isn’t what it used to be – and that’s a profound compliment to the cloud industry’s maturity and sophistication. What once was porous is now substantially better in every way, which isn’t to deny that bad actors have raised their game as well. Some aspects of cloud migration have always been threatening to the old guard. Here and there, vendors and other members of the IT community have fostered misconceptions about security in the cloud – not in an effort to thwart migration but in a bid to control it. Fear fuels both confusion and dependence.
Sadly, while established cloud security protocols should be standard-issue stuff, they aren’t. The conventional wisdom is that one cloud hosting company is the same as another, and that because they’re committed to life off-premises, they all must do the exact same thing, their feature sets are interchangeable, and the underlying architecture is immaterial. The message is, it doesn’t matter what equipment they’re using — it doesn’t matter what choice you make. But in fact, it does. Never mind the analysts; cloud computing is not a commodity business. And never mind the Street; investors and Certain Others fervently want it to be a commodity, but because those Certain Others go by the name of Microsoft and Amazon, fuzzing the story won’t fly. They want to grab business on price and make scads of money on volume (which they are).
The push to reduce and simplify is being driven by a combination of marketing gurus who are unfamiliar with the technology and industry pundits who believe everything can be plotted on a two-dimensional graph. Service providers are trying to deliver products that don’t necessarily fit the mold, so it’s ultimately pointless to squeeze technologies into two or three dimensions. These emerging solutions are much more nuanced than that.
Vendors need to level with users. The devil really is in the details. There are literally hundreds of decisions to make when architecting a solution, and those choices mean that every solution is not a commodity. Digital transformation isn’t going to emerge from some marketing contrivance, but from technologies that make cloud computing more secure, more accessible and more cost-effective.