Odyssey

…the wanderings

DDoS Wars

A DDoS attack is easier to inflect compared to the effort necessary in deflecting or defending against it. Think of it as Gurrilla Warefare ensued with a herd of Zombies.

That is exactly what a Botnet based DDoS attack represents in the networking world. Coercion replaces loyalty as Malware perpetrates through the network to increase the size of a herd. All you need is the right triggers (like people who will click through links) and crowd-sourcing takes over. Last year Craig Labovitz at Arbor described a DDoS at over 30Gbps on an Asian mobile operator. It is likely to have been done with a herd numbering in 10s of thousands rather than the order of million botnets known to exist today.

This is a new turn to what was ‘affectionately’ called the Slashdot Effect in earlier days but, with a bad twist. Traffic is intentionally diverted by a botnet herder. Their target might be popular but does not really appreciate (nor can benefit from) the incoming volume.

First thing to note about a DDoS (or DoS in general) is the fact that trying to throttle such an attack is actually playing in the hands of the attacker — a self-inflicted denial-of-service. Trying a selective block is not very fruitful when the attack is distributed well over the Internet landscape and, laden with guerrilla tactics.

It is more of a pipe-dream to expect everyone will protect their end-hosts from
Malware attempts to subvert and assimilate into a growing botnet. And waiting for a curseder (good-guy) that will spread along the malware channels to wipe out the bad-boys has its own risks.

At a higher level, the Internet infrastructure could analyze elements of such a botnet and attempt to sterilize it (a quick and more generally visible example is OpenDNS).

If a potential target (company, nation; Google?) can afford distributed hosting on the Internet, it would make an intimidating challenge to the botnet with multiple points that need to be compromised before a successful DDoS is achieved. This is when a larger Botnet will have to rear its head for a Multiple DDoS.

As malware writers get more sophisticated, the attack itself is more silent and versatile. But, DDoS is not a one-way tool and can be used by both sides. Around the start of this month, Aiplex Software was hired by the likes of MPAA and RIAA to attack piracy sites. This instigated a retaliation which is currently in progress (www.aiplex.com is off-line as of this writing). So, which side wins with this? None, I guess if anyone benefits, it would be the RBN, and ultimately terrorist organizations, probably.

But, this is not yet the end of this story and over time I expect we will hear more on this round of DDoS.

Meanwhile, you can catch the background on this from a recent non-fiction book Fatal System Error – The Hunt for the New Crime Lords Who are Bringing Down the Internet, by Joseph Menn. It runs though an account of such warfare over the last decade taking specific examples (see Prolexic).

The Prolexic timeline showing DDoS progress

DDoS Evolution (Prolexic)

Advertisements

September 30, 2010 Posted by | reading, security | , , , , | Leave a comment

Insecurities

I wonder when people will stop blaming the OS vendors and start taking security of their machines seriously…

June 20, 2009 Posted by | security | , , , | Leave a comment

Conficker and the Curious Yellow

I was not planning on writing another post so soon, but Utopiah here has referred a very nice article in his comments to my previous post. If you have not already read Brandon Wiley’s Curious Yellow: The First Coordinated Worm Design, I urge you to read it through.

It hits right on spot about fast distribution through a peer-to-peer network. I used the concept to hypothesize a patch propagation (described as an anti-worm by him).

The paper describes a scenario comparable to a powerful chess game turning the yellow worm to blue and backwards. Probably in recognition of this idea, Conficker uses latest encryption , very likely making a first field implementation of the MD6 algo and its fixes too!. It appears that the Conficker writer is very well versed with this paper and current technology 🙂

Besides the points made by Wiley on that page, there is one more ‘common goal’ such a network can target, and I am sure its already stated somewhere: These compromised systems can be pooled to brute force encryption security.

April 21, 2009 Posted by | security | , , , | Leave a comment

The White Botnet

This is a work of fiction. Any resemblance to reality is entirely unexpected. All similarities (like pigs can fly) are coincidental. Of course, all trademark names used here (starting right from the next line) are property of their owners.

As the first quarter of 2009 ended people had mixed feelings about the Conficker worm (aka Downadup, Kido). It was simultaneously not a joke or an immediate disaster. But, very few knew that this was a beta run of what would eventually be a White Hat vulnerability-patching network. It was clear that the botnet could only hit systems that were not patched for a long known vulnerability. The infection smartly started protecting the systems it conquered and made them safe from further malware. It moved on to become a server of protection that located other weak hosts and propagated towards them in a race against other malware.

The Microsoft Windows machines that are not patched against known attack vectors are usually because of pirated software or Overworked IT Administrators. Is that a good enough reason for malware to propagate towards unprepared legal users? That is where the Open Group came together to build a distributed protection system. This system had to work as a secondary solution in tandem with the existing anti-virus and anti-spyware securities. It had to be disconnected and, by that reason, at crossroads with these solutions.

The solution is to propagate a neutralizing white-botnet across the Internet. It is maintained by a group that partly consists of people from the AV/AS, OS vendors and search engine companies; though most of these vendors are themselves not yet directly associated with it. Google has tweaked its search algorithms to locate and assimilate zero-day vulnerability information quickly. These public postings are verified (coz, they might be poisoned) and associated patches are pushed through the white botnet to manage the ‘compromised’ machines. The window of attack reduces again to the time a patch is found for a zero-day exploit. All hosts will be patched one-way or the other.

…and pigs will fly!

April 20, 2009 Posted by | writing | , , | 2 Comments