That is exactly what a Botnet based DDoS attack represents in the networking world. Coercion replaces loyalty as Malware perpetrates through the network to increase the size of a herd. All you need is the right triggers (like people who will click through links) and crowd-sourcing takes over. Last year Craig Labovitz at Arbor described a DDoS at over 30Gbps on an Asian mobile operator. It is likely to have been done with a herd numbering in 10s of thousands rather than the order of million botnets known to exist today.
This is a new turn to what was ‘affectionately’ called the Slashdot Effect in earlier days but, with a bad twist. Traffic is intentionally diverted by a botnet herder. Their target might be popular but does not really appreciate (nor can benefit from) the incoming volume.
First thing to note about a DDoS (or DoS in general) is the fact that trying to throttle such an attack is actually playing in the hands of the attacker — a self-inflicted denial-of-service. Trying a selective block is not very fruitful when the attack is distributed well over the Internet landscape and, laden with guerrilla tactics.
It is more of a pipe-dream to expect everyone will protect their end-hosts from
Malware attempts to subvert and assimilate into a growing botnet. And waiting for a curseder (good-guy) that will spread along the malware channels to wipe out the bad-boys has its own risks.
At a higher level, the Internet infrastructure could analyze elements of such a botnet and attempt to sterilize it (a quick and more generally visible example is OpenDNS).
If a potential target (company, nation; Google?) can afford distributed hosting on the Internet, it would make an intimidating challenge to the botnet with multiple points that need to be compromised before a successful DDoS is achieved. This is when a larger Botnet will have to rear its head for a Multiple DDoS.
As malware writers get more sophisticated, the attack itself is more silent and versatile. But, DDoS is not a one-way tool and can be used by both sides. Around the start of this month, Aiplex Software was hired by the likes of MPAA and RIAA to attack piracy sites. This instigated a retaliation which is currently in progress (www.aiplex.com is off-line as of this writing). So, which side wins with this? None, I guess if anyone benefits, it would be the RBN, and ultimately terrorist organizations, probably.
But, this is not yet the end of this story and over time I expect we will hear more on this round of DDoS.
Meanwhile, you can catch the background on this from a recent non-fiction book Fatal System Error – The Hunt for the New Crime Lords Who are Bringing Down the Internet, by Joseph Menn. It runs though an account of such warfare over the last decade taking specific examples (see Prolexic).
The morning article on CAT 2010 results declared a “cent per cent” marks result. This was something that needed a second glance. Not to wonder about abilities of Ankit Garg or his percentile-band-mates to reach there, but the scoring system itself that declared someone had reach a full score — a cent-per-cent, total 100! Or, maybe doubt secondary articles that reflect these results to the people.
I quote from a couple of articles that sample what you are probably reading today morning.
The CAT is out of the bag, and Ankit Garg is rolling in the satin. The 21-year-old from Chandigarh is among the rarest of the rare who landed cent per cent marks in the Common Admission Test (CAT) 2010, the entrance exam to India’s elite B-schools, including the Indian Institutes of Management (IIMs).
City lad Ankit Garg brought laurels to Chandigarh by scoring 100 percentile in the Common Aptitude Test (CAT). The result was declared on Sunday.
An examination system (and many a school exams will qualify to be quoted here) is not good enough if students score a 100. That is a reference point never to be touched. Just like your car has a 220 Kmph mark on the odometer and your music system has a 100-percent setting on the volume knob but you don’t go there (remember that old Michael Jackson video showing a volume knob with “Are you Nuts?”). When you need to touch that maximum number it means you need better technology. Likewise, if you see an examination being ‘cracked’ by someone with a 100 percent score, you need to upgrade the examination (of course laud the cracker too).
All this does not sound right for CAT, they should know this already. CAT scores are not disclosed, its the percentile that is declared. The result is therefore normalized within 1 and 99 percentile (not 100 as many would like to say).
Congratulations to Ankit Garg, Vivek Gupta and the top-band scorers of CAT 2010. May you live in interesting times where our media can describe your achivements better.
Some 40 years ago, Philip K. Dick penned a story of dark times with flying cars and unstable androids who wanted a life of their own. Almost two decades later, Ridley Scott directed Blade Runner which was (surprisingly for Scott) a much diluted dystopia (guess PKD was a bit too much for him).
The film and the book both have many things to talk about — and this post is not really about them. There is one thing notable with recent news. The Spinner flying cars from the plot represented a futuristic vehicle that was capable of being driven over ground and flying through air. Sound was necessary for proper effects and the moving cars were shown to give off a high-pitched whistling noise (sound is important, check the Star Wars space ship noises and compare them to the 2001: A Space Odyssey silence). Many movies picked up this theme of flying cars and some even represented the sound made by these cars. One always wondered what sound advanced technologies would make as they replaced the engines of today.
Well, Nissan has taken a different problem to solve for their new silent electric and hybrid cars. You need to have some noise to make a car safe for the external world. Curiously, they decided to use the Spinner’s whistle for this.
We have seen a lot of technology inspired by science fiction, this is a different one.
Maybe we will see really silent glass lift doors with the StarTrek sound-effects to remind people when they operate.
A recent advisory from Microsoft (Microsoft Security Advisory (975497)) says Vulnerabilities in SMB Could Allow Remote Code Execution. This “SMB2 zero day” is focused on the Microsoft VIsta and Server 2008 systems.
If Vista were as wide spread as the Windows XP it would have become a potential addition to the Conficker troupe. Interestingly, even the Windows 7 (gaining momentum at the moment) does not seem vulnerable. Which does not preclude Conficker writers themselves of deciding to add this new ‘tool’ to their variations. Return-On-Investments may be the only reason they would not target Vista…
The Internet goes 40 years shortly. This post is a snapshot of some things I see at the moment.
1. Conficker continues to proliferate
2. EBay to Sell Skype Stake to Group Led by Silver Lake
3. Opera allows you to host a web-site right off your laptops and desktops
4. You get DDoS Botnets on rent
And, we get back to Tetris as the best exercise for the brains (that one is un-numbered, leave it out :-).
Fresh reports of the McAfee update “felling PCs across the world” is sweeping across the news and help forums.
The problem seems to be a false-positive that surfaces when a new DAT file is patched on an older McAfee engine. The 4th of July holiday probably helped a bit too with people being a little less alert.
These things will happen, and the important lesson to learn is the critical focus required while working with security systems that have a costly false-positive impact. Hollywood has been cashing in on this idea for decades (remember Robocop?). These things will get more interesting as we move to security systems more sophisticated then these.
I use Flickr to keep my pictures online and refer them from my photoblog. This is not a very high-frequency usage; very like this blog. Yet, I have managed to hit Flickr’s 200 photos ceiling very quickly in about 3 large sets of pictures. Yeah, I am giving thought to the ‘Pro’ subscription option. But, there is a slight resistance in my mind, and a feeling that if this were a Google service, things could have been different.
A short search found PicasaWeb in its ‘test’ phase. This is no Flickr competitor yet. But, needing some alternative to Flickr a little urgently I decided to check this out. As of this writing PicasaWeb is less than 100 days after launch and not even in Beta, needless to say this is too early for a review and I am by no means berating it.
What follows is a short comparative negative-points analysis (for brevity) — so, if there is a point about some limitation on one service, the other service has a (at least relatively) better option on the same.
Problems with PicasaWeb.
1. You seem to need a gmail account to use the album — not a very strong negative now.
2. You need Picasa installed to upload pictures in bulk — this could be quite restrictive. Though, there appears to be a downloader for Mac users. There is also an ActiveX upload-plugin if your browser supports ActiveX else you are limited to uploading single picture at a time.
3. Not very friendly to collaboration — uncomfortable comment handling, single tag for a picture, complicated tracking of friend albums, no multi-resolution storage of pictures. There is EXIF tracking for pictures though.
4. 250MB limit in free version — with high resolution pictures you can quickly hit the ceiling. But, this is better than the 200 picture limit by Flickr. Google is restricting by space rather than number.
5. A Bulk subscription storage limit of 6GB. If I pay, this is too small a storage size.
6. All extrage storage is eliminated if your subscription expires. This is inline with the pay-for-storage policy, but as a user I am not happy to loose my pictures just because I stepped down from the subscription. Maybe that is because I know Flickr will not delete my pictures.
Problems with Flickr.
1. Does not handle EXIF tags — this kills half the fun with digital photography.
2. Downloading pictures in bulk is not easy — its multiple clicks to reach the right resolution for download of every picture.
3. 20MB upload limit per month — could be uncomfortable for some people.
4. 200 picture tracking limit — thats a very short memory! But, the fact that even without subscription all your uploaded pictures are always retained in all resolutions supported is a powerful plus point.
While Flickr has moved to (what they call) the ‘Gamma’ stage, PicasaWeb is still in its early ‘test’ stage; there is still a chance for Google to clean its act up.
The Internet has not yet reached the critical-mass point for on-line photo services like it has for e-mail services today. Check the advantages for allowing people to keep good high-resolution digitial pictures online — thats for another post.
With growing strength of open-source encyclopedic media (like Wikipedia) arise questions on accidental and more importantly intentional false information being published and retained on the web.
The Seigenthaler incident even suggests ways this could be done on Wikipedia by someone with malicious intent and know-how. Adding responsibility to the mechanism like Wales intends — disallowing anonymous new-entry creation — will probably not suffice. What is going to stop the anonymous from creating a few registrations?
Wikipedia does not require an e-mail id for creating a registration ( e-mail id is considered a unique mapping to a person; but, lets not split-hair on that here) and, has no protection against automatic registrations through scripts (remember those small pictures with numbers and letters scrawled like a kid learning the alphabet?). Well let’s argue that Wales will introduce all these into the registration process while he is striking anonymous postings out.
But, that does not preclude the primary problem of well-placed misinformation being introduced. This was never a spam problem. At least it is not yet, while we don’t have robots doing this 🙂
About the postings being partisan (the Curry episode), it does sound a bit difficult for an information repository being managed dynamically to remain objective. Personal bias will rule in little packets all over; Wales has accepted this side of the coin.
So, where does it land? would you trust the next page of information you read on Wikipedia?
Its not that bleak if you did not start browsing the Internet for information today.
Here are a few things to start with.
1. All data on the Internet is put with some purpose (and I am not talking theological here).
2. What is the probable ratio of people looking at a piece of information on Wikipedia to be (a) knowledgeable and interested in keeping it correct to that of (b) wanting to corrupt it?
You have very likely done this at your sub-conscious already — formed a checklist of how to judge the value of Wikipedia pages — and can add to this list easily once you put your mind to it.
Edit: There is another angle to publishing with crediblity; get an expert to review the content. This is something Digital Universe is working on. How does that model work? it would certainly be telling to follow where the likes of Digital Universe reach. Crux: is an expert objective in what they publish? Bias is at the core of human nature, be it a non-profit organization or your regular school text book publisher…
Google is starting to use its acquisition…
It was another flooded evening for Pune. Flow of people heading homewards was restricted over a few clear bridges and nudged through various traffic-jams on the way.
The broad Mula crossover through Aundh over Rajiv Gandhi bridge was blocked as water flooded its city-side ramp. All traffic was diverted to the neighbouring secondary bridge (a short treatise on names of the two bridges i refer here by Salil).