publishes 437 copies of every article

I mentioned previously in My Sys-Con Nightmare that are spamming Google with duplicate content. Even I wasn't aware, however, of just how much duplicate content they were spewing on the Internets.

This morning, I wrote a little bit more Python to download their list of topics and editors. Each one of those topics and editors has its own subdomain on and every article is duplicated on every one of those subdomains.

So, on alone, there are 437 copies of every article. is basically a mirror of with different styling but the same content. So, across both domains, Sys-Con Media publishes 874 copies of every article in its system. was delisted from Google but still shows up in search results. Surely having 437 copies of every article on their system contravenes the rules on duplicate content in Google's Terms and Conditions?

Once more, visually

So how does having 437 copies of an article look? Here are the copies of just one article by Scott Guthrie, Corporate Vice-President at Microsoft and a Sys-Con author:

And keep in mind that these are the links to just one article on Every article on their system can be found at 437 different links on alone. (All links have rel="nofollow" on them to avoid giving Sys-Con and Google Juice.)

Update: And this is how all that duplicate content looks on Google: search Google for inurl:sys-con inurl:439215. There are 241 results returned for just that one article. In other words, Google has well and truly been spammed by