Stephen Speicher

Jihad in cyberspace

57 posts in this topic

There seems to be current series of denial of service attacks on blogs that speak out against the terrorists. See this entry from Michelle Malkin's blog which places Saudi Arabia as the source of the attacks.

Share this post


Link to post
Share on other sites

There seems to be current series of denial of service attacks on blogs that speak out against the terrorists. See this entry from Michelle Malkin's blog which places Saudi Arabia as the source of the attacks.

Aren't they are allies? :)

Anyway, one more reason why a mass republication of the Mohammed cartoons would lessen the effect of such attacks - they can't DOS thousands of sites at once. Taking over their countries would help too.

Share this post


Link to post
Share on other sites

For anyone: What is the legal status of such attacks? Does the U. S. A., for example, have laws prohibiting such attacks? If so, who enforces such laws and with what penalties? Or is the situation anarchic?

Share this post


Link to post
Share on other sites

For anyone: What is the legal status of such attacks? Does the U. S. A., for example, have laws prohibiting such attacks? If so, who enforces such laws and with what penalties? Or is the situation anarchic?

They are illegal in the U.S. I'm fairly sure they're a federal crime, theoretically investigated by the FBI. Since they originated in Saudi Arabia, the predictable course of events is to have Bush send them a stern warning to play nice. Most of the 9/11 hijackers were from S.A. and Washington still considers them some kind of ally, I don't think a "little" thing like attacks on free speech within the U.S. is something they care about.

The first thought that came to mind was to mass firewall all traffic from that part of the world before it even reaches the U.S., which is almost certainly possible technically, but it would never fly politically.

Share this post


Link to post
Share on other sites

For anyone: What is the legal status of such attacks? Does the U. S. A., for example, have laws prohibiting such attacks? If so, who enforces such laws and with what penalties? Or is the situation anarchic?

Something to keep in mind is just because the traffic is originating from Saudi Arabia does not mean that is where the attacker is from. It is just the location of the machines, called "zombies," the attacker has hacked and taken control of. The attacker can literally be coming from anywhere in the world. And it is extremely difficult to track down who and where that hacker is. Denial of Services attacks are almost always swarms of zombies that are controlled as a network. A single individual can have thousands, or even tens of thousands of systems under his control.

So while these attacks are illegal, enforcing the law amounts to trying to find a needle in an Internet haystack. Catching one of these guys usually involves listening in on the communications between the zombies and hoping you get lucky or the attacker does something really stupid. To make it worse, a lot of countries will not extradite for these Denial of Service attacks.

It is not quite anarchy but the technology very much favors the offense over the defense.

Share this post


Link to post
Share on other sites

Something to keep in mind is just because the traffic is originating from Saudi Arabia does not mean that is where the attacker is from. It is just the location of the machines, called "zombies," the attacker has hacked and taken control of. The attacker can literally be coming from anywhere in the world.

Yes, in *some* cases. There are tens of millions of PCs in the U.S. alone that are reportedly "zombies", for sale to anybody wishing to use them for spamming and I suppose DOS attacks and anything else they want. (Blame Microsoft's lax Internet Explorer security, before they got around to tightening it, for many of these zombies.)

But an attack originating from machines in S.A. is very suggestive that it might be something different. Why indeed would they use "zombies" in a country that has far fewer PCs than the U.S. or Europe? Saudi Arabia is floating in oil money, and more than one terrorist there (recall that bin Laden was a Saudi billionaire) surely has access to the funds to just plain buy a lot of machines and buy the biggest bandwidth connections they can find. They could probably statistically spread them across enough IP addresses to cover whatever range goes in/out of Saudi Arabia. Is this acceptable? Of course not. But when the U.S. government cynically calls S.A. an ally when most of the 9/11 terrorists were Saudis, and S.A. is a terrible Islamic dictatorship, I'm not sure what the hell it would take for Bush and co. to do the right thing, i.e., flatten Saudi Arabia's disgusting little empire and flat out take the oil fields back. If they blow them up before we can secure them, hire the guy who put out the Kuwait fires in pretty short order after Iraq did something similar.

It's either going to come down to that or these fleabitten mystics are going to eventually figure out how to use WMD to destroy America while Washington wrings its hands begging the U.N. to "do something". I hope the former happens first.

Share this post


Link to post
Share on other sites
They are illegal in the U.S. I'm fairly sure they're a federal crime, theoretically investigated by the FBI.

As far as I know the first large-scale investigation of denial of service leading to prosecution occured in Los Angeles two years ago. Here is an FBI pointer to one of the criminals who is still at large. I seem to recall that five other were caught and prosecuted. Note that the write-up mentions that the DOS was used for "a commercial purpose," so I wonder what the actual law states. In the Patriot Act there is also some mention of criminality for DOS, but perhaps that has a limited context too.

Share this post


Link to post
Share on other sites
There are tens of millions of PCs in the U.S. alone that are reportedly "zombies", for sale to anybody wishing to use them for spamming and I suppose DOS attacks and anything else they want.

There are some 200 million personal computers in the United States, so that "tens of millions" zombies seemed extraordinarily high. But then I went to CipherTrust's ZombieMeter for a real-time estimate of zombies being created. I was amazed that as of this moment, for the current month of April there were 964,020 new zombies added in the United States, and some 735,598 zombies in China! An amazing number. And the United States figure for the month represents less than 20% of the total worldwide. This gives a different perspective to the notion of weapons of mass destruction.

I don't know how many of these zombies are available for sale, or how many are actually in use at any given time, but almost a million new zombie PCs added a month in the United States alone, is astounding.

Share this post


Link to post
Share on other sites
But an attack originating from machines in S.A. is very suggestive that it might be something different.

It's definitely suspicious but we really don't know a lot at this point. I've seen comments on two blogs saying Hosting Matters stated the attack is coming from Saudi Arabia. Beyond that the only thing we can say for sure is that it is a big enough to take down servers at Hosting Matters which has a fair amount of experience with DDOS attacks. It could be some Saudi Prince dropping enough money on infrastructure to pump out data at the infidel of his choice. If that is the case, eventually the Tier 1 Internet providers will get involved and shut him down.

Share this post


Link to post
Share on other sites

There are some 200 million personal computers in the United States, so that "tens of millions" zombies seemed extraordinarily high.

Yes, unfortunately it's not an exaggeration. From everything I've read about the number lately, it's some sizeable percentage of PC's in the U.S. alone. I think Microsoft should be doing a lot more to address the problem, since it was their swiss-cheese security in Internet Explorer and Windows and Microsoft Office (especially Outlook but also Word and Excel) that made so many systems vulnerable. They have Windows Defender for free download, and it's good, but not good enough.

And it isn't as though they don't have the resources to fix this mess, with literally $50 billion in cash. This is essentially a software issue that could be readily distributed with their existing Windows Update network. I think identifying the top 10 programmers on the planet and offering them $1 million each to work as a team to develop a solution in a few months would probably work - it would be pocket change to the company, but the hardest part for them would probably be accepting the reality that few, if any, of those programmers work at Microsoft.

Share this post


Link to post
Share on other sites
And it isn't as though they don't have the resources to fix this mess, with literally $50 billion in cash. This is essentially a software issue that could be readily distributed with their existing Windows Update network. I think identifying the top 10 programmers on the planet and offering them $1 million each to work as a team to develop a solution in a few months would probably work - it would be pocket change to the company, but the hardest part for them would probably be accepting the reality that few, if any, of those programmers work at Microsoft.

Sometimes, like here and now, I just do not understand the source and depth of such negativity. I mean, here is this company that is almost single-handedly responsible for this age of the personal computer, and supposedly they are unwilling to accept a simple solution to a big problem because it implies that the best of programmers do not work for them? I find it difficult to treat such a notion in a serious vein. What was it, then, just luck and good timing that so fortuitously allowed this second-rate group to create an entire industry where relatively nothing existed before? We just got through posting about 200 million personal computers existing in the United States; just how many of those do you think run Linux or Apple?

I have heard all the stories, how computer users do not know what is good for them and the like, but the fact is, the bottom-line is, that people use Windows because it all works. And if it wasn't good programmers who created such value, then from where did it come? I do not mean programmers who exist on some Platonic level, but the very real talented people who created a system of enormous value. Isn't it about time that these people and this company get acknowledged for their accomplishments, and not get criticized at every opportunity because from some Godly perch where it appears as if they did not accomplish all that they should?

Honestly, Phil, if 10 first-rate programmers and $10 million dollars can solve in a few months a problem that has plagued Microsoft for years, do you really think that they would not jump at the solution because of some false sense of pride? Look, personally, I am not a big fan of Bill Gates, and I used Linux from the day it was born, but Microsoft and Windows is the best thing since sliced bread, at least until Vista comes along.

Share this post


Link to post
Share on other sites
For anyone: What is the legal status of such attacks? Does the U. S. A., for example, have laws prohibiting such attacks? If so, who enforces such laws and with what penalties? Or is the situation anarchic?

Cybercrime.gov http://www.cybercrime.gov/

http://www.cybercrime.gov/cclaws.html

12/28/05 Man Pleads Guilty to Infecting Thousands of Computers Using Worm Program

then Launching them in Denial of Service Attacks -- "Botnet" Investigation Led by U.S. Secret Service's Electronic Crimes Task Force and the Computer Hacking and Intellectual Property Unit of the U.S. Attorney's Office http://www.cybercrime.gov/clarkPlea.htm

18 U.S.C. §1030 http://www4.law.cornell.edu/uscode/html/us...30----000-.html

Share this post


Link to post
Share on other sites

Sometimes, like here and now, I just do not understand the source and depth of such negativity.

I primarily use Windows (as well as Linux and Mac OS X) and I've always admired the success of Bill Gates and the fact that he created the world's first true software company. It's a history I am very familiar with. Gates is extraordinarily intelligent and tried to instill a hiring policy of hiring very bright people. This certainly set them apart for many years and led to their dominant success, which has been honestly earned and unfortunately attacked by antitrust and other such irrationalities.

But, I think they've been steadily dropping the ball as they get larger and more bureaucratic, and more complacent in a number of ways. I have a number of reasons for saying this, from years of personal observation, that I won't try to put in to this post.

It did not have to take them as long as it did to start to rectify the security problems, and most of them could have (and often were) foreseen years ago - running executable binary code in the browser that can do *anything* permitted in the user's security context (ActiveX controls) was identified as inherently perilous from a security standpoint long ago but not really addressed until relatively recently - but the source of many problems. Permitting incoming emails from any source to execute macros in the same security context as the user running Outlook, macros that can do virtually anything, another big source of problems (and why I've used Eudora for over 10 years, for email.) And many other issues, some foreseen, others more subtle.

Because they did drop the ball with that security, there are many of those millions of trusting customers whose data has been totally compromised - potentially every Quicken file, every Word document, every Excel spreadsheet, every photo, can be copied to the attacker's systems for analysis (and, e.g., identity theft, which is one reason why it's so much on the rise). That's not counting that their systems can be used at will to perform DOS attacks, send spam, and so forth.

So it was not taking the problem seriously that led to where we are, today, here and now, and it is not taking this problem really seriously to have a million more PCs become infected with this junk per month. And it is largely a software issue. Their best specific solution so far is Windows Defender, based on a product developed outside Microsoft and acquired in late 2004.

Do I seriously think a small team of the most talented programmers could do better than Microsoft has, in the area of fixing the problem? Yes I do. At the upper fractions of talent, quantity can't substitute for quality. One thousand above average physicists added together do not equal the abilities of one Einstein and I think the same idea holds true in any profession (the analogy is still true even if we're talking about the current-best rather than a rare genius). But it's practically certain that those people already have work, hence my suggestion of a large financial incentive to get them to work on *that* particular problem.

It's an emergency problem that needs to be fixed and I simply don't think that they're doing what's needed to fix it quickly enough, and they should accept some of the responsibility for it existing. Certainly, the criminals who exploited the weaknesses have primary responsibility, but I don't think it excuses Microsoft from ignoring the weaknesses for so long, nor from failing to treat the situation as the emergency that it is.

Share this post


Link to post
Share on other sites

I would strongly have to disagree with your view, PhilO.

They might not execute your specific strategy, but that does not mean Microsoft do not take security very very seriously. They have already spent millions of dollars addressing the problem.

A quick question, have you looked at the majority of security exploits on what exactly they are? The majority of security mistakes are buffer overflows.

To anyone in this forum who does not know what a buffer overflow is, they are a programming mistake that instead of the user input being checked, it goes straight through to the code. Sometimes there are special cases that are not checked correctly so the input overwrites some memory. Hackers count on it overwriting programming code memory so that the program executes their input instead of the real code. After that, the hacker is running their code on your computer.

To address that, Microsoft has done a number of things. First off they put Data Execution Protection into Windows XP. It basically stops a program from writing data to programming code memory. Instead, it will bomb the program out with an error before any damage to the system is done. To utilise that feature properly tho, they have to rely on new hardware features in the users CPU. Modern CPU cores have that, but there is still a large userbase that have older computers which don't support it. The security situation will improve significantly without any further changes than that as time goes on, old systems are retired and new systems are built to replace them. Short of buying users a new computer, there is nothing more Microsoft can do there to improve security.

But they did not stop there. Microsoft always have huge deadlines to meet but they stopped all windows development a while back. They ordered all of their programmers to look at all the code they have written and to then audit it very very carefully to see if there are any ways that buffer over flows could happen, and to fix those issues. Those huge amounts of security upgrades were then put into XP Service Pack 2. This is not a trivial amount of resources to spend on the problem. They also sent massive amounts of their programmers to courses to learn more about security.

The next thing they have done is declared a lot of functions in the Windows API obsolete. They will continue to exist for compatibility reasons but programmers are strongly encouraged to use the new functions. The replaced functions offer the same functionality but they have checking on them to stop buffer overflows.

As these are incorporated in various programs, Windows will become a lot more secure.

To deal with other security issues other than buffer overflows.

Before, Microsoft software was very trusting in that it assumed that it assumed 3rd party software would most likely not be malicious in nature. Stuff like ActiveX was a way for new programmers who had brilliant ideas to extend the internet to include new forms of content that Microsoft hadn't dreamed of yet. There were no hurdles, the user could simply go to the page and enjoy the content.

Now tho, Microsoft realises that was very naive. Stuff like that is now blocked by default. If the user wants to see that content, they will have to enable it for each site manually.

Microsoft have tried to bring in other measures in the past, such as requiring that all software be signed with a special digital certificate that can't be broken. If software is signed, it will be trusted to be secure, if the software program is not signed, it will be blocked. If a program is malicious, it will get its digital certificate blocked.

But there was a gigantic uproar over it that Microsoft were trying to control everybody and what programs they could run. Google the keyword "Palladium" to see the controversy.

Future Microsoft products, such as Vista, are designed with security explicitly in mind. Bill Gates has said in corporate e-mails that security at the moment is far more important than new features.

Microsoft do hire the very best talent they can find. Each year, they keep an eye on who is graduating from colleges around the world and they offer the very very best jobs at Microsoft. Now obviously, they make mistakes since they haven't offered me a job yet. :)

But that doesn't change the fact that they do hire the very best everywhere that they can find.

One final point, Bill Gates a long time ago, predicted the time wasn't right for the internet. He thought that security technologies were not mature enough for it so he focused on NT and its security as his primary focus. Microsoft missed the internet boat as a result with WinSock and Netscape taking dominance over the internet.

It was at that point that Microsoft decided that the marketplace did not want security as a high priority(the modern security threats that exist today, were not around the place back then) so they plunged straight into the internet market with a focus on features and catching upto Netscape.

Share this post


Link to post
Share on other sites

They might not execute your specific strategy, but that does not mean Microsoft do not take security very very seriously. They have already spent millions of dollars addressing the problem.

Sure, I know that. But the bottom line is that it clearly hasn't been enough, and as I've tried to argue, it's probably more about *how* they're spending the money.

A quick question, have you looked at the majority of security exploits on what exactly they are? The majority of security mistakes are buffer overflows.

Yes, I know what a buffer overflow is, I was programming in Z80 assembly language before you were alive, Michael :) Usually they are caused by overly optimistic (i.e., arguably sloppy) programming in C/C++ that allocates a buffer of fixed length and then not checking the size of strings that are written to them. That you should check the size, *especially* where your code could be called by strings supplied by an external call, has been a commonly accepted best practice for probably 30+ years.

Not all security exploits are buffer overflows. Installing junk via ActiveX or processing VBA macros in arbitrary incoming emails in Outlook wasn't a buffer overflow problem - it was a *serious architectural mistake* to have introduced such an obviously large attack surface. Mistakes that were pointed out to Microsoft years ago and which they effectively ignored until it became a "big enough" problem.

But they did not stop there. Microsoft always have huge deadlines to meet but they stopped all windows development a while back.

What I heard was that it was one month where pretty much everyone was given a mandate to do a line by line code review to find problems. Not just Windows but all development. SQL Server was targeted by a number of serious worms, you recall. It was a good thing to do, though inadequate.

Before, Microsoft software was very trusting in that it assumed that it assumed 3rd party software would most likely not be malicious in nature. Stuff like ActiveX was a way for new programmers who had brilliant ideas to extend the internet to include new forms of content that Microsoft hadn't dreamed of yet. There were no hurdles, the user could simply go to the page and enjoy the content.

I would note, again, that there were the pessimists of the time (unpleasant people like me) who did consider the downside of being able to go a web page to automatically run arbitrary compiled code on your system, and publicly said so, years before Microsoft did anything about it. It took millions of infected computers to spur action. I would not call that a particularly brilliant approach to problem solving.

Future Microsoft products, such as Vista, are designed with security explicitly in mind. Bill Gates has said in corporate e-mails that security at the moment is far more important than new features.

We'll see. NTFS permissions have been around for years and offer very fine grained security, and then mostly get bypassed wholesale by non-corporate users because it's impossible to do most Windows program installations if you don't have Administrator rights. As you know, that's one reason that malicious programs can easily get installed. It's a reason having to do with a disintegrated, architectural mistake on Microsoft's part.

Microsoft do hire the very best talent they can find. Each year, they keep an eye on who is graduating from colleges around the world and they offer the very very best jobs at Microsoft. Now obviously, they make mistakes since they haven't offered me a job yet. :)

Did you acquire your programming smarts in college? Have you ever compared your self-taught knowledge and programming ability with the typical CS graduate? (I have little doubt that you're better.)

They hire mostly smart people that can pass their method of hiring, which includes a filtering focus on people who like to solve "trick" or logic puzzles quickly (see the book How Would You Move Mount Fuji?). That is not the same as generally hiring the very best talent, though they have hired (or purchased by company acquisition) some really stellar people. For example, Jim Blinn is a giant of computer graphics and works at Microsoft Research. It is not apparent to me how his influence is operating within the company though - I have read in several different places that there's a disappointingly low connection between Microsoft Research and what gets put in to commercially released software.

I have seen a programming example that was in the MSDN library that was in C, and completely wrong from top to bottom - it wouldn't have compiled without a total rewrite, and obviously nobody had ever tried or else they would have known it. They were certainly a Microsoft employee.(*) Take a close look and honestly tell me that it isn't a hair raising example of incompetence. It even involves a string buffer reference. :D

One final point, Bill Gates a long time ago, predicted the time wasn't right for the internet. He thought that security technologies were not mature enough for it so he focused on NT and its security as his primary focus.

Ironically, I now use Firefox as my preferred browser. More secure, faster, and lots of active development in interesting little Extensions, which are probably what ActiveX should have been - a lot easier to program for, not automatically installed just by hitting a web page, and most of them downloadable from a single site that has a high probability of weeding out malicious programs.

I certainly will continue to use Microsoft Windows on my main systems, and will continue to use Microsoft development tools for C++/.NET Windows programming. Where possible, I would recommend SQL Server 2005 over Oracle because it's far better. I admire Microsoft's success, but even better *is* possible, which I know not because of some "godlike perch" but because (1) they have done some pretty dumb things that are in spite of, not because of, their success, which were identified by people years before they actually acted, (2) other companies, and single individuals, have written software that is arguably technically better, with far less resources.

Commercial success is a great thing to be admired, but it does not imply that the products or services are actually the best. IBM's existing market clout made the segmented Intel architecture an instant "standard", when they selected the Intel 8088 processor for the original PC. But the Motorola MC68000 (which, to their credit, was used by Apple later) was *radically* superior to the hideously ugly, slapped-together, totally-lacking-in-integrity segmented Intel architecture of that time. The 68000 had (has) a flat 32 bit address space (24 bits usable externally), a large number of 32 bit registers that were mostly orthogonal (symmetrically accessible in the instruction opcodes), and other virtues. Programming the 68000 in assembly language (which I did for awhile for an embedded high powered laser control system) was sometimes like programming in C, it was so powerful. The 8088 had a segmented 20 bit address space - be thankful if you never had to deal with segmented memory (I have many times) - and a completely non-orthogonal hodge-podge of registers. But that one IBM decision cemented the use of Intel architecture thereafter, with many bad consequences for years, despite the commercial success of the PC.

-------------

(*) Since proof is probably required, I decided to give the technical specifics here, from a post I made on another board some time ago:

Today I was looking through the Platform SDK documentation (for Visual Studio .NET 2003) at an example purporting to show how to enumerate the files in a directory. The example's help URL is:

ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/f il eio/base/listing_the_files_in_a_directory.htm

found in the hierarchy at: MSDN Library/Windows Development/Windows Base Services/Files and I/O/SDK Documentation/Storage/Storage Overview/Directory Management/Obtaining Directory Information/Listing the Files in a Directory.

The code sample is:

#define _WIN32_WINNT 0x0501

#include "windows.h"

int

main(int argc, char *argv[])

{

WIN32_FIND_DATA FindFileData;

HANDLE hFind = NULL;

LPCTSTR lpDirSpec[MAXPATH]; // directory specification

wsprintf ("Target directory is %s.\n", argv[1]);

strncpy (lpDirSpec, argv[1], sizeof(argv[1]));

strncpy (lpDirSpec, "\*", 3);

hFind = FindFirstFile(lpDirSpec, &FindFileData);

if (hFind == INVALID_HANDLE_VALUE) {

wsprintf ("Invalid file handle. Error is %u\n", GetLastError());

return (-1);

} else {

wsprintf ("First file name is %s\n", FindFileData.cFileName);

while (FindNextFile(hFind, &FindFileData) != 0) {

wsprintf ("Next file name is %s\n", FindFileData.cFileName);

}

DWORD dwError = GetLastError();

if (dwError == ERROR_NO_MORE_FILES) {

FindClose(hFind);

} else {

wsprintf ("FindNextFile error. Error is %u\n", dwError);

return (-1);

}

}

return (0);

}

The sample is *utterly wrong* from top to bottom. It will not remotely compile. Almost literally everything about it is screwed up, from the incorrect MAXPATH define (it should be MAX_PATH) to the 'wsprintf' function lacking a string target, to the 'LPCTSTR lpDirSpec[MAXPATH];' which is obviously meant to define a string buffer but which actually defines an array of LPCTSTRs (LPCTSTR is a pointer to a string).

Share this post


Link to post
Share on other sites

For anyone: What is the legal status of such attacks? Does the U. S. A., for example, have laws prohibiting such attacks? If so, who enforces such laws and with what penalties? Or is the situation anarchic?

If it is launched not by a private party, but by a government, it should be considered an act of war. For example, the recent case of Titan Rain has many suspecting the Chinese military is (surprise!) behind some recent network attacks. While the info here doesn't indicate denial of service attacks per se, there's no reason such couldn't be used. Or, if you have some cyber-literate but religious types eager to do long-distance terrorizing, I wouldn't rule it out.

Share this post


Link to post
Share on other sites
I have seen a programming example that was in the MSDN library that was in C, and completely wrong from top to bottom - it wouldn't have compiled without a total rewrite, and obviously nobody had ever tried or else they would have known it. They were certainly a Microsoft employee.(*) Take a close look and honestly tell me that it isn't a hair raising example of incompetence. It even involves a string buffer reference.

Do you think you're being fair when you bring this article up? Do you think this *one* incorrect article you've found out of MSDN really means anything?

Share this post


Link to post
Share on other sites

Do you think you're being fair when you bring this article up? Do you think this *one* incorrect article you've found out of MSDN really means anything?

I did not do a systematic survey, it was found entirely randomly. There are many other known problems with Microsoft software/documentation, that just happens to be one that I personally found, which seemed remarkable enough to note.

Do I think that finding a programming example from Microsoft, supposedly where only top minds are hired, that would not pass muster with an introductory programming class in the first week, and which is totally wrong and could not possibly have compiled, is meaningful? Well, I think so, and I think it goes some way in explaining internal quality control at the company and why things such as buffer overflow problems have been so rampant (though in the example given, the string buffer itself was not even declared properly), but everyone can judge for themselves. I did let them know about it and they wrote back saying that it would be fixed, but that is beside the point I am making.

Share this post


Link to post
Share on other sites
Do I think that finding a programming example from Microsoft, supposedly where only top minds are hired, that would not pass muster with an introductory programming class in the first week, and which is totally wrong and could not possibly have compiled, is meaningful? Well, I think so, and I think it goes some way in explaining internal quality control at the company and why things such as buffer overflow problems have been so rampant (though in the example given, the string buffer itself was not even declared properly), but everyone can judge for themselves.
Here is the new and improved version of the sample code you posted:

#define _WIN32_WINNT 0x0501

#include <windows.h>
#include <string.h>
#include <stdio.h>

int main(int argc, char *argv[])
{
WIN32_FIND_DATA FindFileData;
HANDLE hFind = INVALID_HANDLE_VALUE;
char DirSpec[MAX_PATH]; // directory specification
DWORD dwError;

printf ("Target directory is %s.\n", argv[1]);
strncpy (DirSpec, argv[1], strlen(argv[1])+1);
strncat (DirSpec, "\\*", 3);

hFind = FindFirstFile(DirSpec, &FindFileData);

if (hFind == INVALID_HANDLE_VALUE)
{
printf ("Invalid file handle. Error is %u\n", GetLastError());
return (-1);
}
else
{
printf ("First file name is %s\n", FindFileData.cFileName);
while (FindNextFile(hFind, &FindFileData) != 0)
{
printf ("Next file name is %s\n", FindFileData.cFileName);
}

dwError = GetLastError();
FindClose(hFind);
if (dwError != ERROR_NO_MORE_FILES)
{
printf ("FindNextFile error. Error is %u\n", dwError);
return (-1);
}
}
return (0);
}

Share this post


Link to post
Share on other sites
-------------

(*) Since proof is probably required, I decided to give the technical specifics here, from a post I made on another board some time ago:

Today I was looking through the Platform SDK documentation (for Visual Studio .NET 2003) at an example purporting to show how to enumerate the files in a directory. The example's help URL is:

I think it would be a mistake to judge Microsoft's programmer hiring practices off of one piece of documentation in the help file. I suspect this is the work of a technical writer with enough programming experience to get them self in trouble. It is definitely an example of poor quality control in the documentation.

Back to browsers: The latest IE7 is an example of how Microsoft is trying to harden the browser. So far it has to be the most irritating program I have ever used. It continually forces you to manually enable ActiveX controls on a site even after putting it in the Trusted Sites list. And you find yourself continually having to re-enable ActiveX for the same site. I work at a company with 153,000 employees with *many* internal applications that rely on ActiveX. Which goes to the point why hasn't Microsoft just hired top talent to fix their security issues. It will break millions of dollars of applications at some of their best customers.

Like it or not, it is going to take time to fix their problems. I love Microsoft's server products but I moved to OS X on the desktop for my personal systems (with the result my Objectivism Research Guide sits unused in its caddy *hint* *hint*). I'm agnostic about Vista as I have not had a chance to work with the betas yet but if it handles notifications the same way IE 7 does I'll tire of it right quick.

Share this post


Link to post
Share on other sites

Here is the new and improved version of the sample code you posted:

At least it appears to basically work. :D

I would note that this:

strncpy (DirSpec, argv[1], strlen(argv[1])+1);

is peculiar. It is functionally equivalent to a simple:

strcpy(DirSpec, argv[1]);

Also, re: the earlier thing about string buffer overflows, note that there is no check to assure that argv[1] + the concatenated '\*' is less than the allocated buffer (DirSpec) of MAX_SIZE+1. argv[1] is supplied externally so there is no particular guarantee that the incoming string will not exceed the buffer size, which will lead to a buffer overflow if the length of argv[1] exceeds sizeof(DirSpec)-3 ('\*' plus the 0x0 terminator = 3 bytes extra).

If we replaced it with this:

strncpy (DirSpec, argv[1], sizeof(DirSpec)-3); DirSpec[sizeof(DirSpec)-3]=0x0;

then there would effectively be buffer overflow limiting, which is the only logical reason to have a strncpy there. (The last statement is required to properly set a string terminator in case argv[1] is actually too big.)

It is likely that strcpy by itself has become anathema in the 'code review', so they've replaced them with 'strncpy' which limits the number of characters copied (but does not otherwise check for overflows). Ironically, the limit that they supply is precisely equal to the incoming string length, rather than being related to the proper limit, which is the destination string buffer size minus 3, as I give above. That is not necessarily even the correct behavior - an incoming string that is too large should halt processing immediately, not just be truncated down and used, because it indicates that the caller is acting improperly.

It also doesn't check that the number of supplied parameters is correct (e.g. argc==2), so it blows up if you execute it by itself.

Conclusion: the new sample at least compiles and runs, but it does no error checking on arguments and the probably intended buffer-overflow check is completely wrong. So things have certainly improved. :D:):)

On a tangential note, it's these kind of problems that led to the development of Java and C#, which have higher level string manipulation operators (as does C++ with appropriate class libraries, but you can still get in trouble in a hurry). In Java/C#/etc., it's much more difficult, if not practically impossible, to have buffer overflow errors of the kind you get with C. The tradeoff is execution speed. Note however that there is no substitute for the necessity to think about what you're doing.

Share this post


Link to post
Share on other sites
...

Usually they [i.e., buffer overflows] are caused by overly optimistic (i.e., arguably sloppy) programming in C/C++ that allocates a buffer of fixed length and then not checking the size of strings that are written to them. That you should check the size, *especially* where your code could be called by strings supplied by an external call, has been a commonly accepted best practice for probably 30+ years.

I emphatically agree, and go farther: having buffer overflows in code that's supposed to be robust (like an operating system) isn't just arguably sloppy programming, it is sloppy programming. Shipping code like this and then, when problems are found, going back and trying to fix all the bugs, is just asking for trouble.

...

I would note, again, that there were the pessimists of the time (unpleasant people like me) who did consider the downside of being able to go a web page to automatically run arbitrary compiled code on your system, and publicly said so, years before Microsoft did anything about it. It took millions of infected computers to spur action. I would not call that a particularly brilliant approach to problem solving.

...

I've always thought it dubious that web pages are apparently allowed to run arbitrarily powerful code on the browsing system, and also that there's the same problem with Outlook email. After all, the purpose of email and web browsing is just communication. That should require only a limited capability be given to the code on the web page or in the email message.

Commercial success is a great thing to be admired, but it does not imply that the products or services are actually the best. IBM's existing market clout made the segmented Intel architecture an instant "standard", when they selected the Intel 8088 processor for the original PC. But the Motorola MC68000 (which, to their credit, was used by Apple later) was *radically* superior to the hideously ugly, slapped-together, totally-lacking-in-integrity segmented Intel architecture of that time. The 68000 had (has) a flat 32 bit address space (24 bits usable externally), a large number of 32 bit registers that were mostly orthogonal (symmetrically accessible in the instruction opcodes), and other virtues. Programming the 68000 in assembly language (which I did for awhile for an embedded high powered laser control system) was sometimes like programming in C, it was so powerful. The 8088 had a segmented 20 bit address space - be thankful if you never had to deal with segmented memory (I have many times) - and a completely non-orthogonal hodge-podge of registers. But that one IBM decision cemented the use of Intel architecture thereafter, with many bad consequences for years, despite the commercial success of the PC.

As somebody who's had to compile code for the Intel X86 machines, I'll certainly agree that, compared to the Motorola 68000 series they are more difficult to program. Register tracking and allocation, with that non-orthogonal register set, is quite a challenge. (But a bigger challenge is using the X86 floating point registers in compiled code!) So why did IBM choose the 8088? My understanding is that first, they thought that being able to run old 8080 code was important and second, the 68000 wasn't quite ready at the time - something like not saving context properly when an interrupt occurred. So IBM had valid reasons for choosing the 8088.

Share this post


Link to post
Share on other sites

I don't think that one piece of bad code out of thousands and thousands of articles can be used to say anything negative about MSDN, let alone Microsoft as a whole. Just think of all the great things that Microsoft has built over the years, yet when the quality of Microsoft's products comes up we're talking about a help file that says "MAXPATH" instead of "MAX_PATH."

Microsoft deserves criticism for its many mistakes, but if you added everything its mistakes have cost its users I bet you'd find that it's a small percentage of everything its users have gained. In that context, it's really hard for me to hold any negative feelings towards them or look at them as a problem.

Share this post


Link to post
Share on other sites

I believe that the debate about code is a side issue that is bogging down this thread. The principle of the matter is nowhere apparent in this thread, and it is on principle that I agree with Stephen and disagree with Phillip.

The overarching principle here is that of virtue. If one acts virtuously to the best of their ability I see them as a good person. I can use myself here. I have always been a good person. Even before I discovered the proper way to reason (and I’m still discovering it now) I still reasoned as well as I could. I used my own perception and thinking as my guide to knowledge. Sure, I’ll admit that I even thought incorrectly and that I probably still have lots of wrong ideas from not thinking correctly, or not thinking at all (accepting others’ conclusions). But the point is that I’m trying. I’m trying the best I can with what I’ve got and that’s all one can ask of another.

To say that “well, here’s what reason is as fully laid out in theory, that’s how somebody should reason” and then say, when they don’t through no fault of their own reason in exactly that way, that they are bad people is nothing more than Platonism. You’ve constructed a static (i.e. Platonic) ideal and then are asking them to become that. But an ideal, or perfection, when applied to man is not static. Perfection is a virtue, or an action repeatedly done; a habit. I like to look at is as this: there is no end-game in life. There is no point at which one may say, “Well, I’ve reached perfection, I can quit now” and then stop. One must continually act in a certain way to live.

Applying this to the current issue, I believe that Phillip has done nothing more than to set up a Platonic ideal. Based on what he would have done if he were Bill Gates or the others who manage Microsoft, he has created the ideal. Then, when Microsoft decides not to do what he prescribes, they’ve “dropped the ball” and are a bad company.

I don’t see how anybody can do anything of the sort. Microsoft is a highly successful company that continues to make nice profits for its shareholders. Obviously, they’re doing something correctly. Besides, what do you know of project costs, deadlines, etc. all of which might have contributed to the decision to not address the security concerns until later? Furthermore, even though they addressed the concerns later, they still addressed them. The point is that there is no one way to run Microsoft “correctly” other than profitably. The same way that there is no one way for a man to live his life except virtuously. (Side note: I’m not advocating relativism.)

So, I believe that Microsoft is a good company. Sure, one might have done it differently, and go ahead and disagree with them. Say that you think another way was also good. But that doesn’t negate the fact that, though different, it was still good. Microsoft has done good things, different things than someone else might have done, but still good things, and profitable things. I believe that Microsoft is a good company.

Share this post


Link to post
Share on other sites

I just noticed that I posted on the Microsoft issue and not the thread issue. Well, my comments to that still stand, but I would like to point out that others have demonstrated my point as well, maybe not so explicitly.

Share this post


Link to post
Share on other sites