My Message close
Latest News
spacer View All spacer
 
November 8, 2012
 
Jordan Mechner finds his cinematic rhythm with Karateka remake [2]
 
Game Developer magazine's Power 50: Business
 
Activision beats estimates by going big, focusing on retail hits [6]
spacer
Latest Features
spacer View All spacer
 
November 8, 2012
 
arrow Postmortem: Sony Bend Studio's Uncharted: Golden Abyss [1]
 
arrow Is Game Music All It Can Be? [31]
 
arrow The End of the Journey: One Indie Studio's Tale [21]
spacer
Latest Jobs
spacer View All     Post a Job     RSS spacer
 
November 8, 2012
 
GREE International
Game Studio, Android Developer
 
GREE International
Sr. Analyst, Social Game Platform
 
GREE International
HTML5 UI Developer
 
Avalanche Studios
Senior Tools Programmer, New York
 
Avalanche Studios
Senior Graphics Programmer, New York
 
ZeniMax Online Studios
Customer Service Analytics Manager
spacer
Blogs

  Security, security! But do you test it?
by Andrey Karpov on 11/07/12 03:11:00 am   Featured Blogs
1 comments Share on Twitter Share on Facebook RSS
 
 
  Posted 11/07/12 03:11:00 am
 

There is no fragment in program code where you cannot make mistakes. You may actually make them in very simple fragments. While programmers have worked out the habit of testing algorithms, data exchange mechanisms and interfaces, it's much worse concerning security testing. It is often implemented on the leftover principle. A programmer is thinking: "I just write a couple of lines now, and everything will be ok. And I don't even need to test it. The code is too simple to make a mistake there!". That's not right. Since you're working on security and writing some code for this purpose, test it as carefully!

When and where is security important? In many applications. But let's not discuss it in abstracto. Take, for instance, the source codes of the Tor application. This is a system intended to enable online anonymity. Tor client software directs internet traffic through a worldwide volunteer network of servers to conceal a user's location or usage from anyone conducting network surveillance or traffic analysis. To know more what it is and where it is used, see the Wikipedia article.

Everyone will agree that programmers should pay maximum attention to data security in such an application. And even more than that! Let's put it this way, you should develop this application being in a state of paranoia and persecution mania.

Indeed, much is done in the TOR program to conceal and protect information. But when I study the code, I'm starting to feel sad. Many protection mechanisms simply stay idle because of trivial slip-ups and misprints.

One of the protection mechanisms is intended to clear buffers which are not used anymore. These buffers may contain passwords, IP addresses, and other user data. If you don't destroy these data, they may be sent to the Internet in the form of trash. It's not a fairy-tale - it's a real-life situation. To find out how exactly it may happen, see the article "Overwriting memory - why?".

The TOR developers know of this danger and try to clear buffer contents using the memset() function. This is an Epic Fail. The compiler has the right to remove calls of the memset() function from the code, if the buffer it clears is not used anywhere.

Consider a code fragment taken from TOR:

int
crypto_pk_private_sign_digest(....)
{
  char digest[DIGEST_LEN];
  ....
  memset(digest, 0, sizeof(digest));
  return r;
}

Now let's find out how it works. The 'digest' buffer is created on the stack. It is used somewhere later. It doesn't matter how exactly it is used, the point is that we want to clear it after that. The programmer has written a memset() function call for this purpose. However, the 'digest' buffer is not used in any way in the function after that. The compiler will notice it when performing optimization and remove the function call. It won't change the program logic, but it will make it dangerous from the viewpoint of data privacy.

Those interested in details, please look here - you'll see the assembler listing showing how the memset() function call disappears. Visual C++ 2010 is used as compiler together with the "/O2" switch.

You should use such functions as RtlSecureZeroMemory() to certainly clear the memory. These functions are created specially for such cases and cannot be deleted by the compiler.

You may say that I'm making a mountain out of a molehill, that no important data will get anywhere. Maybe. But can you be sure? Since the developers have implemented the array clearing mechanism, they must be concerned about something. And they did it not in one or two places in the code - there are many such fragments. It's a pity their efforts were spent in vain in most cases. Not to sound unfounded, I will give you a list of fragments containing errors.

This is the list of files and lines where the PVS-Studio analyzer has generated the warning "V597 The compiler could delete the 'memset' function call, which is used to flush '...' buffer. The RtlSecureZeroMemory() function should be used to erase the private data":

  • crypto.c 1015
  • crypto.c 1605
  • crypto.c 2233
  • crypto.c 2323
  • tortls.c 2453
  • connection_or.c 1798
  • connection_or.c 2128
  • onion.c 295
  • onion.c 384
  • onion.c 429
  • rendclient.c 320
  • rendclient.c 321
  • rendclient.c 699
  • rendclient.c 942
  • rendclient.c 1284
  • rendclient.c 1285
  • rendservice.c 705
  • rendservice.c 900
  • rendservice.c 903
  • rendservice.c 904
  • rendservice.c 905
  • rendservice.c 906
  • rendservice.c 1409
  • rendservice.c 1410
  • rendservice.c 1411
  • rendservice.c 1412
  • rendservice.c 1413
  • rendservice.c 1414
  • rendservice.c 1415
  • rendservice.c 2078
  • rendservice.c 2079
  • rendservice.c 2080
  • rendservice.c 2516
  • rendservice.c 2517
  • rendservice.c 2518
  • rendservice.c 2668
  • rendservice.c 2669
  • rendservice.c 2670
  • tor-gencert.c 108

I've cited such a long list deliberately. I want you to feel the huge depth of the problem of missing checks for code which is responsible for security. How on earth can one make a mistake using memset()? Well, quite easily, as it turns out.

This is not only the problem of TOR. This is a common problem for many applications and libraries. We don't need to go far for an example. What libraries does TOR use? For instance, it uses OpenSSL. This is an open-source cryptographic package intended for SSL/TLS handling. Let's see how the OpenSSL developers clear memory.

The OpenSSL developers know that memset() cannot be used to clear memory buffers. That's why they have created their own function. Here it is:

unsigned char cleanse_ctr = 0;
void OPENSSL_cleanse(void *ptr, size_t len)
{
  unsigned char *p = ptr;
  size_t loop = len, ctr = cleanse_ctr;
  while(loop--)
  {
    *(p++) = (unsigned char)ctr;
    ctr += (17 + ((size_t)p & 0xF));
  }
  p=memchr(ptr, (unsigned char)ctr, len);
  if(p)
    ctr += (63 + (size_t)p);
  cleanse_ctr = (unsigned char)ctr;
}

A perfect paranoid code. Everything's ok with it. It will clear memory indeed. What's more, it will fill it not just with zeroes, but with random numbers.

But there are errors in the code that make this function useless: the private data will remain there. Have a look at this code:

void usage(void)
{
  static unsigned char *buf=NULL,*obuf=NULL;
  ....
  OPENSSL_cleanse(buf,sizeof(buf));
  OPENSSL_cleanse(obuf,sizeof(obuf));
  ....  
}

So many efforts spent on writing the OPENSSL_cleanse() function - all in vain.

Look close. Don't you see anything bad?

The expressions sizeof(buf) and sizeof(obuf) calculate the pointer size instead of the buffer size. As a result, only the first 4 bytes will be cleared in a 32-bit program, while all the rest private data won't.

There are other errors of this type to be found in OpenSSL (see V597):

  • ec_mult.c 173
  • ec_mult.c 176

Conclusions:

  1. If data security is an important part of your software product, you must create the corresponding tests to check it. For instance, when creating unit-tests for a function, you also need to make sure that no important data are left in the stack. To do this, call a function with an array like "char buf[10000]" in the beginning and search it through for words that could be left in the stack.
  2. Don't test the DEBUG version only; pay attention to the RELEASE version too. The error with memset() discussed in the article won't reveal itself in the DEBUG version.
  3. Use static code analyzers. They can tell you many interesting details about errors and unsafe code fragments.
  4. Applications critical from the viewpoint of security should be open-source. Now I've just come across the open-source TOR project and found those errors there. This information can be used to make the product better. Can I check proprietary code? No. It means that such errors might remain unnoticed by the developers for many years.
  5. No matter how experienced and skilled a programmer is, he/she is never safe from making simple and silly mistakes. Remember that the phrase "professional programmers never make simple mistakes and misprints" is a myth. It's not true. You'd better be self-critical - the awareness that you may make a mistake alone will help you avoid many of them. When you know it, you won't feel lazy to write one more test, run a code analyzer, or just reread the code you've written.
 
 
Comments

Matt Boudreaux
profile image
Just wanted to say I enjoy these articles, I always learn something new.


none
 
Comment:
 




 
UBM Tech