The world has ended! openssl has this terrible bug which allows an attacker to receive a pot-luck of 64k portions of the memory address space of the SSL server. Thing that are up for grabs include usernames, passwords and SSL private keys; i.e. a veritable grab-bag of things that can be obtained from the server.
You should really change your password.
Actually, don’t bother; it’s not like it actually matters in the long run. You’re probably going to change your password from
bunny3 anyway and it’s not like that was the most likely point of escape of your password in the first place. It’s probably that toolbar that installed when you installed that video player that was needed to watch that movie; you know the one? The one with the kittens? The one that you never got to see because you got distracted.
It’s been a really bad few months for security, and while people have been pointing fingers in various directions, deploying a Nelson-style ‘Ha Ha’, the only lesson that has been found in this case is that the bible has a great thing to say about this:
Do not gloat when your enemy falls; when they stumble, do not let your heart rejoice, or the Lord will see and disapprove and turn his wrath away from them.
… and then you’re in trouble.
As for me, I’m going to wait a few weeks and then change the passwords that are in any way connected to money. It would be nice if I could get some form of toolbar that would check sites as I go to them in a database and reveal if they were subject to the bug and prompt me to change the password once they’ve been declared clear. Someone get on that; it sounds like a fun project
So I found this little security clanger in the manual page for dlopen on Mac OS X, where it states:
When path does not contain a slash character (i.e. it is just a leaf name), dlopen() searches the following until it finds a compatible Mach-O file: $LD_LIBRARY_PATH, $DYLD_LIBRARY_PATH, current working directory, $DYLD_FALLBACK_LIBRARY_PATH.
Yes, current working directory, one of the classic vulnerability injection mechanisms. This is as epically bad a security clanger as Microsoft Windows’s LoadLibrary call but, apparently, nobody cares! Linux, and Solaris have a far more sensible mechanism, where it actually enumerates the places that it looks for the library, but unless you really, horrendously eff it up, it won’t look in the current working directory.
I nearly did a spit-take when I saw this explicitly called out. In this day and age, it’s an embarassment.
I’ve seen it again and again… a developer wants to access some restricted data over the internet in a client application, but is unwilling to use a per-user login to access the data. As a result, they embed a password into the application. Then they discover that the password is readable via some mechanism once the application is on the client. Developer scratches their head and tries to figure out how to secure the password. Developer gets frustrated as people say ‘that doesn’t work’.
Fundamentally, you are trying to hide a secret in a client application. There is a long history of trying to do this in applications. It forms the basis for pretty much all forms of application protection – and it is fundamentally impossible. If there is everything you need to run an application on a system, it just requires a certain amount of effort to determine the secret. The amount of effort varies, but in general it is a continual fight between the developer and the person trying to determine the secret.
Mechanisms have been developed to try and make the secret ‘something you have’. One of the earlier disk-based methods was to have ‘malformed’ sectors on a floppy drive that needed to be read. These sectors were only ‘malformed’ in that they were laid on the disk in a method that made them difficult to read normally. The sectors that were read became part of the code that was used to execute the application.
The fix to this form of protection was to read the protected content from an original and then putting this data into a new copy of the program, replacing the invalid content with this good data, and then skip/remove the code that performed the read of the drive data into that location.
An extension to this protection was to actually encrypt the code that is loaded from disk, and then decrypting it at execution time – the encryption varied from simple byte-level xor-based to more fancy xor with rotate. Typically this decryption code butted up to the decrypted code (sometimes even overlaying it), preventing you from setting a breakpoint at the first instruction following the decryption code). Solving this problem involved manually decrypting a few bytes (which at the time was a pen-and-paper operation), and then starting the decryption from the subsequent instructions. Sometimes easy, sometimes more difficult. This would typically be used in conjunction with the ‘special’ media to give a dual layer of protection.
Another mechanism was the hardware dongle. An oft-loved feature of expensive software, it typically embedded some data on the dongle that was necessary for the use of the application. Without the dongle, the application was useless. Some even went so far as to corrupt the data created from the application if the dongle was not present – e.g. straight lines would no longer be quite straight following a save-load cycle, making the files deteriorate following the transition (I think Autocad used this method).
The issue with hardware-based mechanisms is that they have a high cost associated with them on a per-unit basis. A quick search revealed a per-unit cost of €25 for low order quantities, which would need to be added into the cost of the application. In other words, this can quite often not be an appropriate for software which has a low price goal.
For any of these mechanisms, if someone obtained only one part of the solution (application without special disk/dongle) then a well written protection would mean that the application was unusable without the second part. Poorly written protections would use perform a simple test against the special item, and not actually make use of any of the underlying data from it. In general, once you have all the items that are needed for the running of the application all that mattered after that was skill and time.
Special media, encryption, dongles, packers, obfuscation, anti-debugging tricks are many of the tools that have been used to secure applications.
What has this got to do with the opening paragraph? Well quite a bit, actually. The developer needs to store some kind of secret in the application. This secret can be anything, but in general it is some form of key to gain access to some form of resource. Nowadays, the application is not going to be shipped with any physical media – after all, this is the 21st century, and the use of physical media is archaic. This tends to rule out special media and dongles from the mix.
This leaves encryption, packers, obfuscation and other anti-debugging tricks. There are some very good solutions out there for the packer/encryption/anti-debugging. A quick Google for ‘executable packer anti-debugging’ yielded some interesting results. It’s a fun and interesting area, where the developer is trying to outwit the villainous cracker. Some of the solutions are commercial – adding to the cost of application development, and reducing the ability to debug the application when deployed in the field. These generally are decisions that need to be made by the developer when deciding how to proceed to protect their application.
You have to do the math on it. If the cost of developing and implementing the protection exceeds the overall value that you have placed on the application then you simply cannot afford to spend time, effort and money on a solution that will cost you more than you will ever make.
The big take from this is that if you have a secret that you want to keep safe, don’t put it in the application – all that will accomplish is to keep it temporarily out of view. The truly wily hacker will be able to get it; and if the secret is a password to one of your important on-line accounts; then you should think of some other way to mediate access to the resource.