Useful OpenSSL Tricks

By Van Emery

——————————————————————————–
Introduction
OpenSSL deserves a lot of credit. It is an extremely useful, valuable Open Source project. When people talk about how successful Apache is, rock-solid crypto toolkits like OpenSSL and OpenSSH should also be mentioned. Here are a few (of the many) functions that I have found useful, along with examples of how to use them:

•Base64 Encoding and Decoding
•Symmetric Encryption and Decryption of Files
•Cryptographic Hashing of Files
•S_CLIENT SSL/TLS Test Utility
These examples assume that you are using a Unix-like OS, with OpenSSL 0.9.6b or higher.

——————————————————————————–

Base64 Encode/Decode
Base64 encoding is a standard method for converting 8-bit binary information into a limited subset of ASCII characters for safe transport through e-mail systems, and other systems that are not 8-bit safe. With OpenSSL, it is very easy to encode and decode Base64 data:

$ openssl enc -base64 -in myfile -out myfile.b64

$ openssl enc -d -base64 -in myfile.b64 -out myfile.decrypt

Symmetric Encryption/Decryption of Files
As you can imagine, being able to encrypt and decrypt files with strong ciphers is a useful function. With OpenSSL, you can even use the commands in shell scripts. Here are some command line examples using the Blowfish, Triple DES, and CAST5 ciphers:

$ openssl enc -e -a -salt -bf -in tomcat.jpg -out tomcat.blowfish
enter bf-cbc encryption password:
Verifying password – enter bf-cbc encryption password:

$ openssl enc -d -a -bf -in tomcat.blowfish -out tomcat-decrypt.jpg
enter bf-cbc decryption password:

$ openssl enc -e -a -salt -des3 -in tomcat.jpg -out tomcat.des3
enter des-ede3-cbc encryption password:
Verifying password – enter des-ede3-cbc encryption password:

$ openssl enc -d -a -des3 -in tomcat.des3 -out tomcat-des3.jpg
enter des-ede3-cbc decryption password:

$ openssl enc -e -a -salt -cast5-cbc -in tomcat.jpg -out tomcat.cast5
enter cast5-cbc encryption password:
Verifying password – enter cast5-cbc encryption password:

$ openssl enc -d -a -cast5-cbc -in tomcat.cast5 -out tomcat-cast5.jpg
enter cast5-cbc decryption password:

If the file will not be transported as an e-mail attachment, you can forego the -a argument, which base64 encodes and decodes the ciphertext. Sometimes this is referred to as “ASCII armor”. The non-base64 encoded files should be smaller. Here is an example using the CAST5-CBC algorithm:

$ openssl enc -e -salt -cast5-cbc -in tomcat.jpg -out tomcat.nob64
enter cast5-cbc encryption password:
Verifying password – enter cast5-cbc encryption password:

$ openssl enc -d -cast5-cbc -in tomcat.nob64 -out tomcat-nob64.jpg
enter cast5-cbc decryption password:

Cryptographic Hashing Functions
What if you want to check to see that a file has not been tampered with? One simple way to do this is a cryptographic hashing function. This will give you a fixed-length string (called a message digest) given an input file of any length. SHA-1 and RIPE-MD160 are considered current; MD-5 is considered outdated.

$ openssl dgst -sha1 -c tomcat.jpg
SHA1(tomcat.jpg)= 92:b1:9b:96:ef:45:c3:89:b4:2e:e6:96:5b:43:bf:02:66:4a:47:8f

$ openssl dgst -ripemd160 -c tomcat.jpg
RIPEMD160(tomcat.jpg)= 68:f2:05:a9:9d:52:f1:cc:04:ed:d7:1e:42:80:0a:b8:c0:e6:cc:6d

$ openssl dgst -md5 -c tomcat.jpg
MD5(tomcat.jpg)= e7:13:d6:a7:cc:16:e3:da:0a:f7:ab:5a:fa:e3:3b:34
You can see that the md5sum utility that is shipped with most GNU/Linux distributions returns the same value as the openssl md5 message digest:

$ md5sum tomcat.jpg
e713d6a7cc16e3da0af7ab5afae33b34 tomcat.jpg
The OpenSSL dgst (message digest/hashing) command also has numerous options for signing digests, verifying signatures, etc.

S_CLIENT SSL/TLS Test Utility
OpenSSL has a great test utility available, called s_client. This lets you test servers that use SSL/TLS with a powerful command line utility. The following is an example of using s_client to view information about a secure web server:

$ openssl s_client -connect http://www.redhat.com:443

CONNECTED(00000003)
depth=0 /C=US/ST=North Carolina/L=Durham/O=Red Hat, Inc./OU=Web Operations/CN=www.redhat.com
verify error:num=20:unable to get local issuer certificate
verify return:1
depth=0 /C=US/ST=North Carolina/L=Durham/O=Red Hat, Inc./OU=Web Operations/CN=www.redhat.com
verify error:num=27:certificate not trusted
verify return:1
depth=0 /C=US/ST=North Carolina/L=Durham/O=Red Hat, Inc./OU=Web Operations/CN=www.redhat.com
verify error:num=21:unable to verify the first certificate
verify return:1

Certificate chain
0 s:/C=US/ST=North Carolina/L=Durham/O=Red Hat, Inc./OU=Web Operations/CN=www.redhat.com
i:/C=US/O=RSA Data Security, Inc./OU=Secure Server Certification Authority

Server certificate
—–BEGIN CERTIFICATE—–
MIID3TCCA0qgAwIBAgIQC4A9mzg//B7clolOw0V4WzANBgkqhkiG9w0BAQQFADBf
MQswCQYDVQQGEwJVUzEgMB4GA1UEChMXUlNBIERhdGEgU2VjdXJpdHksIEluYy4x
LjAsBgNVBAsTJVNlY3VyZSBTZXJ2ZXIgQ2VydGlmaWNhdGlvbiBBdXRob3JpdHkw
HhcNMDExMTE0MDAwMDAwWhcNMDMxMjA1MjM1OTU5WjCBgTELMAkGA1UEBhMCVVMx
FzAVBgNVBAgTDk5vcnRoIENhcm9saW5hMQ8wDQYDVQQHFAZEdXJoYW0xFjAUBgNV
BAoUDVJlZCBIYXQsIEluYy4xFzAVBgNVBAsUDldlYiBPcGVyYXRpb25zMRcwFQYD
VQQDFA53d3cucmVkaGF0LmNvbTCBnzANBgkqhkiG9w0BAQEFAAOBjQAwgYkCgYEA
4MFi5Xg1rYKETCZ4inSeLJwK4/g/WcOI8JUpH7aK/Hm/e8Lz0uwagzEg/EQnACGl
o6HZsAwlNwV/H4LDXhf4I7NIfgLHmrp6qY1e3SX5qfAAPbxFl4ghiGzNdlTR2Pkn
XQhj/0eW8Pt7NdmQ6LDaMHxb2WchBQYVTYC/cK2zU+8CAwEAAaOCAXkwggF1MAkG
A1UdEwQCMAAwCwYDVR0PBAQDAgWgMDwGA1UdHwQ1MDMwMaAvoC2GK2h0dHA6Ly9j
cmwudmVyaXNpZ24uY29tL1JTQVNlY3VyZVNlcnZlci5jcmwwgawGA1UdIASBpDCB
oTCBngYLYIZIAYb4RQEHAQEwgY4wKAYIKwYBBQUHAgEWHGh0dHBzOi8vd3d3LnZl
cmlzaWduLmNvbS9DUFMwYgYIKwYBBQUHAgIwVjAVFg5WZXJpU2lnbiwgSW5jLjAD
AgEBGj1WZXJpU2lnbidzIENQUyBpbmNvcnAuIGJ5IHJlZmVyZW5jZSBsaWFiLiBs
dGQuIChjKTk3IFZlcmlTaWduMB0GA1UdJQQWMBQGCCsGAQUFBwMBBggrBgEFBQcD
AjAZBgpghkgBhvhFAQYPBAsWCTg3ODA1MTU1NjA0BggrBgEFBQcBAQQoMCYwJAYI
KwYBBQUHMAGGGGh0dHA6Ly9vY3NwLnZlcmlzaWduLmNvbTANBgkqhkiG9w0BAQQF
AAN+AEBUhe0gnMw8OWcnKA5XnoglC3V9v//UIZh7lVJCaMA/K2tFAiRlmkGPsim7
H8rHpZhtTOUBqZl6PuA/VJD2wCECJ+uUYx0zUh1dKwoJKWgcaBQOQ6GsCgxsOB2a
i6wMUcAlqHZULjF1mDkM4bu0gNmLXpIMIsw9UotTvz/O
—–END CERTIFICATE—–
subject=/C=US/ST=North Carolina/L=Durham/O=Red Hat, Inc./OU=Web Operations/CN=www.redhat.com
issuer=/C=US/O=RSA Data Security, Inc./OU=Secure Server Certification Authority

No client certificate CA names sent

SSL handshake has read 1549 bytes and written 314 bytes

New, TLSv1/SSLv3, Cipher is EDH-RSA-DES-CBC3-SHA
Server public key is 1024 bit
SSL-Session:
Protocol : TLSv1
Cipher : EDH-RSA-DES-CBC3-SHA
Session-ID: 97D3E2DF903F5757AF8BED807F5FD9665F43300F139BDFCD1701974D97E5C5CA
Session-ID-ctx:
Master-Key: 4B2295AEDCE520F4615769135FB65EBD6E2345C88FCE4EB7450B71B17FD1A2B4460D751DC3DF05C311DA54B02A7B04D1
Key-Arg : None
Start Time: 1063899107
Timeout : 300 (sec)
Verify return code: 21 (unable to verify the first certificate)

Once you have connected, you can manually type in any commands you want, such as “GET /” and “HEAD / HTTP/1.0” for secure web servers. There are also options like -no_tls1 and -no_ssl2 that let you specify which version of SSL/TLS that you want to connect with.

The -showcerts and -debug options are also worth a look.

——————————————————————————–

Resources
•OpenSSL home page
•man openssl
•man enc
•man dgst
•man s_client

http://www.vanemery.com/Linux/Apache/openSSL.html

Advertisements

The Rise and Fall of HL7

Interfaceware is a Toronto-based HL7 solutions provider whose customers include the CDC, Cerner, GE Medical Systems, IBM, Johns Hopkins Medical, the Mayo Foundation, MD Anderson Cancer Center, Mount Sinai Hospital, Partners Healthcare Systems, Philips, Quest Diagnostics, the VA, and Welch Allyn.

At 2.57pm EDT today, March 31, 2011 — on what will surely prove to be a historic day in the advance of healthcare information technology in the direction of reason and light — Eliot Muir, founder and CEO of Interfaceware, posted the following comment, which I here reproduce in full:

The Rise and Fall of HL7

That might seem an unusual comment from what is supposed to be an HL7 middleware vendor. But times are changing and that is not where I see our future.

Standards do not exist in a vacuum. To be successful standards must address market needs and solve real problems so people can make or save money. Writing code costs money. Less than 0.01% of code gets written for free. The majority of code is written by people that are being paid to solve problems with it.

There are plenty of standards which are not worth the paper they are printed on because are are not sufficiently useful or practical.

Complicated standards can be pushed for a while but ultimately markets reject them. Even governments will ultimately reject complicated standards, through a democratic correction process. Although they usually waste a fair amount of other people’s money along the way.

So back to HL7. Why was it successful?

Version 2.X of HL7 solved a very big problem for many people in healthcare IT back in the 90’s. It replaced a lot of adhoc data sharing mechanisms used in the industry at the time. It gave three points of value. Ironically the first point is one which is not even an official part of the standard.
The so called “defacto LLP standard” defined a uniform way to transport HL7 over a TCP/IP socket – this meant vendors could write standard socket implementations to exchange data.
The EDI format of HL7 with it’s classic | and ^ separators meant vendors could write standard HL7 parsers.
The HL7 definitions gave some good suggestions on places to look for data
And that is where the value stops.

It is a lie when a vendor tries to claim they are “HL7 compliant”.

The term is meaningless.

The best any vendor can ever do is provide a stream of messages with fields that map adequately to most of the data from their application. HL7 interfaces always end up being a thin wrapper around the structure of the database of the application which feeds them. The standardization comes about because there are common ways of structuring a lot of the data. The pain comes from areas where it is unclear how to structure the data.

There are good reasons for the lack of “standard data models”. Technology and society change which means data models must also be changed to best describe new data requirements. Medicine changes. New entrepreneurs come up with clever new solutions and invent ways of using data that improves on old models.

HL7 is working on creating the final solution for healthcare interoperability – the Reference Information Model (RIM) which underlies the structure of version 3 (v3) of HL7.

I think that effort is doomed to fail for these reasons:
1. There is no such thing a single optimal data model to serve all purposes. A formal data model is always going be a square peg going into a round hole. Some problems are best solved by small simple models. There are approximations which work for certain problems but are not valid for others. If there was a single solution to everything then one person would invent it and the rest of us would be out of work.
2. There is substantial academic criticism of RIM that points to the semantic inconsistency within the model itself.
3. It is creating complicated standards which are expensive to implement.
The only organisations spending money on v3 are governments and some big corporations like Oracle that based their health care transaction base (HTB) on it. Oracle salespeople can sell ice to eskimos but I have not heard a lot of great success stories for that product.

Now let us fast forward to what I think will become the future. JSON based web services over HTTPS. Let us look at the benefits:
1. HTTPS with authentication is analogous to LLP – only it comes with authentication and security baked in.
2. JSON – the simplest format imaginable with free parsers in every language and environment, including Javascript which is strategic as the language of the web.
3. JSON data names and values give good suggestions on places to look for data.
Hmmm. Notice something? The value is more than what HL7 offers. In fact a lot more since these are very mainstream technologies that extend far beyond just the healthcare market.

That is why I am not betting the future of my company on HL7. Our value was never really as an HL7 implementation tool. The value our tools provide is the wiggle room we provide for our customers to handle the incompatibilities that occur with real world data. The Iguana Translator is all about making it easy to grab data from anywhere – be it HL7, X12, XML, JSON, databases or web services and making it easy to munch, transform and consume that data.

That is the future I am betting on.

Eliot Muir – CEO of iNTERFACEWARE

http://hl7-watch.blogspot.com/2011/03/rise-and-fall-of-hl7.html