Normal view

There are new articles available, click to refresh the page.
Before yesterdayInclude Security Research Blog

A light-weight forensic analysis of the AshleyMadison Hack

19 August 2015 at 14:13

———–[Intro]

So Ashley Madison(AM) got hacked, it was first announced about a month ago and the attackers claimed they’d drop the full monty of user data if the AM website did not cease operations. The AM parent company Avid Life Media(ALM) did not cease business operations for the site and true to their word it seems the attackers have leaked everything they promised on August 18th 2015 including:

  • full database dumps of user data
  • emails
  • internal ALM documents
  • as well as a limited number of user passwords

Back in college I used to do forensics contests for the “Honey Net Project” and thought this might be a fun nostalgic trip to try and recreate my pseudo-forensics investigation style on the data within the AM leak.

Disclaimer: I will not be releasing any personal or confidential information
within this blog post that may be found in the AM leak. The purpose of
this blog post is to provide an honest holistic forensic analysis and minimal
statistical analysis of the data found within the leak. Consider this a
journalistic exploration more than anything.

Also note, that the credit card files were deleted and not reviewed as part of this write-up

———–[Grabbing the Leak]

First we go find where on the big bad dark web the release site is located. Thankfully knowing a shady guy named Boris pays off for me, and we find a torrent file for the release of the August 18th Ashley Madison user data dump. The torrent file we found has the following SHA1 hash.
e01614221256a6fec095387cddc559bffa832a19  impact-team-ashley-release.torrent

After extracting all the files we have the following sizes and
file hashes for evidence audit purposes:

$  du -sh *
4.0K    74ABAA38.txt
9.5G    am_am.dump
2.6G    am_am.dump.gz
4.0K    am_am.dump.gz.asc
13G     aminno_member.dump
3.1G    aminno_member.dump.gz
4.0K    aminno_member.dump.gz.asc
1.7G    aminno_member_email.dump
439M    aminno_member_email.dump.gz
4.0K    aminno_member_email.dump.gz.asc
111M    ashleymadisondump/
37M     ashleymadisondump.7z
4.0K    ashleymadisondump.7z.asc
278M    CreditCardTransactions.7z
4.0K    CreditCardTransactions.7z.asc
2.3G    member_details.dump
704M    member_details.dump.gz
4.0K    member_details.dump.gz.asc
4.2G    member_login.dump
2.7G    member_login.dump.gz
4.0K    member_login.dump.gz.asc
4.0K    README
4.0K    README.asc

$ sha1sum *
a884c4fcd61e23aecb80e1572254933dc85e2b4a  74ABAA38.txt
e4ff3785dbd699910a512612d6e065b15b75e012  am_am.dump
e0020186232dad71fcf92c17d0f11f6354b4634b  am_am.dump.gz
b7363cca17b05a2a6e9d8eb60de18bc98834b14e  am_am.dump.gz.asc
d412c3ed613fbeeeee0ab021b5e0dd6be1a79968  aminno_member.dump
bc60db3a78c6b82a5045b797e6cd428f367a18eb  aminno_member.dump.gz
8a1c328142f939b7f91042419c65462ea9b2867c  aminno_member.dump.gz.asc
2dcb0a5c2a96e4f3fff5a0a3abae19012d725a7e  aminno_member_email.dump
ab5523be210084c08469d5fa8f9519bc3e337391  aminno_member_email.dump.gz
f6144f1343de8cc51dbf20921e2084f50c3b9c86  aminno_member_email.dump.gz.asc
sha1sum: ashleymadisondump: Is a directory
26786cb1595211ad3be3952aa9d98fbe4c5125f9  ashleymadisondump.7z
eb2b6f9b791bd097ea5a3dca3414a3b323b8ad37  ashleymadisondump.7z.asc
0ad9c78b9b76edb84fe4f7b37963b1d956481068  CreditCardTransactions.7z
cb87d9fb55037e0b1bccfe50c2b74cf2bb95cd6c  CreditCardTransactions.7z.asc
11e646d9ff5d40cc8e770a052b36adb18b30fd52  member_details.dump
b4849cec980fe2d0784f8d4409fa64b91abd70ef  member_details.dump.gz
3660f82f322c9c9e76927284e6843cbfd8ab8b4f  member_details.dump.gz.asc
436d81a555e5e028b83dcf663a037830a7007811  member_login.dump
89fbc9c44837ba3874e33ccdcf3d6976f90b5618  member_login.dump.gz
e24004601486afe7e19763183934954b1fc469ef  member_login.dump.gz.asc
4d80d9b671d95699edc864ffeb1b50230e1ec7b0  README
a9793d2b405f31cc5f32562608423fffadc62e7a  README.asc

———–[Attacker Identity & Attribution]

The attackers make it clear they have no desire to bridge their dark web identities with their real-life identities and have taken many measures to ensure this does not occur.

The torrent file and messaging were released via the anonymous Tor network through an Onion web server which serves only HTML/TXT content. If the attacker took proper OPSEC precautions while setting up the server, law enforcement and AM may never find them. That being said hackers have been known to get sloppy and slip up their OPSEC. The two most famous cases of this were when Sabu of Anonymous and separately the Dread Pirate Roberts of SilkRoad; were both caught even though they primarily used Tor for their internet activities.

Within the dump we see that the files are signed with PGP. Signing a file in this manner is a way of saying “I did this” even though we don’t know the real-life identity of the person/group claiming to do this is (there is a bunch of crypto and math that makes this possible.) As a result we can be more confident that if there are files which are signed by this PGP key, then it was released by the same person/group.

In my opinion, this is done for two reasons. First the leaker wants to claim responsibility in an identity attributable manner, but not reveal their real-life identity. Secondly, the leaker wishes to dispel statements regarding “false leaks” made by the Ashley Madison team. The AM executive and PR teams have been in crises communications mode explaining that there have been many fake leaks.

The “Impact Team” is using the following public PGP key to sign their releases.

$ cat ./74ABAA38.txt

-----BEGIN PGP PUBLIC KEY BLOCK-----

Version: GnuPG v1.4.12 (GNU/Linux)




mQINBFW25a4BEADt5OKS5F36aACyyPc4UMZAnhLnbImhxv5A2n7koTKg1QhyA1mI
InLLriKW3GR0Y4Fx+84pvjbYdoJAnuqMemI0oP+2VAJqwC0LYVVcFHKK6ZElYiN8
4/3e5WWYv6vzrHwB+3NbQ1O9bbUjgk9ky2RsdTe+vDBhKwKS0kPSb28h0oMpAs87
pJcgWZ57jjtvyUEIKXQZAqLvFo5xayS8dEp8tRgNLauQ0SafKGsxjW5cRd2Ok3Z5
QtIS44WnYECe3tqqFYSOo4kdHBeswC8zaKapYaNzxsHw9msdZvx/rkrMgXtJye/o
vmf2RdLIcvqK0Nwf1LDLhweCBP61wVn8gWqSrzww+as1ObE6b64hYKHFzdIMcqJ3
sbAErRrfZMqZ6ihWnlSjzDDx2L3n5T16ZIDxGx5Mt0KDYIo8RqDdF+VKLCT7Eq/C
g/Ax+06Eez4rVnY+xeW6Tj+1iBAlrGRIcRHCX89fNwLxr4Bcq/q1KKrCwVsgonBK
+3Mzzs2/b9XQ/Z6bDHFnMWUTDhomBmNcZOz9sHrZZI9XUzx/bfS6CoQ3MIqDhNM+
l7cKZ/Icfs6IDoOsYIS3QeTWC8gv3IBTvtfKFnf1o6JnkP0Qv6SrckslztNA4HDL
2iIMMGs34vDc11ddTzMBBkig1NgtiaHqHhG5T8OoOD9c3hEmTQzir7iCPQARAQAB
tCRJbXBhY3QgVGVhbSA8aW1wYWN0dGVhbUBtYWlsdG9yLm5ldD6JAjgEEwECACIF
AlW25a4CGwMGCwkIBwMCBhUIAgkKCwQWAgMBAh4BAheAAAoJECQ3PNV0q6o445UQ
AKYIVyrpVKKBA4jliarqngKvkEBRd62CXHY42ZdjFmubLvRw5nC0nDdGUyGPRYOl
0RddL2C7ROqW9lCYfNl3BAQYEXMADDjoBMEQkepIxeIVehat46ksbJuFZ0+uI6EB
aVcJCR4S2C+hJP09q9tn/7RKacIolfeT0+s9IteFghKKK0c8Aot52A/hExrqjldo
fsMX6liSFQjDQpPhQpqiAJ8z9N3eeFwcAAc/gqNz9bE0Wug/OXh0OAHUQk3fS57a
uIi8medOr+kAqHziuO79+5Hkachsp+8c58jBtIzZM4bO6e42aEa2yHv0FGG5MhoB
x7MH0ympFdwbgebpF6kpH371GIsJcyumwQ3Yn4Sy2kp2XmB8xOQo2W8tWRtLW1dI
yGAXHXXy5UI5FJek7G1KvQXCy4pa756RGDFiqdqigq0KC27A/at02M8CP6R9RxC9
YSnru0Qrl7JeATekWM3w8sKs8r6yMEDFAcpK2NHaYzF6/o6t/HEqUWD41DZ2cqqg
9i4uoXpkAB3vAG/snNg1B8g89b3vbVUf6hSIcU89G3lgj9hh87Q/TSsISRJ+yq0N
sLEeVmDmOdf+xb44g3RuRJ9yh0h3j8jdQOq0FvvwW3UHKIVDQlFB3kgHY478TCIa
5MMCtMovGv/ukGKlU8aELKV0/sVsliMh8HDdFQICTd0MuQINBFW25a4BEADIh8Vg
tMGfByY/+IgPd9l3u0I4FZLHqKGKOIpfFEeA31jPAhfOqQyBRcnEN/TxLwJ8NLnL
+GdQ+0z1YncZPxpHU/z8zyMwGpZM/hMbkixA9ysyu06S7hna4YMfifT+lOe1lGSo
Tz3Fz1u2OGH+2UzVk5+Rv0FqDl6X1ZoqhMTswzW0jYR7JLLJip5MTMrLD0rSl0b5
a2XvF9Tpjzy9KWubsJk4W7x00Egu2EU9NhEZXaY18H3rxvYgXT7JMjq/y+IUp2Cd
Bv/XCNWmzl66/ZSLC8hzlcxmAYpmBkxafYNdptMeVzsH/xHmN2zSFjuBNx0Mkk+R
TrOxK/boS9onrGsSQ3zItWJAmodo2qYFjlirtu9pURSdYEINNQ5DgWymg43iAIfp
Xp5/yGBj4BlWE80qEAVsBB2BIRs7QHvpd34xETP08dXMsswIrMn/XxvHumyPoimj
mcNvIpvnAZqt6xppo6BSZ3y7MU4cSIRsZzLuSvkwGk97Jv2sMNvXlPRxzpU9ozsI
iYJAk6/n8kbQiTJk/SeiCTbf6e+BzbZbgIE3O9iPKhfW+6zWjC4TL+lBeyWTy1PP
PcQTT+najDqIwysz2BFuPozwuUQsnfQnyRytSjcI5m1fDoYpJPH8NNRIu9lzp+RN
YENVKXiCfnUCMCnSzxP3Kij3Wt227JLZQqnBUQARAQABiQIfBBgBAgAJBQJVtuWu
AhsMAAoJECQ3PNV0q6o4C2EP/29Bis5Skt9NxHVUBpC1OgRL8V+JD5TjNurMT6Pu
E75szLsMZ84z0MQ6n74ADIgEuznPDIa9hMZGK9DwlsQfFOlC/jyTYxSpgAgN6LAl
qoJztVzLRnMd2gZjOj6wajUy616b8u3Q3zovHcEKll5niUyNwHXovZcCzukFqJBF
a3JU/tkPvBuj2PEWf4ytuO6He2ERuSnsi+7mil8rTAAV/PPy7N2R/T7OUa6ERoGg
hqIGythWizRtZBVPRzush+8L181GBU2ps7nJ1resZ7T0OsCFL67J6t8r8IpmjWWt
fiiV05E71UAyNWLOWriS57qAwNcQ0W2UYKkFFKor+oWaBB+hCpvb8Za5867wpH8l
O6gpS/G17e+MKHTn60hw64xIVFJn7pka+OdAINjPRo5B5qVyvM3puEjRepx1piOG
HKOan00quI0dhF2Gia59zrBHK/agdF4FjkJSjER8uf/jJpo184p38zuQ7kyMXUxY
ExpGcVMVjVOoWKVRPGXYEz2nc9HIZ6mHbvhzsWQEAVwwIxZCos5dW1AMW3Otn30A
uFqPsx4jh/ANGhqUASz18bBrZ8DW3zceVs2zelkMpdL0z7ifU/UNn2rtDlpgLwFl
9ggUtPwXnSxqB7doSxfJyPJUum+bZxMb4Iq5BNNa/tme7TeWGl9bmsVwcQXSQlY2
uZnr
=v0qe
-----END PGP PUBLIC KEY BLOCK-----

The key has the following Meta-data below.

Old: Public Key Packet(tag 6)(525 bytes)
        Ver 4 - new
        Public key creation time - Mon Jul 27 22:15:10 EDT 2015
        Pub alg - RSA Encrypt or Sign(pub 1)
        RSA n(4096 bits) - ...
        RSA e(17 bits) - ...
Old: User ID Packet(tag 13)(36 bytes)
        User ID - Impact Team <[email protected]>
Old: Signature Packet(tag 2)(568 bytes)
        Ver 4 - new
        Sig type - Positive certification of a User ID and Public Key packet(0x13).
        Pub alg - RSA Encrypt or Sign(pub 1)
        Hash alg - SHA1(hash 2)
        Hashed Sub: signature creation time(sub 2)(4 bytes)
                Time - Mon Jul 27 22:15:10 EDT 2015
        Hashed Sub: key flags(sub 27)(1 bytes)
                Flag - This key may be used to certify other keys
                Flag - This key may be used to sign data
        Hashed Sub: preferred symmetric algorithms(sub 11)(5 bytes)
                Sym alg - AES with 256-bit key(sym 9)
                Sym alg - AES with 192-bit key(sym 8)
                Sym alg - AES with 128-bit key(sym 7)
                Sym alg - CAST5(sym 3)
                Sym alg - Triple-DES(sym 2)
        Hashed Sub: preferred hash algorithms(sub 21)(5 bytes)
                Hash alg - SHA256(hash 8)
                Hash alg - SHA1(hash 2)
                Hash alg - SHA384(hash 9)
                Hash alg - SHA512(hash 10)
                Hash alg - SHA224(hash 11)
        Hashed Sub: preferred compression algorithms(sub 22)(3 bytes)
                Comp alg - ZLIB <RFC1950>(comp 2)
                Comp alg - BZip2(comp 3)
                Comp alg - ZIP <RFC1951>(comp 1)
        Hashed Sub: features(sub 30)(1 bytes)
                Flag - Modification detection (packets 18 and 19)
        Hashed Sub: key server preferences(sub 23)(1 bytes)
                Flag - No-modify
        Sub: issuer key ID(sub 16)(8 bytes)
                Key ID - 0x24373CD574ABAA38
        Hash left 2 bytes - e3 95
        RSA m^d mod n(4096 bits) - ...
                -> PKCS-1
Old: Public Subkey Packet(tag 14)(525 bytes)
        Ver 4 - new
        Public key creation time - Mon Jul 27 22:15:10 EDT 2015
        Pub alg - RSA Encrypt or Sign(pub 1)
        RSA n(4096 bits) - ...
        RSA e(17 bits) - ...
Old: Signature Packet(tag 2)(543 bytes)
        Ver 4 - new
        Sig type - Subkey Binding Signature(0x18).
        Pub alg - RSA Encrypt or Sign(pub 1)
        Hash alg - SHA1(hash 2)
        Hashed Sub: signature creation time(sub 2)(4 bytes)
                Time - Mon Jul 27 22:15:10 EDT 2015
        Hashed Sub: key flags(sub 27)(1 bytes)
                Flag - This key may be used to encrypt communications
                Flag - This key may be used to encrypt storage
        Sub: issuer key ID(sub 16)(8 bytes)
                Key ID - 0x24373CD574ABAA38
        Hash left 2 bytes - 0b 61
        RSA m^d mod n(4095 bits) - ...
                -> PKCS-1

We can verify the released files are attributable to the PGP public key
in question using the following commands:

$ gpg --import ./74ABAA38.txt
$ gpg --verify ./member_details.dump.gz.asc ./member_details.dump.gz
gpg: Signature made Sat 15 Aug 2015 11:23:32 AM EDT using RSA key ID 74ABAA38
gpg: Good signature from "Impact Team <[email protected]>"
gpg: WARNING: This key is not certified with a trusted signature!
gpg:          There is no indication that the signature belongs to the owner.
Primary key fingerprint: 6E50 3F39 BA6A EAAD D81D  ECFF 2437 3CD5 74AB AA38

This also tells us at what date the dump was signed and packaged.

———–[Catching the attackers]

The PGP key’s meta-data shows a user ID for the mailtor dark web email service. The last known location of which was:
http://mailtoralnhyol5v.onion

Don’t bother emailing the email address found in the PGP key as it does not have a valid MX record. The fact that this exists at all seems to be one of those interesting artifact of what happens when Internet tools like GPG get used on the dark web.

If the AM attackers were to be caught; here (in no particular order) are the most likely ways this would happen:

  • The person(s) responsible tells somebody. Nobody keeps something like this a secret, if the attackers tell anybody, they’re likely going to get caught.
  • If the attackers review email from a web browser, they might get revealed via federal law enforcement or private investigation/IR teams hired by AM. The FBI is known to have these capabilities.
  • If the attackers slip up with their diligence in messaging only via TXT and HTML on the web server. Meta-data sinks ships kids — don’t forget.
  • If the attackers slip up with their diligence on configuring their server. One bad config of a web server leaks an internal IP, or worse!
  • The attackers slipped up during their persistent attack against AM and investigators hired by AM find evidence leading back to the attackers.
  • The attackers have not masked their writing or image creation style and leave some semantic finger print from which they can be profiled.

If none of those  things happen, I don’t think these attackers will ever be caught. The cyber-crime fighters have a daunting task in front of them, I’ve helped out a couple FBI and NYPD cyber-crime fighters and I do not envy the difficult and frustrating job they have — good luck to them! Today we’re living in the Wild West days of the Internet.

———–[Leaked file extraction and evidence gathering]

Now to document the information seen within this data leak we proceed with a couple of commands to gather the file size and we’ll also check the file hashes to ensure the uniqueness of the files. Finally we review the meta-data of some of the compressed files. The meta-data shows the time-stamp embedded into the various compressed files. Although meta-data can easily be faked, it is usually not.

Next we’ll extract these files and examine their file size to take a closer look.

$ 7z e ashleymadisondump.7z

We find within the extracted 7zip file another 7zip file
“swappernet_User_Table.7z” was found and also extracted.

We now have the following files sizes and SHA1 hashes for evidence
integrity & auditing purposes:

$ du -sh ashleymadisondump/*
68K     20131002-domain-list.xlsx
52K     ALMCLUSTER (production domain) computers.txt
120K    ALMCLUSTER (production domain) hashdump.txt
68K     ALM - Corporate Chart.pptx
256K    ALM Floor Plan - ports and names.pdf
8.0M    ALM - January 2015 - Company Overview.pptx
1.8M    ALM Labs Inc. Articles of Incorporation.pdf
708K    announcement.png
8.0K    Areas of concern - customer data.docx
8.0K    ARPU and ARPPU.docx
940K    Ashley Madison Technology Stack v5(1).docx
16K     Avid Life Media - Major Shareholders.xlsx
36K     AVIDLIFEMEDIA (primary corporate domain) computers.txt
332K    AVIDLIFEMEDIA (primary corporate domain) user information and hashes.txt
1.7M    Avid Org Chart 2015 - May 14.pdf
24K     Banks.xlsx
6.1M    Copies of Option Agreements.pdf
8.0K    Credit useage.docx
16K     CSF Questionnaire (Responses).xlsx
132K    Noel's loan agreement.pdf
8.0K    Number of traveling man purchases.docx
1.5M    oneperday_am_am_member.txt
940K    oneperday_aminno_member.txt
672K    oneperday.txt
44K     paypal accounts.xlsx
372K    [email protected]_20101103_133855.pdf
16K     q2 2013 summary compensation detail_managerinput_trevor-s team.xlsx
8.0K    README.txt
8.0K    Rebill Success Rate Queries.docx
8.0K    Rev by traffic source rebill broken out.docx
8.0K    Rev from organic search traffic.docx
4.0K    Sales Queries
59M     swappernet_QA_User_Table.txt  #this was extracted from swappernet_User_Table.7z in the same dir
17M     swappernet_User_Table.7z
$ sha1sum ashleymadisondump/*
f0af9ea887a41eb89132364af1e150a8ef24266f  20131002-domain-list.xlsx
30401facc68dab87c98f7b02bf0a986a3c3615f0  ALMCLUSTER (production domain) computers.txt
c36c861fd1dc9cf85a75295e9e7bcf6cf04c7d2c  ALMCLUSTER (production domain) hashdump.txt
6be635627aa38462ebcba9266bed5b492a062589  ALM - Corporate Chart.pptx
4dec7623100f59395b68fd13d3dcbbff45bef9c9  ALM Floor Plan - ports and names.pdf
601e0b462e1f43835beb66743477fe94bbda5293  ALM - January 2015 - Company Overview.pptx
d17cb15a5e3af15bc600421b10152b2ea1b9c097  ALM Labs Inc. Articles of Incorporation.pdf
1679eca2bc172cba0b5ca8d14f82f9ced77f10df  announcement.png
6a618e7fc62718b505afe86fbf76e2360ade199d  Areas of concern - customer data.docx
91f65350d0249211234a52b260ca2702dd2eaa26  ARPU and ARPPU.docx
50acee0c8bb27086f12963e884336c2bf9116d8a  Ashley Madison Technology Stack v5(1).docx
71e579b04bbba4f7291352c4c29a325d86adcbd2  Avid Life Media - Major Shareholders.xlsx
ef8257d9d63fa12fb7bc681320ea43d2ca563e3b  AVIDLIFEMEDIA (primary corporate domain) computers.txt
ec54caf0dc7c7206a7ad47dad14955d23b09a6c0  AVIDLIFEMEDIA (primary corporate domain) user information and hashes.txt
614e80a1a6b7a0bbffd04f9ec69f4dad54e5559e  Avid Org Chart 2015 - May 14.pdf
c3490d0f6a09bf5f663cf0ab173559e720459649  Banks.xlsx
1538c8f4e537bb1b1c9a83ca11df9136796b72a3  Copies of Option Agreements.pdf
196b1ba40894306f05dcb72babd9409628934260  Credit useage.docx
2c9ba652fb96f6584d104e166274c48aa4ab01a3  CSF Questionnaire (Responses).xlsx
0068bc3ee0dfb796a4609996775ff4609da34acb  Noel's loan agreement.pdf
c3b4d17fc67c84c54d45ff97eabb89aa4402cae8  Number of traveling man purchases.docx
9e6f45352dc54b0e98932e0f2fe767df143c1f6d  oneperday_am_am_member.txt
de457caca9226059da2da7a68caf5ad20c11de2e  oneperday_aminno_member.txt
d596e3ea661cfc43fd1da44f629f54c2f67ac4e9  oneperday.txt
37fdc8400720b0d78c2fe239ae5bf3f91c1790f4  paypal accounts.xlsx
2539bc640ea60960f867b8d46d10c8fef5291db7  [email protected]_20101103_133855.pdf
5bb6176fc415dde851262ee338755290fec0c30c  q2 2013 summary compensation detail_managerinput_trevor-s team.xlsx
5435bfbf180a275ccc0640053d1c9756ad054892  README.txt
872f3498637d88ddc75265dab3c2e9e4ce6fa80a  Rebill Success Rate Queries.docx
d4e80e163aa1810b9ec70daf4c1591f29728bf8e  Rev by traffic source rebill broken out.docx
2b5f5273a48ed76cd44e44860f9546768bda53c8  Rev from organic search traffic.docx
sha1sum: Sales Queries: Is a directory
0f63704c118e93e2776c1ad0e94fdc558248bf4e  swappernet_QA_User_Table.txt
9d67a712ef6c63ae41cbba4cf005ebbb41d92f33  swappernet_User_Table.7z

———–[Quick summary of each of the leaked files]

The following files are MySQL data dumps of the main AM database:

  • member_details.dump.gz
  • aminno_member.dump.gz
  • member_login.dump.gz
  • aminno_member_email.dump.gz
  • CreditCardTransactions.7z

Also included was another AM database which contains user info (separate from the emails):

  • am_am.dump.gz

In the top level directory you can also find these additional files:

  • 74ABAA38.txt
    Impact Team’s Public PGP key used for signing the releases (The .asc files are the signatures)
  • ashleymadisondump.7z
    This contains various internal and corporate private files.
  • README
    Impact Team’s justification for releasing the user data.
  • Various .asc files such as “member_details.dump.gz.asc”
    These are all PGP signature files to prove that one or more persons who are part of the “Impact Team” attackers released them.

Within the ashleymadisondump.7z we can extract and view the following files:

  • Number of traveling man purchases.docx
    SQL queries to investigate high-travel user’s purchases.
  • q2 2013 summary compensation detail_managerinput_trevor-s team.xlsx
    Per-employee compensation listings.
  • AVIDLIFEMEDIA (primary corporate domain) user information and hashes.txt
  • AVIDLIFEMEDIA (primary corporate domain) computers.txt
    The output of the dnscmd windows command executing on what appears to be a primary domain controller. The timestamp indicates that the command was run on July 1st 2015. There is also “pwdump” style export of 1324 user accounts which appear to be from the ALM domain controller. These passwords will be easy to crack as NTLM hashes aren’t the strongest
  • Noel’s loan agreement.pdf
    A promissory note for the CEO to pay back ~3MM in Canadian monies.
  • Areas of concern – customer data.docx
    Appears to be a risk profile of the major security concerns that ALM has regarding their customer’s data. And yes, a major user data dump is on the list of concerns.
  • Banks.xlsx
    A listing of all ALM associated bank account numbers and the biz which owns them.
  • Rev by traffic source rebill broken out.docx
  • Rebill Success Rate Queries.docx
    Both of these are SQL queries to investigate Rebilling of customers.
  • README.txt
    Impact Team statement regarding their motivations for the attack and leak.
  • Copies of Option Agreements.pdf
    All agreements for what appears all of the company’s outstanding options.
  • paypal accounts.xlsx
    Various user/passes for ALM paypal accounts (16 in total)
  • swappernet_QA_User_Table.txt
  • swappernet_User_Table.7z
    This file is a database export into CSV format. I appears to be from a QA server
  • ALMCLUSTER (production domain) computers.txt
    The output of the dnscmd windows command executing on what appears to be a production domain controller. The timestamp indicates that the command was run on July 1st 2015.
  • ALMCLUSTER (production domain) hashdump.txt
    A “pwdump” style export of 1324 user accounts which appear to be from the ALM domain controller. These passwords will be easy to crack as NTLM hashes aren’t the strongest.
  • ALM Floor Plan – ports and names.pdf
    Seating map of main office, this type of map is usually used for network deployment purposes.
  • ARPU and ARPPU.docx
    A listing of SQL commands which provide revenue and other macro financial health info.
    Presumably these queries would run on the primary DB or a biz intel slave.
  • Credit useage.docx
    SQL queries to investigate credit card purchases.
  • Avid Org Chart 2015 – May 14.pdf
    A per-team organizational chart of what appears to be the entire company.
  • announcement.png
    The graphic created by Impact Team to announce their demand for ALM to shut down it’s flagship website AM.
  • [email protected]_20101103_133855.pdf
    Contract outlining the terms of a purchase of the biz Seekingarrangement.com
  • CSF Questionnaire (Responses).xlsx
    Company exec Critical Success Factors spreadsheet. Answering questions like “In what area would you hate to see something go wrong?” and the CTO’s response is about hacking.
  • ALM – January 2015 – Company Overview.pptx
    This is a very detailed breakdown of current biz health, marketing spend, and future product plans.
  • Ashley Madison Technology Stack v5(1).docx
    A detailed walk-through of all major servers and services used in the ALM production environment.
  • oneperday.txt
  • oneperday_am_am_member.txt
  • oneperday_aminno_member.txt
    These three files have limited leak info as a “teaser” for the .dump files that are found in the highest level directory of the AM leak.
  • Rev from organic search traffic.docx
    SQL queries to explore the revenue generated from search traffic.
  • 20131002-domain-list.xlsx
    BA list of the 1083 domain names that are, have been, or are seeking to be owned by ALM.
  • Sales Queries/
    Empty Directory
  • ALM Labs Inc. Articles of Incorporation.pdf
    The full 109 page Articles of Incorporation, ever aspect of inital company formation.
  • ALM – Corporate Chart.pptx
    A detailed block diagram defining the relationship between various tax and legal business entity names related to ALM businesses.
  • Avid Life Media – Major Shareholders.xlsx
    A listing of each major shareholder and their equity stake

———–[File meta-data analysis]

First we’ll take a look at the 7zip file in the top level directory.

$ 7z l ashleymadisondump.7z

Listing archive: ashleymadisondump.7z

----

Path = ashleymadisondump.7z

Type = 7z

Method = LZM

Solid = +

Blocks = 1

Physical Size = 37796243

Headers Size = 1303



   Date      Time    Attr         Size   Compressed  Name
------------------- ----- ------------ ------------  ------------------------
2015-07-09 12:25:48 ....A     17271957     37794940  swappernet_User_Table.7z
2015-07-10 12:14:35 ....A       723516               announcement.png
2015-07-01 18:03:56 ....A        51222               ALMCLUSTER (production domain) computers.txt
2015-07-01 17:58:55 ....A       120377               ALMCLUSTER (production domain) hashdump.txt
2015-06-25 22:59:22 ....A        35847               AVIDLIFEMEDIA (primary corporate domain) computers.txt
2015-06-14 21:18:11 ....A       339221               AVIDLIFEMEDIA (primary corporate domain) user information and hashes.txt
2015-07-18 15:23:34 ....A       686533               oneperday.txt
2015-07-18 15:20:43 ....A       959099               oneperday_aminno_member.txt
2015-07-18 19:00:45 ....A      1485289               oneperday_am_am_member.txt
2015-07-19 17:01:11 ....A         6031               README.txt
2015-07-07 11:41:36 ....A         6042               Areas of concern - customer data.docx
2015-07-07 12:14:42 ....A         5907               Sales Queries/ARPU and ARPPU.docx
2015-07-07 12:04:35 ....A       960553               Ashley Madison Technology Stack v5(1).docx
2015-07-07 12:14:42 ....A         5468               Sales Queries/Credit useage.docx
2015-07-07 12:14:43 ....A         5140               Sales Queries/Number of traveling man purchases.docx
2015-07-07 12:14:47 ....A         5489               Sales Queries/Rebill Success Rate Queries.docx
2015-07-07 12:14:43 ....A         5624               Sales Queries/Rev by traffic source rebill broken out.docx
2015-07-07 12:14:42 ....A         6198               Sales Queries/Rev from organic search traffic.docx
2015-07-08 23:17:19 ....A       259565               ALM Floor Plan - ports and names.pdf
2012-10-19 16:54:20 ....A      1794354               ALM Labs Inc. Articles of Incorporation.pdf
2015-07-07 12:04:10 ....A      1766350               Avid Org Chart 2015 - May 14.pdf
2012-10-20 12:23:11 ....A      6344792               Copies of Option Agreements.pdf
2013-09-18 14:39:25 ....A       132798               Noel's loan agreement.pdf
2015-07-07 10:16:54 ....A       380043               [email protected]_20101103_133855.pdf
2012-12-13 15:26:58 ....A        67816               ALM - Corporate Chart.pptx
2015-07-07 12:14:28 ....A      8366232               ALM - January 2015 - Company Overview.pptx
2013-10-07 10:30:28 ....A        67763               20131002-domain-list.xlsx
2013-07-15 15:20:14 ....A        13934               Avid Life Media - Major Shareholders.xlsx
2015-07-09 11:57:58 ....A        22226               Banks.xlsx
2015-07-07 11:41:41 ....A        15703               CSF Questionnaire (Responses).xlsx
2015-07-09 11:57:58 ....A        42511               paypal accounts.xlsx
2015-07-07 12:04:44 ....A        15293               q2 2013 summary compensation detail_managerinput_trevor-s team.xlsx
2015-07-18 13:54:40 D....            0            0  Sales Queries
------------------- ----- ------------ ------------  ------------------------
                              41968893     37794940  32 files, 1 folders

If we’re to believe this meta-data, the newest file is from July 19th 2015 and the oldest is from October 19th 2012. The timestamp for the file announcement.png shows a creation date of July 10th 2015. This file is the graphical announcement from the leakers. The file swappernet_User_Table.7z
has a timestamp of July 9th 2015. Since this file is a database dump, one might presume that these files were created for the original release and the other files were copied from a file-system that preserves timestamps.

Within that 7zip file we’ve found another which looks like:

$ 7z l ashleymadisondump/swappernet_User_Table.7z

Listing archive: ./swappernet_User_Table.7z

----

Path = ./swappernet_User_Table.7z

Type = 7z

Method = LZMA

Solid = -

Blocks = 1

Physical Size = 17271957

Headers Size = 158




   Date      Time    Attr         Size   Compressed  Name
------------------- ----- ------------ ------------  ------------------------
2015-06-27 18:39:40 ....A     61064200     17271799  swappernet_QA_User_Table.txt
------------------- ----- ------------ ------------  ------------------------
                              61064200     17271799  1 files, 0 folders

Within the ashleymadisondump directory extracted from ashleymadisondump.7z we’ve got
the following file types that we’ll examine for meta-data:

8 txt
8 docx
6 xlsx
6 pdf
2 pptx
1 png
1 7z

The PNG didn’t seem to have any EXIF meta-data, and we’ve already covered the 7z file.

The text files probably don’t usually yield anything to us meta-data wise.

In the MS Word docx files  we have the following meta-data:

  • Areas of concern – customer data.docx
    No Metadata
  • ARPU and ARPPU.docx
    No Metadata
  • Ashley Madison Technology Stack v5(1).docx
    Created Michael Morris, created and last modified on Sep 17 2013.
  • Credit useage.docx
    No Metadata
  • Number of traveling man purchases.docx
    No Metadata
  • Rebill Success Rate Queries.docx
    No Metadata
  • Rev by traffic source rebill broken out.docx
    No Metadata
  • Rev from organic search traffic.docx
    No Metadata

In the MS Powerpoint pptx files we have the following meta-data:

  • ALM – Corporate Chart.pptx
    Created by “Diana Horvat” on Dec 5 2012 and last updated by “Tatiana Kresling”
    on Dec 13th 2012
  • ALM – January 2015 – Company Overview.pptx
    Created Rizwan Jiwan, Jan 21 2011 and last modified on Jan 20 2015.

In the MS Excel xlsx files we have the following meta-data:

  • 20131002-domain-list.xlsx
    Written by Kevin McCall, created and last modified Oct 2nd 2013
  • Avid Life Media – Major Shareholders.xlsx
    Jamal Yehia, created and last modified July 15th 2013
  • Banks.xlsx
    Created by “Elena” and Keith Lalonde, created Dec 15 2009 and last modified Feb 26th  2010
  • CSF Questionnaire (Responses).xlsx
    No Metadata
  • paypal accounts.xlsx
    Created by Keith Lalonde, created Oct 28  2010 and last modified Dec 22nd  2010
  • q2 2013 summary compensation detail_managerinput_trevor-s team.xlsx
    No Metadata

And finally within the PDF files we also see additional meta-data:

  • ALM Floor Plan – ports and names.pdf
    Written by Martin Price in MS Visio, created and last modified April 23 2015
  • ALM Labs Inc. Articles of Incorporation.pdf
    Created with DocsCorp Pty Ltd (www.docscorp.com), created and last modified on Oct 17 2012
  • Avid Org Chart 2015 – May 14.pdf
    Created and last modified on May 14 2015
  • Copies of Option Agreements.pdf
    OmniPage CSDK 16 OcrToolkit, created and last modified on Oct 16 2012
  • Noel’s loan agreement.pdf
    Created and last modified on Sep 18 2013
  • [email protected]_20101103_133855.pdf
    Created and last modified on Jul 7 2015

———–[MySQL Dump file loading and evidence gathering]

At this point all of the dump files have been decompressed with gunzip or 7zip. The dump files are standard MySQL backup file (aka Dump files) the info in the dump files implies that it was taken from multiple servers:

$ grep 'MySQL dump' *.dump
am_am.dump:-- MySQL dump 10.13  Distrib 5.5.33, for Linux (x86_64)
aminno_member.dump:-- MySQL dump 10.13  Distrib 5.5.40-36.1, for Linux (x86_64)
aminno_member_email.dump:-- MySQL dump 10.13  Distrib 5.5.40-36.1, for Linux (x86_64)
member_details.dump:-- MySQL dump 10.13  Distrib 5.5.40-36.1, for Linux (x86_64)
member_login.dump:-- MySQL dump 10.13  Distrib 5.5.40-36.1, for Linux (x86_64)

Also within the dump files was info referencing being executed from localhost, this implies an attacker was on the Database server in question.

Of course, all of this info is just text and can easily be faked, but it’s interesting none-the-less considering the possibility that it might be correct and unaltered.

To load up the MySQL dumps we’ll start with a fresh MySQL database instance
on a decently powerful server and run the following commands:

--As root MySQL user
CREATE DATABASE aminno;
CREATE DATABASE am;
CREATE USER 'am'@'localhost' IDENTIFIED BY 'loyaltyandfidelity';
GRANT ALL PRIVILEGES ON aminno.* TO 'am'@'localhost';
GRANT ALL PRIVILEGES ON am.* TO 'am'@'localhost';

Now back at the command line we’ll execute these to import the main dumps:

$ mysql -D aminno -uam -ployaltyandfidelity < aminno_member.dump

$ mysql -D aminno -uam -ployaltyandfidelity < aminno_member_email.dump

$ mysql -D aminno -uam -ployaltyandfidelity < member_details.dump

$ mysql -D aminno -uam -ployaltyandfidelity < member_login.dump

$ mysql -D am -uam -ployaltyandfidelity < am_am.dump

Now that you’ve got the data loaded up you can recreate some of the findings ksugihara made with his analysis here [Edit: It appears ksugihara has taken this offline, I don’t have a mirror]. We didn’t have much more to add for holistic statistics analysis than what he’s already done so check out his blog post for more on the primary data dumps. There still is one last final database export though…

Within the file ashleymadisondump/swappernet_QA_User_Table.txt we have a final database export, but this one is not in the MySQL dump format. It is instead in CSV format. The file name implies this was an export from a QA Database server.

This file has the following columns (left to right in the CSV):

  • recid
  • id
  • username
  • userpassword
  • refnum
  • disable
  • ipaddress
  • lastlogin
  • lngstatus
  • strafl
  • ap43
  • txtCoupon
  • bot

Sadly within the file we see user passwords are in clear text which is always a bad security practice. At the moment though we don’t know if these are actual production user account passwords, and if so how old they are. My guess is that these are from an old QA server when AM was a smaller company and hadn’t moved to secure password hashing practices like bcrypt.

These commands show us there are 765,607 records in this database export and
only four of them have a blank password. Many of the passwords repeat and
397,974 of the passwords are unique.

$ cut -d , -f 4 < swappernet_QA_User_Table.txt |wc -l
765607
$ cut -d , -f 4 < swappernet_QA_User_Table.txt | sed '/^s*$/d' |wc -l
765603
$ cut -d , -f 4 < swappernet_QA_User_Table.txt | sed '/^s*$/d' |sort -u |wc -l
387974

Next we see the top 25 most frequently used passwords in this database export
using the command:

$ cut -d , -f 4 < swappernet_QA_User_Table.txt |sort|uniq -c |sort -rn|head -25
   5882 123456
   2406 password
    950 pussy
    948 12345
    943 696969
    917 12345678
    902 fuckme
    896 123456789
    818 qwerty
    746 1234
    734 baseball
    710 harley
    699 swapper
    688 swinger
    647 football
    645 fuckyou
    641 111111
    538 swingers
    482 mustang
    482 abc123
    445 asshole
    431 soccer
    421 654321
    414 1111
    408 hunter

After importing the CSV into MS excel we can use sort and filter to make some
additional statements based on the data.

    1. The only logins marked as “lastlogin” column in the year 2015 are from the
      following users:
      SIMTEST101
      SIMTEST130
      JULITEST2
      JULITEST3
      swappernetwork
      JULITEST4
      HEATSEEKERS
    1. The final and most recent login was from AvidLifeMedia’s office IP range.
    2. 275,285 of these users have an entry for the txtCupon.
    3. All users with the “bot” column set to TRUE have either passwords

“statueofliberty” or “cake”

The post A light-weight forensic analysis of the AshleyMadison Hack appeared first on Include Security Research Blog.

IncludeSec’s free training in Buenos Aries for our Argentine hacker friends.

29 April 2019 at 20:20

One of the things that has always been important in IncludeSec’s progress as a company is finding the best talent for the task at hand. We decided early on that if the best Python hacker in the world was not in the US then we would go find that person and work with them! Or whatever technology the project at hand is; C, Go, Ruby, Scala, Java, etc.

As it turns out the best Python hackers (and many other technologies) might actually be in Argentina. We’re not the only ones that have noticed this. Immunity Security, IOActive Security, Gotham Digital Science, and many others have a notable presence in Argentina (The NY Times even wrote an article on how great the hackers are there!) We’ve worked with dozens of amazing Argentinian hackers over the last six years comprising ~30% of our team and we’ve also enjoyed the quality of the security conferences like EkoParty in Buenos Aires.

As a small thank you to the entire Argentinian hacker scene, we’re going to do a free training class on May 30/31st 2019 teaching advanced web hacking techniques. This training is oriented towards hackers earlier in their career who have already experienced the world of OWASP top 10 and are looking to take their hacking skills to the next level.

If that sounds like you, you’re living in Argentina, and can make it to Buenos Aires on May 30th & 31st then this might be an awesome opportunity for you!

Please fill out the application here if this is something that would be awesome for you. We’ll close the application on May 10th.
https://docs.google.com/forms/d/e/1FAIpQLScrjV8wei7h-AY_kW7QwXZkYPDvSQswzUy6BTT9zg8L_Xejxg/viewform?usp=sf_link

Gracias,

Erik Cabetas
Managing Partner

The post IncludeSec’s free training in Buenos Aries for our Argentine hacker friends. appeared first on Include Security Research Blog.

Custom Static Analysis Rules Showdown: Brakeman vs. Semgrep

In application assessments you have to do the most effective work you can in the time period defined by the client to maximize the assurance you’re providing. At IncludeSec we’ve done a couple innovative things to improve the overall effectiveness of the work we do, and we’re always on the hunt for more ways to squeeze even more value into our assessments by finding more risks and finding them faster. One topic that we revisit frequently to ensure we’re doing the best we can to maximize efficiency is static analysis tooling (aka SAST.)

Recently we did a bit of a comparison example of two open source static analysis tools to automate detection of suspicious or vulnerable code patterns identified during assessments. In this post we discuss the experience of using Semgrep and Brakeman to create the same custom rule within each tool for a client’s Ruby on Rails assessment our team was assessing. We’re also very interested in trying out GitHub’s CodeQL, but unfortunately the Ruby support is still in development so that will have to wait for another time.

Semgrep is a pattern-matching tool that is semantically-aware and works with several languages (currently its Ruby support is marked as beta, so it is likely not at full maturity yet). Brakeman is a long-lived Rails-specific static-analysis tool, familiar to most who have worked with Rails security. Going in, I had no experience writing custom rules for either one.

This blog post is specifically about writing custom rules for code patterns that are particular to the project I’m assessing. First though I want to mention that both tools have some pre-built general rules for use with most Ruby/Rails projects — Brakeman has a fantastic set of built-in rules for Rails projects that has proven very useful on assessments (just make sure the developers of the project haven’t disabled rules in config/brakeman.yml, and yes we have seen developers do this to make SAST warnings go away!). Semgrep has an online registry of user-submitted rules for Ruby that is also handy (especially as examples for writing your own rules), but the current rule set for Ruby is not quite as extensive as Brakeman. In Brakeman the rules are known as “checks”, for consistency we’ll use the term “rules” for both tools, but you the reader can just keep that fact in mind.

First custom rule: Verification of authenticated functionality

I chose a simple pattern for the first rule I wanted to make, mainly to familiarize myself with the process of creating rules in both Semgrep and Brakeman. The application had controllers that handle non-API routes. These controllers enforced authentication by adding a before filter: before_action :login_required. Often in Rails projects, this line is included in a base controller class, then skipped when authentication isn’t required using skip_before_filter. This was not the case in the webapp I was looking at — the before filter was manually set in every single controller that needed authentication, which seemed error-prone as an overall architectural pattern, but alas it is the current state of the code base.

I wanted to get a list of any non-API controllers that lack this callback so I can ensure no functionality is unintentionally exposed without authentication. API routes handled authentication in a different way so consideration for them was not a priority for this first rule.

Semgrep

I went to the Semgrep website and found that Semgrep has a nice interactive tutorial, which walks you through building custom rules. I found it to be incredibly simple and powerful — after finishing the tutorial in about 20 minutes I thought I had all the knowledge I needed to make the rules I wanted. Although the site also has an online IDE for quick iteration I opted to develop locally, as the online IDE would require submitting our client’s code to a 3rd party which we obviously can’t do for security and liability reasons. The rule would eventually have to be run against the entire codebase anyways.

I encountered a few challenges when writing the rule:

  • It’s a little tricky to find things that do not match a pattern (e.g. lack of a login_required filter). You can’t just search all files for ones that don’t match, you have to have a pattern that it does search for, then exclude the cases matching your negative pattern. I was running into a bug here that made it even harder but the Semgrep team fixed it when we gave them a heads up!
  • Matching only classes derived from ApplicationController was difficult because Semgrep doesn’t currently trace base classes recursively, so any that were more than one level removed from ApplicationController would be excluded (for example, if there was a class DerivedController < ApplicationController, it wouldn’t match SecondLevelDerivedController < DerivedController.) The Semgrep team gave me a tip about using a metavariable regex to filter for classes ending in “Controller” which worked for this situation and added no false positives.

My final custom rule for Semgrep follows:

rules:
- id: controller-without-authn
  patterns:
  - pattern: |
      class $CLASS
        ...
      end
  - pattern-not: |
      class $CLASS
        ...
        before_action ..., :login_required, ...
        ...
      end
  - metavariable-regex:
      metavariable: '$CLASS'
      regex: '.*Controller'  
  message: |
  $CLASS does not use the login_required filter.
  severity: WARNING
  languages:
  - ruby

I ran the rule using the following command: semgrep --config=../../../semgrep/ | grep "does not use"

The final grep is necessary because Semgrep will print the matched patterns which, in this case, were the entire classes. There’s currently no option in Semgrep to show only a list of matching files without the actual matched patterns themselves. That made it difficult to see the list of affected controllers, so I used grep on the output to filter the patterns out. This rule resulted in 47 identified controllers. Creating this rule originally took about two hours including going through the tutorial and debugging the issues I ran into but now that the issues are fixed I expect it would take less time in future iterations.

Overall I think the rule is pretty self-explanatory — it finds all files that define a class then excludes the ones that have a login_required before filter. Check out the semgrep tutorial lessons if you’re unsure.

Brakeman

Brakeman has a wiki page which describes custom rule building, but it doesn’t have a lot of detail about what functionality is available to you. It gives examples of finding particular method calls and whether user input finds their ways into these calls. There’s no example of finding controllers.

The page didn’t give any more about what I wanted to do so I headed off to Brakeman’s folder of built-in rules in GitHub to see if there are any examples of something similar there. There is a CheckSkipBeforeFilter rule which is very similar to what I want — it checks whether the login_required callback is skipped with skip_before_filter. As mentioned, the app isn’t implemented that way, but it showed me how to iterate controllers and check before filters.

This got me most of the way there but I also needed to skip API controllers for this particular app’s architecture. After about an hour of tinkering and looking through Brakeman controller tracker code I had the following rule:

require 'brakeman/checks/base_check'

class Brakeman::CheckControllerWithoutAuthn < Brakeman::BaseCheck
  Brakeman::Checks.add self

  @description = "Checks for a controller without authN"

  def run_check
  controllers = tracker.controllers.select do |_name, controller|
      not check_filters controller
  end
  Hash[controllers].each do |name, controller|
    warn  :controller => name,
          :warning_type => "No authN",
          :warning_code => :basic_auth_password,
          :message => "No authentication for controller",
          :confidence => :high,
          :file => controller.file
  end
  end

# Check whether a non-api controller has a :login_required before_filter
  def check_filters controller
  return true if controller.parent.to_s.include? "ApiController"
  controller.before_filters.each do |filter|
      next unless call? filter
      next unless filter.first_arg.value == :login_required
      return true
  end
  return false
  end
end

Running it with brakeman --add-checks-path ../brakeman --enable ControllerWithoutAuthn -t ControllerWithoutAuthn resulted in 43 controllers without authentication — 4 fewer than Semgrep flagged.

Taking a close look at the controllers that Semgrep flagged and Brakeman did not, I realized the app is importing shared functionality from another module, which included a login_required filter. Therefore, Semgrep had 4 false positives that Brakeman did not flag. Since Semgrep works on individual files I don’t believe there’s an easy way to prevent those ones from being flagged.

Second custom rule: Verification of correct and complete authorization across functionality

The next case I wanted assurance on was vertical authorization at the API layer. ApiControllers in the webapp have a method authorization_permissions() which is called at the top of each derived class with a hash table of function_name/permission pairs. This function saves the passed hash table into an instance variable. ApiControllers have a before filter that, when any method is invoked, will look up the permission associated with the called method in the hash table and ensure that the user has the correct permission before proceeding.

Manual review was required to determine whether any methods had incorrect privileges, as analysis tools can’t understand business logic, but they can find methods entirely lacking authorization control — that was my goal for these rules.

Semgrep

Despite being seemingly a more complex scenario, this was still pretty straightforward in Semgrep:

rules:
- id: method-without-authz
  patterns:
  - pattern: |
    class $CONTROLLER < ApiController
        ...
        def $FUNCTION
          ...
        end
    ...
    end
  - pattern-not: |
    class $CONTROLLER < ApiController
        ...
        authorization_permissions ... :$FUNCTION => ...
        ...
        def $FUNCTION
          ...
        end
    ...
    end
  message: |
  Detected api controller $CONTROLLER which does not check for authorization for the $FUNCTION method
  severity: WARNING
  languages:
  - ruby

It finds all methods on ApiControllers then excludes the ones that have some authorization applied. Semgrep found seven controllers with missing authorization checks.

Brakeman

I struggled to make this one in Brakeman at first, even thinking it might not be possible. The Brakeman team kindly guided me towards Collection#options which contains all method calls invoked at the class level excluding some common ones like before_filter. The following rule grabs all ApiControllers, looks through their options for the call to authorization_permissions, then checks whether each controller method has an entry in the authorization_permissions hash.

require 'brakeman/checks/base_check'

class Brakeman::CheckApicontrollerWithoutAuthz < Brakeman::BaseCheck
  Brakeman::Checks.add self

  @description = "Checks for an ApiController without authZ"

  def run_check

  # Find all api controllers
  api_controllers = tracker.controllers.select do |_name, controller|
      is_apicontroller controller
  end

  # Go through all methods on all ApiControllers
  # and find if they have any methods that are not in the authorization matrix
  Hash[api_controllers].each do |name, controller|
    perms = controller.options[:authorization_permissions].first.first_arg.to_s

    controller.each_method do |method_name, info|
      if not perms.include? ":#{method_name})"
        warn  :controller => name,
              :warning_type => "No authZ",
              :warning_code => :basic_auth_password,
              :message => "No authorization check for #{name}##{method_name}",
              :confidence => :high,
              :file => controller.file
      end
    end
  end
  end

  def is_apicontroller controller
  # Only care about controllers derived from ApiController
  return controller.parent.to_s.include? "ApiController"
  end

end

Using this rule Brakeman found the same seven controllers with missing authorization as Semgrep.

Conclusion

So who is the winner of this showdown? For Ruby, both tools are valuable, there is no definitive winner in our comparison when we’re specificially talking about custom rules. Currently I think Semgrep edges out Brakeman a bit for writing quick and dirty custom checks on assessments, as it’s faster to get going but it does have slightly more false positives in our limited comparison testing.

Semgrep rules are fairly intuitive to write and self explanatory; Brakeman requires additional understanding by looking into its source code to understand its architecture and also there is the need to use existing rules as a guide. After creating a few Brakeman rules it gets a lot easier, but the initial learning curve was a bit higher than other SAST tools. However, Brakeman has some sophisticated features that Semgrep does not, especially the user-input tracing functionality, that weren’t really shown in these examples. If some dangerous function is identified and you need to see if any user input gets to it (source/sink flow), that is a great Brakeman use case. Also, Brakeman’s default ruleset is great and I use them on every Rails test I do.

Ultimately Semgrep and Brakeman are both great tools with quirks and particular use-cases and deserve to be in your arsenal of SAST tooling. Enormous thanks to both Clint from the Semgrep team and Justin the creator of Brakeman for providing feedback on this post!

The post Custom Static Analysis Rules Showdown: Brakeman vs. Semgrep appeared first on Include Security Research Blog.

❌
❌