RSS Security

❌ About FreshRSS
There are new articles available, click to refresh the page.
Today β€” 5 August 2021KitPloit - PenTest & Hacking Tools

Elpscrk - An Intelligent Common User-Password Profiler Based On Permutations And Statistics

5 August 2021 at 12:30
By: Zion3R

An Intelligent common user-password profiler that's named after the same tool in Mr.Robot series S01E01

In simple words, elpscrk will ask you about all info you know about your target then will try to generate every possible password the target could think of, it all depends on the information you give, the flags you activate, and the level of complication you specify.

There are 6 levels of complications in elpscrk for each type of target out there, starts from the simple person which is the default to nerd person, paranoid person till the final boss level which is nuclear level which could generate 1000000 passwords or more.

Elpscrk is like cupp, but it's based on permutations and statistics while being memory efficient. So you will get more results in nearly no time, complication levels for each type of user, and very customizable results as you will see in the usage part.


It's simple you just run the script and will prompt ask you about the info you know about the target then it will make permutations lists of each common mixes of the data as you will see next.

For more advanced usages and customizations, there are some things you need to pay attention to:

  1. Complication levels

It says how simple you want the generated passwords to be. So, for example given names:

- At level 0, which is the default, each name you give will be converted to (UPPERCASE, lowercase), the First letter will be converted to (UPPERCASE, lowercase), 
and first two letter will be converted to (UPPERCASE, lowercase, capitalized).
- At level 1, you will see all things from level 0 but also each name reversed, and each first two letters of each name will be reversed too.

Here's a table explaining the whole complication levels:

Note: In the general idea column, every thing mentioned is being used in many (common/not common) mixtures not alone

Level General idea
0 - Simple person This is the default level, here you expect a lot of things like names permutations explained above, dates are splited to groups of days, months, years, last two number of year like 1990 & 90, phone numbers are converted to national format with a list of first/last four numbers as it's commonly used and so on.
1 - Average person Using this level will add some interesting things like for names, each name reversed, and each first two letters of each name will be reversed too, for dates, 0 will be appended to (days, months) lower than 10 so 5 will be 05 & 5 of course, last 3 numbers of year will be used also so 1990 will give 990.
2 - Cyber awareness By default, old passwords you give will be used as it's given and any special chars will be removed from it, but starting from level 2 each given old password will be converted to (UPPERCASE, lowercase, capitalized, reversed) forms.
3 - Paranoid person By default, if you used the --chars flag, elpscrk will use the 10 most common chars as you are in level 0 or 1 but starting with 3 it will use the whole special characters set allowed in passwords (See references).
4 - Nerd person For level 3 and lower, elpscrk will be using orders pairs to make sure permutations are in order and cuts a lot of uncommon password forms in the mixes, now using level 4 will use same common mixtures but without pairs so for example instead of getting passwords of (names & dates), you will get mixes of (names & dates, names & names, dates & dates,...) and so on
5 - Nuclear! Here is where shit hits the fan
, elpscrk will starting using not common and not very realistic results for the most complicated unhuman targets lol.

Note: Before all permutations in these levels starts, elpscrk will generate some of the most used forms of passwords using simple addition like cupp just to make sure it will be in the results.

  1. Leet flag

When you enable the leet flag, elpscrk will work as normal and after finishing and exporting the results then it will start getting all leet permutations of all passwords and saves them into new file.

So, for example a name like karim will result to ['k4r1m', 'k4rim', 'kar1m', 'karim']

  1. Years and numbers ranges

Using -y/--years option and giving it a year like 1980 will make elpscrk mix all passwords with all the years from 1980 till 2022 (Year we are currently in+1) so you expect passwords like these:


and so on, the same applies to -r/--num-range option giving it number like 100 will add all numbers from 0 till 100 so expect passwords like karim99, 99karim, karim100...

  • It should run on any OS but needs python 3.6 and above.
  • Clone the repo and in its directory install requirements like that: pip install -r requirements.txt or this one python3 -m pip install -r requirements.txt if you have more than one version installed.
  • You are ready to go


  • Elpscrk is created to raise cyber awareness about the importance of strong unpredictable passwords, and it's not responsible for misuse or illegal purposes.
  • It can be used only for legal penetration tests or educational purpose...etc!
  • Copying a code from this tool or using it in another tool is accepted as you mentioning the source and pull requests are always welcomed ofc.


Yesterday β€” 4 August 2021KitPloit - PenTest & Hacking Tools

Uchihash - A Small Utility To Deal With Malware Embedded Hashes

4 August 2021 at 21:30
By: Zion3R

Uchihash is a small utility that can save malware analysts the time of dealing with embedded hash values used for various things such as:

  • Dynamically importing APIs (especially in shellcode)
  • Checking running process used by analysts (Anti-Analysis)
  • Checking VM or Antivirus artifacts (Anti-Analysis)

Uchihash can generate hashes with your own custom hashing algorithm, search for a list of hashes in an already generated hashmap and also it can generate an IDAPython script to annotate the hashes with their corresponding values for easier analysis.

$ git clone
$ pip install -r requirements.txt

Hashing algorithm --apis Calculate hashes of APIs --keywords Calculate hashes of keywords --list LIST Calculate hashes of your own word list --script SCRIPT Script file containing your custom hashing algorithm --search SEARCH Search a JSON File containing hashes mapped to words --hashes HASHES File containing list of hashes to search for --ida Generate an IDAPython script to annotate hash values Examples: * python --algo crc32 --apis * python --algo murmur3 --list mywords.txt * python --search hashmap.txt --hashes myhashes.txt ">
usage: [-h] [--algo ALGO] [--apis] [--keywords] [--list LIST] [--script SCRIPT] [--search SEARCH] [--hashes HASHES] [--ida]

optional arguments:
-h, --help show this help message and exit
--algo ALGO Hashing algorithm
--apis Calculate hashes of APIs
--keywords Calculate hashes of keywords
--list LIST Calculate hashes of your own word list
--script SCRIPT Script file containing your custom hashing algorithm
--search SEARCH Search a JSON File containing hashes mapped to words
--hashes HASHES File containing list of hashes to search for
--ida Generate an IDAPython script to annotate hash values

* python --algo crc32 --apis
* python --algo murmur3 --list mywords.txt
* python --search hashmap.txt --hashes myhashes.txt


see examples folder for more clarification

Available Hashing Algorithms
  • md4
  • md5
  • sha1
  • sha224
  • sha256
  • sha384
  • sha512
  • ripemd160
  • whirlpool
  • crc8
  • crc16
  • crc32
  • crc64
  • djb2
  • sdbm
  • loselose
  • fnv1_32
  • fnv1a_32
  • fnv1_64
  • fnv1a_64
  • murmur3


Let's take an examples with a real malware family, in this case we have BuerLoader which is using hash values to dynamically import APIs and it's using a custom hashing algorithm.

First we need to implement the hashing algorithm in python:

def ROR4(val, bits, bit_size=32):
return ((val & (2 ** bit_size - 1)) >> bits % bit_size) | \
(val << (bit_size - (bits % bit_size)) & (2 ** bit_size - 1))

def hashme(s):
res = 0
for c in s:
v3 = ROR4(res, 13)
v4 = c - 32
if c < 97:
v4 = c
res = v4 + v3
return hex(res)

Then we calculate the hashes of all APIs:

$ python --script --apis

Finally we search for the hash values that BuerLoader is using in the generated hashmap, we can also generate an IDAPython script to annotate those hash values with their corresponding API names:

$ python --search output/hashmap.txt --hashes buer_hashes.txt --ida

We should get 2 output files, one is "output/search_hashmap.txt" which maps BuerLoader's hash values to API names:

"0x8a8b468c": "LoadLibraryW",
"0x302ebe1c": "VirtualAlloc",
"0x1803b7e3": "VirtualProtect",
"0xe183277b": "VirtualFree",
"0x24e2968d": "GetComputerNameW",
"0xab489125": "GetNativeSystemInfo",

The other file is "output/" which will add the comments to your idb:

SharpLAPS - Retrieve LAPS Password From LDAP

4 August 2021 at 12:30
By: Zion3R

The attribute ms-mcs-AdmPwd stores the clear-text LAPS password.

This executable is made to be executed within Cobalt Strike session using execute-assembly. It will retrieve the LAPS password from the Active Directory.

Require (either):

  • Account with ExtendedRight or Generic All Rights
  • Domain Admin privilege

LDAP host to target, most likely the DC Optional /user:<username> Username of the account /pass:<password> Password of the account /out:<file> Outputting credentials to file /ssl Enable SSL (LDAPS://) Usage: SharpLAPS.exe /user:DOMAIN\User /pass:[email protected]! /host: ">
  _____ __                     __    ___    ____  _____
/ ___// /_ ____ __________ / / / | / __ \/ ___/
\__ \/ __ \/ __ `/ ___/ __ \/ / / /| | / /_/ /\__ \
___/ / / / / /_/ / / / /_/ / /___/ ___ |/ ____/___/ /
/____/_/ /_/\__,_/_/ / .___/_____/_/ |_/_/ /____/
/host:<> LDAP host to target, most likely the DC

/user:<username> Username of the account
/pass:<password> Password of the account
/out:<file> Outputting credentials to file
/ssl Enable SSL (LDAPS://)

Usage: SharpLAPS.exe /user:DOMAIN\User /pass:[email protected]! /host:

Before yesterdayKitPloit - PenTest & Hacking Tools

Doldrums - A Flutter/Dart Reverse Engineering Tool

3 August 2021 at 21:30
By: Zion3R

To flutter: to move in quick, irregular motions, to beat rapidly, to be agitated.
Doldrums: a period of stagnation.

Doldrums is a reverse engineering tool for Flutter apps targetting Android. Concretely, it is a parser and information extractor for the Flutter/Dart Android binary, conventionally named, for all Dart version 2.10 releases. When run, it outputs a full dump of all classes present in the isolate snapshot.

The tool is currently in beta, and missing some deserialization routines and class information. If it does not work out-of-the-box, please let me know.


Doldrums requires pyelftools to parse the ELF format. You can install it with

pip3 install pyelftools


To use, simply run the following command, substituting for the appropriate binary, and output for the desired output file. Note that the verbose option only works for Dart snapshot v2.12.

python3 src/ [-v] output

The expected output is a dump of all classes, in the following format:

class MyApp extends StatelessWidget {
Widget build(DynamicType, DynamicType) {
Code at absolute offset: 0xec85c

String myPrint(DynamicType, DynamicType) {
Code at absolute offset: 0xeca80

The absolute code offset indicates the offset into the file where the native function may be found.

Reading material

For a detailed write-up on the format, please check my blog post.

Related works

darter is a fully implemented and fully tested parser for Dart version 2.5 releases.


If you'd like to help the project, consider making a pull request, or donating to

  • ADA: DdzFFzCqrhsgHAVMtnep9Uq9iF61oxZ31LWVG3izmT8BH54Jz7C2gUBFcy6VnCkrbVNqrkevQ4wSwK7dfh7YrUfvSd5toKdE9tzZrcaB
  • BTC: 33piC5kfTdqFyQ5ionmuJkTDJXsFYdzGdS
  • ETH: 0x2bF670503C28B551C80191aeE9F7ACC96e101D9B

Logo by Luis Fonseca.

Rz-Ghidra - Deep Ghidra Decompiler And Sleigh Disassembler Integration For Rizin

3 August 2021 at 12:30
By: Zion3R

This is an integration of the Ghidra decompiler and Sleigh Disassembler for rizin. It is solely based on the decompiler part of Ghidra, which is written entirely in C++, so Ghidra itself is not required at all and the plugin can be built self-contained. This project was presented, initially for radare2, at r2con 2019 as part of the Cutter talk:


An rz-pm package is available that can easily be installed like:

rz-pm -i rz-ghidra

This package only installs the rizin part. To use rz-ghidra from cutter, either use a provided pre-built release starting with Cutter 1.9, which bundles rz-ghidra, or follow the build instructions below.

Usage: pdg   # Native Ghidra decompiler plugin
| pdg # Decompile current function with the Ghidra decompiler
| pdgd # Dump the debug XML Dump
| pdgx # Dump the XML of the current decompiled function
| pdgj # Dump the current decompiled function as JSON
| pdgo # Decompile current function side by side with offsets
| pdgs # Display loaded Sleigh Languages
| pdg* # Decompiled code is returned to rizin as comment

The following config vars (for the e command) can be used to adjust rz-ghidra's behavior:

override auto-detection (e.g. x86:LE:32:default) ghidra.linelen: Max line length Newline before opening '{' Newline before else ghidra.sleighhome: SLEIGHHOME ">
    ghidra.cmt.cpp: C++ comment style
ghidra.cmt.indent: Comment indent
ghidra.indent: Indent increment
ghidra.lang: Custom Sleigh ID to override auto-detection (e.g. x86:LE:32:default)
ghidra.linelen: Max line length Newline before opening '{' Newline before else
ghidra.sleighhome: SLEIGHHOME

Here, ghidra.sleighhome must point to a directory containing the *.sla, *.lspec, ... files for the architectures that should supported by the decompiler. This is however set up automatically when using the rz-pm package or installing as shown below.


First, make sure the submodule contained within this repository is fetched and up to date:

git submodule init
git submodule update

Then, the rizin plugin can be built and installed as follows:

mkdir build && cd build
cmake -DCMAKE_INSTALL_PREFIX=~/.local ..
make install

Here, set the CMAKE_INSTALL_PREFIX to a location where rizin can load the plugin from. The install step is necessary for the plugin to work because it includes installing the necessary Sleigh files.

To also build the Cutter plugin, pass -DBUILD_CUTTER_PLUGIN=ON -DCUTTER_SOURCE_DIR=/path/to/cutter/source to cmake, for example like this:

/my/path> git clone
/my/path> # build Cutter, clone rz-ghidra, etc.
/my/path/rz-ghidra> mkdir build && cd build
/my/path/rz-ghidra/build> cmake -DBUILD_CUTTER_PLUGIN=ON -DCUTTER_SOURCE_DIR=/my/path/cutter -DCMAKE_INSTALL_PREFIX=~/.local ..
/my/path/rz-ghidra/build> make && make install

Versioning and Rizin Compatibility

Rizin has a quickly evolving C API so it is necessary to be explicit about which versions of rz-ghidra are compatible with which versions of Rizin:

When using Rizin and rz-ghidra from git:

  • rz-ghidra branch dev follows along Rizin branch dev.
  • rz-ghidra branch stable follows along Rizin branch stable.

Regarding releases, rz-ghidra is generally released simultaneously with Rizin and often uses the same version numbers (but not guaranteed, do not depend on these numbers!). Also, along with every Rizin release a tag like rz-0.1.2 is created on rz-ghidra, which exactly points to an rz-ghidra release and indicates that this release is compatible with the specified Rizin version. These tags can be used by distribution maintainers to look up how to set up dependencies.

Domhttpx - A Google Search Engine Dorker With HTTP Toolkit Built With Python, Can Make It Easier For You To Find Many URLs/IPs At Once With Fast Time

2 August 2021 at 21:30
By: Zion3R

domhttpx is a google search engine dorker with HTTP toolkit built with python, can make it easier for you to find many URLs/IPs at once with fast time.



This will display help for the tool. Here are all the switches it supports.

Flag Description Example
-ip, --only-ip Show output as IP only domhttpx --only-ip
-od, --only-domain Show output as domain only domhttpx --only-domain
-rp, --real-path Extract real path domhttpx -k [keyword] -a [amount] --real-path
-p, --path Custom path url domhttpx -k [keyword] -a [amount] --path [custom_path]
-sc, --status-code Extract status code domhttpx -k [keyword] -a [amount] --status-code
-t, --title Extract title page domhttpx -k [keyword] -a [amount] --title
-ws, --web-server Extract web server domhttpx -k [keyword] -a [amount] --server
-cr, --check-result Check list result domhttpx --check-result
-sr, --show-result Show result content domhttpx --show-result result.txt
-rr, --remove-result Remove result file domhttpx --remove-result result.txt
-o, --output File to write output domhttpx -k [keyword] -a [amount] -o output.txt
-s, --silent Show only subdomains in output domhttpx -k [keyword] -a [amount] --silent
-v, --version Show current program version domhttpx --version

Basic Usage
> --keyword [keyword] --amount [amount]

One keyword
pentesting --amount 5 ">
> --keyword pentesting --amount 5

Multiple keyword
> --keyword "pentesting basic" --amount 5

Extract Title Page
> --keyword "pentesting basic" --amount 5 --title

Extract Title Page from Real Path
> --keyword "pentesting basic" --amount 5 --title --real-path

Extract Web Server
> --keyword "pentesting basic" --amount 5 --web-server

Running Example

Running domHttpx with default command

This will run an automatic search tool with the specified keyword and number

➀ --keyword indonesia --amount 20     

_ _ _ _ _
__| |___ _ __ | || | |_| |_ _ ____ __
/ _` / _ \ ' \| __ | _| _| '_ \ \ /
\__,_\___/_|_|_|_||_|\__|\__| .__/_\_\
|_| v1.0.0

[INFO] Searching domain for indonesia keyword
[INFO] Found 20 domain

Show output as IP
➀ --keyword indonesia --amount 9 --only-ip

_ _ _ _ _
__| |___ _ __ | || | |_| |_ _ ____ __
/ _` / _ \ ' \| __ | _| _| '_ \ \ /
\__,_\___/_|_|_|_||_|\__|\__| .__/_\_\
|_| v1.0.0

[INFO] Searching IP for indonesia keyword
[INFO] Found 9 IP

Extracts the real path
➀ --keyword indonesia --amount 9 --real-path

_ _ _ _ _
__| |___ _ __ | || | |_| |_ _ ____ __
/ _` / _ \ ' \| __ | _| _| '_ \ \ /
\__,_\___/_|_|_|_||_|\__|\__| .__/_\_\
|_| v1.0.0

[INFO] Searching domain for indonesia keyword
[INFO] Found 9 domain

Extracts status code
➀ --keyword "Indonesia Basketball League" --amount 10 --status-code 

_ _ _ _ _
__| |___ _ __ | || | |_| |_ _ ____ __
/ _` / _ \ ' \| __ | _| _| '_ \ \ /
\__,_\___/_|_|_|_||_|\__|\__| .__/_\_\
|_| v1.0.0 [200] [200] [200] [200] [200] [200] [200] [200] [200] [200]

[INFO] Searching domain for Indonesia Basketball League keyword
[INFO] Found 10 domain

Extracts title page
➀ --keyword "Ananta Dandy" --amount 10 --real-path --title              

_ _ _ _ _
__| |___ _ __ | || | |_| |_ _ ____ __
/ _` / _ \ ' \| __ | _| _| '_ \ \ /
\__,_\___/_|_|_|_||_|\__|\__| .__/_\_\
|_| v1.0.0 [Campers - Ananta Dandy] [Ananta Dandy Profile | DBL ID] [Page Not Found β€’ Instagram] [Ananta Dandy] [Ananta Dandy - Rakan - Rafie - Saddam & Zee Bikin Komunitas #Basket Komplek | Isinya Jagoan Semua ! - YouTube] [Next in Line #12: Ananta Dandy Tentang Bermain melawan Filipin & Motivasi Untuk Ju ara DBL. - YouTube] [ANANTA DANDY DAN MUHAMAD HAFIZH | DYNAMIC DUO DARI SMAN 71 JAKARTA - YouTube] [Ananta Dandy PutraΒ Tarigan's profile | 2017Β SEABA U16 Championship for Men | ARCHIVE.FIBA.COM] [Data Sementara Alumni Siswa SMAN 71 Tahun 2020 yang Diterima di PTN – SMAN 71] [Muhamad Hafizh: Gua Ingin Jadi Pemain Indonesia Pertama di NBA -]

[INFO] Searching domain for Ananta Dandy keyword
[INFO] Found 10 domain

Help & Bugs

If you are still confused or find a bug, please open the issue. All bug reports are appreciated, and will be responded to as soon as possible thanks!


PowerShellArmoury - A PowerShell Armoury For Security Guys And Girls

2 August 2021 at 12:30
By: Zion3R

The PowerShell Armoury is meant for pentesters, "insert-color-here"-teamers and everyone else who uses a variety of PowerShell tools during their engagements. It allows you to download and store all of your favourite PowerShell scripts in a single, encrypted file.

You do not have to hassle with updating Rubeus, PowerView, ... manually. Just create a configuration file once or use the default one included with the tool. From now on, you just have to run "New-PSArmoury" before you head to the next engagement. In addition, your new and shiny armoury is encrypted and includes a bypass for AMSI, so you dont have to worry about AV.

Note: you have to provide a valid github account as well as a personal access token, so the script can properly use the github API. Do not use username/password since this will not work anyway if you have MFA enabled (and you should enable MFA). Also accessing the API with basic username/password is deprecated. Follow this guide to create a personal access token.

Config reference

The config file needs to be a valid json that consists of a single array with one or more objects, where every object is interpreted as a single script source. Every object has the following attributes

Name (Mandatory)

A name of your choice to identify the script included in this object. This is just meant as a reference for yourself.

URL (Mandatory)

The location to get the script content from. This can be a URL to a web resource (https://) or a local path (C:) or a network resource (\...). The URL is thrown into Net.Webclient or Powershells Get-Item respectively. So basically every format that one of those two can handle by default should work.

Type (Mandatory)

This gives a hint about the script location to the armoury creator. There are three valid types:

  • GitHub
    • Will prompt for credentials so we can authenticate against the github API. Will also try to distinguish between a "raw" URL that directly poins to a file or a URL that points to a repository. If the URL points to a repository, the script will automatically search all Powershell files in that repository and include them. Like ""
  • WebDownloadSimple
  • LocalFile
    • A file on disk like "C:\temp\test.ps1". If the path points to a directory, all files (recursive) with the extension ".ps1" will be included.

FileInclusionFilter (Optional)

Will only be interpreted in an object of type "GitHub". Will be matched with Powershells "like" comparison operator against the whole filename so keep in mind that you need to include the wildcards yourself. Don't forget to include a star (*) if you want to match part of a filename. "*.ps1" means all files that end with ".ps1" but ".ps1" just means ".ps1".

You don't have to include a filter but if you do, you have to use it. An empty InclusionFilter means no files.

FileExclusionFilter (Optional)

Like the InclusionFilter but obviously the other way round. Exclusion takes precedence.


See inline Powershell help (man -full New-PSArmoury) for more details.


The path to your new armoury file. The default ist ".\MyArmoury.ps1"


Load your Powershell scripts directly from a local folder or file and you don't have to provide a config file.


The path to your JSON-config file. Have a look at the sample that comes with this script for ideas.


The password that will be used to encrypt your armoury. If you do not provide a password, the script will generate a random one.

Please note: the main goal of encryption in this script is to circumvent anti-virus. If confidentiality is important to you, use the "-OmitPassword" switch. Otherwise your password and salt will be stored in your armoury in PLAINTEXT!


The salt that will be used together with your password to generate an AES encryption key. If you do not provide a salt, the script will generate a random one.

Please note: the main goal of encryption in this script is to circumvent anti-virus. If confidentiality is important to you, use the "-OmitPassword" switch. Otherwise your password and salt will be stored in your armoury in PLAINTEXT!


This switch will remove the plaintext password from the final armoury script. Use this if confidentiality is important to you.


Use this together with "-Config" to let the script validate the basic syntax of your JSON config file without executing it.


Encrypts with 3DES instead of AES.


Instructs your armoury to require a protectecd PowerShell process. Therefore on first execution, your armoury will not load but spawn a new PowerShell that is set to run with BLOCK_NON_MICROSOFT_BINARIES_ALWAYS_ON process mitigation. This prevents non-microsoft DLLs (e.g. AV/EDR products) to load into PowerShell. Shamelessly copied from the great @_rastamouse:

Example usage

You can find a very brief introduction below. Also have a look a these two blog posts here and here.

Use the following commands to create an armoury with all default settings. You can start with the sample config file in this repository for inspiration.

. .\New-PSArmoury.ps1
New-PSArmoury -Config .\PSArmoury.json

This will create an encrypted .ps1 file called "MyArmoury.ps1" in the current working directory. Password and salt for encryption are randomly generated and included in cleartext in the file. (note that we use encryption only to prevent detection on disk and not for confidentiality)

You can load the armoury into your current session by using

cat -raw .\MyArmoury.ps1 | iex

Loading your armoury invokes the following steps:

  • Load all encrypted powershell functions into the current session as part of an array
  • Disable AMSI
  • Disable console history (can help prevent detection)
  • Decrypt everything and pipe into iex

After that, all powershell code you put in the armoury will be available. Just invoke the cmdlets as usual like this

Invoke-Rubeus -Command "kerberoast /stats"
Get-DomainGroupMember -Identity "Domain Admins" -Recurse

If it happens that you don't remember what you put inside the armoury, just load it and call the inventory :-)


tsharkVM - Tshark + ELK Analytics Virtual Machine

1 August 2021 at 21:30
By: Zion3R

This project builds virtual machine which can be used for analytics of tshark -T ek (ndjson) output. The virtual appliance is built using vagrant, which builds Debian 10 with pre-installed and pre-configured ELK stack.

After the VM is up, the process is simple:

  • decoded pcaps (tshark -T ek output / ndjson) are sent over TCP/17570 to the VM
  • ELK stack in VM will process and index the data
  • Kibana is running in VM and can be accessed on

Instuctions to build VM from Ubuntu desktop

Clone source code
git clone

Build tshark VM
virtualbox vagrant bash ./ ">
sudo apt update
sudo apt install tshark virtualbox vagrant
bash ./

Upload pcaps to VM
# copy your pcaps into ./Trace
# run following script

# or use tshark directly towards 17570/tcp
tshark -r trace.pcapng -x -T ek > /dev/tcp/localhost/17570

Open Kibana with browser

Open Main Dashboard and increase time window to e.g. last 100 years to see there the sample pcaps.



cd ./VM
vagrant ssh

Delete VM
cd ./VM
vagrant destroy default

Start VM
cd ./VM
vagrant up

Stop VM
cd ./VM
vagrant halt

SSH into VM and check if ELK is running correctly
cd ./VM
vagrant ssh
sudo systemctl status kibana.service
sudo systemctl status elasticsearch.service
sudo systemctl status logstash.service

Elasticsearch mapping template

In the project is included simple Elasticseacrh mapping template generated for the frame,eth,ip,udp,tcp,dhcp protocols. To handle additional protocols efficiently it can be required to update the mapping template in the following way:

Elasticsearch version ruby ./Public/process_tshark_mapping_json.rb # 3. Upload file to vagrant VM cd VM vagrant upload ../Kibana/custom_tshark_mapping_deduplicated.json /home/vagrant/tsharkVM/Kibana/custom_tshark_mapping_deduplicated.json cd .. # 4. Connect to VM and upload template in the Elasticsearch cd VM vagrant ssh cd tsharkVM/Kibana curl -X PUT "localhost:9200/_index_template/packets_template" -H 'Content-Type: application/json' [email protected]_tshark_mapping_deduplicated.json ">
# 1. Create custom mapping, by selecting required protocols
tshark -G elastic-mapping --elastic-mapping-filter frame,eth,ip,udp,tcp,dns > ./Kibana/custom_tshark_mapping.json

# 2. Deduplicate and post-process the mapping to fit current Elasticsearch version
ruby ./Public/process_tshark_mapping_json.rb

# 3. Upload file to vagrant VM
cd VM
vagrant upload ../Kibana/custom_tshark_mapping_deduplicated.json /home/vagrant/tsharkVM/Kibana/custom_tshark_mapping_deduplicated.json
cd ..

# 4. Connect to VM and upload template in the Elasticsearch
cd VM
vagrant ssh
cd tsharkVM/Kibana
curl -X PUT "localhost:9200/_index_template/packets_template" -H 'Content-Type: application/json' [email protected]_tshark_mapping_deduplicated.json

Alternative can be using the dynamic mapping. See template ./Kibana/template_tshark_mapping_dynamic.json. And consider setting the numeric_detection parameter true/false depending on the mapping requirements and pcaps used. Upload the template into Elasticsearch in similar way as described above.


tshark -G elastic-mapping --elastic-mapping-filter mapping could be outdated, it is not following properly the Elasticsearch changes and the output can be duplicated. The manual configuration and post-processing of the mapping template is required.

Program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY.


The default license of source codes provided inside this project is the Apache License v2.0. Additionally refer to individual licenses and terms of used of installed software (see licenses for Wireshark, Elastic and other).


Special thanks to people who helped with the Wireshark development or otherwise contributed to this work:

Example pcap in ./Traces subfolder was downloaded from

Created by Martin Kacer

Copyright 2021 H21 lab, All right reserved,

CSIRT-Collect - PowerShell Script To Collect Memory And (Triage) Disk Forensics

1 August 2021 at 12:30
By: Zion3R

A PowerShell script to collect memory and (triage) disk forensics for incident response investigations.

The script leverages a network share, from which it will access and copy the required executables and subsequently upload the acquired evidence to the same share post-collection.

Permission requirements for said directory will be dependent on the nuances of the environment and what credentials are used for the script execution (interactive vs. automation)

In the demonstration code, a network location of \Synology\Collections can be seen. This should be changed to reflect the specifics of your environment.

Collections folder needs to include:

  • subdirectory KAPE; copy the directory from existing install
  • subdirectory MEMORY; 7za.exe command line version of 7zip and winpmem.exe

  • Maps to existing network drive -
    • Subdir 1: β€œMemory” – Winpmem and 7zip executables
    • Subdir 2: ”KAPE” – directory (copied from local install)
  • Creates a local directory on asset
  • Copies the Memory exe files to local directory
  • Captures memory with Winpmem
  • When complete, ZIPs the memory image
  • Renames the zip file based on hostname
  • Documents the OS Build Info (no need to determine profile for Volatility)
  • Compressed image is copied to network directory and deleted from host after transfer complete
  • New temp Directory on asset for KAPE output
  • KAPE !SANS_Triage collection is run using VHDX as output format [$hostname.vhdx]
  • VHDX transfers to network
  • Removes the local KAPE directory after completion
  • Writes a β€œProcess complete” text file to network to signal investigators that collection is ready for analysis.


Essentially the same functionality as CSIRT-Collect.ps1 with the exception that it is intented to be run from a USB device. The extra compression operations on the memory image and KAPE .vhdx have been removed. There is a slight change to the folder structure for the USB version. On the root of the USB:

  • CSIRT-Collect_USB.ps1
  • folder (empty to start) titled 'Collections'
  • folders for KAPE and Memory - same as above

Execution: -Open PowerShell as Adminstrator -Navigate to the USB device -Execute ./CSIRT-Collect_USB.ps1

Cerbrutus - Network Brute Force Tool, Written In Python

31 July 2021 at 21:30
By: Zion3R

Modular brute force tool written in Python, for very fast password spraying SSH, and FTP and in the near future other network services.

COMING SOON: SMB, HTTP(s) POST, HTTP(s) GET, HTTP BASIC AUTH Thanks to @0dayctf, Rondons, Enigma, and 001 for testing and contributing

cd /opt
git clone

python3 /opt/cerbrutus/ --help
usage: [-h] -U USERS -P PASSWORDS [-p PORT] [-t THREADS] [-q [QUIET [QUIET ...]]] Host Service

Python based network brute forcing tool!

positional arguments:
Host The host to connect to - in IP or VHOST/Domain Name form
Service The service to brute force (currently implemented 'SSH')

optional arguments:
-h, --help show this help message and exit
-U USERS, --users USERS
Either a single user, or the path to the file of users you wish to use
Either a single password, or the path to the password list you wish to use
-p PORT, --port PORT The port you wish to target (only required if running on a non standard port)
-t THREADS, --threads THREADS
Number of threads to use
-q [QUIET [QUIET ...]], --quiet [QUIET [QUIET ...]]
Do not print banner
/opt/cerbrutus/ SSH -U "username" -P /opt/wordlists/fasttrack.txt -t 10

__ ___ ____ ____ ____ __ __ ______ __ __ _____
/ ] / _]| \ | \ | \| | || || | |/ ___/
/ / / [_ | D )| o )| D ) | || || | ( \_
/ / | _]| / | || /| | ||_| |_|| | |\__ |
/ \_ | [_ | \ | O || \| : | | | | : |/ \ |
\ || || . \| || . \ | | | | |\ |
\____||_____||__|\_||_____||__|\_|\__,_| |__| \__,_| \___|

Network Brute Force Tool
< br/>[*] - Initialising password list...
Read in 224 words from /opt/wordlists/fasttrack.txt
[+] - Running with 10 threads...
[*] - Starting attack against [email protected]
[*] - Trying: 65/224

Test Run:
# The password is in line number 12600 in rockyou    64 threads -> 1400 seconds ~ 7 minutes (hydra took 30 minutes)  1000 threads -> 464 seconds -> 27 requests per second  100 threads took 1000 seconds -> 12 requests per second     # the password is in line 460   100 threads took 32 seconds -> 14 requests per second  1000 threads took 16 seconds -> 28 requests per second  64 threads took 51 seconds -> 9 requests per second (hydra took the same time)    # word number 20k in rockyou  1100 threads took 637 seconds which means 31 rps  110 threads took 1457 seconds so that's 13.7 rps  

Uses a custom implementation of paramiko to overcome a few minor issues with implementing it for ssh brute forcing. -