Name Last Update Last Commit     160a9caea8a – Rename and add downsample history
File_txt .gitignore Loading commit data... Ajax_loader_tree
File_txt README.md Loading commit data... Ajax_loader_tree
File_txt downsample.py Loading commit data... Ajax_loader_tree
File_txt pairwise_operation.py Loading commit data... Ajax_loader_tree
README.md

Parallel traces preprocessing

Description

This script applies preprocessing to power traces by combining pairs of samples.
The possible pairs of samples are taken inside a sliding window which slides over the trace.
The operation used to combine the samples can be chosen.
Thanks to Python multiprocessing package, the trace is split into blocks that are processed in parallel.

Combining pairs of samples allows to launch a first-order CPA on a first-order masked implementation, which would otherwise require a second-order CPA.

Install

# Download sources
git clone git@gitlab.emse.fr:brice.colombier/traces-preprocessing.git
cd traces-preprocessing

# Download and build dependencies:
# On Windows
pip install scikit-image
# On Ubuntu
sudo apt-get install python-skimage

Usage

The script is typically used in the following manner:

python preprocessing.py masked_traces.npy --op=multiplication --window_size=5 --min_dist=1 --dtype=float64 --ncores=4

The parameter is the file in which the traces are stored in numpy format.
Options are detailed below.

Options

  • --op: operation to compute on the pair of samples. Should belong to {'addition','multiplication','squared_addition','absolute_difference'}. In DPA book it is said that absolute difference is a good choice for second-order CPA attacks that leak the Hamming weight.
  • --window_size: width of the sliding window
  • --min_dist: minimum distance between two samples in a pair
  • --dtype: numpy data type for the samples of the processed trace
  • --ncores: number of cores to use for the parallel computation