Commit b67dffe368f1c313748ad16e2472f400138d340c

Authored by Brice COLOMBIER
1 parent cb669b45e9
Exists in master

Update README

Showing 3 changed files with 137 additions and 24 deletions

... ... @@ -22,7 +22,7 @@
22 22  
23 23 * **npy\_to\_bin**
24 24  
25   - Convert *.npy* file into *.bin*.
  25 + Convert from `.npy` file to `.bin`.
26 26  
27 27 * **pairwise_operation**
28 28  
... ... @@ -43,7 +43,7 @@
43 43  
44 44 * **remove_window**
45 45  
46   - Remove a window from the traces.
  46 + Remove a window from the traces. Can plot the traces before removing.
47 47  
48 48 * **shorten**
49 49  
50 50  
51 51  
52 52  
53 53  
54 54  
55 55  
56 56  
57 57  
58 58  
59 59  
60 60  
61 61  
62 62  
63 63  
64 64  
65 65  
... ... @@ -72,50 +72,147 @@
72 72 sudo apt-get install python-skimage
73 73 ```
74 74  
75   -## Usage
  75 +## Use cases
76 76  
77 77 These scripts take one positional parameter and multiple keyword arguments.
78 78 The positional parameter is the file in which the traces are stored in `numpy` format.
79 79  
  80 +* **downsample**
80 81  
  82 +Keep only every 4<sup>th</sup> sample starting from sample 10.
  83 +
  84 +```bash
  85 +python downsample.py traces.npy --factor=4 --offset=10
  86 +```
  87 +
  88 +* **filter\_highest\_variance**
  89 +
  90 +Keep only the 1% samples with the highest variance:
  91 +```bash
  92 +python filter_highest_variance.py traces.npy --ratio=0.01
  93 +```
  94 +
  95 +Keep only the 100 samples with the highest variance:
  96 +
  97 +```bash
  98 +python filter_highest_variance.py traces.npy --nsamples=100
  99 +```
  100 +
81 101 * **pairwise_operation**
82 102  
83   -To perform parallel multiplication of samples on 4 cores using a sliding window of 5 samples and all possible pairs of samples:
  103 +Perform parallel multiplication of samples on 4 cores using a sliding window of 5 samples and all possible pairs of samples:
84 104  
85 105 ```bash
86 106 python pairwise_operation.py masked_traces.npy --op=multiplication --window_size=5 --min_dist=1 --dtype=float64 --ncores=4
87 107 ```
88 108  
89   -To perform parallel absolute difference of samples on 16 cores using a sliding window of 100 samples and pairs of samples that are at least 80 samples away from one another:
  109 +Perform parallel absolute difference of samples on 16 cores using a sliding window of 100 samples and pairs of samples that are at least 80 samples away from one another:
90 110  
91 111 ```bash
92 112 python pairwise_operation.py masked_traces.npy --op=absolute_difference --window_size=100 --min_dist=80 --dtype=float64 --ncores=16
93 113 ```
94 114  
95   -* **downsample**
  115 +* **plot**
96 116  
97   -To keep only every 4<sup>th</sup> sample starting from sample 10.
  117 +Plot the first trace in a the file:
98 118  
99 119 ```bash
100   -python downsample.py masked_traces.npy --factor=4 --offset=10
  120 +python plot.py traces.npy
101 121 ```
102 122  
103   -* **filter\_highest\_variance**
  123 +Plot the first ten traces in a the file:
104 124  
105   -To keep only the 1% samples with the highest variance:
106 125 ```bash
107   -python filter_highest_variance.py masked_traces.npy --ratio=0.01
  126 +python plot.py traces.npy -n=10
108 127 ```
109 128  
110   -To keep only the 100 samples with the highest variance:
  129 +* **realign**
111 130  
  131 +Realign all the traces in a file on the 1<sup>st</sup> trace from this file:
  132 +
112 133 ```bash
113   -python filter_highest_variance.py masked_traces.npy --nsamples=100
  134 +python realign.py traces.npy
114 135 ```
115 136  
  137 +Realign all the traces in a file on the 21<sup>st</sup> trace from this file:
116 138  
  139 +```bash
  140 +python realign.py traces.npy -r=21
  141 +```
  142 +
  143 +* **remove_window**
  144 +
  145 +Plot the trace with the window from samples 500 to 1000 in red:
  146 +
  147 +```bash
  148 +python remove_window.py --start_index=500 --stop_index=1000 --plot_only=True traces.npy
  149 +```
  150 +
  151 +Remove from sample 500 to sample 1000 from the trace:
  152 +
  153 +```bash
  154 +python remove_window.py --start_index=500 --stop_index=1000 traces.npy
  155 +```
  156 +
  157 +* **shorten**
  158 +
  159 +Keep only from sample 500 to sample 1000 in the trace:
  160 +
  161 +```bash
  162 +python shorten.py --start_index=500 --stop_index=1000 traces.npy
  163 +```
  164 +
  165 +* **split**
  166 +
  167 +Split the traces file into four files:
  168 +
  169 +```bash
  170 +python split.py --nb_shares=4 traces.npy
  171 +```
  172 +
  173 +* **step_average**
  174 +
  175 +Compute the average of every block of four samples in the trace, reducing the size of the file by four:
  176 +
  177 +```bash
  178 +python step_average.py --step_size=4 traces.npy
  179 +```
  180 +
  181 +Compute the average of every block of four samples in the trace starting at sample 100:
  182 +
  183 +```bash
  184 +python step_average.py --step_size=4 --offset=100 traces.npy
  185 +```
  186 +
117 187 ## Keyword arguments
118 188  
  189 +Keyword arguments can be listed by calling script with `-h` argument:
  190 +```bash
  191 +pyhton *script*.py -h
  192 +```
  193 +
  194 +* **downsample**
  195 +
  196 + - `--factor`: the downsampling factor n, to keep only every n<sup>th</sup> sample
  197 + - `--offset`: the offset at which downsampling starts
  198 +
  199 +* **filter\_highest\_variance**
  200 +
  201 + - `--ratio`: the ratio of samples with highest variance to keep
  202 +
  203 + **OR**
  204 + - `--nsamples`: the number of samples with highest variance to keep
  205 +
  206 +* **group_process**
  207 +
  208 + - `--prefix`: prefix of the name of the files to process
  209 + - `--nb_shares`: number of files to process
  210 + - `--function`: operation to apply on the files
  211 +
  212 +* **npy\_to\_bin**
  213 +
  214 + - `--output_format`: data format for the binary file
  215 +
119 216 * **pairwise_operation**
120 217  
121 218 - `--op`: the operation to compute on the pair of samples. It should belong to `{'addition','multiplication','squared_addition','absolute_difference'}`
122 219  
123 220  
124 221  
... ... @@ -126,16 +223,31 @@
126 223 - `--dtype`: `numpy` the data type for the samples of the processed trace
127 224 - `--ncores`: the number of cores to use for the parallel computation
128 225  
129   -* **downsample**
  226 +* **plot**
130 227  
131   - - `--factor`: the downsampling factor n, to keep only every n<sup>th</sup> sample
132   - - `--offset`: the offset at which downsampling starts
  228 + - `-n`: number of traces to plot
133 229  
134   -* **filter\_highest\_variance**
  230 +* **realign**
135 231  
136   - - `--ratio`: the ratio of samples with highest variance to keep
137   -
138   - **OR**
139   - - `--nsamples`: the number of samples with highest variance to kee
140   -
  232 + - `-r`: index of the trace to use as reference
  233 +
  234 +* **remove_window**
  235 +
  236 + - `--start_index`: start index of the window to remove
  237 + - `--stop_index`: stop index of the window to remove
  238 + - `--plot_only`: set to `True`to plot only
  239 +
  240 +* **shorten**
  241 +
  242 + - `--start_index`: start index of the window to keep
  243 + - `--stop_index`: stop index of the window to keep
  244 +
  245 +* **split**
  246 +
  247 + - `--nb_shares`: number of files into which the file is split
  248 +
  249 +* **step_average**
  250 +
  251 + - `--step_size`: size of the chunk on which the average is computed
  252 + - `--offset`: start index
... ... @@ -3,7 +3,7 @@
3 3 import matplotlib.patches as patches
4 4 import argparse
5 5  
6   -def plot(traces_file, nb_subplots):
  6 +def plot(traces_file, nb_subplots=1):
7 7 traces = np.load(traces_file)
8 8 points_of_interest = [(21, 66),
9 9 (35, 66),
... ... @@ -24,6 +24,7 @@
24 24 # Parsing arguments
25 25 parser = argparse.ArgumentParser(description='Preprocess traces')
26 26 parser.add_argument("traces_file", type=str)
  27 + parser.add_argument("-r", "--ref_trace_index", type=int)
27 28 args = parser.parse_args()
28   - realign(args.traces_file)
  29 + realign(args.traces_file, args.ref_trace_index)