feat: transparently append to compressed archives
When appending to a compressed archive (gzip, brotli, zstd), the tool now handles compression automatically. Since some compression formats don't support appending to compressed files in place, we write a new compressed file with all the data and atomically rename it to replace the original (assuming there is enough space on that filesystem). This means you can work with compressed archives the same way as uncompressed ones. Point the tool at your .json.gz file and append values. No manual decompression/recompression needed.
This commit is contained in:
parent
da0fed29de
commit
2ab1c31993
34 changed files with 4747 additions and 1099 deletions
49
docs/fuzz-testing.md
Normal file
49
docs/fuzz-testing.md
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
# Fuzz Testing
|
||||
|
||||
Fuzz testing throws random inputs at your code until something breaks.
|
||||
|
||||
## Commands
|
||||
|
||||
List available fuzz targets:
|
||||
```
|
||||
cargo fuzz list
|
||||
```
|
||||
|
||||
Run a fuzz target:
|
||||
```
|
||||
cargo fuzz run fuzz_apply_move
|
||||
```
|
||||
|
||||
Runs until you kill it or it finds a crash.
|
||||
|
||||
## Reading the Output
|
||||
|
||||
```
|
||||
#787958 REDUCE cov: 1281 ft: 6423 corp: 1112/621Kb lim: 4096 exec/s: 13823 rss: 584Mb L: 19/3954 MS: 1 EraseBytes-
|
||||
#788755 REDUCE cov: 1281 ft: 6424 corp: 1113/621Kb lim: 4096 exec/s: 13837 rss: 584Mb L: 767/3954 MS: 2 CMP-CrossOver- DE: "6\000\000\000"-
|
||||
#789383 REDUCE cov: 1281 ft: 6424 corp: 1113/621Kb lim: 4096 exec/s: 13848 rss: 584Mb L: 59/3954 MS: 3 InsertByte-ShuffleBytes-EraseBytes-
|
||||
```
|
||||
|
||||
The fields:
|
||||
|
||||
- `#787958` — test case number. How many inputs have been tried.
|
||||
- `REDUCE` — what happened. `NEW` means new code was reached. `REDUCE` means an input was shrunk while keeping the same coverage. `pulse` is just a heartbeat.
|
||||
- `cov: 1281` — coverage. Number of code edges hit. This is what you care about.
|
||||
- `ft: 6423` — features. Finer-grained coverage metric. Ignore it.
|
||||
- `corp: 1112/621Kb` — corpus. 1112 interesting inputs saved, 621KB total.
|
||||
- `exec/s: 13823` — speed. Test cases per second.
|
||||
- `rss: 584Mb` — memory use.
|
||||
- `L: 19/3954` — input length. This one was 19 bytes. Largest in corpus is 3954.
|
||||
- `MS: 1 EraseBytes-` — mutation. How the input was generated. Doesn't matter.
|
||||
|
||||
## Is It Working?
|
||||
|
||||
Watch `cov`. If it goes up, the fuzzer is finding new code paths. If it stops going up, either you have good coverage or the fuzzer is stuck.
|
||||
|
||||
`exec/s` in the thousands is fine. If it drops to double digits, something is wrong.
|
||||
|
||||
Seeing `NEW` events means progress. Long stretches without `NEW` means diminishing returns.
|
||||
|
||||
## When to Stop
|
||||
|
||||
When `cov` stops increasing and you're bored. Hours for a quick check, days for thoroughness.
|
||||
Loading…
Add table
Add a link
Reference in a new issue