You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We currently share bug lists and per-bug information via a JSON API which requires multiple queries (linear to the number of bugs) and which is subject to request throttling.
There have been a number of requests from external and internal parties that could be resolved by sharing a downloadable .tar.gz archive with all that information.
So far, it looks like the archive should be of the following form:
Multiple folders at root, one folder per bug.
For each bug, a JSON description with the basic information (title, status, fix/cause bisections).
For each bug, a crashes folder with N (up to 10?) most relevant crashes. For each crash:
3.a) General info (crash title, date, arch, kernel and syzkaller revisions)
3.b) Crash report
3.c) C and Syz reproducers (if present).
There are two ways we could implement it:
Construct and upload the archive directly from GAE.
Invoke a standalone tool (like tools/syz-reprolist) that will make all those API queries to the Web dashboard (without throttling) and construct the archive.
We currently share bug lists and per-bug information via a JSON API which requires multiple queries (linear to the number of bugs) and which is subject to request throttling.
There have been a number of requests from external and internal parties that could be resolved by sharing a downloadable
.tar.gz
archive with all that information.So far, it looks like the archive should be of the following form:
crashes
folder withN
(up to 10?) most relevant crashes. For each crash:3.a) General info (crash title, date, arch, kernel and syzkaller revisions)
3.b) Crash report
3.c) C and Syz reproducers (if present).
There are two ways we could implement it:
tools/syz-reprolist
) that will make all those API queries to the Web dashboard (without throttling) and construct the archive.Cc @tarasmadan
The text was updated successfully, but these errors were encountered: