Call for Artifacts
Artifact Submission Steps for Authors
- Please include the content of the
ARTIFACT-APPENDIX.md
file either within yourREADME.md
file or as a separate file. The file is important for not only reviewers during the evaluation process, but also for future researchers attempting to re-use your artifact. This file has different, marked sections for different badges. - Decide which badges you want to apply for. As we describe below, in general, all submitted artifact should apply at least for “Artifact Available” badge, unless doing so would endanger someone. Authors should apply to all the badges for which they believe that their artifact meets the respective badge requirements.
- Ensure that you have filled out all sections of the
ARTIFACT-APPENDIX.md
file that are relevant for the badges that you apply for. - For your submission on HotCRP, you will need to provide a copy of your paper
and a direct link to the
ARTIFACT-APPENDIX.md
file.
Artifact Badges
See badges.
Artifact Link
When the artifact evaluation is over, a persistent and stable link pointing to the final evaluated version of the artifact will be collected for each accepted artifact. Depending on the hosting option picked, this is likely a link to a specific Git commit/tag or a DoI for a Zenodo record, etc. This link and the awarded badge(s) will be added to the website next to the corresponding paper title. As updates to the artifact are likely to occur to address reviewers’ feedback, we will only collect this link after a final decision is made.
What makes a Good Submission
To ensure a smooth submission process, please follow these important guidelines:
- Alpha-test your artifact from a fresh install or ask a friend to do so. Fix potential issues that are uncovered before submission.
- As discussed in the FAQ, go through the resources and examples of artifact packaging (Dockerfiles etc.) that have been put together by the artifact evaluation chairs.
- For the “Artifact Functional” and/or “Artifact Reproduced” badges, clear documentation and mapping between claims, results, and experiments usually go a long way in facilitating the evaluation. Ideally, reviewers should be able to execute a single script to install, configure, and reproduce results.
- Respond professionally to reviews and comments within one week.
- Incorporate requested changes, at least partially, within two weeks after the request. Partial progress should be evident to reviewers through version control (Git commits or updates to Zenodo records etc.). Do not leave updates to the last minute. If some fixes require more time, authors should communicate a timeline by which these changes will be made for reviewers to plan a re-evaluation.
Your cooperation in adhering to these guidelines will greatly contribute to the efficiency and effectiveness of your submission and review process. We eagerly anticipate receiving your high-quality contributions and look forward to showcasing your research!
Artifact Award
Since PETS 2022, distinguished artifacts are recognized by an artifact award. The main objective through that award is to reward authors who put a lot of effort into the release of their artifact and to showcase exemplar submissions that contribute to the open-science and reproducibility efforts of our community.
Since PETS 2026, we provide explicit criteria for reviewers to judge whether an artifact should be nominated for this award. Reviewers should consider the artifact quality, completeness, documentation, ease of reuse, artifact maturity and scope of its target audience, interactivity and responsiveness of authors, etc. Ultimately, nominations are reviewed and ranked.
Our suggestion to authors is to simply prepare and release their artifact in the same way that they wish anyone in their field would do to facilitate adoption and reproducibility by others.
What Makes a Good Review
Artifact reviewers should familiarize themselves with the artifact call, the
aforementioned guidelines, and the format of the ARTIFACT-APPENDIX.md
file.
Reviewers should reach out to artifact chairs with any questions.
Towards the goal of contributing to open-science and reproducibility, the artifact evaluation process is designed to be interactive; authors are expected to take into account reviewers’ comments and modify their artifact accordingly. As such, reviewers are kindly asked to start their evaluation as early as possible and to post reviews or comments regularly. Once authors communicate to reviewers that issues were resolved, reviewers should then take another look and either approve the artifact or provide additional comments until a final decision is made.
We provide practical tips for reviewers below:
- Start the evaluation early.
- Notify artifact evaluation chairs ASAP if you are missing hardware or resources to perform the evaluation.
- Post a preliminary review and update it as authors make edits. Adding an “[EDIT]” tag or striking out the text of the prior review can indicate modified and/or solved comments, etc.
- Provide a concise list of issues/suggestions first (this helps give an overview to everyone), followed by more details (for the authors to make changes).
- Explicitly number or name these issues/suggestions; doing so facilitates future references to them in comments between authors and reviewers.
- If the code fails, include the environment that it is run on, the error messages, and potential steps that were attempted to fix them.
- Actively participate in the discussions.
- Politely ping authors for updates or for a timeline if no response is received to your comments after a week.
- Respond politely and professionally.
- Tag artifact evaluation chairs if you do not hear back from the authors, or if something needs to be brought up to our attention.
Distinguished Artifact Reviewers
Since PETS 2025, we recognize members of the artifact evaluation committee as distinguished artifact reviewers, based on the following criteria:
- Timeliness, including responding to authors’ updates.
- High quality reviews and discussions that significantly improve the artifact.
- Going above and beyond, such as by helping out with extra reviews, helping with artifacts with special requirements, etc.
Volunteer for the Artifact Evaluation Committee
We are looking for volunteers to serve on the artifact evaluation committee. As a committee member, you will perform review of artifacts according to the guidelines above. We are looking for volunteers who will be interested in providing feedback on documentation and instructions, trying to get source code to build, or have experience with re-using published datasets. Please fill out the reviewer nomination form linked in the menu of the PETS website.