Artifacts
Authors of accepted papers in the research, RE@Next! and industry tracks of RE’20 are invited to submit an artifact to the Artifact Track. Research papers with artifacts receive a “badge” on the front page of their paper in the proceedings. These badges certify that the paper has associated artifacts of the following form:
Artifacts Reusable
The artifacts associated with the research are found to be documented, consistent, complete, exercisable, and include appropriate evidence of verification and in addition, they are very carefully documented and well-structured to the extent that reuse and repurposing is facilitated. In particular, norms and standards of the research community for artifacts of this type are strictly adhered to.
Artifacts Available
This badge is applied to papers in which associated artifacts have been made permanently available for retrieval. Author- created artifacts relevant to this paper have been placed on a publically accessible archival repository. A DOI or link to this repository along with a unique identifier for the object is provided.
Results Replicated
The main results of the paper have been obtained in a subsequent study by a person or team other than the authors, using, in part, artifacts provided by the author.
Results Reproduced
The main results of the paper have been independently obtained in a subsequent study by a person or team other than the authors, without the use of author-supplied artifacts.
Disclaimer: Badges are loosely adopted from ACM badges as of December of 2019. For more details regarding the badges please visit https://www.acm.org/publications/policies/artifact-review-badging.
Papers with such badges contain reusable products that other researchers can use to bootstrap their own research. Such papers are likely to earn increased citations and greater prestige in the research community. Artifacts of interest include the following. If your proposed artifact is not on this list, you are welcome to email the chairs before submitting.
- Software, which are implementations of systems or algorithms potentially useful in other studies. For instance, tools to search for optimally consistent requirements specifications.
- Machine learning scripts, which can be understood as parts of a software tool, e.g., as Jupyter notebooks.
- Machine readable requirements models, e.g., in IstarML or UML interchange formats.
- Traceability relations between artifacts, such as requirements to source code.
- Data repositories, which are data (e.g., requirements models, requirements text, survey raw data) that can be used for multiple software engineering approaches.
- Requirements in natural language form, e.g., textual requirements in a spreadsheet, DOORS, JIRA export.
- User reviews, such as app reviews, product changelogs, and release notes.
- Frameworks, which are tools and services illustrating new approaches to requirements engineering that could be used by other researchers in different contexts. For example, a service to highlight inconsistencies in natural language.
Selection Criteria
The RE’20 artifacts will be evaluated using the criteria in the last row of the above table. The goal of this track is to encourage reusable research products that can accelerate science.
Review Process
- Artifacts will be reviewed via an open review GitHub-based process. All reviews will be Github issues.
- Artifacts will be checked for their “badge worthiness” by one reviewer unless there is an author/reviewer dispute in which case a second check will be conducted by a second reviewer or one of the chairs.
- Both authors and reviewers will interact using their real GitHub ID (non-anonymously).
- Reviewers will have 7 days to assess the artifacts.
- Reviewers and authors will then have 3 days to interact such that (e.g.) if there is one line missing in a config file, then that bug can be fixed and the artifact declared “reusable”.
- Track chairs will then meet to write a “decisions.md” list written to the RE’20 GitHub (https://github.com/researchart/rose7re20) that contains a table of submissions and their associated badge (if any). These will be stored as labels in the Github repo.
Submission Process
Step 1: Prepare all the supporting materials needed to evaluate an artifact:
- a README.md main markdown file describing what the artifact does;
- A STATUS.md markdown file stating what kind of badge you are applying for (one of reusable, available, replicated, reproduced) as well as the reasons why you think your artifact deserves that badge.
- a LICENSE.md markdown file describing the distribution rights (note that the license needs to be some form of open source license).
- an INSTALL.md markdown file with installation instructions. These instructions should include notes on what output to expect, which confirms the code is installed and working.
- For artifacts where the code / virtual machine / data sets / log files are larger than 1GB, do not place those artifacts within the RE’20 Artifacts GitHub repository, but make sure the INSTALL.md should include the appropriate download instructions.
- Enough associated code and data such that some CS person with a reasonable knowledge of scripting, build tools, etc. could install, build, run your code.
Step 2: Contact the RE’20 artifacts chairs to get access to the submission repository, by communicating your GitHub ID.
https://docs.google.com/forms/d/e/1FAIpQLScRpjcQsOHjjsTQqgEFE02z3xgW1fa5-bFypAorD69ib7PLEQ/viewform
Step 3: Submit all of your materials [artifact, *.md files] to a folder within the RE’20 repository. Use the <paper ID>-<contact author last name> template; for example, 23-smith or 34-jones. For large artifacts (>1 GB), please store only the .md files in the folder, and make sure the instal.md file includes appropriate download instructions.
This call is only open for authors of a paper accepted in the research, RE@Next!, or industry innovation track.
Key Dates
Artifact Submission | May 22, 2020 |
Artifact Notification | June 12, 2020 |
Camera Ready Due | June 22, 2020 |
All deadlines are 23:59 Anywhere on Earth (Standard Time).