write out a codemeta.json file for a given package. This function is
basically a wrapper around create_codemeta() to both create the codemeta
object and write it out to a JSON-LD-formatted file in one command. It can
also be used simply to write out to JSON-LD any existing object created with
create_codemeta()
.
write_codemeta(
pkg = ".",
path = "codemeta.json",
root = ".",
id = NULL,
use_filesize = TRUE,
force_update = getOption("codemeta_force_update", TRUE),
use_git_hook = NULL,
verbose = TRUE,
write_minimeta = FALSE,
...
)
package path to package root, or package name, or description file (character), or a codemeta object (list)
file name of the output, leave at default "codemeta.json"
if pkg is a codemeta object, optionally give the path to package root. Default guess is current dir.
identifier for the package, e.g. a DOI (or other resolvable URL)
whether to try to estimating and adding a filesize by using
base::file.size()
. Files in .Rbuildignore
are ignored.
Update guessed fields even if they are defined in an existing codemeta.json file
Deprecated argument.
Whether to print messages indicating opinions e.g. when
DESCRIPTION has no URL. -- See give_opinions
;
and indicating the progress of internet downloads.
whether to also create the file schemaorg.json that
corresponds to the metadata Google would validate, to be inserted to a
webpage for SEO. It is saved as "inst/schemaorg.json" alongside path
(by
default, "codemeta.json").
additional arguments to write_json
writes out the codemeta.json file, and schemaorg.json if write_codemeta
is TRUE
.
If pkg is a codemeta object, the function will attempt to update any fields it can guess (i.e. from the DESCRIPTION file), overwriting any existing data in that block. In this case, the package root directory should be the current working directory.
When creating and writing a codemeta.json for the first time, the function adds "codemeta.json" to .Rbuildignore.
Why bother creating a codemeta.json for your package? R packages
encode lots of metadata in the DESCRIPTION
file, README
, and other
places, telling users and developers about the package purpose, authors,
license, dependencies, and other information that facilitates discovery,
adoption, and credit for your software. Unfortunately, because each
software language records this metadata in a different format, that
information is hard for search engines, software repositories, and other
developers to find and integrate.
By generating a codemeta.json
file, you turn your metadata into a
format that can easily crosswalk between metadata in many other software
languages. CodeMeta is built on schema.org a
simple structured data
format developed by major search engines like Google and Bing to improve
discoverability in search. CodeMeta is also understood by significant
software archiving efforts such as Software Heritage Project, which seeks to
permanently archive all open source software.
For more general information about the CodeMeta Project for defining software metadata, see https://codemeta.github.io. In particular, new users might want to start with the User Guide, while those looking to learn more about JSON-LD and consuming existing codemeta files should see the Developer Guide.
How to keep codemeta.json up-to-date? In particular, how to keep it
up to date with DESCRIPTION
? codemetar
itself no longer supports
automatic sync, but there are quite a few methods available out there.
Choose one that fits well into your workflow!
You could rely on devtools::release()
since it will ask you
whether you updated codemeta.json when such a file exists.
You could use a git pre-commit hook that prevents a commit from being done if DESCRIPTION is newer than codemeta.json.
You can use the precommit package in which there<U+2019>s a <U+201C>codemeta-description-updated<U+201D> hook.
If that<U+2019>s your only pre-commit hook (i.e.<U+00A0>you don<U+2019>t have one
created by e.g.<U+00A0>usethis::use_readme_rmd()
), then you can
create it using
script = readLines(system.file("templates", "description-codemetajson-pre-commit.sh", package = "codemetar")) usethis::use_git_hook("pre-commit", script = script)
You could use GitHub actions. Refer to GitHub actions docs
https://github.com/features/actions, and to the example workflow
provided in this package (type system.file("templates", "codemeta-github-actions.yml", package = "codemetar")
). You can use
the cm-skip
keyword in your commit message if you don<U+2019>t want this
to run on a specific commit. The example workflow provided is setup
to only run when a push is made to the master branch. This setup is
designed for if you<U+2019>re using a git flow
setup where the master branch is only committed and pushed to via
pull requests. After each PR merge (and the completion of this
GitHub action), your master branch will always be up to date and so
long as you don<U+2019>t make manual changes to the codemeta.json file, you
won<U+2019>t have merge conflicts.
Alternatively, you can have GitHub actions route run codemetar
on each
commit. If you do this you should try to remember to run git pull
before making any new changes on your local project. However, if you
forgot to pull and already committed new changes, fret not, you can use
(git pull --rebase
)
to rewind you local changes on top of the current upstream HEAD
.
on: push: branches: master paths: - DESCRIPTION - .github/workflows/main.ymlname: Render codemeta jobs: render: name: Render codemeta runs-on: macOS-latest if: "!contains(github.event.head_commit.message, 'cm-skip')" steps: - uses: actions/checkout@v1 - uses: r-lib/actions/setup-r@v1 - name: Install codemetar run: Rscript -e 'install.packages("codemetar")' - name: Render codemeta run: Rscript -e 'codemetar::write_codemeta()' - name: Commit results run: | git commit codemeta.json -m 'Re-build codemeta.json' || echo "No changes to commit" git push https://${{github.actor}}:${{secrets.GITHUB_TOKEN}}@github.com/${{github.repository}}.git HEAD:${{ github.ref }} || echo "No changes to commit"
# NOT RUN {
codemeta <- tempfile()
write_codemeta("codemetar", path = codemeta)
# }
Run the code above in your browser using DataLab