Get all Wikidata Q identifiers of all Wikipedia pages that appear in one or more pages
tw_get_wikipedia_page_links(
url = NULL,
title = NULL,
language = tidywikidatar::tw_get_language(),
cache = NULL,
overwrite_cache = FALSE,
cache_connection = NULL,
disconnect_db = TRUE,
wait = 1,
attempts = 10
)A data frame (a tibble) with eight columns: source_title_url,
source_wikipedia_title, source_qid, wikipedia_title, wikipedia_id,
qid, description, and language.
Full url to a Wikipedia page. If given, title and language can be left empty.
Title of a Wikipedia page or final parts of its url. If given, url can be left empty, but language must be provided.
Two-letter language code used to define the Wikipedia version
to use. Defaults to language set with tw_set_language(); if not set,
"en". If url given, this can be left empty.
Defaults to NULL. If given, it should be given either TRUE
or FALSE. Typically set with tw_enable_cache() or tw_disable_cache().
Logical, defaults to FALSE. If TRUE, it overwrites
the table in the local sqlite database. Useful if the original Wikidata
object has been updated.
Defaults to NULL. If NULL, and caching is
enabled, tidywikidatar will use a local sqlite database. A custom
connection to other databases can be given (see vignette caching for
details).
Defaults to TRUE. If FALSE, leaves the connection to
cache open.
In seconds, defaults to 0. Time to wait between queries to Wikidata. If data are cached locally, wait time is not applied. If you are running many queries systematically you may want to add some waiting time between queries.
Defaults to 10. Number of times it re-attempts to reach the API before failing.
if (interactive()) {
tw_get_wikipedia_page_links(title = "Margaret Mead", language = "en")
}
Run the code above in your browser using DataLab