Downloads a list of Wikipedia pages in a specified path of the computer, and return a vector of the no-found names (if any).
getWikiFiles(X, language = c("es", "en", "fr"), directory = "./", maxtime = 0)
It returns a vector of errors, if any. All pictures are download into the selected directory (NULL= no errors).
A vector of Wikipedia's entry).
The language of the Wikipedia page version. This should consist of an ISO language code (default = "en").
Directory where to export the files to.
In case you want to apply a random waiting between consecutive searches.
Modesto Escobar, Department of Sociology and Communication, University of Salamanca. See https://sociocav.usal.es/blog/modesto-escobar/
This function allows download a set of Wikipedia pages into a directory of the local computer. All the errors (not found pages) are reported as outcomes (NULL= no errors). The files are donwload into your chosen directory.
## Not run:
## In case you want to download the Wikipage of a person:
# getWikiFiles("Rembrandt", dir = "./")
## Or the pics of multiple authors:
# B <- c("Monet", "Renoir", "Caillebotte")
# getWikiFiles(B, dir = "./", language="fr")
## End(Not run)
Run the code above in your browser using DataLab