mirror of
https://codeberg.org/hyperreal/bin
synced 2024-11-01 08:33:06 +01:00
Add description of script
This commit is contained in:
parent
7f9bee10ee
commit
5aa57d27da
@ -4,9 +4,20 @@
|
||||
# It prints a list of URLs from fandom wiki pages to stdout.
|
||||
#
|
||||
# Stdout can then be redirected to a plaintext file with e.g:
|
||||
# `get_fandom_wiki_urls cyberpunk > ~/downloads/cyberpunk_wiki_urls.txt`
|
||||
# `get-fandom-wiki-urls cyberpunk > ~/downloads/cyberpunk-wiki-urls.txt`
|
||||
#
|
||||
# These URLs can then be imported directly into ArchiveBox.
|
||||
# These URLs can then be imported directly into ArchiveBox. Each URL will
|
||||
# be a page of the local sitemap. The local sitemap is a list of wiki pages
|
||||
# in alphabetical order. Importing the URLs scraped by the script into
|
||||
# ArchiveBox with a depth of '1' will pull every URL one hop away, so every
|
||||
# wiki page listed in the local sitemap will be archived.
|
||||
#
|
||||
# This script wouldn't be necessary if there was a way to view the entire
|
||||
# local sitemap in one html page. Then all you'd have to do is import the
|
||||
# URL for the local sitemap into ArchiveBox with a depth of '1'. As far I
|
||||
# know there is no way to get this view of the local sitemap. For some
|
||||
# unknown reason the Wiki fandom site developers didn't design the frontend
|
||||
# to enable that.
|
||||
#
|
||||
# LICENSE
|
||||
# This is free and unencumbered software released into the public domain.
|
||||
|
Loading…
Reference in New Issue
Block a user