mirror of
https://codeberg.org/hyperreal/admin-scripts
synced 2024-11-25 09:03:41 +01:00
Refactor
This commit is contained in:
parent
4a72d27766
commit
bf1d965374
12
README.org
12
README.org
@ -2,18 +2,18 @@
|
||||
|
||||
These are scripts I use to automate various tasks in my homelab.
|
||||
|
||||
** bin
|
||||
⊘ bin
|
||||
- ~add_scihub_torrents~ : This script uses [[https://github.com/charmbracelet/gum][gum]] to select paginated text files that contain URLs of Sci Hub torrent files. For each selected file, the URLs are read and added to a qBittorrent instance.
|
||||
- ~qbth~ : This is a helper program for adding Linux and BSD distros to a qBittorrent instance. It's a bit crude, and the Python linter yells at me for it, but it gets the job done, and that's all I need it to do. Thank you.
|
||||
- ~qbt_sum_size~ : This script prints the total size of completed torrents and the total size of all torrents added to a qBittorrent instance. The former is a subset of the latter.
|
||||
- ~seed_armbian_torrents~ : This script downloads an archive from Armbian, extracts the torrent files within to a temporary directory, and adds each file to a qBittorrent instance. It first removes older Armbian torrents from the qBittorrent instance.
|
||||
- ~seed_scihub_torrents~ : This script finds which torrents have less than or equal to N seeders, where N is an integer argument supplied by the user. It then adds these torrents to a qBittorrent instance.
|
||||
- ~server0_backup~ : This script dumps my Mastodon instance's PostgreSQL database, then uses rclone to sync ~/etc~, ~/var/log~, and ~/home/jas~ to an S3-compatible object storage bucket. It also copies the dumped Mastodon database and ~.env.production~ to the object storage bucket.
|
||||
- ~server0_backup~ : This script dumps my Mastodon instance's PostgreSQL database, then uses rclone to sync ~/etc~, ~/var/log~, and ~/home/jas~ to an S3-compatible object storage bucket (MinIO). It also copies the dumped Mastodon database and ~.env.production~ to the object storage bucket.
|
||||
|
||||
** systemd
|
||||
*** system
|
||||
- ~server0-backup.service~ : A systemd service unit that runs the ~server0-backup~ script.
|
||||
⊘ systemd
|
||||
⊖ system
|
||||
- ~server0-backup.service~ : A systemd service unit that runs the ~server0_backup~ script.
|
||||
- ~server0-backup.timer~ : A systemd timer unit that triggers the corresponding service unit.
|
||||
*** user
|
||||
⊖ user
|
||||
- ~glances.service~ : A systemd service unit for the user scope that runs a glances server.
|
||||
- ~gmcapsuled.service~ : A systemd service unit for the user scope that runs the gmcapsuled Gemini server.
|
||||
|
@ -46,7 +46,7 @@ if __name__ == "__main__":
|
||||
)
|
||||
|
||||
# Read the contents of each file and put lines (which are URLs) into a list
|
||||
torrent_urls = list()
|
||||
torrent_urls = []
|
||||
for item in torrent_selection:
|
||||
with open(scihub_torrent_dir.joinpath(item), "r") as tf:
|
||||
urls = tf.readlines()
|
||||
|
@ -49,8 +49,7 @@ if __name__ == "__main__":
|
||||
with ZipFile(BytesIO(req.content)) as zip_file:
|
||||
zip_file.extractall(tmp_dir)
|
||||
|
||||
torrents = qb.torrents()
|
||||
for torrent in torrents:
|
||||
for torrent in qb.torrents():
|
||||
if "Armbian" in torrent.get("name"): # type: ignore
|
||||
qb.delete_permanently(torrent.get("hash")) # type: ignore
|
||||
print(f"Removed {torrent.get('name')}") # type: ignore
|
||||
|
2
justfile
2
justfile
@ -67,7 +67,7 @@ caddy-install:
|
||||
sudo apt install caddy
|
||||
|
||||
thelounge-install:
|
||||
curl -s https://api.github.com/repos/thelounge/thelounge-deb/releases/latest | grep "browser_download_url.*deb" | cut -d : -f 2,3 | tr -d \" | wget -qi -
|
||||
curl -s https://api.github.com/repos/thelounge/thelounge-deb/releases/latest | grep "browser_download_url.*deb" | cut -d : -f 2,3 | tr -d '"' | wget -qi -
|
||||
sudo apt install -y ./thelounge*.deb
|
||||
rm -fv ./thelounge*.deb
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user