

That or they have failing hardware or really bad upload or something. Maybe they’re using a bad client or something. Seems odd for sure.
That or they have failing hardware or really bad upload or something. Maybe they’re using a bad client or something. Seems odd for sure.
Most likely they’ll end up getting rid of that section. So I suggest anyone who is looking to keep that music alive start hoarding them if they haven’t already.
I’m part of a private tracker and am hosting some low seeder torrents all the time.m using my servers. If you want I can download them from you, set them to their own category so they don’t get removed and just seed them for ya. Feel free to message me.
that’s pretty obvious. Their body panels are falling off and showing how little there actually is their vehicles :D
Assuming that Tesla goes bankrupt, actually shuts down forever, and shuts its servers down…
At a minimum someone would have to find out where the software sends and receives data from. Then you’d have to reverse engineer the software to control the vehicles.
Then you’d have to reprogram the software to send to your C&C server. I don’t think it would really take all that much to host that… it’s getting there that’s difficult.
I’d have to have friends across the internet that wanted files first…
I’ve been using pinchflat which is essentially a front end for yt-dlp and it’s been working fine for me. Mind you, i have all YouTube traffic from pinchflat running through a vpn to a different country so but that’s because that system also sails the high seas 🏴☠️😂
😂 kinda figured.
100%. That’s how I started, that’s how I continue to operate. Currently have a few HP prodesk and elite desk mini pcs, my old desktop converted to be a proxmox node that runs OPNsense as a vm, and an even older desktop that runs TrueNAS. However, I would like to replace my current truenas system with something newer and lower power as it consumes quite a bit for what it’s doing.
“Yit”? From a search of GitHub I’m not seeing a YouTube downloader that rivals yt-dlp… got a link?
Just recently started using it again… but that’s mainly because I found “pinchflat” which gives a great front end for yt-dlp and a bunch of the options. Mainly thought it was a great project that I could use to start archiving channels that I want down the line like all of Demolition Ranch, but I came to realize it’s just a great front end for yt-dlp lol
To add:
I follow these and some other I can’t think of the name right now, but some great resources!
Just found Redirecterr and set that up, but that’s just for me since no one else seems to use Overseerr.
Purchased a new to me EOL enterprise switch that will enable me to expand my network while replacing existing hardware that is limited. It also enables me to move to 10G networking woot!
Find something that interests you, and look at the docs of how to get started. It literally is the easiest way to learn and get involved in self hosting
Sounds like you should be good there then!
@xanza@lemm.ee has a great response and also suggests using AdGuard Home instead, which is what I run as well. The biggest benefits the AGH has over PiHole for my family is the fact that you can very easily define a Client and the ips that pertain to that client… so I can define a single client for all of my devices , a single client for each of my kids, etc.
Then from there I can block specific services like social media platforms per client group or allow them. And similar to PiHole, I can setup all the blocklists that I want and it’ll block them across all clients.
For my kids, this means it’s blocking all those pesky ads that pop up in games getting them to go and download more mind numbing and draining games…
Finally, I can keep tabs on my network traffic and see what individual devices are accessing what domains; however, this doesn’t mean that I can see the individual web pages.
I have two AGH instances setup on two different hosts, and an additional AdGuardHome-sync container that syncs between the two instances, to make sure that all settings are mirrored.
Honestly I think this might be a better way than what I’m using now. I’ve subbed to dockerrelease.io (edit: docker-notify.com) and releasealert.dev … get spammed all day everyday because the devs keep pushing all sorts of updates to old branches… or because those sites aren’t configured well.
I agree that you’ll want to figure out inter-pod networking.
In docker, you can create a specific “external” network (external to the docker container is my understanding) and then you can attach the docker compose stack to that network and talk using the hostnames of the containers.
Personally, I would avoid host network mode as you expose those containers to the world (good if you want that, bad if you don’t)… possibly the same with using the public IP address of your instance.
You could alternatively bind the ports to 127.0.0.1 which would restrict them from exposing to internet… (see above)
So just depends on how you want to approach it.
I am running AdGuard Home DNS, not PiHole… but same idea. I have AGH running in two LXCs on proxmox (containers). I have all DHCP zones configured to point to both instances, and I never reboot both at the same time. Additionally, I watch the status of the service to make sure it’s running before I reboot the other instance.
Outside of that, there’s really no other approach.
You would still need at least 2 DNS servers, but you could setup some sort of virtual IP or load balancing IP and configure DHCP to point to that IP, so when one instance goes down then it fails over to the other instance.
I’ve got qbit running in the following configuration…
Core i5-8500T hardware > proxmox > Ubuntu virtual machine > docker > qbit. Currently seeding nearly 500 torrents and using less than 4gigs of ram. Qbit itself is running on ssd, while all torrent files are running on hard drives over SMB share. Something is up with your system.