Links are almost always base64 encoded now and the online url decoders always produce garbage. I was wondering if there is a project out there that would allow me to self-host this type of tool?
I’d probably network this container through gluetun because, yanno, privacy.
Edit to add: Doesn’t have to be specifically base64 focused. Any link decoder that I can use in a privacy respecting way, would be welcome.
Edit 2: See if your solution will decode this link (the one in the image): https://link.sfchronicle.com/external/41488169.38548/aHR0cHM6Ly93d3cuaG90ZG9nYmlsbHMuY29tL2hhbWJ1cmdlci1tb2xkcy9idXJnZXItZG9nLW1vbGQ_c2lkPTY4MTNkMTljYzM0ZWJjZTE4NDA1ZGVjYSZzcz1QJnN0X3JpZD1udWxsJnV0bV9zb3VyY2U9bmV3c2xldHRlciZ1dG1fbWVkaXVtPWVtYWlsJnV0bV90ZXJtPWJyaWVmaW5nJnV0bV9jYW1wYWlnbj1zZmNfYml0ZWN1cmlvdXM/6813d19cc34ebce18405decaB7ef84e41 (it should decode to this page: https://www.hotdogbills.com/hamburger-molds)
I’ve been working on a URL cleaning tool for almost 2 years now and just committed support for that type of URL. I’ll release it to crates.io shortly after Rust 1.90 on the 18th.
https://github.com/Scripter17/url-cleaner
It has 3 frontends right now: a CLI, an HTTP server and userscript to clean every URL on every webpage you visit, and a discord bot. If you want any other integration let me know and I’ll see what I can do.
Also, amusingly, you decoded the base64 wrong. You forgot to change the _ to / and thus missed the
/burger-dog-mold
and tracking parameter garbage at the end. I made sure to remove the tracking parameters.