Implementing a form of websites white listing today, is it actually possible?
from dontblink@feddit.it to programming@programming.dev on 29 Jan 22:07
https://feddit.it/post/14437356

With all the CDNs and content been served from several locations for a single web page for example, would it be possible to implement a maintainable whitelist in something like a proxy? Does it makes sense? Or I would break half of the websites?

#programming

threaded - newest

onlinepersona@programming.dev on 29 Jan 22:45 next collapse

What is your goal? I’m not sure I understand. Are you trying to rebuild something like Decentraleyes?

Anti Commercial-AI license

dontblink@feddit.it on 30 Jan 09:52 collapse

No it’s more a user management thing, I would need users to access a certain list of whitelisted websites only…

Maybe proxy or dns? I’ve been looking in squid proxy but it looks fairly complicated, especially if I wanna be able to access it from wan… But Idk if with DNS I could block ips as well. Setting up an hosts file seems like a lot of continuous work since I would have to specify entries for each ip address associated with domain… Maybe firewall?

moonpiedumplings@programming.dev on 30 Jan 20:08 collapse

Yeah you probably want a proxy based solution. Have a network that has no internet access except through a proxy that you control.

You would also have to lock dns down. The problem with dns based blocks is that things like dns over https allow people to use an alternative dns server. But, if you control the devices that you are managing, then you can also control what dns server they use.

dontblink@feddit.it on 30 Jan 21:39 collapse

Do you think a Proxy would be better in this regard compared to a firewall? I was trying to watch the logs of ufw today and see if I could do something there but the incoming and outgoing connections are A LOT, and I would essentially like to whitelist both per domain and per IP.

How much maintainance would this require? I wonder how often IPs change today, but with all the NAT, dynamic DNS and CDNs there around maintaining a whitelist only with IP addresses looks like a nightmare…

Squid proxy with squidguard could be a better option than trying to work with a forewall maybe?

MajorHavoc@programming.dev on 29 Jan 22:55 next collapse

If you’re looking for website content to not run without permission, running with JavaScript disabled generally gets the job done. Each site I enable JavaScript for is effectively “white listed” to serve me app like features, beyond static content.

furrowsofar@beehaw.org on 30 Jan 00:55 next collapse

NoScript browser extension is one example. DNS filtering is another.

Kissaki@programming.dev on 30 Jan 16:33 collapse

yes, it would be possible

dontblink@feddit.it on 30 Jan 17:31 collapse

Any suggestions on the how?

Kissaki@programming.dev on 30 Jan 19:19 next collapse

Not enough context. Depending on what you proxy, you can allowlist DNS and IPs. You can use DNS to query domains for alternative delivery sources.

moonpiedumplings@programming.dev on 30 Jan 20:05 collapse

Noscript. Ublock origin strict/export mode, where you must manually accept connections. Dns filtering. A socks proxy. VPN. Etc, there exist many ways.

But which of these methods is best depends on what you are trying to do.