SCIENTIFIC-LINUX-USERS Archives

May 2012

SCIENTIFIC-LINUX-USERS@LISTSERV.FNAL.GOV

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Corey Quinn <[log in to unmask]>
Reply To:
Corey Quinn <[log in to unmask]>
Date:
Fri, 11 May 2012 12:41:48 -0700
Content-Type:
multipart/signed
Parts/Attachments:
text/plain (1249 bytes) , signature.asc (496 bytes)

On May 11, 2012, at 12:33 PM, Christopher Tooley wrote:

> Hello All,
> 
> I've been requested to whitelist websites for a local user here, apparently the internet is extremely distracting for work, save for certain sites - has anyone done something like this before? I know I could put IPs and website addresses in /etc/hosts, but I don't want to have to fix the hosts file whenever IPs change.
> 
> This will be entirely for one computer.

My response is probably off topic from a strictly technical basis.  If you're not interested in any but technical answers, you should stop reading now.

That being said, it sounds an awful lot like somebody is trying to solve a political problem via technical means; this doesn't "work."  More to the point, you're going to be investing large amounts of time in updating the whitelist, troubleshooting DNS changes, whitelisting CDNs, troubleshooting odd connectivity issues when sites assume you can access third party dependencies, etc.

If you're set on going this route, I'd suggest an actual content filtering package; reinventing the wheel doesn't work out very well, plus you get to pass the support burden off to a vendor (if you go commercial), or the community (if you go open source).

-- Corey

ATOM RSS1 RSS2