|More Configuration Details||_||_||_|
|INDEX BACK NEXT||_||_||_|
Stopping or allowing people from using you as a proxy server is only one of the functions of ACLs. ACLs are also used for cache hierarchies. Thus you will define an ACL first, and then deny or allow access to a function of the cache. In 99% of cases this feature will be "http_access", which allows or denies a web browser's to access you. We will use this as an example for all further cases, though principles apply to the other options (such as "icp_access").
Squid works it's way through the http_access list from top to bottom when deciding which class you fall into, and also as to if you are denied or allowed access. Thus if you have a /24 (normally called a class C) network, and you want to allow only those machines access to the web through the proxy, you would use the following (assuming that you want the "class C" (properly a /24) 18.104.22.168 - 22.214.171.124 to have access:
acl ourallowedhosts src 126.96.36.199/255.255.255.0 acl all src 0.0.0.0/0.0.0.0 http_access allow ourallowedhosts http_access deny all
The "src" option on the first line is one of the options you can use to decide what acl the person is in. You can also choose based on things like the current time, and the site that they are going to. For more options have a look at the /usr/local/squid/etc/squid.conf.default file squid installs on your system.
If a user from 196.4.160.* connects using TCP and request a URL, Squid will work it's way through the list of http_access lines (since it's a TCP connection, the client is going to use the HTTP method to request the object). It works through this list From TOP to BOTTOM, stopping after the FIRST match to decide which one they are in. In this case, Squid will match on the first http_access line, Since the policy that matched is allow, squid would proceed to allow their request.
Re-writing the acl list above as follows:
acl ourallowedhosts src 188.8.131.52/255.255.255.0 acl all src 0.0.0.0/0.0.0.0 http_access deny all http_access allow ourallowedhosts
Will not work, as squid will match on the first "http_access" line, and deny all users to connect.
More advanced options
The default squid.conf has a few default settings in another form:
acl manager proto cache_object acl localhost src 127.0.0.1/255.255.255.255 acl all src 0.0.0.0/0.0.0.0 http_access deny manager !localhost http_access allow all
The "proto" field in the first line means that the acl blocks a specific protocol, in this case the "cache_object" protocol. It could just as easily be the "ftp" or "http" protocols. If you haven't heard of the "cache_object" protocol, don't worry - it's a squid-only protocol that returns information to the sender as to how the cache is configured, or how it is running. It falls in the "http_access" section of the config as it is essentially an HTTP request to squid, but instead of connecting somewhere else to get a web page that it refers to, Squid just manufactures the info.
The above example therefore says: If you get a connection trying to use the cache_object protocol (as defined in the manager acl), deny it, unless it's from the acl localhost. Thus a program running on the actual cache server can get info about squid's internal status, but not any machine on the outside. (remember that the character "!" means NOT - so we are saying "deny manager NOT localhost"). We also allow client machines in any network access.
Destination address based acls
One example that people quite often need is to disable access to a list
of sites that are "unappropriate". Squid is NOT optimised to do this for
a large number of sites, but will handle a sufficient number for most people
without much of a problem.
acl adults dstdomain playboy.com sex.com
acl ourallowedhosts src 184.108.40.206/255.255.255.0
acl all src 0.0.0.0/0.0.0.0
http_access deny adults
http_access allow ourallowedhosts
http_access deny all
This means that machines requesting urls going to playboy.com or sex.com
will match on the first http_access deny line... when the http_access lists
are checked all machines requesting adult sites (even if they are not ours)
will be denied. If they pass that rule, we check if they are from our IP
address range, if so, they are allowed access. If they aren't in our address
range then they match the all acl, and match the last http_access rule
and are denied.
There is a problem with acls like this though - if the person specifies the machine (like www.playboy.com) by it's IP address, the acl's won't match (they are looking for the DOMAIN in the acl...) so you need something that uses a dst acl, and if it matches that throw it out...
The Squid Users guide is copyright Oskar Pearson email@example.com
If you like the layout (I do), I can only thank William Mee and hope he forgives me for stealing it