Say I have 3 Squid proxies, in accelerator mode, out in front of a single
webserver.
My goal: When a proxy receives a request for a document, it first checks
the others to see if there's a cached copy available. If yes, it retrieves
the document from the other proxy, stores a copy, and responds to the
request. If (and only if) no, it fetches the document directly from the
webserver.
Security/performance aside, from the documentation I've found I think
you'd start with squid.conf like this:
------------------------------------------
# proxy1 192.168.168.1
# proxy2 192.168.168.2
# proxy3 192.168.168.3 <--me!
cache_peer 192.168.168.1 sibling 80 3130
cache_peer 192.168.168.2 sibling 80 3130
http_port 80
httpd_accel_port 80
httpd_accel_host CACHE
httpd_accel_with_proxy on
httpd_accel_uses_host_header on
redirect_rewrites_host_header off
emulate_httpd_log on
hierarchy_stoplist cgi-bin ?
acl QUERY urlpath_regex cgi-bin \?
no_cache deny QUERY
refresh_pattern . 3600 100% 3600 ignore-reload
acl all src 0.0.0.0/0.0.0.0
acl siblings src 192.168.168.1/255.255.255.255
acl siblings src 192.168.168.2/255.255.255.255
http_access allow all
http_reply_access allow all
icp_access allow siblings
------------------------------------------
. . . but it doesn't work. As far as I can tell from the logs, the proxy
responding to the web client checks the other proxies via ICP, but always
retrieves the document from the webserver even if a cached copy is
available on a different proxy.
Am I missing something really obvious?!
thanks,
jg
Received on Wed Jan 21 2004 - 18:40:00 MST
This archive was generated by hypermail pre-2.1.9 : Sun Feb 01 2004 - 12:00:07 MST