rewrite_guide.xml revision 1093a264c81aa1041581ab059905fb8f7cdfc5e2
<?xml version="1.0" encoding="UTF-8" ?>
<!-- $LastChangedRevision$ -->
<!--
Licensed to the Apache Software Foundation (ASF) under one or more
contributor license agreements. See the NOTICE file distributed with
this work for additional information regarding copyright ownership.
The ASF licenses this file to You under the Apache License, Version 2.0
(the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
-->
<manualpage metafile="rewrite_guide.xml.meta">
<parentdocument href="./">Rewrite</parentdocument>
<title>URL Rewriting Guide</title>
<summary>
<p>This document supplements the <module>mod_rewrite</module>
It describes how one can use Apache's <module>mod_rewrite</module>
to solve typical URL-based problems with which webmasters are
commonly confronted. We give detailed descriptions on how to
solve each problem by configuring URL rewriting rulesets.</p>
<note type="warning">ATTENTION: Depending on your server configuration
it may be necessary to slightly change the examples for your
additionally using <module>mod_alias</module> and
<module>mod_userdir</module>, etc. Or rewriting a ruleset
to fit in <code>.htaccess</code> context instead
of per-server context. Always try to understand what a
particular ruleset really does before you use it. This
avoids many problems.</note>
</summary>
documentation</a></seealso>
introduction</a></seealso>
useful examples</a></seealso>
<section id="canonicalurl">
<title>Canonical URLs</title>
<dl>
<dt>Description:</dt>
<dd>
<p>On some webservers there are more than one URL for a
resource. Usually there are canonical URLs (which should be
actually used and distributed) and those which are just
shortcuts, internal ones, etc. Independent of which URL the
user supplied with the request he should finally see the
canonical one only.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We do an external HTTP redirect for all non-canonical
URLs to fix them in the location view of the Browser and
for all subsequent requests. In the example ruleset below
we replace <code>/~user</code> by the canonical
<example><pre>
RewriteRule ^/<strong>~</strong>([^/]+)/?(.*) /<strong>u</strong>/$1/$2 [<strong>R</strong>]
RewriteRule ^/u/(<strong>[^/]+</strong>)$ /$1/$2<strong>/</strong> [<strong>R</strong>]
</pre></example>
</dd>
</dl>
</section>
<section id="canonicalhost"><title>Canonical Hostnames</title>
<dl>
<dt>Description:</dt>
<dd>The goal of this rule is to force the use of a particular
hostname, in preference to other hostnames which may be used to
reach the same site. For example, if you wish to force the use
following recipe.</dd>
<dt>Solution:</dt>
<dd>
<p>For sites running on a port other than 80:</p>
<example><pre>
RewriteCond %{HTTP_HOST} !^www\.example\.com [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteCond %{SERVER_PORT} !^80$
RewriteRule ^/?(.*) http://www.example.com:%{SERVER_PORT}/$1 [L,R,NE]
</pre></example>
<p>And for a site running on port 80</p>
<example><pre>
RewriteCond %{HTTP_HOST} !^www\.example\.com [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/?(.*) http://www.example.com/$1 [L,R,NE]
</pre></example>
<p>
If you wanted to do this generically for all domain names - that
recipe:</p>
<example><pre>
RewriteCond %{HTTP_HOST} !^www\. [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/?(.*) http://www.%{HTTP_HOST}/$1 [L,R,NE]
</pre></example>
<p>These rulesets will work either in your main server configuration
file, or in a <code>.htaccess</code> file placed in the <directive
module="core">DocumentRoot</directive> of the server.</p>
</dd>
</dl>
</section>
<section id="moveddocroot">
<title>Moved <code>DocumentRoot</code></title>
<dl>
<dt>Description:</dt>
<dd>
<p>Usually the <directive module="core">DocumentRoot</directive>
of the webserver directly relates to the URL "<code>/</code>".
But often this data is not really of top-level priority. For example,
you may wish for visitors, on first entering a site, to go to a
particular subdirectory <code>/about/</code>. This may be accomplished
using the following ruleset:</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We redirect the URL <code>/</code> to
<code>/about/</code>:
</p>
<example><pre>
RewriteEngine on
RewriteRule <strong>^/$</strong> /about/ [<strong>R</strong>]
</pre></example>
<p>Note that this can also be handled using the <directive
module="mod_alias">RedirectMatch</directive> directive:</p>
<example>
RedirectMatch ^/$ http://example.com/about/
</example>
<p>Note also that the example rewrites only the root URL. That is, it
fact changed your document root - that is, if <strong>all</strong> of
your content is in fact in that subdirectory, it is greatly preferable
to simply change your <directive module="core">DocumentRoot</directive>
directive, rather than rewriting URLs.</p>
</dd>
</dl>
</section>
<section id="trailingslash">
<title>Trailing Slash Problem</title>
<dl>
<dt>Description:</dt>
<dd><p>The vast majority of "trailing slash" problems can be dealt
with using the techniques discussed in the <a
href="http://httpd.apache.org/docs/misc/FAQ-E.html#set-servername">FAQ
entry</a>. However, occasionally, there is a need to use mod_rewrite
to handle a case where a missing trailing slash causes a URL to
fail. This can happen, for example, after a series of complex
rewrite rules.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>The solution to this subtle problem is to let the server
add the trailing slash automatically. To do this
correctly we have to use an external redirect, so the
browser correctly requests subsequent images etc. If we
only did a internal rewrite, this would only work for the
directory page, but would go wrong when any images are
included into this page with relative URLs, because the
browser would request an in-lined object. For instance, a
redirect!</p>
<p>So, to do this trick we write:</p>
<example><pre>
RewriteEngine on
RewriteBase /~quux/
RewriteRule ^foo<strong>$</strong> foo<strong>/</strong> [<strong>R</strong>]
</pre></example>
<p>Alternately, you can put the following in a
top-level <code>.htaccess</code> file in the content directory.
But note that this creates some processing overhead.</p>
<example><pre>
RewriteEngine on
RewriteBase /~quux/
RewriteCond %{REQUEST_FILENAME} <strong>-d</strong>
RewriteRule ^(.+<strong>[^/]</strong>)$ $1<strong>/</strong> [R]
</pre></example>
</dd>
</dl>
</section>
<section id="movehomedirs">
<title>Move Homedirs to Different Webserver</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Many webmasters have asked for a solution to the
following situation: They wanted to redirect just all
homedirs on a webserver to another webserver. They usually
need such things when establishing a newer webserver which
will replace the old one over time.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>The solution is trivial with <module>mod_rewrite</module>.
On the old webserver we just redirect all
<example><pre>
RewriteEngine on
RewriteRule ^/~(.+) http://<strong>newserver</strong>/~$1 [R,L]
</pre></example>
</dd>
</dl>
</section>
<section id="multipledirs">
<title>Search for pages in more than one directory</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Sometimes it is necessary to let the webserver search
for pages in more than one directory. Here MultiViews or
other techniques cannot help.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We program a explicit ruleset which searches for the
files in the directories.</p>
<example><pre>
RewriteEngine on
# first try to find it in dir1/...
# ...and if found stop and be happy:
RewriteCond %{DOCUMENT_ROOT}/<strong>dir1</strong>/%{REQUEST_URI} -f
RewriteRule ^(.+) %{DOCUMENT_ROOT}/<strong>dir1</strong>/$1 [L]
# second try to find it in dir2/...
# ...and if found stop and be happy:
RewriteCond %{DOCUMENT_ROOT}/<strong>dir2</strong>/%{REQUEST_URI} -f
RewriteRule ^(.+) %{DOCUMENT_ROOT}/<strong>dir2</strong>/$1 [L]
# else go on for other Alias or ScriptAlias directives,
# etc.
RewriteRule ^(.+) - [PT]
</pre></example>
</dd>
</dl>
</section>
<section id="setenvvars">
<title>Set Environment Variables According To URL Parts</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Perhaps you want to keep status information between
requests and use the URL to encode it. But you don't want
to use a CGI wrapper for all pages just to strip out this
information.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We use a rewrite rule to strip out the status information
and remember it via an environment variable which can be
later dereferenced from within XSSI or CGI. This way a
<code>STATUS</code> is set to the value "java".</p>
<example><pre>
RewriteEngine on
RewriteRule ^(.*)/<strong>S=([^/]+)</strong>/(.*) $1/$3 [E=<strong>STATUS:$2</strong>]
</pre></example>
</dd>
</dl>
</section>
<section id="uservhosts">
<title>Virtual Hosts Per User</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Assume that you want to provide
for the homepage of username via just DNS A records to the
same machine and without any virtualhosts on this
machine.</p>
</dd>
<dt>Solution:</dt>
<dd>
can use the following ruleset to rewrite
<example><pre>
RewriteEngine on
RewriteCond %{<strong>HTTP_HOST</strong>} ^www\.<strong>([^.]+)</strong>\.host\.com$
RewriteRule ^(.*) /home/<strong>%1</strong>$1
</pre></example>
<p>Parentheses used in a <directive
module="mod_rewrite">RewriteCond</directive> are captured into the
backreferences <code>%1</code>, <code>%2</code>, etc, while parentheses
used in <directive module="mod_rewrite">RewriteRule</directive> are
captured into the backreferences <code>$1</code>, <code>$2</code>,
etc.</p>
</dd>
</dl>
</section>
<section id="redirecthome">
<title>Redirect Homedirs For Foreigners</title>
<dl>
<dt>Description:</dt>
<dd>
<p>We want to redirect homedir URLs to another webserver
does not stay in the local domain
virtual host contexts.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>Just a rewrite condition:</p>
<example><pre>
RewriteEngine on
RewriteCond %{REMOTE_HOST} <strong>!^.+\.ourdomain\.com$</strong>
RewriteRule ^(/~.+) http://www.somewhere.com/$1 [R,L]
</pre></example>
</dd>
</dl>
</section>
<section id="redirectanchors">
<title>Redirecting Anchors</title>
<dl>
<dt>Description:</dt>
<dd>
<p>By default, redirecting to an HTML anchor doesn't work,
because mod_rewrite escapes the <code>#</code> character,
turning it into <code>%23</code>. This, in turn, breaks the
redirection.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>Use the <code>[NE]</code> flag on the
<code>RewriteRule</code>. NE stands for No Escape.
</p>
</dd>
</dl>
</section>
<section id="time-dependent">
<title>Time-Dependent Rewriting</title>
<dl>
<dt>Description:</dt>
<dd>
<p>When tricks like time-dependent content should happen a
lot of webmasters still use CGI scripts which do for
instance redirects to specialized pages. How can it be done
via <module>mod_rewrite</module>?</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>There are a lot of variables named <code>TIME_xxx</code>
for rewrite conditions. In conjunction with the special
lexicographic comparison patterns <code><STRING</code>,
<code>>STRING</code> and <code>=STRING</code> we can
do time-dependent redirects:</p>
<example><pre>
RewriteEngine on
RewriteCond %{TIME_HOUR}%{TIME_MIN} >0700
RewriteCond %{TIME_HOUR}%{TIME_MIN} <1900
RewriteRule ^foo\.html$ foo.day.html
RewriteRule ^foo\.html$ foo.night.html
</pre></example>
<code>07:00-19:00</code> and at the remaining time the
feature for a homepage...</p>
</dd>
</dl>
</section>
<section id="backward-compatibility">
<title>Backward Compatibility for YYYY to XXXX migration</title>
<dl>
<dt>Description:</dt>
<dd>
<p>How can we make URLs backward compatible (still
bunch of <code>.html</code> files to <code>.phtml</code>?</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We just rewrite the name to its basename and test for
existence of the new extension. If it exists, we take
that name, else we rewrite the URL to its original state.</p>
<example><pre>
# backward compatibility ruleset for
# rewriting document.html to document.phtml
# when and only when document.phtml exists
# but no longer document.html
RewriteEngine on
RewriteBase /~quux/
# parse out basename, but remember the fact
RewriteRule ^(.*)\.html$ $1 [C,E=WasHTML:yes]
# rewrite to document.phtml if exists
# Note: This is a per-directory example, so %{REQUEST_FILENAME} is the full
# filesystem path as already mapped by the server.
RewriteCond %{REQUEST_FILENAME}.phtml -f
RewriteRule ^(.*)$ $1.phtml [S=1]
# else reverse the previous basename cutout
RewriteCond %{ENV:WasHTML} ^yes$
RewriteRule ^(.*)$ $1.html
</pre></example>
</dd>
</dl>
</section>
<section id="old-to-new">
<title>From Old to New (intern)</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Assume we have recently renamed the page
to provide the old URL for backward compatibility. Actually
we want that users of the old URL even not recognize that
the pages was renamed.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We rewrite the old URL to the new one internally via the
following rule:</p>
<example><pre>
RewriteEngine on
RewriteBase /~quux/
RewriteRule ^<strong>foo</strong>\.html$ <strong>bar</strong>.html
</pre></example>
</dd>
</dl>
</section>
<section id="old-to-new-extern">
<title>From Old to New (extern)</title>
<dl>
<dt>Description:</dt>
<dd>
<p>Assume again that we have recently renamed the page
to provide the old URL for backward compatibility. But this
time we want that the users of the old URL get hinted to
the new one, i.e. their browsers Location field should
change, too.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We force a HTTP redirect to the new URL which leads to a
change of the browsers and thus the users view:</p>
<example><pre>
RewriteEngine on
RewriteBase /~quux/
RewriteRule ^<strong>foo</strong>\.html$ <strong>bar</strong>.html [<strong>R</strong>]
</pre></example>
</dd>
</dl>
</section>
<section id="static-to-dynamic">
<title>From Static to Dynamic</title>
<dl>
<dt>Description:</dt>
<dd>
<p>How can we transform a static page
</dd>
<dt>Solution:</dt>
<dd>
<p>We just rewrite the URL to the CGI-script and force the
handler to be <strong>cgi-script</strong> so that it is
executed as a CGI program.
internally leads to the invocation of
<example><pre>
RewriteEngine on
RewriteBase /~quux/
RewriteRule ^foo\.<strong>html</strong>$ foo.<strong>cgi</strong> [H=<strong>cgi-script</strong>]
</pre></example>
</dd>
</dl>
</section>
<section id="blocking-of-robots">
<title>Blocking of Robots</title>
<dl>
<dt>Description:</dt>
<dd>
<p>How can we block a really annoying robot from
retrieving pages of a specific webarea? A
"Robot Exclusion Protocol" is typically not enough to get
rid of such a robot.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We use a ruleset which forbids the URLs of the webarea
directory indexed area where the robot traversal would
create big server load). We have to make sure that we
forbid access only to the particular robot, i.e. just
forbidding the host where the robot runs is not enough.
This would block users from this host, too. We accomplish
this by also matching the User-Agent HTTP header
information.</p>
<example><pre>
RewriteCond %{HTTP_USER_AGENT} ^<strong>NameOfBadRobot</strong>.*
RewriteCond %{REMOTE_ADDR} ^<strong>123\.45\.67\.[8-9]</strong>$
</pre></example>
</dd>
</dl>
</section>
<section id="blocked-inline-images">
<title>Forbidding Image "Hotlinking"</title>
<dl>
<dt>Description:</dt>
<dd>
<p>The following technique forbids the practice of other sites
including your images inline in their pages. This practice is
often referred to as "hotlinking", and results in
your bandwidth being used to serve content for someone else's
site.</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>This technique relies on the value of the
<code>HTTP_REFERER</code> variable, which is optional. As
such, it's possible for some people to circumvent this
limitation. However, most users will experience the failed
request, which should, over time, result in the image being
removed from that other site.</p>
<p>There are several ways that you can handle this
situation.</p>
<p>In this first example, we simply deny the request, if it didn't
initiate from a page on our site. For the purpose of this example,
<example><pre>
RewriteCond %{HTTP_REFERER} <strong>!^$</strong>
RewriteCond %{HTTP_REFERER} !www.example.com [NC]
RewriteRule <strong>\.(gif|jpg|png)$</strong> - [F,NC]
</pre></example>
<p>In this second example, instead of failing the request, we display
an alternate image instead.</p>
<example><pre>
RewriteCond %{HTTP_REFERER} <strong>!^$</strong>
RewriteCond %{HTTP_REFERER} !www.example.com [NC]
</pre></example>
<p>In the third example, we redirect the request to an image on some
third-party site.</p>
<example><pre>
RewriteCond %{HTTP_REFERER} <strong>!^$</strong>
RewriteCond %{HTTP_REFERER} !www.example.com [NC]
</pre></example>
<p>Of these techniques, the last two tend to be the most effective
in getting people to stop hotlinking your images, because they will
simply not see the image that they expected to see.</p>
</dd>
</dl>
</section>
<section id="proxy-deny">
<title>Proxy Deny</title>
<dl>
<dt>Description:</dt>
<dd>
<p>How can we forbid a certain host or even a user of a
special host from using the Apache proxy?</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>We first have to make sure <module>mod_rewrite</module>
is below(!) <module>mod_proxy</module> in the Configuration
file when compiling the Apache webserver. This way it gets
called <em>before</em> <module>mod_proxy</module>. Then we
configure the following for a host-dependent deny...</p>
<example><pre>
RewriteCond %{REMOTE_HOST} <strong>^badhost\.mydomain\.com$</strong>
RewriteRule !^http://[^/.]\.mydomain.com.* - [F]
</pre></example>
<example><pre>
RewriteCond %{REMOTE_IDENT}@%{REMOTE_HOST} <strong>^badguy@badhost\.mydomain\.com$</strong>
RewriteRule !^http://[^/.]\.mydomain.com.* - [F]
</pre></example>
</dd>
</dl>
</section>
<section id="external-rewriting">
<title>External Rewriting Engine</title>
<dl>
<dt>Description:</dt>
<dd>
problem? There seems no solution by the use of
<module>mod_rewrite</module>...</p>
</dd>
<dt>Solution:</dt>
<dd>
<p>Use an external <directive module="mod_rewrite"
>RewriteMap</directive>, i.e. a program which acts
like a <directive module="mod_rewrite"
>RewriteMap</directive>. It is run once on startup of Apache
receives the requested URLs on <code>STDIN</code> and has
to put the resulting (usually rewritten) URL on
<code>STDOUT</code> (same order!).</p>
<example><pre>
RewriteEngine on
RewriteRule ^/~quux/(.*)$ /~quux/<strong>${quux-map:$1}</strong>
</pre></example>
<example><pre>
# disable buffered I/O which would lead
# to deadloops for the Apache server
$| = 1;
# read URLs one per line from stdin and
# generate substitution URL on stdout
while (<>) {
s|^foo/|bar/|;
print $_;
}
</pre></example>
<p>This is a demonstration-only example and just rewrites
whatever you like. But notice that while such maps can be
<strong>used</strong> also by an average user, only the
system administrator can <strong>define</strong> it.</p>
</dd>
</dl>
</section>
</manualpage>