Friday, 28 August 2009

Redirect Using Friendly 301

The .htaccess file can seem daunting, especially when it comes to implementing 301 search engine friendly permanent redirects.

Not any longer.

Why implement them in the first place, well it's better to have your website indexed under one canonical name than two or even three.

How does this come about? In your root folder the homepage is normally labelled 'index.html, index.php &c, so what happens is this, when you click on the homepage link of your website instead of reading: www.mywebsite.com, you get: www.mywebsite.com/index.html; Google also sees this and actually indexes the two differently, even though they are exactly the same pages. This isn't so bad until your back-links start to get affected - there's nothing worse than having the /index.html page hogging more of the links than the .com only page.

But that's not all, it never would be with the web; what about sites having two TLDs (top level domain), for example: .co.uk & .com, you want the search engines to index both of these the same don't you? Likewise when someone links back to your site without the 'www', I presume you want these also to count under your one chosen TLD

Now I'm no programmer, but I have managed to implement the said 301 redirects using a really simple bit of code that you pop into your .htaccess file. I say simple, all three use 'regular expressions', but you don't really have to do the tutorial, just copy and paste.

But as with all changes to your root folder, do ensure you've saved the lot before you carry out any changes;-)

301 redirect .co.uk to .com


RewriteEngine On
RewriteCond %{HTTP_HOST} !^mywebsite\.co.uk$
RewriteRule (.*) http://mywebsite.com/$1 [R=301,L]

You only have to type in the 'RewriteEngine On' once.

301 redirect non www to www


RewriteCond %{HTTP_HOST} !^www\.
RewriteRule (.*) http://www.%{HTTP_HOST}/$1 [R=301,L]

301 redirect index.html to just the domain: www.mywebsite.com


RewriteCond %{THE_REQUEST} ^.*\/index\.html?
RewriteRule ^(.*)index\.html?$ http://www.mywebsite.com/$1 [R=301,L]

Remember to replace the 'mywebsite' with the name of your website.

End product, after implementing all three 301 permanent redirects, you've a website rendering under one roof that has nothing to confuse the search engine spiders or your potential customers with.

Tuesday, 25 August 2009

Bing 'o' with a Difference



The new face of Bing 'o' (couldn't resist the pun) in the South West.

Friday, 21 August 2009

Bing Frustrating

To ensure in Bing, that one of my UK based .com sites gets indexed as a UK site I added the following Meta Data:

meta http-equiv="content-language" content="en-gb"

Excellent I thought, not so.

On checking keyword ranking in Bing SERP I find that for some pages the site ranks highly in 'show all', and not at all in 'Only from the United Kingdom', and other pages vice versa.

This problem has been on going with MSN for some time now and really it's about time they sorted it out, just like Google has in its Webmaster Tools.

Thursday, 20 August 2009

PageRank of 6 & Never Indexed

How can a site: http://tinyurl.com/nywamr, have a PageRank of 6, yet never been indexed by Google.

I've carried out some checks, meaning does the page have this code:

/* insert a cloaking script here to detect whether the visitor is
Googlebot */
if (Googlebot) {
echo '‘;
exit();
}
else {
echo “

My PR10 page!

“;
}
?>

No, it doesn't. So how can a site generate such a wonderful PageRank yet never have had the pleasure of Google's bots? - Answers please.

Wednesday, 19 August 2009

Půjčovna Šlapadel



Ah, swimming in the rain with not a thought of anything except a půjčovna šlapadel!!

Friday, 7 August 2009

Excessive JavaScript and Computer Meltdown

Home users are the people that really suffer when your website has an excessive amount of JavaScript include files, this is when it is internally called: i.e. client side loading.

Is there a solution to halting this meltdown of someone’s computer as it staggeringly attempts to take in all those lovely new media APIs.

Yes, of course there is, but alas there aren’t many designers either willing to or able to implement the changes.

So what do you need to do? Simply convert as many of the include JavaScript files to externally called JavaScript files, this will solve a lot of the meltdown, but it definitely won’t solve all of it.

Next step, you need to condense these externally called JavaScript files into as few amounts of fetches as possible. This is the tricky bit, but below I will offer some code you can experiment with.

Basically what I say to explain this in simple terms is: imagine you’re out for a walk with your dog and you’ve a great big bag of sticks, if you make the dog collect every stick individually, after you’ve thrown them, well, it could take some while and the dog’s going to get mighty tired, but, if you were to put so many sticks into, say two or three bags and throw them for the dog, well, the bags may individually take a little longer to fetch, but I can assure you, no way near as long as gathering them all when thrown individually.

So wrap up your 21 files of JavaScript into a few neat packages and improve your potential customers’ experience of your website, and who knows you may even get a few more sales or sign-ups.

Okay here’s the test code, but do remember it is test code, so use with care and tell me how you get on.

And credit where credit is due, this isn’t my code it's from a clever chap called Niels Leenheer, he calls it the ‘CSS and Javascript Combinator 0.5’ and it was copyrighted in 2006.

CODE:

$cache = true;
$cachedir = dirname(__FILE__) . '/cache';
$cssdir = dirname(__FILE__) . '/css';
$jsdir = dirname(__FILE__) . '/javascript';

// Determine the directory and type we should use
switch ($_GET['type']) {
case 'css':
$base = realpath($cssdir);
break;
case 'javascript':
$base = realpath($jsdir);
break;
default:
header ("HTTP/1.0 503 Not Implemented");
exit;
};

$type = $_GET['type'];
$elements = explode(',', $_GET['files']);

// Determine last modification date of the files
$lastmodified = 0;
while (list(,$element) = each($elements)) {
$path = realpath($base . '/' . $element);

if (($type == 'javascript' && substr($path, -3) != '.js') ||
($type == 'css' && substr($path, -4) != '.css')) {
header ("HTTP/1.0 403 Forbidden");
exit;
}

if (substr($path, 0, strlen($base)) != $base || !file_exists($path)) {
header ("HTTP/1.0 404 Not Found");
exit;
}

$lastmodified = max($lastmodified, filemtime($path));
}

// Send Etag hash
$hash = $lastmodified . '-' . md5($_GET['files']);
header ("Etag: \"" . $hash . "\"");

if (isset($_SERVER['HTTP_IF_NONE_MATCH']) &&
stripslashes($_SERVER['HTTP_IF_NONE_MATCH']) == '"' . $hash . '"')
{
// Return visit and no modifications, so do not send anything
header ("HTTP/1.0 304 Not Modified");
header ('Content-Length: 0');
}
else
{
// First time visit or files were modified
if ($cache)
{
// Determine supported compression method
$gzip = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'gzip');
$deflate = strstr($_SERVER['HTTP_ACCEPT_ENCODING'], 'deflate');

// Determine used compression method
$encoding = $gzip ? 'gzip' : ($deflate ? 'deflate' : 'none');

// Check for buggy versions of Internet Explorer
if (!strstr($_SERVER['HTTP_USER_AGENT'], 'Opera') &&
preg_match('/^Mozilla\/4\.0 \(compatible; MSIE ([0-9]\.[0-9])/i', $_SERVER['HTTP_USER_AGENT'], $matches)) {
$version = floatval($matches[1]);

if ($version < 6)
$encoding = 'none';

if ($version == 6 && !strstr($_SERVER['HTTP_USER_AGENT'], 'EV1'))
$encoding = 'none';
}

// Try the cache first to see if the combined files were already generated
$cachefile = 'cache-' . $hash . '.' . $type . ($encoding != 'none' ? '.' . $encoding : '');

if (file_exists($cachedir . '/' . $cachefile)) {
if ($fp = fopen($cachedir . '/' . $cachefile, 'rb')) {

if ($encoding != 'none') {
header ("Content-Encoding: " . $encoding);
}

header ("Content-Type: text/" . $type);
header ("Content-Length: " . filesize($cachedir . '/' . $cachefile));

fpassthru($fp);
fclose($fp);
exit;
}
}
}

// Get contents of the files
$contents = '';
reset($elements);
while (list(,$element) = each($elements)) {
$path = realpath($base . '/' . $element);
$contents .= "\n\n" . file_get_contents($path);
}

// Send Content-Type
header ("Content-Type: text/" . $type);

if (isset($encoding) && $encoding != 'none')
{
// Send compressed contents
$contents = gzencode($contents, 9, $gzip ? FORCE_GZIP : FORCE_DEFLATE);
header ("Content-Encoding: " . $encoding);
header ('Content-Length: ' . strlen($contents));
echo $contents;
}
else
{
// Send regular contents
header ('Content-Length: ' . strlen($contents));
echo $contents;
}

// Store cache
if ($cache) {
if ($fp = fopen($cachedir . '/' . $cachefile, 'wb')) {
fwrite($fp, $contents);
fclose($fp);
}
}
}

There you go then, all done.