Moving website to a new URL

By : Administrator
Published 12th March 2012 |
Read latest comment - 12th April 2012

Ok, ok, I'm doing the unthinkable, all those years of carefully crafted links and pagerank...

But I managed to get my hands on mylocalservices.com as opposed to the current live US site which is mylocalservices.us

So we have been upgrading it and giving it a facelift, and are about to move over to the new URL.

So will be obviously 301 redirect old to new, and then get the new domain verified and site address changed in Webmaster tools.

What could go wrong?.........

Be interesting to see how long any associated PageRank takes to move over and how long it takes our keywords to recover

Shall update when we've cut over and let you know what I forgot!

Steve Richardson
Gaffer of My Local Services
My Local Services | Me on LinkedIn
Comments
Good luck. I look forward to seeing the face lift

Thanks,
Dreamraven

I moved a site to another domain back in beginning of January, because we re-branded the business, and the old links are all where they were (or have risen despite no further seo being done on that url) and the new domain is now not far behind for all the original keywords we were chasing, most are about 10 positions back, or less.

Now going to start some SEO on the new url but I wanted to see how it did on it's own out of curiosity having never done it before myself. Obviously you have to keep the old domain hosted for quite a long time until new domain achieves same ranking positions, so I reckon I could ditch it by end of the year however, what I now have is 2 results in google for my keywords, so im not too sure that I even want to dump the old domain....might as well just leave it there since it redirects to the new website and I get 2 bites of the cherry.

indizine
indizine

forum avatarRrmartin
4th April 2012 3:28 PM
You should do a "redirect match" using .htaccess to preserve the rankings and position.

Well we are a couple of weeks live, and apart from the usual bug fixing when you have lots of new code, the actual URL change has gone pretty smoothly.

There was a few days where all the performing keywords danced about, but it seems to have settled down, and early days but the mobile site seems to be warming up nicely and overall traffic is up marginly, which is a result, as I assumed we'd come crashing down

Alexa rank (I know, I know but it's a good yard stick) has dropped from 12 million 287,624 as of today, and is heading back to the old URL's level of just under 100,000.

PageRank has gone to zero, but I'm pretty sure that will come back on the next update, and is only vanity, as business signups are pretty constant, and so is revenue.

Overall, a lot less scary than I expected, and Googles happily indexing away just under 200,000 pages a day, so hopefully in a month it should have refreshed it's cache.

Time will tell if moving from .us to .com was worth the hassle, but the fact the .com was registered in 2001 versus 2008 will hopefully be the main bonus, plus everyone in the US seems to prefer a .com

Ahem.. Feel free to visit the home page and give us a Facebook Recommend, G +1 or leave a review as it's looking a bit feedback bare lol

Steve Richardson
Gaffer of My Local Services
My Local Services | Me on LinkedIn

forum avatarRrmartin
11th April 2012 1:43 PM
I couldn't see a SE sitemap... Do you have one?
It should help Google with knowing what to index and increasing the time a little.

I think you're right about the .com switch... Great move to get an extra 7 years on the domain age. I find that people tend to type in .com first then search if it fails for some reason


Edit:
+1'd

I couldn't see a SE sitemap... Do you have one?
It should help Google with knowing what to index and increasing the time a little.

There was a technical hitch with the SE sitemap. We set it up, and it took 28 hours to run and consumed the servers resources!

We run the sitemap on our UK site daily, and it take about an hour, but the US database is huge.

Personally I'm not 100% convinced about the need for a google sitemap, as according to WMT, we are averaging 200,000 pages a day in the crawl stats. But as the traffic begins to grow, we are looking at the option of putting in another server that will hold the master database, run any cron jobs and complete the sitemap, leaving a slave database to actually run the site.

All ideas at the moment though

Steve Richardson
Gaffer of My Local Services
My Local Services | Me on LinkedIn

forum avatarRrmartin
11th April 2012 2:32 PM
There was a technical hitch with the SE sitemap. We set it up, and it took 28 hours to run and consumed the servers resources!

We run the sitemap on our UK site daily, and it take about an hour, but the US database is huge.

Personally I'm not 100% convinced about the need for a google sitemap, as according to WMT, we are averaging 200,000 pages a day in the crawl stats. But as the traffic begins to grow, we are looking at the option of putting in another server that will hold the master database, run any cron jobs and complete the sitemap, leaving a slave database to actually run the site.

All ideas at the moment though

I look at it this way:
Google allocates a set amount of resources to each site it crawls (processor cycles, download limits and whatnot). By having, for a large site, a set of xml sitemaps Google will use less of it's resources trying to figure out content hierarchy and more on checking the content.

Having one big sitemap will do more harm than good on a big site, so I'd have separate ones for listings and pages and maybe split the cats up if there are a lot of listings.

There is also a bonus to xml sitemaps... They can be submitted as feeds to a lot of services such as aggregators which is another nice way of reaching a slightly wider audience


Edit:
Just seen the "15,785,323 Businesses signed up Are you listed?" in your footer... That's impressive
Also probably why creating sitemaps and other tasks are taking soo long hehe.

Having one big sitemap will do more harm than good on a big site, so I'd have separate ones for listings and pages

Yup in the UK, we have sitemap_index.xml and then multitudes of site maps broken up into 50k segments, but this approach failed in the US, so like I say, prob have to put another server in

Steve Richardson
Gaffer of My Local Services
My Local Services | Me on LinkedIn

This Thread is now closed for comments