0844 332 6657
Welcome to our blog

Our Latest News

Scroll for more...

Google Makes Tweaks For Geo Targeted Websites

One of the major issues with our global online marketplace is the necessity to serve geo targeted content that is dependent on the country that the visitor is searching from. Historically this has caused great challenges with regard to optimisation and organic search. However, Google has today announced that it wants to overcome these difficulties by enabling more multi language sites to rank in its search results, for different languages and countries.

geo-targeted-websites

Geo targeting is widely recognised as an accepted feature of most multinational websites and as a feature that naturally improves the users experience by letting them to read the website content in their native and most comfortable language. The problem is that by implementing this vital facility for international visitors it can actually create a barrier that causes search engines real problems when they are crawling for index purposes. The issue is that Google crawls as an American user, from the United States with a California IP address. It follows then that this will trigger the English version of the website alone thereby not allowing the other languages to be crawled and indexed. This means that the potential visitors in other countries and with other languages will find extremely difficult, if not impossible to find that website through their native version of Google, making a search in their native language.

If you have been using geo targeting on your website and have been concerned about the low levels of organic search results from outside English speaking countries then this may well be the issue that you needed to Google to address. Thankfully now they are.

According to Google’s latest announcement on geo targeted websites, The new configurations include;

Geo distributed crawling – Google bot will start to use IP addresses that appear to be coming from countries other than the United States. Google will still also use their original IP addresses that originate from United States.

Language dependent crawl – this is where Google bought would start to crawl a website with “an accepted” language HTTP header in the request.

There is excellent news for website owners in so much that these new geo targeted search crawls will be fully automated by Google and their configurations will be deployed as and when required.

However Google’s recent changes do not mean but you can take your foot off the gas with regard to your usual optimisation procedures. You must make sure that you have optimised your website with the required location and language information. Although we are very pleased to hear of this change to the Google configuration, it must be borne in mind the website owner cannot rely wholly on the good efforts and works other search engine. It also needs to be noted that it is only Google that has made this type of release and the other search engines still remain silent as to how they are dealing with geo targeted websites. Furthermore, this will be the first in the long line of configuration changes for Google in this area and this release must be seen as them taking fledgeling steps to make those improvements.

From our understanding there are four distinct areas of the Geo targeted website that need consideration immediately.

Separate URLs – it is best practice to place that content which is specific to a country or language, into a specific directory and on specific pages within the website. For example, the same product that is available both in England and the United States should have the core language based pages within the same directory but any specific offers, discounts, shipping information, pricing, descriptions, etc should be in separate directories, each specified for the country concerned.

ccTLDs – hosting content in country code top level domains for a specified country and language send a clear message to a search engine as to its crawl. For example, when Google comes across a website with a .de domain it’ll fairly assume that the website will be for German speaking people most likely living in Germany. There is a downside to this though, that being the splitting up of content over multiple sites which can weaken the original and main site by reducing the amount of relevant content that is published through it.

Hreflang – is Google’s preferred method of identifying the country and language for the website. The hreflang tags are embedded within the head of each page and are visible only to crawlers and are also only supported by Google. Bing and other search engines use the meta language tab at present.

Targeting In Webmaster Tools - both Google and Bing have a geographic targeting facility within their respective webmaster tools accounts. Although not a definitive solution it does give both search engines a further level of corroboration when identifying a country and language.

I hope this article has been of some use in light of this news release from Google. As I mentioned within the article Google is taking some fledgeling steps here, albeit that they are much-needed and much overdue.

If you would like to add to this discussion please feel free to use the commenting facility below.

Google Earth Pro Is Now Available For Free

It has been reported today that Google has reduced it’s Google Earth Pro product Down from $399 to FREE.

google-earth

Some of you maybe wondering why this is such a big deal as there has always been the free download version of Google Earth, which was launched on 25 May 2007 following googles purchase of the CIA funded company Keyhole Inc. There are some major differences between the the Original free version and the previously paid for version, some are listed below:

  • Google Earth allows the user to print screen resolution pictures whereas Google Earth Pro gave the user premium high definition pictures.
  • The free version of Google Earth only allows for the manual geolocation searches whereas the Pro version automate this process.
  • The standard version of Google Earth only allows important of images with a certain texture depth whereas the Pro version offers superdynamic textured imagery that goes way beyond that allowed by the standard version.

Interesting Google Earth Pro use the same imagery as the standard version, but as what you can do with it but makes a real difference between the two versions. For example, the Pro version officer facilities to businesses such as video creation and animation as well as the ability to map the measure polygons. The pro version also allows you to map more than one area at once, given detail access to graphical, demographic and traffic related data.

“Over the last 10 years, businesses, scientists and hobbyists from all over the world have been using Google Earth Pro for everything from planning hikes to placing solar panels on rooftops. Google Earth Pro has all the easy-to-use features and detailed imagery of Google Earth, along with advanced tools that help you measure 3D buildings, print high-resolution images for presentations or reports, and record HD movies of your virtual flights around the world.”
Stafford Marquardt, Google Earth Pro Product Manager

Obtaining Google Earth Pro is not a simple as just making download. You need to apply for a Google Earth pro key from Google, the application page you can reach by clicking here. The email address is required Will then become your Google Earth pro username, and Google will email you your Google Earth Pro key.

Making WordPress Work Faster

WordPress is the most popular blogging and content management system alive on the web today. It powers around 90 million websites and offers a very easy platform to publish content in a format that are very search engine friendly. The problem is that WordPress can end up being slow on loading into a visitors browser. With Google identifying page load speed as an ever important metric, having a slow website can cause you problems with not only visitor experience but can be a black mark against your WordPress website in the eyes of Google.

One of the main benefits of WordPress is the ability for anyone to purchase all utilise for free a theme that allows the look and feel of a blog or website to be changed and adapted to suit any taste. Are you will find that many of the most popular and most visited websites are built with a WordPress spine and a theme overlay. The final result is that anyone, with a modicum of coding knowledge can create a very attractive website that is easy to update and publish blog posts and articles to generate traffic and get visitors onto your site time and time again with the end result of turning them into clients.

One of the major issues with WordPress websites is that they are very server resource hungry. The themes that are used for the website design use a number of external HTTP calls, J query and can be calling in stylesheets from more than one source. One of the major benefits of using WordPress is the ability to add plug-ins, which are small pieces of code that enhance the website experience. Again, although plug-ins allow for wider flexibility on a website they can and often do increase the requirements placed upon the server. The end result is that the WordPress website can function slower that a hand coded development that is not reliant upon the database functionality of WordPress.

The Internet is full of tips and hints of how to speed up a WordPress website but many of these involve additional plug-ins and the use of third party content distribution networks (“CDN”), such as Cloud Flare and Max CDN. The additional plug-ins are generally based around the method of caching which means that the plug-in generates a static type page to replace the database driven page that is normally delivered by WordPress. Although this can make a difference the plug-in itself and the work required to generate the static page can again cause extra work for the server.

In looking at servers many people and small businesses host their websites on shared servers which means that your website may well be the same server as 1000 other websites. All these websites, like yours are vying for resources which are shared across every website. There is no problem in using shared hosting as it means the cost of hosting a website are dramatically lower than paying for a dedicated server.

The question then is how do you speed up your WordPress website without having to use any additional plug-ins? In fact in a perfect world you would want to limit your use of plug-ins to the absolute minimum.

JET PACK

WordPress provides its own plug-in called Jet Pack. Jet pack is a highly sophisticated plug-in that provides a wealth of services from stats to contact forms to galleries, and many more besides. The issue with jet pack is that it needs to be linked to WordPress.org in order for it to function properly. Not only is it a very resource hungry plug-in in its own right, by making the extra calls to WordPress.org you can find that your wordPress website Will not fully load until all the data are required for jet pack has been retrieved from WordPress.org. In real terms this can add a good few seconds onto your website or webpage load time.

Most people use jet pack for it’s ease of reporting stats from simply logging into your WordPress website. In analysing the traffic on your wordPress website you may best be placed in using Google analytics rather than relying upon jet pack. The data from Google analytics is much more in-depth in any event and will give you a much more comprehensive reporting facility in any event. Adding Google analytics to your WordPress website is very simple and does not require any plug-in should you have a basic grounding in coding.

I am going to take it as read that you know how to open a Google analytics account and that you know where to get the tracking code from within the account. I will be writing a further article on this and will link this paragraph into that article.

Quite simply copy and paste the Google analytics tracking code into the header.php file within your WordPress website. This file can be found via the WordPress dashboard by going to APPEARANCE >> EDITOR. At this point you will see a list of editable files on the right hand side of the page wherein header.php is included. In adding the Google analytics tracking code it is my experience that it is best place straight after thetag. Please note that when editing any of your WordPress website files it is always best practice to make a copy of either the file all the original coding content so that if you make a mistake, or are simply does not work for you then you can revert back to the original easily.

Once you have managed this another google analytics tracking code is in place then deactivate jet pack and delete it. This is by itself Will reduce the server workload and will increase your WordPress website page load speed.

.HTACCESS & COMPRESSION

Most WordPress websites by far are hosted on Apache-based servers. The key file that controls the efficiency of WordPress website is the .htaccess file which is located in the root domain. In order to edit this file you will need FTP access to your server and a moderate level of coding experience. It is vital that owing to the importance of the contents of this file that you make a copy immediately before you make any edits at all. Getting it wrong in the .htaccess file can mean that your website does not work at all.

A key method of lowering your WordPress website Page load speed is by applying compression to the website so that a much smaller version of it is served to your visitors browser. There is more than one method of applying compression, either through on page “on the fly” compression through PHP or making that compression site wide from the outset via the .htaccess file.

In utilising compression through the .htaccess file, you would add the following lines of code at the start of the file.

<IfModule mod_deflate.c>
# Insert filters
AddOutputFilterByType DEFLATE text/plain
AddOutputFilterByType DEFLATE text/html
AddOutputFilterByType DEFLATE text/xml
AddOutputFilterByType DEFLATE text/css
AddOutputFilterByType DEFLATE application/xml
AddOutputFilterByType DEFLATE application/xhtml+xml
AddOutputFilterByType DEFLATE application/rss+xml
AddOutputFilterByType DEFLATE application/javascript
AddOutputFilterByType DEFLATE application/x-javascript
AddOutputFilterByType DEFLATE application/x-httpd-php
AddOutputFilterByType DEFLATE application/x-httpd-fastphp
AddOutputFilterByType DEFLATE image/svg+xml
</IfModule>

Once these have been added and your .htaccess has been saved you can test the compression levels by visiting http://www.whatsmyip.org/http-compression-test/ and entering your domain name. A short test Will be run and result will show by how much your site is now compressed.

IMAGES

It goes without saying that you will want images and pictures on your website. You will already have noticed that photographs are usually very large with respect to that file sizes. By not optimising your images for the web you’ll find that this will dramatically increase your page load speed as your server tries to deliver that file to your visitor’s browser. In reducing the size you will need to use Photo editor such as photo shop or photo shop elements. When saving your image or photograph you will want to click “save for web or devices” and this will dramatically reduce the file size of any image.

Yahoo Aims To Become The Default Search Engine On Apple Devices

Yahoo is apparently attempting to persuade Apple to make it the default search engine on Apple devices like the iPad and iPhone. Google is the current default choice, although users can change to Yahoo or Bing by changing the settings of their device. Yahoo has a good relationship with Apple, as they already provide stock and weather information for Apple apps, but Marissa Mayer is said to have prepared a presentation and has already sounded out a number of executives from the tech giant.

Safari currently offers Google as its default search engine, and while it is possible to change the site that is used to conduct searches, the majority of users will simply stick with the default. Users can change by selecting Settings, clicking on Safari, and then locating the section labelled Search Engine under the General tab. Clicking the name of the current Search Engine, typically Google, will bring up a list of options including Yahoo. Make the selection and a check mark will appear next to the new choice.

Switching from Google to Yahoo is a simple process, but the vast majority of people will stick to the default choice, either because it is the one they prefer, because web browsers may have been optimised for an improved experience with that particular search engine, or simply because they don’t want to mess around in the settings section of their phone.

Yahoo has undergone considerable changes since Marissa Mayer took over as CEO, and they have been particularly active acquiring and taking over a number of social and mobile companies and apps. The mobile sector is clearly important to what was once the world’s leading search engine, and while iOS may not be the most popular mobile operating system any longer, it is a good place to start, and it seems far more likely that Mayer and her crew will be able to persuade Apple to make the move than they would Google.

Cornering the Apple corner of the mobile browser market would be a major coup for Yahoo, and it would represent a sign of intent by the company. Rather than continuing to buy new companies, they would be placing a marker down in the sand and presenting something of a challenge for the likes of Google.

What Is The Google Penguin 3.0 Refresh?

For those that frequent Webmaster forums and discussion boards, it will come as little surprise to learn that Google has released the latest update in the Penguin algorithm history. However, while it is an update in the literal sense, it is not an update in the algorithmic sense, and Googlers have been keen to point out that it is merely a refresh. Like hitting the f5 button, Google has simply run the most recent Penguin algorithm again, without making any changes, in order to level results.

The Penguin algorithm takes a direct sweep at unnatural and black hat linking techniques. In particular, it has penalised websites for the use of spammy links, and during the first and second rounds of Penguin updates, this saw many pages drop a long way down rankings. Webmasters were given the opportunity to make amends, following Penguin 2.0 and 2.1, by manually having links removed, or using the Google disavow tool.

This refresh, which has come just over a year since the last one, will have some negative connotations for a small number of sites. Those with unnatural links last time around will have already seen the impact of the penalty, so it is only those sites that have recently adopted black hat techniques, or new sites that have started on the wrong foot.

However, for those webmasters that have been diligently contacting site owners and asking to have links removed, while using the disavow tool, the news could be much more positive. Because the Penguin algorithm does not run automatically, it means that any link removal and disavowing that has been done will not have yielded results, as yet. However, Google’s clicking of the refresh button, means that those changes will now lead to a reverse in fortunes for sites owners that experienced a drop in rankings.

The refresh isn’t yet complete, with Google stating that it could take a few weeks to complete. It is too late to affect any positive changes for this round of updates, but it is possible that another refresh will occur at a later date, so webmasters that are affected this time around or not have taken action since the last update should still look to manually remove poor quality links.