1.
In which of the following situations might you recommend a client delivers a 503 HTTP header response?
Correct Answer
C. During site maintenance down time.
Explanation
A 503 HTTP header response is used to indicate that a server is temporarily unable to handle a request. In the context of the given options, recommending a client to deliver a 503 response during site maintenance downtime would be appropriate. This would inform the user that the website is undergoing maintenance and is temporarily unavailable, allowing them to understand the reason for the downtime.
2.
What do you feel is closest to the % click through rate (of all search traffic) received by sites ranking in position 1 of the natural search results?
Correct Answer
B. 25%
Explanation
The answer of 25% is the closest to the click-through rate received by sites ranking in position 1 of the natural search results. This means that approximately 25% of all search traffic clicks on the first result displayed in the search engine.
3.
In Excel, which function or formula is best for combining a categorised list of keywords from one source and an uncategorised list of keyword rankings from another, in order to arrive at a categorised list of keyword rankings?
Correct Answer
B. =vlookup
Explanation
The VLOOKUP function in Excel is the best choice for combining a categorised list of keywords from one source and an uncategorised list of keyword rankings from another. This function allows you to search for a value in one column and return a corresponding value from another column. By using VLOOKUP, you can match the keywords from the categorised list with the uncategorised list and retrieve the corresponding keyword rankings, thus creating a categorised list of keyword rankings.
4.
You've implemented a 301 redirect from Page A to Page B. Which of the following outcomes is most likely to happen?
Correct Answer
C. Over time, most of Page A's PageRank will be migrated to Page B.
Explanation
Over time, most of Page A's PageRank will be migrated to Page B because a 301 redirect is a permanent redirect that signals to search engines that Page A has permanently moved to Page B. This means that search engines will transfer the majority of the ranking signals, including PageRank, from Page A to Page B. However, it is important to note that not all of Page A's PageRank will be migrated, as there may be some loss of ranking signals during the redirect process.
5.
Which of the following is NOT a notable Google algorithm update?
Correct Answer
C. Caffeine
Explanation
Caffeine is not a notable Google algorithm update. Vince and Florida are both well-known algorithm updates by Google. Vince, also known as the "brand update," aimed to give more visibility to established brands in search results. Florida, on the other hand, was a major update that targeted low-quality and spammy websites, leading to significant changes in search rankings. Caffeine, however, was not an algorithm update but rather a rewrite of Google's indexing system, designed to improve the speed and comprehensiveness of search results.
6.
Which of the following is NOT a notable Google infrastructure update?
Correct Answer
B. Jonas
Explanation
Big Daddy and Caffeine are both notable Google infrastructure updates that have been widely discussed and recognized in the tech community. However, Jonas is not a notable Google infrastructure update. There is no significant information or evidence to suggest that Jonas is a recognized update or part of Google's infrastructure.
7.
When optimising a web page, of the following options which is the least important area to include your target keywords?
Correct Answer
B. Meta Keywords
Explanation
Meta keywords are the least important area to include target keywords when optimizing a web page. This is because search engines no longer consider meta keywords as a ranking factor. In the past, meta keywords were used by search engines to understand the content of a web page. However, due to keyword stuffing and manipulation, search engines now rely on other factors such as page titles, internal links, and body copy to determine the relevance and quality of a web page. Therefore, including target keywords in the meta keywords section is no longer necessary for SEO purposes.
8.
Which of the following is NOT a notable Google acquisition?
Correct Answer
D. Teoma (search engine)
Explanation
Teoma is not a notable Google acquisition because it was actually acquired by Ask Jeeves (now known as Ask.com) in 2001, not by Google. Google has made several notable acquisitions over the years, including Applied Semantics, Doubleclick, and Beat That Quote, but Teoma is not one of them.
9.
What is the most visible website in SEO in the financial comparison sector?
Correct Answer
B. Money Supermarket
Explanation
Money Supermarket is the most visible website in SEO in the financial comparison sector because it consistently ranks highly in search engine results for relevant keywords. It has a strong online presence and a large number of backlinks from reputable websites, which helps to improve its visibility. Additionally, Money Supermarket invests in SEO strategies such as optimizing its website for search engines, creating high-quality content, and regularly updating its site, which further enhances its visibility in the financial comparison sector.
10.
Who owns the social bookmarking website Delicious?
Correct Answer
B. AVOS
Explanation
AVOS is the correct answer because in 2011, the founders of YouTube, Chad Hurley and Steve Chen, acquired Delicious from Yahoo and formed a new company called AVOS to manage it. AVOS aimed to revitalize and improve the social bookmarking site, but eventually, in 2014, they sold Delicious to Science Inc., a technology investment firm. Therefore, AVOS briefly owned Delicious before transferring ownership to Science Inc.
11.
Which of the following sites famously applies the nofollow tag to all of it's outbound external links?
Correct Answer
B. Wikipedia
Explanation
Wikipedia famously applies the nofollow tag to all of its outbound external links. The nofollow tag is an HTML attribute that tells search engines not to follow or pass any link juice to the linked website. This means that when other websites link to Wikipedia, it does not receive any SEO benefit from those links. Wikipedia implemented this policy to prevent spam and maintain the integrity of its content.
12.
Which of the following could NOT be used to conduct backlink analysis?
Correct Answer
C. Majestic 12
Explanation
Majestic 12 is a search engine, not a tool specifically designed for backlink analysis. While Majestic SEO and Open Site Explorer are well-known tools used for backlink analysis, Majestic 12 does not have the same features and functionality as these tools. Therefore, it could not be used for conducting backlink analysis.
13.
Which of the following SEO tools could you employ to extract the title tags from a website?
Correct Answer
B. Xenu's Link Sleuth
Explanation
Xenu's Link Sleuth is an SEO tool that can be used to extract the title tags from a website. It crawls through the website and analyzes the HTML code to identify the title tags used on each page. This information can be helpful in understanding how the website is structured and optimizing the title tags for better search engine visibility. Open Site Explorer is a different SEO tool that focuses on backlink analysis, while Word Tracker is a keyword research tool.
14.
One of the following algorithms is NOT owned by Google. Which one?
Correct Answer
A. TrustRank
Explanation
TrustRank is an algorithm that is not owned by Google. While Hilltop and PageRank are both algorithms developed by Google, TrustRank is a separate algorithm that was developed by researchers at Stanford University. TrustRank is used to combat spam and identify trustworthy websites by analyzing the quality and credibility of incoming links. It is not directly associated with Google's search engine algorithm.
15.
Which firm patented the approach to web page segmentation and analysis known as "block level analysis"?
Correct Answer
C. Microsoft
Explanation
Microsoft patented the approach to web page segmentation and analysis known as "block level analysis".
16.
What server operating system would you expect a .htaccess file to be used on?
Correct Answer
B. Apache
Explanation
The correct answer is Apache. .htaccess files are commonly used on Apache web servers. They allow for configuration and customization of the server's behavior for specific directories or files.
17.
Approximately what is the maximum number of characters (including spaces) that Google will display from a page title in the SERPS?
Correct Answer
B. 65
Explanation
Google typically displays up to 65 characters from a page title in the SERPs (Search Engine Results Pages). This limit includes spaces, and it is important for website owners and SEO professionals to optimize their page titles within this character limit to ensure that the full title is displayed in search results. Going beyond this limit may result in truncation, where the title is cut off and only a portion of it is shown to users.
18.
Of the following options which one typically has the biggest impact when it comes to improving page load speeds?
Correct Answer
A. Minimising HTTP requests by combining JavaScript and CSS files.
Explanation
Minimising HTTP requests by combining JavaScript and CSS files typically has the biggest impact when it comes to improving page load speeds. This is because each HTTP request made by the browser to fetch a file adds overhead and increases the time it takes for the page to load. By combining multiple JavaScript and CSS files into a single file, the number of HTTP requests can be reduced, resulting in faster page load times. This technique helps to optimize the loading process and improve the overall performance of the webpage.
19.
Which of the following is the LEAST appropriate tool for dealing with duplication caused by excess URL parameters?
Correct Answer
C. Robots.txt
Explanation
Robots.txt is the least appropriate tool for dealing with duplication caused by excess URL parameters. Robots.txt is a text file that instructs search engine crawlers on how to crawl and index a website. It is used to block certain pages or directories from being crawled. However, it does not specifically address the issue of duplication caused by excess URL parameters. The other two options, Google Webmaster Tools and Canonical Tag, are more suitable for handling this problem. Google Webmaster Tools provides various features to manage URL parameters and identify duplicate content, while Canonical Tags are used to indicate the preferred version of a webpage when multiple versions with different URLs exist.
20.
Which of the following is not a known factor in the algorithm used to rank products in Google Product Search?
Correct Answer
B. Price consistency between your product feed and your website.
Explanation
The algorithm used to rank products in Google Product Search takes into account various factors such as the amount of product detail specified in your product feed and the keyword use in the product title and description. However, price consistency between your product feed and your website is not considered a known factor in the algorithm. This means that even if there is a difference in the prices mentioned in your product feed and your website, it will not directly affect the ranking of your products in Google Product Search.