Deepcrawl Company: Trivia Quiz!

35 Questions

SettingsSettingsSettings
Please wait...
Deepcrawl Company: Trivia Quiz!

Do you know what is the Deepcrawl company? Deepcrawl Inc. Operates as a software development platform that analyzes the website landscape to understand and control technical issues to improve SEO performance. Regarding this quiz, you must be aware of indexation, what HTTPS is, what to do if your site is not showing up on Google, and what is considered slow page speed. Take this quiz and see how much you know about Deepcrawl company.


Questions and Answers
  • 1. 
    What is HTTPS?
    • A. 

      High Transmit Tech Protocol Service

    • B. 

      HTTPS is a secure way to send data between a web server and a web browser. ... Hypertext transfer protocol secure (HTTPS) is the secure version of HTTP

    • C. 

      HTTPS is a web certificate

  • 2. 
    What is indexation?
    • A. 

      Indexation is a technique to adjust income payments by means of a price index, in order to maintain the purchasing power of the public after inflation, while de-indexation is the unwinding of indexation.

    • B. 

      Indexation is the process we use to archive customer data during crawls to help increase the speed of the crawler.

    • C. 

      In order for your site's contents to be included in the results of your custom search engine, they need to be included in the Google index. When Google visits your site, it detects new and updated pages and updates the Google index.

  • 3. 
    How to check if your site is indexed?
    • A. 

      Type the following into Google’s search bar: “site:yoursitename.com” and instantly view the count of indexed pages for your site.

    • B. 

      Run a crawl.

    • C. 

      Call Google.

    • D. 

      Go your website site map and view the stats.

  • 4. 
    Which meta tag blocks pages?
    • A. 

      NOINDEX

    • B. 

      NOFOLLOW

    • C. 

      NOARCHIVE

  • 5. 
    What should you do if your site is not showing up on Google?
    • A. 

      Use Search Console to save the website URL

    • B. 

      Create a Sitemap 

    • C. 

      Submit a Sitemap of your website to Google via Search Console

    • D. 

      Advertise the website

  • 6. 
    What is a Robots.txt file?
    • A. 

      Robots.txt is a text file webmasters create to instruct robots (typically search engine robots) how to crawl & index pages on their website.

    • B. 

      Robots.txt is a file that allows a search engine robot access to the site.

    • C. 

      Robots.txt is a file that instructs the robot to crawl HTTPS pages only.

  • 7. 
    An improperly configured robots.txt file...
    • A. 

      Increases traffic.

    • B. 

      Will break your site.

    • C. 

      Destroys your organic site traffic.

  • 8. 
    Which term tells all robots to stay out of a website?
    • A. 

      User-agent: *Block: /

    • B. 

      User-agent: *Disallow: /

    • C. 

      User-agent: *Guard: /

    • D. 

      User-agent: *Stop: /

  • 9. 
    Which one do you like?
    • A. 

      Option 1

    • B. 

      Option 2

    • C. 

      Option 3

    • D. 

      Option 4

  • 10. 
    When is it common to use a NOINDEX tag?
    • A. 

      To hide old products.

    • B. 

      When hiding developer pages.

    • C. 

      When a site is in development.

  • 11. 
    If there’s no known reason you see a “NOINDEX” or “NOFOLLOW” in your source code what should you do?
    • A. 

      Have your developer change it to read <dsv345y5g3458-FOLLOWALL-asdfv54v34£$vre>.

    • B. 

      Have your developer change it to read <meta name="robots" content=" INDEX, FOLLOW"> or remove the tag altogether.

    • C. 

      Have your developer change it to read <meta name="BOT" content=" INDEX, ALLOW">.

  • 12. 
    What is considered slow page speed?
    • A. 

      3 seconds or less

    • B. 

      5 seconds or less

    • C. 

      1 second or less

  • 13. 
    When detecting specific speed problems with your site be sure to check...
    • A. 

      Desktop as well as Linux

    • B. 

      Desktop as well as mobile performance.

    • C. 

      Android and IOS

  • 14. 
    Common site speed solutions can include
    • A. 

      Server optimization and server caching.

    • B. 

      Javascript caching, server response time minifying and removing images.

    • C. 

      Image optimization, browser caching improvement, server response time improvement and JavaScript minifying.

  • 15. 
    If you discover multiple indexed versions of your Homepage you should...
    • A. 

      Set up 302 redirects

    • B. 

      Set up 401 redirects

    • C. 

      Set up 301 redirects.

  • 16. 
    What is the difference between a 301 redirect and Canonical
    • A. 

      A 301 Redirect signals to the search engine that the page has been moved permanently, remove the page from the index and pass any acquired SEO credit to the new page. A Canonical Attribute signals to the search engine that the document has multiple versions of the page (or its content).

    • B. 

      A 301 Redirect signals to the server that the page has been deleted, remove the page from the index and pass any acquired SEO credit to the new page. A Canonical Attribute signals to the search engine that the document has one versions of the page (or its content).

    • C. 

      A 301 Redirect signals to the search engine that the page has been moved temporarily, remove the page from the index and pass any acquired SEO credit to the new page. A Canonical Attribute signals to the server that the document has up to 10 versions of the page (or its content).

  • 17. 
    Canonical tags are typically important for what types of sites?
    • A. 

      Publishing sites

    • B. 

      E-commerce sites

    • C. 

      Online community sites

  • 18. 
    Why does canonicalization matter?
    • A. 

      Many sites automatically add tags, allow multiple paths (and URLs) to the same content, and add URL parameters for searches, sorts, currency options, etc.

    • B. 

      It’s usually a good idea to put a canonical tag on your homepage template to prevent unforeseen problems.

    • C. 

      If search crawlers have to wade through too much duplicate content, they may miss some of your unique content. Large-scale duplication may dilute your ranking ability. Search engines may pick the wrong URL as the "original."

  • 19. 
    What are the main reasons for duplicate content?
    • A. 

      Ecommerce site store items appear on multiple versions of the same URL. Printer-only web pages repeat content on the main page. The same content appears in multiple languages on an international site.

    • B. 

      Ecommerce site store items appear on multiple versions of the same URL. Printer-only web pages repeat multiple languages on an international site. The same content appears in multiple versions from the main page.

  • 20. 
    What is the main problem with duplicate content?
    • A. 

      The problem with duplicate content is that it may slow down the loading speed of your overall site.

    • B. 

      The problem with duplicate content is that it may “confuse” search engine crawlers and prevent the correct content from being served to your target audience.

    • C. 

      The problem with duplicate content is that it may increase the size of your site map which will confuse Google bot and reduce organic traffic.

  • 21. 
    What is the benefit of an Alt Tag?
    • A. 

      It’s a simple way to boost the SEO organic traffic.

    • B. 

      The image alt tag attribute helps search engines index a page by telling the bot what the image is all about.

    • C. 

      The alt tag helps you find duplicate images on pages.

  • 22. 
    Why are broken links important?
    • A. 

      Broken links create a messy site that a crawler will simply avoid.

    • B. 

      Broken links instantly create a high ranking but will reduce your traffic and visibility.

    • C. 

      Broken links create a poor user experience and reflect lower quality content, a factor that can affect page ranking.

  • 23. 
    What is 'structured data'?
    • A. 

      Structured data allows the crawler to be more efficient with crawling as it helps with easy archiving for each crawl.

    • B. 

      Structured data is a way for crawl data to be presented to the user.

    • C. 

      Structured data is a simple way to help Google search crawlers understand the content and data on a page.

  • 24. 
    What is mobile-first indexing?
    • A. 

      Mobile-first indexing means Google predominantly uses the mobile version of the content for indexing and ranking.

    • B. 

      Mobile-first indexing means Google crawls the mobile version first and then your desktop version.

    • C. 

      Mobile-first indexing means Google predominantly uses the mobile version of the content for comparing against your Desktop site.

  • 25. 
    What is an m-Dot?
    • A. 

      An m-Dot (mdot, m.) is a way of converting a responsive site to a mobile friendly site.

    • B. 

      An m-Dot (mdot, m.) site is a website that’s specifically designed for mobile devices and exists on a separate subdomain.

    • C. 

      An m-Dot (mdot, m.) site is a website that is specifically designed for mobile devices but is linked directly to the desktop sites which makes life easier for Google when crawling.