Friday, 24 February 2017

Top 26 SEO Interview Questions You Should Be Prepared For in 2017




1. Define SEO and its types?


Ans: SEO is an acronym to Search engine optimization which is a process of changing web page or website position frequently in search engine results. This is one via using keywords or phrases.

The two main types of SEO are:

- On Page Optimization
- Off Page Optimization

2. What SEO tools do you prefer using?


Ans: I presently use Google analytic, Alexa, keyword search, open site explorer, and Google Webmaster.

3. What is Backlink?


Ans: Backlink is the incoming links to your website or webpage.

4. What are outbound links?


Ans: These are website links to another webpage or website.

5. What is Googlebot?


Ans: Googlebot is software used by Google to index a webpage. The software collects details from that specific webpage and the following:

- Caching
- Crawling
- Indexing

6. Explain Cross linking and its function?


Ans: Cross linking refers to the process of linking one site to another while ensuring a way to allow access to it. It has several functions such as providing users with reference sites comprising of content relevant to the search, methods built on the Internet, display page on the search engines using SEO techniques, etc. Ranking of a website is calculated on the basis of the website relevance before reflecting on the search engine. SEO tools offering reciprocal links and inbound links are used.

7. Why keyword is used in SEO?


Ans: Keyword is actually a single word. A combination of these makes phrases. Both keywords and phrases are used by search engines to populate various subjects on the internet. The search engine stores keywords in the database. Once a search is conducted, these will emerge with the best possible match.

8. Explain body content relevance function?


Ans: A text without images on a web page is referred to as body content relevance. It is also known as non-image text. Content relevance ensures efficient optimization of websites and boosts search engine ranking.

9. What are Spiders, Robots and Crawlers? Explain their functions?


Ans: There is no difference between spiders, robot and crawler. They are all same and referred by different names. These are software program that follows, or “Crawls” varied links across the internet. Thereafter, it grabs content from other sites and adds to the search engine indexes.

10. How will you check the impact of your SEO campaign? How will you come to know if it is working?


Ans: When offering SEO services for small business or any industry giant, I will check the website statistics that give information about the origin of traffic. Another way is to conduct a search based on the relevant keywords and key phrases and assess the search result. The search result number will let me know if the SEO campaign is working or not.

11. What do you know about competitive analysis?


Ans: The process compares the website that is being optimized with the website ranked highly in search results.

12. What if your SEO technique does not work?


Ans: I will first assess the root cause / problem and resolve them one by one:

- If the project is new, I will re-check the key words.
- I will search for relevant key-words.
- If all webpages of the website are indexed appropriately and the site does not appear on the first 10 pages - of search engine result page, I will make some changes in titles, page text, and description.
- In case, the website is not indexed well or has been dropped from the index, it has some serious issues which I will focus on and re-work.

13. Define PPC?


Ans: PPC is an acronym for Pay Per Click. It is actually an advertisement campaign hosted by Google. The campaign is categorised into two different modules:

CPC ( Cost per click) – Flat rate
CPM ( Cost per thousand impressions) – Bidding

CPC advertiser is charged only if the user clicks on the advert.

14. What do you know about 301 redirect?


Ans: 301 redirect is a technique that redirects user to new to old page url. This causes a permanent redirect which is also beneficial in directing the link juice to new url from old url.

15. What are Webmaster tools?


Ans: It is a service by Google, which offers crawl errors, information on the backlink, the indexing data, search queries, CTR etc.

16. Define keyword density. What is the formula for determining keyword density?


Ans: Keyword density helps stand out, your content from other websites. Yes, there is a formula that determines the keyword density:

The total number of keyword/total number of words in your article multiplied by 100.

17. What is robots.txt? 


Ans: Robots.txt is a text file that gives instruction to search engine crawlers about indexing and caching of a specific webpage, domain, or website file.

18. Assume that the company website you are working for decides to shift all contents to a new domain. How will you handle this situation?


Ans: First, you need to update the previous site with a permanent redirect to new page for all pages. The second step will be to remove previous content from search engine to avoid issues related to duplicate content.

19. Is it possible to optimize a website with over millions of pages?


Ans: Integration of extra special SEO stuffs for a dynamic website will serve this purpose. These include Good Internal link structure, Dynamic XML sitemap generation and Generation of dynamic title and description.

20. Bring up the latest update in SEO?


Ans: Panda and Penguin are the latest update in SEO.

21. What is meaning of “pizza box” in term of Google?


Ans: “Pizza Box” is when the google server comes in a standard case.

22. What is the difference between soft 404 and 404 error?


Ans: An error that shows without http//error form is known as soft 404 error. On the other hand, a normal 404 error that shows as http//error is 404 errors.

23. What are the key aspects of Panda update?


Ans: Panda improves Google search. The latest version of Panda focuses on high quality content, excellent design, speed, appropriate use of images and many more.

24. Explain key aspects of Penguin update?


Ans: Penguin is actually the code name for Google algorithm. The main target is to decrease website ranking that violates guidelines of Google Webmaster. Guidelines get violated by using black hat techniques such as keyword stuffing, cloaking etc.

25. What is the limit to robot.txt file?


Ans: At present, google enforces 500 kilobytes (KB) size limit.

26. What is the best way to neutralize a toxic link to a website?


Ans: Using a 'Backlink Quality Checker' will let you know who links to your website. Now, go to ‘Toxic link’ report. This is the place where you can find links harmful to your websites. Any link in ‘Toxic link report’ matching with the link on your website can be removed with the help of ‘Google Disavov tool’.

Thanks.....
EmoticonEmoticon