Pages

Wednesday, April 25, 2007

Indexar o seu site

Após o registo de um domínio e criação de um site, a maioria dos webmasters quer ver o seu site indexado e aparecer nas primeiras posições no Google. Desde que iniciámos o suporte a webmasters de língua Portuguesa em 2006, vimos grande especulação acerca da forma como o Google indexa e avalia os sites. O mercado de língua Portuguesa, ainda numa fase de desenvolvimento em relação a SEO, é um dos maiores geradores de conteúdo na internet, por isso decidimos clarificar algumas das questões mais pertinentes.

Notámos como prática comum entre webmasters de língua Portuguesa a tendência para entrar em esquemas massivos de troca de links e a implementação de páginas única e exclusivamente para este fim, sem terem em consideração a qualidade dos links, a origem ou o impacto que estes terão nos seus sites a longo termo; outros temas populares englobam também uma preocupação excessiva com o PageRank ou a regularidade com que o Google acede aos seus sites.
Geralmente, o nosso conselho para quem pretende criar um site é começar por considerar aquilo que têm para oferecer antes de criar qualquer site ou blog. A receita para um site de sucesso é conteúdo original, onde os utilizadores possam encontrar informação de qualidade e actualizada correspondendo às suas necessidades.

Para clarificar alguns destes temas, compilámos algumas dicas para webmasters de língua Portuguesa:

  • Ser considerado autoridade no assunto. Ser experiente num tema guiará de forma natural ao seu site utilizadores que procuram informação especificamente relacionada com o assunto do site. Não se preocupe demasiado com back-links ou PageRank, ambos irão surgir de forma natural acompanhando a importância e relevância do seu site. Se os utilizadores considerarem a sua informação útil e de qualidade, eles voltarão a visitar, recomendarão o seu site a outros utilizadores e criarão links para o mesmo. Isto tem também influência na relevância do seu site para o Google – se é relevante para os utilizadores, certamente será relevante para o Google na mesma proporção.
  • Submeta o seu conteúdo no Google e mantenha-o actualizado frequentemente. Este é outro ponto chave que influencia a frequência com que o seu site é acedido pelo Google. Se o seu conteúdo não é actualizado ou se o seu site não é relevante, o mais certo é o Google não aceder ao seu site com a mesma frequência que você deseja. Se acha que o Google não acede ao seu site de uma forma constante, talvez isto seja uma dica para que actualize o site mais frequentemente. Além disso na Central do Webmaster o Google disponibiliza as Ferramentas para Webmasters, ferramentas úteis que o ajudarão na indexação.
  • Evite puras afiliações. Na América Latina há uma quantidade massiva de sites criados apenas para pura afiliação, tais como as lojas afiliadas do mercadolivre. Não há problema em ser afiliado desde que crie conteúdo original e de qualidade para os utilizadores, um bom exemplo é a inclusão de avaliação e críticas de produtos de forma a ajudar o utilizador na decisão da compra.
  • Não entre em esquemas de troca de links. Os esquemas de troca de links ou negócios que prometem aumentar a visibilidade do seu site com o mínimo de esforço, podem levar a um processo de correcção por parte do Google. As nossas Directrizes de Ajuda do Webmaster mencionam claramente esta prática na secção "Directrizes de Qualidade – princípios básicos". Evite entrar neste tipo de esquemas e não crie páginas apenas para troca de links. Tenha em mente que não é o número de links que apontam para o seu site que conta, mas a qualidade e relevância desses links.
  • Use o AdSense de forma correcta. Monetizar conteúdo original e de qualidade levará a uma melhor experiência com o AdSense comparado com directórios sem qualquer tipo de qualidade ou conteúdo original. Sites sem qualquer tipo de valor levam os utilizadores a abandoná-los antes mesmo de estes clicarem em qualquer anúncio.
    Lembre-se que o processo de indexação e de acesso ao seu site pelo Google engloba muitas variáveis e em muitos casos o seu site não aparecerá no índice tão depressa quanto esperava. Se não está seguro acerca de um problema particular, considere visitar as Directrizes de Ajuda do Webmaster ou peça ajuda na sua comunidade. Na maioria dos casos encontrará a resposta que procura de outros utilizadores mais experientes. Um dos sítios recomendados para começar é o Grupo de Discussão de Ajuda a Webmasters que monitorizamos regularmente.

Getting your site indexed

After registering a domain and creating a website, the next thing almost everybody wants is to get it indexed in Google and rank high. Since we started supporting webmasters in the Portuguese language market in 2006, we saw a growing speculation about how Google indexes and ranks websites. The Portuguese language market is one of the biggest web content generators and it's still in development regarding SEO, so we decided to shed some light into the main debated questions.

We have noticed that it is very popular among Portuguese webmasters to engage in massive link exchange schemes and to build partner pages exclusively for the sake of cross-linking, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites; other popular issues involve an over-concern with PageRank and how often Google crawls their websites.

Generally, our advice is to consider what you have to offer, before you create your own website or blog. The recipe for a good and successful site is unique and original content where users find valuable and updated information corresponding to their needs.

To address some of these concerns, we have compiled some hints for Portuguese webmasters:

  • Be an authority on the subject. Being experienced in the subject you are writing about will naturally drive users to your site who search for that specific subject. Don't be too concerned about back-links and PageRank, both will grow naturally as your site becomes a reference. If users find your site useful and of good quality, they will most likely link to your site, return to it and/or recommend your site to other users. This has also an influence on how relevant your site will be to Google — if it's relevant for the users, than it's likely that it is relevant to Google as well.
  • Submit your content to Google and update it on a frequent basis. This is another key factor for the frequency with which your site will be crawled. If your content is not frequently updated or if your site is not relevant to the subject, most likely you will not be crawled as often as you would like to be. If you wonder why Google doesn't crawl your sites on a frequent or constant basis, then maybe this is a hint and you should look into updating your site more often. Apart from that in the Webmasters Central we offer Webmaster tools to help you get your site crawled.
  • Don't engage in link exchange schemes. Be aware that link exchange programs or deals that promise to boost your site visibility with a minimum effort might entail some corrective action from Google. Our Google Webmasters Guidelines clearly address this issue under "Quality Guidelines – basic principles". Avoid engaging in these kind of schemes and don't build pages specifically for exchanging links. Bear in mind that it is not the number of links you have pointing to your site that matters, but the quality and relevance of those links.
  • Avoid pure affiliations. In the Latin America market there is a massive number of sites created just for pure affiliation purposes such as pure mercadolivre catalogs. There is no problem in being an affiliate as long as you create some added value for your users and produce valuable content that a user can't find anywhere else like product reviews and ratings.
  • Use AdSense wisely. Monetizing original and valuable content will generate you more revenue from AdSense compared to directories with no added value. Be aware that sites without added value will turn away users from your site before they will ever click on an AdSense ad.

You should bear in mind that the process of indexing and how Google crawls your site includes many variables and in many cases your site won't come up as quickly in the SERPs as you expected. If you are not sure about some particular issue, consider visiting the Google Webmasters Guidelines or seek guidance in your community. In most cases you will get good advice and positive feedback from more experienced users. One of the recommended places to start is the Google discussion group for webmasters (in English) as well as the recently launched Portuguese discussion group for webmasters which we will monitor on a regular basis.

Tuesday, April 24, 2007

Come out to SMX Advanced in Seattle and party with Webmaster Central

Our team at Webmaster Central is always looking for ways to communicate with you, the webmaster community. We do through providing tools that tell you more about your site and let you give us input about your site, talking to you in our discussion forums, reading what you have to say across the blogs and forums on the web, blogging here, and by talking to you in person at conferences. We can't talk to as many of you in person as we can reach through other means, such as this blog, but we find meeting face-to-face to be invaluable.

So, we're very excited about an upcoming conference in our hometown, Seattle -- SMX Advanced, June 4-5. Since it's nearby, many from our team can attend and we're hoping to hear more about what you like and what you'd like to see us do in the coming year. We're participating in two summits at this conference. Summits are a great way to find out exactly what issues you're facing and explore ways we can solve them together. We can weigh the alternatives and make sure we understand the obstacles from your perspective. The recent robots.txt summit was a great opportunity for all the search engines to get together and brainstorm with you, the webmaster. We came away from that with lots of great ideas and a better understanding of what you're looking for most with the evolution of robots.txt. We hope to do the same with the two summits at SMX Advanced.

At the Duplicate Content Summit, I'd love to talk to you about the types of situations you're facing with your site. Are you most concerned about syndicating your content? Using dynamic URLs with changing parameters? Providing content for sites in multiple countries? For each issue, we'll talk about ways we can tackle them. What solutions can we offer that will work best for you? I'm very excited about what we can accomplish at this summit, although I'm not quite as excited about the 9am start time. Fortunately, our party isn't the night before.

At the Penalty Box Summit, Matt Cutts will be on hand to talk to you about all the latest with our guidelines and reinclusion procedures. And he'll want to hear from you. What concerns do you have about the guidelines? How can we better communicate violations to you? Unfortunately, our party is the night before this session, but I'm sure there will be lots of coffee on hand.

And speaking of the party... since conference attendees are coming all the way to Seattle, we thought we should throw one. The Google Seattle/Kirkland office and the Webmaster Central team are hosting SMX After Dark: Google Dance NW on Monday night. We want to say thanks to you for this great partnership, as well as give you the chance to learn more about what we've been up to. We'll have food, drinks, games (Pacman and Dance Dance Revolution anyone?), and music. Talk to the Webmaster Central engineers, as well as engineers from our other Kirkland/Seattle product teams, such as Talk, Video, and Maps. We may even have a dunk tank! Who would you most like to try your hand at dunking?

Thursday, April 19, 2007

Estuvimos presentes en Madrid

El pasado 8 y 9 de Marzo asistimos al congreso OJOBuscador en Madrid. Este evento fue muy interesante para nosotros dado que nos dio la oportunidad de escuchar a los ponentes de distintos motores de búsqueda y dialogar con los webmasters acerca de sus principales inquietudes relacionadas con el posicionamiento en buscadores en español. Uno de los puntos que se mencionaron con frecuencia, tanto en las sesiones de posicionamiento como en las charlas informales, fue la desventaja de competir en el mercado SEO donde varias empresas utilizan métodos solapados que van en contra de las directrices oficiales de Google.

Entre las técnicas que hemos observado están las de generar dominios satélite o crear infinidades de páginas irrelevantes con el único objetivo de ganar tráfico en búsquedas que no están necesariamente relacionadas con el contenido del sitio. Otro fenómeno que hemos observado es la continua aparición de dominios que sólo tienen contenido procedente de afiliados sin aportar valor único o relevante.

Vamos a ser más severos con las técnicas previamente mencionadas, porque en Google consideramos que es muy importante no defraudar a los usuarios. Por otra parte, consideramos que la responsabilidad última de los contenidos de un sitio pertenece al webmaster, quien debe velar por su calidad y verificar que sus paginas tengan como finalidad primera satisfacer a los usuarios.

Con el propósito de mejorar la comunicación con la comunidad de webmasters, queremos anunciar que Google a partir de ahora estará presente y participará activamente en el Foro de Google para webmasters.

We were in Madrid

Last 8th and 9th of March we went to the OJOBuscador conference in Madrid. The event was very interesting for us since we had the chance to listen to presentations from the main search engines and to discuss with the webmasters their main concerns regarding Spanish search engine positioning. One key point mentioned frequently both in the SEO sessions and in the informal chats was the disadvantage of working in an SEO market where several companies use sneaky methods that go against the official Webmaster Guidelines.

There are some techniques such as generating satellite domains or creating thousands of irrelevant pages with the sole purpose of gaining traffic in search queries that are not always related to sites' content. Another phenomenon we have observed is the steady influx of domains which only have content from affiliate sites without adding any unique value or relevance.

We are going to be stricter against the techniques previously discussed, as we consider it is very important to avoid deceiving the users. Nevertheless, we think the ultimate responsibility for the contents of a website belongs to the webmaster, who should watch over the site quality and verify that the pages are made for the user.

Aiming to enhance communication with the webmaster community, we would like to announce that going forward Google will participate and monitor the Spanish Webmaster Discussion Forum.
Após o registo de um domínio e criação de um site, a maioria dos webmasters quer ver o seu site indexado e aparecer nas primeiras posições no Google. Desde que iniciámos o suporte a webmasters de língua Portuguesa em 2006, vimos grande especulação acerca da forma como o Google indexa e avalia os sites. O mercado de língua Portuguesa, ainda numa fase de desenvolvimento em relação a SEO, é um dos maiores geradores de conteúdo na internet, por isso decidimos clarificar algumas das questões mais pertinentes.

Notámos como prática comum entre webmasters de língua Portuguesa a tendência para entrar em esquemas massivos de troca de links e a implementação de páginas única e exclusivamente para este fim, sem terem em consideração a qualidade dos links, a origem ou o impacto que estes terão nos seus sites a longo termo; outros temas populares englobam também uma preocupação excessiva com o PageRank ou a regularidade com que o Google acede aos seus sites.

Geralmente, o nosso conselho para quem pretende criar um site é começar por considerar aquilo que têm para oferecer antes de criar qualquer site ou blog. A receita para um site de sucesso é conteúdo original, onde os utilizadores possam encontrar informação de qualidade e actualizada correspondendo às suas necessidades.

Para clarificar alguns destes temas, compilámos algumas dicas para webmasters de língua Portuguesa:
  • Ser considerado autoridade no assunto. Ser experiente num tema guiará de forma natural ao seu site utilizadores que procuram informação especificamente relacionada com o assunto do site. Não se preocupe demasiado com back-links ou PageRank, ambos irão surgir de forma natural acompanhando a importância e relevância do seu site. Se os utilizadores considerarem a sua informação útil e de qualidade, eles voltarão a visitar, recomendarão o seu site a outros utilizadores e criarão links para o mesmo. Isto tem também influência na relevância do seu site para o Google – se é relevante para os utilizadores, certamente será relevante para o Google na mesma proporção.
  • Submeta o seu conteúdo no Google e mantenha-o actualizado frequentemente. Este é outro ponto chave que influencia a frequência com que o seu site é acedido pelo Google. Se o seu conteúdo não é actualizado ou se o seu site não é relevante, o mais certo é o Google não aceder ao seu site com a mesma frequência que você deseja. Se acha que o Google não acede ao seu site de uma forma constante, talvez isto seja uma dica para que actualize o site mais frequentemente. Além disso na Central do Webmaster o Google disponibiliza as Ferramentas para Webmasters, ferramentas úteis que o ajudarão na indexação.
  • Evite puras afiliações. Na América Latina há uma quantidade massiva de sites criados apenas para pura afiliação, tais como as lojas afiliadas do mercadolivre. Não há problema em ser afiliado desde que crie conteúdo original e de qualidade para os utilizadores, um bom exemplo é a inclusão de avaliação e críticas de produtos de forma a ajudar o utilizador na decisão da compra.
  • Não entre em esquemas de troca de links. Os esquemas de troca de links ou negócios que prometem aumentar a visibilidade do seu site com o mínimo de esforço, podem levar a um processo de correcção por parte do Google. As nossas Directrizes de Ajuda do Webmaster mencionam claramente esta prática na secção "Directrizes de Qualidade – princípios básicos". Evite entrar neste tipo de esquemas e não crie páginas apenas para troca de links. Tenha em mente que não é o número de links que apontam para o seu site que conta, mas a qualidade e relevância desses links.
  • Use o AdSense de forma correcta. Monetizar conteúdo original e de qualidade levará a uma melhor experiência com o AdSense comparado com directórios sem qualquer tipo de qualidade ou conteúdo original. Sites sem qualquer tipo de valor levam os utilizadores a abandoná-los antes mesmo de estes clicarem em qualquer anúncio.
Lembre-se que o processo de indexação e de acesso ao seu site pelo Google engloba muitas variáveis e em muitos casos o seu site não aparecerá no índice tão depressa quanto esperava. Se não está seguro acerca de um problema particular, considere visitar as Directrizes de Ajuda do Webmaster ou peça ajuda na sua comunidade. Na maioria dos casos encontrará a resposta que procura de outros utilizadores mais experientes. Um dos sítios recomendados para começar é o Grupo de Discussão de Ajuda a Webmasters que monitorizamos regularmente.

Wednesday, April 18, 2007

More insight into anchor text

Last month, we replaced the individual anchor text words that we showed for your site in webmaster tools with a list of full anchor phrases. This report shows you the top phrases that other sites use to link to the pages of your site. Now, we've enhanced the information we show you in the following ways:
  • We've expanded the number of phrases we show to 200.
  • You can now see the variations of each phrase (for instance, with different capitalization and punctuation).
  • More sites now have access to the anchor phrase report. So, if you didn't have this report before, you may have it now.
  • We've brought back the report showing the most common individual words in anchor text (you asked; we delivered!).
  • We've expanded the number of common words in anchor text and common words in your site that we show to 100 each.
To view this information, click the Page analysis link from the Statistics tab.


In addition, we've updated our robots.txt analysis tool to correctly interpret the new Sitemap instruction that we announced support for last week.

We hope this additional insight is helpful in learning how others view your site and keep your suggestions coming! We're listening.

Tuesday, April 17, 2007

Requesting removal of content from our index

Note: The user-interface of the described features has changed.

As a site owner, you control what content of your site is indexed in search engines. The easiest way to let search engines know what content you don't want indexed is to use a robots.txt file or robots meta tag. But sometimes, you want to remove content that's already been indexed. What's the best way to do that?

As always, the answer begins: it depends on the type of content that you want to remove. Our webmaster help center provides detailed information about each situation. Once we recrawl that page, we'll remove the content from our index automatically. But if you'd like to expedite the removal rather than wait for the next crawl, the way to do that has just gotten easier.

For sites that you've verified ownership for in your webmaster tools account, you'll now see a new option under the Diagnostic tab called URL Removals. To get started, simply click the URL Removals link, then New Removal Request. Choose the option that matches the type of removal you'd like.



Individual URLs
Choose this option if you'd like to remove a URL or image. In order for the URL to be eligible for removal, one of the following must be true:
Once the URL is ready for removal, enter the URL and indicate whether it appears in our web search results or image search results. Then click Add. You can add up to 100 URLs in a single request. Once you've added all the URLs you would like removed, click Submit Removal Request.

A directory
Choose this option if you'd like to remove all files and folders within a directory on your site. For instance, if you request removal of the following:

http://www.example.com/myfolder

this will remove all URLs that begin with that path, such as:

http://www.example.com/myfolder
http://www.example.com/myfolder/page1.html
http://www.example.com/myfolder/images/image.jpg

In order for a directory to be eligible for removal, you must block it using a robots.txt file. For instance, for the example above, http://www.example.com/robots.txt could include the following:

User-agent: Googlebot
Disallow: /myfolder

Your entire site
Choose this option only if you want to remove your entire site from the Google index. This option will remove all subdirectories and files. Do not use this option to remove the non-preferred version of your site's URLs from being indexed. For instance, if you want all of your URLs indexed using the www version, don't use this tool to request removal of the non-www version. Instead, specify the version you want indexed using the Preferred domain tool (and do a 301 redirect to the preferred version, if possible). To use this option, you must block the site using a robots.txt file.

Cached copies

Choose this option to remove cached copies of pages in our index. You have two options for making pages eligible for cache removal.

Using a meta noarchive tag and requesting expedited removal
If you don't want the page cached at all, you can add a meta noarchive tag to the page and then request expedited cache removal using this tool. By requesting removal using this tool, we'll remove the cached copy right away, and by adding the meta noarchive tag, we will never include the cached version. (If you change your mind later, you can remove the meta noarchive tag.)

Changing the page content
If you want to remove the cached version of a page because it contained content that you've removed and don't want indexed, you can request the cache removal here. We'll check to see that the content on the live page is different from the cached version and if so, we'll remove the cached version. We'll automatically make the latest cached version of the page available again after six months (and at that point, we likely will have recrawled the page and the cached version will reflect the latest content) or, if you see that we've recrawled the page sooner than that, you can request that we reinclude the cached version sooner using this tool.

Checking the status of removal requests
Removal requests show as pending until they have been processed, at which point, the status changes to either Denied or Removed. Generally, a request is denied if it doesn't meet the eligibility criteria for removal.


To reinclude content
If a request is successful, it appears in the Removed Content tab and you can reinclude it any time simply by removing the robots.txt or robots meta tag block and clicking Reinclude. Otherwise, we'll exclude the content for six months. After that six month period, if the content is still blocked or returns a 404 or 410 status message and we've recrawled the page, it won't be reincluded in our index. However, if the page is available to our crawlers after this six month period, we'll once again include it in our index.

Requesting removal of content you don't own
But what if you want to request removal of content that's located on a site that you don't own? It's just gotten easier to do that as well. Our new Webpage removal request tool steps through the process for each type of removal request.

Since Google indexes the web and doesn't control the content on web pages, we generally can't remove results from our index unless the webmaster has blocked or modified the content or removed the page. If you would like content removed, you can work with the site owner to do so, and then use this tool to expedite the removal from our search results.

If you have found search results that contain specific types of personal information, you can request removal even if you've been unable to work with the site owner. For this type of removal, provide your email address so we can work with you directly.



If you have found search results that shouldn't be returned with SafeSearch enabled, you can let us know using this tool as well.

You can check on the status of pending requests, and as with the version available in webmaster tools, the status will change to Removed or Denied once it's been processed. Generally, the request is denied if it doesn't meet the eligibility criteria. For requests that involve personal information, you won't see the status available here, but will instead receive an email with more information about next steps.

What about the existing URL removal tool?
If you've made previous requests with this tool, you can still log in to check on the status of those requests. However, make any new requests with this new and improved version of the tool.

Wednesday, April 11, 2007

What's new with Sitemaps.org?

What has the Sitemaps team been up to since we announced sitemaps.org? We've been busy trying to get Sitemaps adopted by everyone and to make the submission process as easy and automated as possible. To that end, we have three new announcements to share with you.

First, we're making the sitemaps.org site available in 18 languages! We know that our users are located all around the world and we want to make it easy for you to learn about Sitemaps, no matter what language you speak. Here is a link to the Sitemap protocol in Japanese and the FAQ in German.

Second, it's now easier for you to tell us where your Sitemaps live. We wondered if we could make it so easy that you wouldn't even have to tell us and every other search engine that supports Sitemaps. But how? Well, every website can have a robots.txt file in a standard location, so we decided to let you tell us about your Sitemap in the robots.txt file. All you have to do is add a line like

Sitemap: http://www.mysite.com/sitemap.xml

to your robots.txt file. Just make sure you include the full URL, including the http://. That's it. Of course, we still think it's useful to submit your Sitemap through Webmaster tools so you can make sure that the Sitemap was processed without any issues and you can get additional statistics about your site

Last but not least, Ask.com is now also supporting the Sitemap protocol. And with the ability to discover your Sitemaps from your robots.txt file, Ask.com and any other search engine that supports this change to robots.txt will be able to find your Sitemap file.

Friday, April 6, 2007

Drop by and see us at SES NY

If you're planning to attend the Search Engine Strategies conference next week in New York, be sure to come by and say hi! A whole bunch of us from the Webmaster Central team will be there, looking to talk to you, get your feedback, and answer your questions. Be sure to join us for lunch on Tuesday, April 10th, where we'll spend an hour answering any question you may have. And then come by our other sessions, or find us in the expo hall or the bar.

Tuesday, April 10

11:00am - 12:30pm

Ads in a Quality Score World
Nick Fox, Group Business Product Manager, Ads Quality

12:45 - 1:45

Lunch Q&A with Google Webmaster Central


Vanessa Fox, Product Manager, Webmaster Central


Trevor Foucher, Software Engineer
Jonathan Simon, Webmaster Trends Analyst
Maile Ohye, Sitemaps Developer Support Engineer
Nikhil Gore, Test Engineer
Amy Lanfear, Technical Writer

Susan Mowska, International Test Engineer
Evan Roseman, Software Engineer

Wednesday, April 11

10:30pm - 12:00pm

Web Analytics & Measuring Success
Brett Crosby, Product Marketing Manager, Google Analytics

Sitemaps & URL Submission
Maile Oyhe, Sitemaps Developer Support Engineer

1:30pm - 2:45pm

Duplicate Content & Multiple Site Issues
Vanessa Fox, Product Manager, Webmaster Central

Meet the Search Ad Networks
Brian Schmidt, Online Sales and Operations Manager

3:15pm - 4:30pm

Earning Money from Contextual Ads
Gavin Bishop, GBS Sales Manager, AdSense

4:45pm - 6:00pm

Landing Page Testing & Tuning
Tom Leung, Product Manager, Google Website Optimizer

robots.txt Summit
Dan Crow, Product Manager

Thursday, April 12

9:00am - 10:15am

Meet the Crawlers
Evan Roseman, Software Engineer

Search Arbitrage Issues
Nick Fox, Group Business Product Manager, Ads Quality

11:00am - 12:15pm

Images & Search Engines
Vanessa Fox, Product Manager, Webmaster Central

4:00pm - 5:15pm

Auditing Paid Listings & Click Fraud Issues
Shuman Ghosemajumder, Business Product Manager, Trust and Safety

Friday, April 13

12:30pm - 1:45pm

Search Engine Q&A on Links
Evan Roseman, Software Engineer

CSS, Ajax, Web 2.0 and Search Engines
Dan Crow, Product Manager

Thursday, April 5, 2007

Linkowanie

Szczególnie popularną wśród polskich webmasterów metodą optymalizacji stron pod kątem wyszukiwarek jest wymiana lub zakup linków o wysokim PageRank. W przeszłości niewątpliwie była to jedna z możliwości, która faktycznie przynosiła efekty. Niestety przy wyborze linków użytkownicy i ich zainteresowania nie zawsze są uwzględniane. Prowadzi to do linkowania serwisów i stron internetowych niezwiązanych ze sobą tematycznie. Tego typu linki nie stanowią żadnej wartości informacyjnej dla osób odwiedzających i są postrzegane jako nieetyczna metoda SEO, podobnie jak ukryty tekst. Wytyczne Google dla webmasterów jednoznacznie odnoszą się do takich praktyk.

Dbając o polskich użytkowników, Google niedawno ulepszył algorytmy i metody weryfikacji istotnych linków. Celem tych starań jest udostępnienie jak najlepszych wyników SERP (strony z wynikami wyszukiwania).

Jak więc należy poprawnie linkować strony internetowe, aby nie wykraczać poza wytyczne Google?
Starając się podwyższyć PageRank i dzięki temu osiągnąć lepsze notowanie strony w SERP, należy kierować się potrzebami potencjalnych użytkowników odwiedzających dany serwis, zarówno przy wyborze treści, jak i linków. Linkowanie do i linki z tematycznie związanych stron są doceniane przez Google i bez wątpienia będą pozytywnie wpływać na pozycje w indeksie. Równocześnie Google
dąży do zlikwidowania wpływu masowej wymiany linków tematycznie rozbieżnych oraz linków zakupionych. Odnosi się to również do zautomatyzowanych systemów wymiany linków.

Jak więc postarać się o wartościowe linki?
Najlepszą metodą uzyskania dobrych linków
jest niepowtarzalna, interesująca treść, która w naturalny sposób zdobędzie popularność w społeczności internetowej, a szczególnie wśród grona osób zainteresowanych danym tematem, na przykład u autorów blogów. Naturalnie uzyskane linki istnieją dłużej niż kupione, ponieważ nadane bezinteresownie rzadziej są usuwane. Niezależnie od rodzaju strony internetowej, czy tematu, należy kierować się wyłącznie potrzebami potencjalnych użytkowników. Każda decyzja odnosząca się do linkowania powinna być poprzedzona pytaniem: Czy będzie to użyteczne dla odwiedzających moją stronę?


Linking

One popular way to optimize webpages for search engines, especially among Polish Web masters, is with link exchanges or buying high PageRank links. Unfortunately, in the choice of link partners, some webmasters' priority has not always been on what is best for the user. This causes some people to link to totally unrelated pages or engage in link exchanges with spammy sites. This kind of linking does not provide additional value to the page’s visitors and is a SEO method that, like hiding text, can be considered spammy. Google’s webmaster guidelines refer clearly to methods of this type under "quality guidelines".

Caring about our Polish users, Google recently improved algorithms and methods of link validation for our Polish search results. We do this because we want to provide our users with the best SERPs (Search Engine Result Pages) possible.

How to link in order not to violate Google’s webmaster guidelines?
If you want to increase your PageRank and to improve your position in the SERPs, you should always be thinking about your visitors’ needs. This refers to content as much as to linking.

Linking to and from related sites is still very much appreciated by Google and it will have a positive impact on the position in the index. Simultaneously, Google will work to stop the impact of excessive off-topic link exchanging or bought links, including automated link exchange programs.

How to create relevant links?
The best way to gain relevant links is to create unique, relevant content that can quickly gain popularity in the Internet community, especially among those who are interested in the topic, such as blog publishers. Also, look for editorially given links based on merit, since naturally grown links tend to exist longer; and such links will pass the test of time. Therefore, the best way to go is to focus on your visitors’ needs, no matter how this is related to content or linking. Before making any single decision, you should ask yourself the question: Is this going to be beneficial for my page’s visitors?