Tuesday, 2 July 2013

New York becomes first U.S. city to get unique Web domain

New York becomes first U.S. city to get unique Web domain

Casting .com, .net, and .org aside, New Yorkers will soon be able to get .nyc for their top-level domain names.
New York City's information Web site for the .nyc Web address.
(Credit: Screenshot by Dara Kerr/CNET)
It appears a new domain landgrab has begun, kicking off with New York City becoming the first place in the U.S. to get its own top-level domain: .nyc.
New York City Mayor Michael Bloomberg announced the news Tuesday, saying this new URL will greatly help residents and businesses establish themselves as true New Yorkers.
"Having our own unique, top-level domain -- .nyc -- puts New York City at the forefront of the digital landscape and creates new opportunities for our small businesses," Mayor Bloomberg said in a statement. "They'll now be able to identify themselves as connected to New York City, one of the world's strongest and most prestigious brands."
Besides the name association, specific Web addresses also make it easier for people anywhere to find search results in particular locations.
New York's new domain was approved by the Internet Corporation for Assigned Names and Numbers (ICANN) in May. The organization has been working for years to expand generic top-level domains, like .com, .org, .net, and .edu, to more localized addresses for cities, countries, organizations, businesses, and more.
ICANN announced in February that it plans to roll out hundreds of new top-level domains this year, which will make for the largest growth of Internet addresses since the 1980s. Foreign languages were the first to start getting the new monikers, and brand names like .cadillac and regional addresses like .nyc are next in line.
New Yorkers will have to register for a .nyc domain, which comes with a list of rules, such as a physical address within the city limits. Registration for the Web address is expected to open in late 2013 and the cost for individuals and businesses is still being determined.
"Online search is increasingly driven not only by what a business does but also where it is located," said Ken Hansen, general manager for .nyc Registry Services for Neustar, which will operate .nyc on behalf of New York City. "A .nyc address will enable New Yorkers to easily find local businesses, services, and information online."

Ubisoft hacked, users' e-mails and passwords exposed

Ubisoft hacked, users' e-mails and passwords exposed

The video game developer, known for creating Assassin's Creed, announces that its account database was breached and that all users should to reset their passwords.
Ubisoft, the maker of Assassin's Creed, announces that its online network was breached by hackers.
(Credit: Ubisoft)
Anyone that has an account with video game developer Ubisoft is being asked to change their password immediately. The game maker announced Tuesday that its user account database was breached by hackers who gained access to user names, e-mail addresses, and encrypted passwords.
"We recently discovered that one of our Web sites was exploited to gain unauthorized access to some of our online systems," Ubisoft wrote in a statement. "During this process, we learned that data had been illegally accessed from our account database."
The game maker emphasized that the company doesn't store personal payment information, so no credit or debit card information was stolen. However, since passwords could have been stolen, the company is recommending that users change their passwords on any Website where they used the same or a similar password.
Ubisoft makes hit video games such as Assassin's Creed, Just Dance, and Tom Clancy's The Division. The company won't specify how hackers breached its system; it only said, "credentials were stolen and used to illegally access our online network."
This isn't the first time that Ubisoft's network has been hacked. In 2010, a consortium of hackers known as Skid Row claimed responsibility for breaching Ubisoft's Website in protest over a policy that required gamers to have a constant Internet connection to play their games. This hack didn't affect users' personal information, however, but instead removed the company's digital rights management technology for PC games.
As far as the most recent hack, Ubisoft said it is investigating the breach with the "relevant authorities" and working on restoring their systems.
"Ubisoft's security teams are exploring all available means to expand and strengthen our security measures in order to better protect our customers," the game maker wrote. "Unfortunately, no company or organization is completely immune to these kinds of criminal attacks."

Sunday, 30 June 2013

IT companies eye opportunity in Indian education



Indian tech companies are flocking to education sector, seeing a rare promising market because of liberal government spending.

MUMBAI: Indian technology companies are flocking to the education sector, seeing in it a rare promising market created because of liberal government spending.

Providers of computing hardware, education-related software and content are eager to tap into business as the state and central governments prepare to step up spending to take education beyond cities and towns to villages.

"India is spending Rs 2.5 lakh crore on learning at a macro level. It's a market that continues to have a double-digit growth year after year. At present, only about 5% of that is technology-enabled," HCL Infosystems CEO Harsh Chitale said.

According to technology market researcher IDC, about 5% of the $40 billion (Rs 2.4 lakh crore) IT market in India, including hardware, software and IT services, was in the education sector. The market is expected to grow at nearly 12% through 2017.

The Indian government's education programmes already have some technology component, but that is tiny compared to the overall spending. In 2013-2014, the government allocated more than Rs 27,000 crore to the Sarva Shiksha Abhiyan, with a provision to spend Rs 50 lakh in every district on computer-aided learning. This will grow in the coming years.

"A large part of our school or education infrastructure is in the public sector and so far most of the effort was around improving enrolment and infrastructure. But there is a greater focus now on delivering quality education. And that is where technology will play a role," Nilaya Varma, management consulting lead for public services at Accenture, told ET.

Maharashtra has some projects running in pilot phase, and is also seeking proposals for education-related projects, Varma said. Last July, HCL Infosystems acquired content company Edurix to tap the growing market for digital content-based learning.

Chitale hopes that the market, which is now limited to city schools, will reach all of India as some of the technologies have demonstrated improvements in teaching outcomes.

Cisco, which provided remote learning services to schools in Raichur in rural Karnataka as part of a corporate social responsibility programme, is now offering similar services for a price as part of its inclusive business division. The division launched the Cisco Education Enabled Development Solution earlier this year at a monthly fee of $1 (Rs 60) for every student.

Arvind Sitaraman, president of Cisco's inclusive growth division, said the intent is to help government bring inclusive growth to rural areas using the latest technology at affordable prices.

Intel, which runs a teacher training programme in villages and has an education product that is sold through computing devices, is also looking at this market.

"I think the interest in the government is high. India has a vast geography and you can't go everywhere to provide facilities, so this will happen through internet broadband and remote learning," said Sandeep Aurora, director of market development at Intel South Asia.

Some Indian companies are already running for-profit digital learning models in the villages. SREI Sahaj of the Kolkata-based Kanoria group, runs an e-learning portal provided through government's common service centres that have been created at the panchayat level. The learning on the systems is certified by the Indira Gandhi National Open University. 

ICANN close to opening new generic top-level domains

ICANN close to opening new generic top-level domains


icann_domain.jpg
The agency in charge of website addresses passed a major milestone Friday on the path to broadening the world of domain names by the end of this year.

The board of US-based Internet Corporation for Assigned Names and Numbers (ICANN) touted freshly-approved benefits and responsibilities for registrars that essentially act as domain name wholesalers.

Changes to contractually enforceable rules include requiring registrars to confirm phone numbers or addresses of those buying domain names within 15 days.

"People who have stolen an identity or have criminal backgrounds obviously don't want to give you their name and address if their intentions are not kosher," said Cyrus Namazi, ICANN's vice president of industry engagement.

"The intent here is to weed out bad actors."

Prior to new rules outlined in the Registrar Accreditation Agreement, there were "loose checks and balances" to make sure aliases weren't being used by people buying domain names, according to Namazi.

"It is a very serious and significant milestone in moving toward new gTLDs (generic Top-Level domains)," he said.

ICANN is considering more than 1,800 requests for new web address endings, ranging from the general such as ".shop" to the highly specialized like ".motorcycles."

Many of the requests are from large companies such as Apple, Mitsubishi and IBM -- with Internet giant Google alone applying for more than 100, including .google, .YouTube, and .lol -- Internet slang for "laugh out loud."

California-based ICANN says the huge expansion of the Internet, with some two billion users around the world, half of them in Asia, means new names are essential.

There are currently just 22 gTLDs, of which .com and .net comprise the lion's share of online addresses.

"We spent a long time negotiating very thorny issues," Akram Atallah, ICANN's generic domains division head, said in an online video.

"The new agreement achieves everything we wished for in order to roll out the new gTLD program."

The first new website address endings should be available in the final quarter of this year, according to Namazi.

The revamped agreement will affect more than 1,000 domain name registrars around the world.

ICANN has been negotiating with domain handlers for more than two years on agreement revisions, with interests of governments and law enforcement agencies among those factored into changes, according to Namazi.

"Law enforcement agencies played a big role in it, because Internet crime is one of the biggest factors out there," he said.

"Governments are actively involved because the Internet is one thing that connects all the governments of the world and some want to control it."

The agreement doesn't require domain operators to go beyond legal limits regarding information that must be supplied to law enforcement officials, according to ICANN.

6 Things You Should Never Do On Social Networks

6 Types Of Facebook Posts Employers Don't Want To See: Survey 

It's important to be careful with what you put on Facebook and Twitter.
One day you may be looking for a job and your potential boss may get a gander at your Facebook page. Will they like what they see?
In a new survey from Harris Interactive and CareerBuilder.com, more than 2,000 hiring managers were asked how candidates' social media posts affect their chances of getting a job.
The survey found that 39 percent of companies use sites like Facebook and Twitter to research job candidates. And hiring managers identified six types of posts that made employers less likely to hire a candidate:

Provocative/inappropriate Photos



facebook posts employers


Information About Drinking Or Doing Drugs



facebook posts employers


Bad-Mouthing A Previous Employer



facebook posts employers


Poor Communication Skills



facebook posts employers


Racist, Sexist Or Anti-Religious Remarks



facebook posts employers


Lying



facebook posts employers

Friday, 28 June 2013

What Is SEO / Search Engine Optimization?

SEO stands for “search engine optimization.” It is the process of getting traffic from the “free,” “organic,” “editorial” or “natural” listings on search engines. All major search engines such as Google, Yahoo and Bing have such results, where web pages and other content such as videos or local listings are shown and ranked based on what the search engine considers most relevant to users. Payment isn’t involved, as it is with paid search ads.

More SEO Advice For Newbies

For more basic but also in-depth advice, our Periodic Table Of SEO Success Factors, shown below, introduces you to all the key concepts you need to know:
You can click on the table to view a larger version of it. You can download a copy to print for easy reference!

What is Search Engine Optimization ? Take a LOOK of this Video Just 3 min

Chapter 1: Types Of Search Engine Success Factors

There are three major groups covered by Search Engine Land’s Periodic Table Of SEO Success Factors:

  • On The Page SEO
  • Off The Page SEO
  • Violations
Within each group are subgroups, as each chapter of this guide will explain. These subgroups contain one or more individual SEO factors with a specific weight or importance.
Violations, while a group unto themselves, are displayed under the group and subgroup to which they’re associated.
Periodic Table of SEO Success
Those two letter acronyms you see on the chart? That’s our play on the periodic table of elements and the letter representations, or symbol, of each element. You may have had to remember that the symbol for gold was Au or that iron’s was Fe.
We’ve tried to make it slightly more intuitive. The first letter of each “SEO element” comes from the subgroup that it’s in and the second letter stands for the individual factor.

Factors Work In Combination

No single SEO factor will guarantee search engine rankings. Having a great HTML title won’t help if a page has low quality content. Having many links won’t help if they are all low in quality. Having several positive factors can increase the odds of success while the presence of negative factors can worsen those odds.

On The Page Factors

Search Engine Land On Page SEO Success Factors
On The Page search ranking factors are those that are entirely within the publisher’s own control. What type of content do you publish? Are you providing important HTML clues that help search engines (and users) determine relevancy? How does your site architecture help or hinder search engines?

Off The Page Factors

Search Engine Land Off Page SEO Success Factors
Off The Page ranking factors are those that publishers do not directly control. Search engines use these because they learned early on that relying on publisher controlled signals alone didn’t always yield the best results. For instance, some publishers may try to make themselves seem more relevant than they are in reality.
With billions of web pages to sort through, looking only at ‘on the page’ clues isn’t enough. More signals are needed to return the best pages for any particular search.

Violations

Make no mistake, search engines want people to perform SEO because it can help improve their search results. Search engines provide help in the form of guidelines, blog posts and videos to encourage specific SEO techniques.
However, there are some techniques that search engines deem “spam” or “black hat”, which could result in your pages receiving a ranking penalty or, worse, being banned from the search engines entirely.
Violations are generally tactics meant to deceive or manipulate a search engine’s understanding of a site’s true relevancy and authority.

Weighting

All the factors we show are weighted on a scale of one to three, as shown in the top right corner of each factor as well as reflected in the hue of that factor. A weighting of three is most important and is something you should pay special attention to because it has a bigger impact than other factors.
That doesn’t mean that factors weighted two or one aren’t important. They are. It’s just that they are of less importance, relatively speaking, in terms of the other factors on the chart. Violations are also weighted, but in negative numbers, with negative three being the worst and potentially most harmful to your SEO success.
The weighting is based on a combination of what search engines have said, surveys of the SEO community as well as our own expertise and experience in watching the space over time. We don’t expect them to be perfect. Not everyone will agree. Your mileage may vary.
But we’re confident it is a useful general guide.

“Missing” Factors & The Guide’s Philosophy

Experienced SEOs may be wondering why some factors aren’t shown. How come ALT text or bolding words aren’t included as HTML factors, for example?
The answer? We don’t think those things are as important, relatively speaking. We’re not trying to boil the ocean and encompass every possible signal (Google has over 200 of them) and sub-signals (Google has over 10,000 of those).
Instead, the goal of the Periodic Table Of SEO Success Factors and this online companion guide is to help those new to SEO focus on the big picture and perhaps allow experienced SEOs to hit the “reset” button if they’ve gotten lost staring at specific trees in the SEO forest.
That’s why this guide doesn’t address having your most important keywords be at the beginning or end of an HTML title tag. Nor are we trying to assess how much more weight an H1 header tag carries than an H2 tag.
We’re purposely avoiding being ultra specific because such things often distract and pull us down the rabbit hole. Instead, we hope you gain an understanding that pages should have descriptive titles, that indicating page structure with header tags may help, and topping things off with structured data is a good idea.
Do these things well and you’ve probably addressed 90% of the most important HTML factors.
Similarly, it’s not whether a good reputation on Twitter is worth more than on Facebook. Instead, we’re trying to help people understand that having social accounts that are reputable in general, which attract a good following and generate social shares, may ultimately help you achieve search success.

Chapter 2: Content & Search Engine Success Factors

Content is king. You’ll hear that phrase over and over again when it comes SEO success. Indeed, that’s why the Periodic Table Of SEO Success Factors begins with the content “elements,” with the very first element being about content quality.
Get your content right, and you’ve created a solid foundation to support all of your other SEO efforts.

Cq: Content Quality

More than anything else, are you producing quality content? If you’re selling something, do you go beyond being a simple brochure with the same information that can be found on hundreds of other sites?
Do you provide a reason for people to spend more than a few seconds reading your pages?
Do you offer real value, something of substance to visitors, that is unique, different, useful and that they won’t find elsewhere?
These are just some of the questions to ask yourself in assessing whether you’re providing quality content. This is not the place to skimp since it is the cornerstone upon which nearly all other factors depend.

Cr: Content Research / Keyword Research

Perhaps the most important SEO factor after creating good content is good keyword research. There are a variety of tools that allow you to discover the specific ways that people may be searching for your content.
You want to create content using those keywords, the actual search terms people are using, so you can produce content that effectively “answers” that query.
For example, a page about “Avoiding Melanoma” might use technical jargon to describe ways to prevent skin cancer. But a search engine might skip or not rank that page highly if people are instead searching for “skin cancer prevention tips”. Your content needs to be written in the right ‘language’ – the language your customer or user is using when searching.

Cw: Content Words / Use Of Keywords

Having done your keyword research (you did that, right?), have you actually used those words in your content? Or if you’ve already created some quality content before doing research, perhaps it’s time to revisit that material and do some editing.
Bottom line, if you want your pages to be found for particular words, it’s a good idea to actually use those words in your copy.
How often? Repeat each word you want to be found for at least five times or seek out a keyword density of 2.45%, for best results.
No no no, that was a joke! There’s no precise number of times. Even if “keyword density” sounds scientific, even if you hit some vaunted “ideal” percentage, that would guarantee absolutely nothing.
Just use common sense. Think about the words you want a page to be found for, the words you feel are relevant from your keyword research. Then use them naturally on the page. If you commonly shift to pronouns on a second and further references, maybe use the actual noun again here and there, rather than a pronoun.

Ce: Content Engagement

Quality content should produce meaningful interactions with users. Search engines may try to measure this interaction – engagement – in a variety of ways.
For example, how long do users stay on your page? Did they search, clickthrough to your listing but then immediately “bounce” back to the results to try something else? That “pogosticking” behavior can be measured by search engines and could be a sign that your content isn’t engaging.
Conversely, are people sending a relatively long time reviewing your content, in relation to similar content on other sites? That “time on site” metric or “long click” is another type of engagement that search engines can measure and use to assess the relative value of content.
Social gestures such as comments, shares and “likes” represent another way that engagement might be measured. We’ll cover these in greater detail in the Social section of this guide.
Search engines are typically cagey about the use of engagement metrics, much less the specifics of those metrics. However, we do believe engagement is measured and used to inform search results.
Cf: Content Freshness

Search engines love new content. That’s usually what we mean when we say ‘fresh’.
So you can’t update your pages (or the publish date) every day thinking that will make them ‘fresh’ and more likely to rank. Nor can you just add new pages constantly, just for the sake of having new pages, and think that gives you a freshness boost.
However, Google does have something it calls “Query Deserved Freshness (QDF)”. If there’s a search that is suddenly very popular versus its normal activity, Google will apply QDF to that term and look to see if there’s any fresh content on that topic. If there is, that new or fresh content is given a boost in search results.

Chapter 3: HTML Code & Search Engine Success Factors

HTML is the underlying code used to create web pages. Search engines can pick up ranking signals from specific HTML elements. Below are some of the most important HTML elements to achieve SEO success.

Ht: HTML Title Tag

Imagine that you wrote 100 different books but gave them all the same exact title. How would anyone understand that they are all about different topics?
Imagine that you wrote 100 different books, and while they did have different titles, they weren’t very descriptive — maybe just a single word or two. Again, how would anyone know, at a glance, what the books are about?
HTML titles have always been and remain the most important HTML signal that search engines use to understand what a page is about. Bad titles on your pages are like having bad book titles in the examples above. In fact, if your HTML titles are deemed bad, Google changes them.
So think about what you hope each page will be found for, relying on the keyword research you’ve already performed. Then craft unique, descriptive titles for each of your pages. 

Hd: The Meta Description Tag

The meta description tag, one of the oldest supported HTML elements, allows you to suggest how you’d like your pages to be described in search listings. If the HTML title is the equivalent to a book title, the meta description is like the blurb on the back describing the book.
SEO purists will argue that the meta description tag isn’t a “ranking factor” and that it doesn’t actually help your pages rank higher. Rather, it’s a “display factor,” something that helps how you look if you appear in the top results due to other factors.
Technically, that’s correct. And it’s one of the reasons we decided to call these “success” factors instead of ranking factor.
Because a meta description that contains the keywords searched for (in bold) may catch the user’s eye. A well crafted meta description may help ‘sell’ that result to the user. Both can result in additional clicks to your site. As such, it makes sense for the meta description tag to be counted as a success factor.
Be forewarned, having a meta description tag doesn’t guarantee that your description will actually get used. Search engines may create different descriptions based on what they believe is most relevant for a particular query. But having one increases the odds that what you prefer will appear. And it’s easy to do. So do it.

Hh: Header Tags

See the headline up at the top of this page? Behind the scenes HTML code is used to make that a header tag. In this case, an H1 tag.
See the sub-headlines on the page? Those also use header tags. Each of them is the next “level” down, using H2 tags.
Header tags are a formal way to identify key sections of a web page. Search engines have long used them as clues to what a page is about. If the words you want to be found for are in header tags, you have a slightly increased chance of appearing in searches for those words.
Naturally, this knowledge has caused some people to go overboard. They’ll put entire paragraphs in header tags. That doesn’t help. Header tags are as much for making content easy to read for users as it is for search engines.
Header tags are useful when they reflect the logical structure (or outline) of a page. If you have a main headline, use an H1 tag. Relevant subheads should use an H2 tag. Use headers as they make sense and they may reinforce other ranking factors.

Hs: Structured DataHs Structure

What if you could tell search engines what your content was about in their own “language”? Behind the scenes, sites can use specific mark-up (code) that make it easy for search engines to understand the details of the page content and structure.
The result of structured data often translates into what is called a ‘rich snippet‘, a search listing that has extra bells and whistles that make it more attractive and useful to users. The most common rich snippet you’re likely to encounter is reviews/ratings which usually includes eye-catching stars.
While the use of structured data may not be a direct ranking factor, it is clearly a success factor. All things being equal, a listing with a rich snippet will get more clicks than one without. And search engines are eager for site owners to embrace structured data, providing new and easier ways for less tech-savvy webmasters to participate.
Structured data has been around for quite some time in various forms. But recently search engines have begun to rely on it more with the advent of Google’s Knowledge Graph and Bing’s Snapshot.
This element enters the periodic table for the first time and we suspect it may become more important over time.
Chapter 4: Site Architecture & Search Engine Success Factors

The last major On The Page group in the Periodic Table Of SEO Success Factors is site architecture. The right site structure can help your SEO efforts flourish while the wrong one can cripple them.

Ac: Site Crawlability

Search engines “crawl” web sites, going from one page to another incredibly quickly, acting like hyperactive speed readers. They make copies of your pages that get stored in what’s called an “index,” which is like a massive book of the web.
When someone searches, the search engine flips through this big book, finds all the relevant pages and then picks out what it thinks are the very best ones to show first. To be found, you have to be in the book. To be in the book, you have to be crawled.
Most sites generally don’t have crawling issues, but there are things that can cause problems. For example, JavaScript or Flash can potentially hide links, making the pages those links lead to hidden from search engines. And both can potentially cause the actual words on pages to be hidden.
Each site is given a crawl budget, an approximate amount of time or pages a search engine will crawl each day, based on the relative trust and authority of a site. Larger sites may seek to improve their crawl efficiency to ensure that the ‘right’ pages are being crawled more often. The use of robots.txt, internal link structures and specifically telling search engines to not crawl pages with certain URL parameters can all improve crawl efficiency.
However, for most, crawl problems can be easily avoided. In addition, it’s good practice to use sitemaps, both HTML and XML, to make it easy for search engines to crawl your site.

Ad: Duplication / CanonicalizationArchitecture Duplicate (Ad) Element

Sometimes that big book, the search index, gets messy. Flipping through it, a search engine might find page after page after page of what looks like virtually the same content, making it more difficult for it to figure out which of those many pages it should return for a given search. This is not good.
It gets even worse if people are actively linking to different versions of the same page. Those links, an indicator of trust and authority, are suddenly split between those versions. The result is a distorted (and lower) perception of the true value users have assigned that page. That’s why canonicalization is so important.
You only want one version of a page to be available to search engines.
There are a number of different ways that duplicate versions of a page can creep into existence. A site may have www and non-www versions of the site instead of redirecting one to the other. An ecommerce site may allow search engines to index their paginated pages. But no one is search for “page 9 red dresses”. Or filtering parameters might be appended to a URL, making it look (to a search engine) like a different page.
For as many ways as there are to inadvertently create URL bloat, there are ways to address it. Proper implementation of 301 redirects, the use of rel=canonical tags, managing URL parameters and effective pagination strategies can all help ensure you’re running a tight ship.

As: Site Speed

Google wants to make the web a faster place and has declared that speedy sites get a small ranking advantage over slower sites.
However, making your site blistering fast isn’t a guaranteed express ride to the top of search results. Speed is a minor factor that impacts just 1 in 100 queries according to Google.
But speed can reinforce other factors and may actually improve others. We’re an impatient bunch of folks these days. So engagement (and conversion) on a site may improve based on a speedy load time.
Speed up your site! Search engines and humans will both appreciate it.

Au: Are Your URLs Descriptive?

Yes. Having the words you want to be found for within your domain name or page URLs can help your ranking prospects. It’s not a major factor but if it makes sense to have descriptive words in your URLs, do so.
Aside from helping a bit from a ranking perspective, various research reports over the years have shown that searchers are more likely to select pages with short, descriptive URLs over other pages in search results.
It’s also notable that all major search engines have moved the URL up just under the title in their search listings. And structured data can be used to transform the URL into a breadcrumb, giving users more ways to navigate to a site directly from a search result.
If the URL wasn’t important, why do search engines pay so much attention to it?

Chapter 5: Link Building & Ranking In Search Engines

Links were the first major “Off The Page” ranking factor used by search engines. Google wasn’t the first search engine to count links as “votes,” but it was the first search engine to rely heavily on link analysis (or the Link Graph) as a way to improve relevancy.

Despite the chatter around other signals, links remain the most important external signal for search rankings. But as you’ll find, some links are more valuable than others.

Lq: Link Quality

If you were sick, which would you trust more? The advice from five doctors or from fifty random people who offered their advice as you walked down the street.
Unless you’ve had a really bad experience with doctors, you’ll probably trust the advice from the doctors. Even though you’re getting fewer opinions, you’re getting those opinions from experts. The quality of their opinions carries more weight.
It works the same way with search engines. They’ll count all the links pointing at web sites (except those blocked using nofollow or other methods), but they don’t count them all equally. They give more weight to the links that are considered to be of better quality.
What’s a quality link? It’s one of those “you’ll know it when you see it” types of things in many cases. But a link from any large, respectable site is going to be higher on the quality scale than a link you might get from commenting on a blog. In addition,  links from those in your “neighborhood”, sites that are topically relevant to your site, may also count more.

Lt: Link Text / Anchor TextLink Text (Lt) SEO Element

Amazon has millions of links pointing at it. Yet, it doesn’t rank for “cars.” It does rank for “books.” Why? Many of those links pointing at Amazon say the word “books” within the links while relatively few say “cars,” since Amazon doesn’t sell cars.
The words within a link — the link text or “anchor text” — are seen by search engines as the way one web site is describing another. It’s as if someone’s pointing at you in real life and saying “books” and declaring you to be an expert on that topic.
You often can’t control the words people use to link to you, so capitalize on your opportunities to influence anchor text, within reason.
Link text is a powerful factor but has been decreased from a weight of 3 to 2 in this edition of the table. The downgrade revolves around the efforts Google has made, via the Penguin Updates, to identify over-optimized anchor text and a desire by search engines to see more ‘natural’ link text patterns.

Ln: Number Of Links

Plenty of sites have found that getting a whole lot of  links can add up to SEO success. Even more so if your getting a lot of links from many different sites. All things being equal, 1000 links from 1 site will mean far less than 1000 links from 1000 sites.
Long ago, the sheer number of links used to be a far more important, but has decreased steadily as search engines learn how to better evaluate the quality of links.
Tactics such as viral link baiting campaigns, badges and widgets can all be effective at securing large numbers of links and something even search engine representatives have suggested.
But in your quest for links, don’t fire up automated software and begin spamming blogs. That’s a bad thing, in many ways, as we’ll explore later in this guide.

Chapter 6: Social Media & Ranking In Search Results

Using links as an Off The Page ranking factor was a great leap forward for search engines. But over time, links have lost some of their value for a variety of reasons. Some sites are stingy about linking out. Others block links to help fight spam. And links get bought and sold, making them less trustworthy.

Enter social media. If links were a way for people to “vote” in favor of sites, social media sharing represents a way for that voting behavior to continue. Social signals are emerging as ranking factors as search engines determine how to leverage our social interaction and behavior.

Sr: Social Reputation

Just as search engines don’t count all links equally, they don’t view all social accounts as being the same. This makes sense, since anyone can create a new account on a social network. What’s to prevent someone from making 100 different accounts in order to manufacture fake buzz?
Nothing, really, other than fake accounts like these can often be easy to spot. They may only have a handful of “quality” friends in their network and few might pass along material they share.
Ideally, you want to gain references from social accounts with good reputations. Having your own social presence that is well regarded is important. So participate on relevant social platforms in a real, authentic way, just as you would with your web site, or with customers in an offline setting.

Ss: Social Shares

Similar to links, getting quality social shares is ideal, but being shared widely on social networks is still helpful. Good things happen when more people see your site or brand.
Again, participation in social sharing sites is crucial. If you don’t have a Twitter account, a Facebook fan page or Google+ Page you’re missing out. You’re not building up a network that can help spread (aka share) your content, site and brand.
Nowhere is this more apparent than Google+, where Ripples show how content is shared through the platform and winds up personalizing more and more search results.

Chapter 7: Trust, Authority, Identity & Search Rankings

If search engines can decide to trust links or social accounts, can they learn to trust web sites? Absolutely. Many SEOs believe that site trust plays a big role in whether a site will succeed or fail from a search perspective.

Ta: Authority

Is your site an authority? Is it a widely recognized leader in its field, area, business or in some other way? That’s the goal.
No one knows exactly how search engines calculate authority and, in fact, there are probably multiple “authority” signals. The type of links your site receives (lots of quality or ‘neighborhood’ links?) or social references (from respected accounts?) and engagement metrics (long clicks?) may all play a role in site authority. Of course, negative sentiment and reviews may hurt site authority as covered below:
There’s little doubt that search engines try to assess authority. One only needs to look through the questions Google told publishers to ask themselves in building high-quality sites that should be immune to “Panda” updates.

Th: History

Since search engines are constantly visiting your web site, they can get a sense of what’s “normal” or how you’ve behaved over time.
Are you suddenly linking out to what the search engines euphemistically call “bad neighborhoods?” Are you publishing content about a topic you haven’t typically covered? Such things might raise alarm bells.
Then again, sites do change just like people do, and often for the better. Changes aren’t taken in isolation. Other factors are also assessed to determine if something worrisome has happened.
Similarly, a site with a history of violating guidelines and receiving multiple penalties may find it more difficult to work their way back to search prominence.
In the end, a good overall track record may help you. An older, established site may find it can keep cruising along with search success, while a new site may have to “pay its dues,” so to speak, for weeks, months or even longer to gain respect.

Ti: IdentityTrust Identity (Ti) SEO Element

Is that the real Benedict Cumberbatch on Twitter or someone impersonating him? Does a site claiming to be the ‘official’ site … really official? Who is the person giving legal advice on their blog? Are they even a lawyer!?
In the offline world it’s easier to figure these things out. We spot Benedict at a local coffee shop (no pictures please), we can tell instantly when we’re in a real Target store and you’ll likely take a look at the degrees on the wall when visiting a lawyer.
Search engines have found it increasingly important to ensure they’re dealing with the ‘right’ data. Amit Singhal, who oversees Google’s search efforts, made the importance of identity clear:
A good product can only be built where we understand who’s who and who is related to whom. Relationships are also important alongside content. To build a good product, we have to do all types of processing. But fundamentally, it’s not just about content. It’s about identity, relationships and content.
Identity takes many forms, from Google’s Authorship program to social profile verification on platforms such as Twitter and Facebook. While there’s clearly a debate around the balance of privacy and anonymity, search engines continue to seek out those willing to stand up and behind the content they produce.

Chapter 8: Personalization & Search Engine Rankings

Years ago, everyone saw exactly the same search results. Today, no one sees exactly the same search results, not on Google, not on Bing. Everyone gets a personalized experience.
Of course, there’s still a lot commonality. It’s not that everyone sees completely different results. Instead, everyone sees many of the same “generic” listings. But there will also be some listings appearing because of where someone is, who they know or how they surf the web.

Pc: Country

One of the easiest personalization ranking factors to understand is that people are shown results relevant to the country they’re in.
Someone in the US searching for “football” will get results about American football; someone in the UK will get results about the type of football that Americans would call soccer.
If your site isn’t deemed relevant to a particular country, then you’ve got no chance of showing up when country personalization happens. If you feel you should be relevant, then you’ll probably have to work on your international SEO.

Pl: Locality

Search engines don’t stop personalizing at the country level. They’ll tailor results to match the city or metropolitan area based on the user’s location.
As with country personalization, if you want to appear when someone gets city-specific results, you need to ensure your site is relevant to that city.
This is increasingly important as search becomes more prevalent on mobile devices and geolocation becomes a primary way of delivering more relevant results. In addition, Google’s Venice Update placed far more emphasis on sites who were physically located in that user’s area.
Today, if you’re looking for a dentist, you’ll find more individual dental practice sites in your area populating your search results rather than national directory sites.

Ph: Personal HistoryPersonal History (Ph) SEO Element

What has someone been searching for and clicking on from their search results? What sites do they regularly visit? Have they “Liked” a site using Facebook, shared it via Twitter or perhaps +1′d it?
This type of personal history is used by both Google and Bing to influence search results. Unlike country or city personalization, there’s no easy way to try and make yourself more relevant.
Instead, it places more importance on first impressions and brand loyalty. When a user clicks on a “regular” search result, you want to ensure you’re presenting a great experience so they’ll come again. Over time, they may seek out your brand in search results, clicking on it despite it being below other listings.
This behavior reinforces your site as one that they should be shown more frequently to that user. Even better if they initiate a social gesture, such as a Like, +1 or Tweet that indicates a greater affinity for your site or brand.
History is even more important in new search interfaces such as Google Now, which will proactively present “cards” to users based on explicit preferences (i.e. – which sports teams or stocks do you track) and search history.

Ps: Social ConnectionsPersonal Social (Ps) SEO Element

What do someone’s friends think about a web site? This is one of the newer ranking factors to impact search results. Someone’s social connections can influence what they see on Google and Bing.
Those connections are what truly matter because search engines view those connections as a user’s personal set of advisors. Offline, you might trust and ask your friends to give you advice on a restaurant or gardening.
Increasingly, when you search today search engines are trying to emulate that offline scenario. So if a user is connected to a friend and that friend has reviewed a restaurant or shared an article on growing tomatoes then that restaurant and article may rank higher for that user.
If someone can follow you, or easily share your content, that helps get your site into their circle of trust and increases the odds that others they know will find you. Nowhere is this more transformative than Google+, where circling a site’s Google+ Page will change the personalized search results for that user.
And if the rising percentage of (not provided) keyword data is any indication, the number of people getting personalized search results on Google is growing fast.

Chapter 9: Violations & Search Engine Spam Penalties


So far, we’ve discussed the positive signals that make up the Periodic Table Of SEO Success Factors. But there are also some negative factors to avoid.

A word of reassurance. Very few people who believe they’ve spammed a search engine have actually done so. It’s hard to accidentally spam and search engines look at a variety of signals before deciding if someone deserves a harsh penalty.
That said, let’s talk about things not to do!

Vt: “Thin” or “Shallow” Content

Responding to a drumbeat of complaints about poor search results, Google rolled out its “Panda” update in February 2011. Panda targets what is described as “thin” or “shallow” content or content that is lacking in substance.
This domain level penalty targets sites with a predominant amount of so-so content and essentially treats it similar to overt spam techniques.
Today it’s no longer a question of whether the content is simply relevant but whether it is valuable to the user.

Va: Ads / Top Heavy LayoutViolation Ads (Va) SEO Element

Have you ever been on a site and found it hard to find the actual content amid a slew of ads? Where’s the beef!
That’s what the Page Layout algorithm was meant to address. Matt Cutts, Google’s head of Webspam described it as follows:
… we’ve heard complaints from users that if they click on a result and it’s difficult to find the actual content, they aren’t happy with the experience. Rather than scrolling down the page past a slew of ads, users want to see content right away. So sites that don’t have much content “above-the-fold” can be affected by this change. If you click on a website and the part of the website you see first either doesn’t have a lot of visible content above-the-fold or dedicates a large fraction of the site’s initial screen real estate to ads, that’s not a very good user experience. Such sites may not rank as highly going forward.
Often referred to as Top Heavy, this penalty is reserved for sites that frustrate the user experience by placing an over abundance of ads before content. So don’t make your users search for the content.

Vs: Keyword Stuffing

It’s one of the oldest spam tactics on the books. Search engines say to use words you want to be found for on your pages. OK, I’ll give them those words over and over again! How about 100 times. In a row? That work for you, Google?
Actually, no, it doesn’t. But “keyword stuffing” like this could get you penalized.
How often is too often? There’s no correct answer here, but you’d really have to go to extremes to cause this penalty to kick in. It’s most likely to happen to non-SEOs who just don’t know better and might decide to paste a word many times in a row, typically at the bottom of a web page.

Vh: Hidden Text

Once you decide to keyword stuff, you’re next thought will probably be “Why don’t I hide all this text that no human wants to see.” You might make the text white, so it blends with a page’s background. In doing so, you will have spammed a search engine.
Search engines don’t like anything hidden. They want to see everything that a user sees. Don’t hide text, whether it be using styles, fonts, display:none or any other means that means a typical user can’t see it.

Vc: Cloaking

Let’s talk sophisticated hiding. How about rigging your site so that search engines are shown a completely different version than what humans see?
That’s called cloaking. Search engines really don’t like it. It’s one of the worst things you could do. Heck, Google’s even banned itself for cloaking. Seriously.
While most people are unlikely to accidentally spam a search engine, the opposite is true when it comes to cloaking. That’s why it’s such a heavy penalty, if you’re caught doing it. It’s a bait and switch and seen as a deliberate attempt to manipulate search results.

Vp: Paid Links

Speaking of Google banning itself, it also banned Google Japan, when that division was found to be buying links. For 11 months.
That’s longer than JC Penney was penalized (3 months) in 2011. But JC Penney suffered another penalty after having its paid link purchase splashed across a giant New York Times article. So did several large online florists. And Overstock got hammered via a Wall Street Journal article.
The debate over whether Google should act so aggressively against those who buy and sell links has gone on for years. The bottom line is to rank on Google, you have to follow Google’s rules — and the rules say no buying or selling links in a way that passes on search engine ranking credit.
If you choose to ignore Google’s rules, be prepared for little mercy if caught. And don’t believe programs that tell you they’re paid links are undetectable. They’re not, especially when so many of the cold-call ones are run by idiots.
As for Bing, officially it doesn’t penalize for paid links, but it frowns on the practice.

Vl: Link SpamViolations Link Spam (Vl) SEO Element

Tempted to run around and drop links on forums and blogs, all with highly optimized anchor text (like ‘louis vuitton handbags 2013′), with the help of automated software?
You suck.
You’re also not doing SEO, though sadly, all the people who hate the spam you leave behind get the impression that’s what SEO is about. So SEOs hate you too – with a passion.
If you do go ahead with it, most of the links won’t give you the credit you were thinking they would. On top of that, you can find yourself on the sharp end of a penalty.
This penalty has been given more weight in this version of the table based on the efforts Google has made in neutralizing and penalizing link spam and, in particular, the launch of the “Penguin” update.
If you’ve been caught dabbling on the dark side, or if a fly-by-night “SEO” company got your site in hot water you can disavow those links on both Google and Bing in hopes of redemption and a clean start.

Vd: Piracy / DMCA TakedownsViolation Piracy (Vd) SEO Element

The “Pirate” update targeted sites infringing on copyright law. Under pressure from the Recording Industry Associate of America (RIAA), Hollywood powerhouses and governments, Google began to penalize sites who received a large number of Digital Millennium Copyright Act (DMCA) “takedown” requests.
It’s unlikely that most sites will have to deal with these issues, but you should handle any DMCA takedown notifications that show up in your Google Webmaster Tools account.