2011: The Year Google & Bing Took Away From SEOs & Publishers

Sunday, 8 January 2012


Increasingly over the years, search engines — Google in particular — have given more and more support to SEOs and publishers. But 2011 marked the first significant reversal that I can recall, with both linking and keyword data being withheld. Here’s what happened, why it matters and how publishers can push back if Google and Bing don’t change things.

Where We Came From

Some might believe that search engines hate SEOs, hate publishers and have done little over the years to help them. They are mistaken, either choosing to deliberately ignore the gains or, more likely, are simply unaware of how far things have come.
When I first started writing about SEO issues nearly 16 years ago, in 1996, we had little publisher support beyond add URL forms. Today, we have entire toolsets like Google Webmaster Central and Bing Webmaster Tools, along with standalone features and options, which allow and provide:
  • Ability to submit & validate XML sitemaps
  • Ability to view crawling & indexing errors
  • Ability to create “rich” listings & manage sitelinks
  • Ability to migrate a domain
  • Ability to indicate a canonical URL or preferred domain
  • Ability to set crawl rates
  • Ability to manage URL parameters
  • Ability to view detailed linkage information to your site
  • Ability to view keywords used to reach your site
  • Notifications of malware or spam issues with your site
There’s even more beyond what I’ve listed above. The support publishers enjoy today was simply unimaginable to many veteran SEOs who were working in the space a decade ago.
The advancement has been welcomed. It has helped publishers better manage their placement in those important venues of the web, the search engines. It has helped search engines with errors and problems that would hurt their usability and relevancy.
That’s why 2011 was so alarming to me. After years of moving forward, the search engines took a big step back.

The Loss Of Link Data

One of the most important ways that search engines determine the relevancy of a web page is through link analysis. This means examining who links to a page and what the text of the link —the anchor text — says about the page.
However, for years Google has deliberately suppressed the ability for outsiders to see what links tell it about any particular page. Want to know why THAT result shows up for Santorum? Why Google was returning THAT result for “define English person” searches? Sorry.
Google won’t help you understand how links have caused these things. It refuses to show all the links to a particular page, or the words used within those links to describe a page, unless you are the page’s owner.
Why? Google’s rationale has been that providing this information would make it harder for it to fight spam. Potentially, bad actors might figure out some killer linking strategy by using Google’s own link reporting against it.
It’s a poor argument. Despite withholding link data, it’s painfully easy to demonstrate how sites can gain good rankings in Google for competitive terms such as “SEO” itself by simply dropping links into forums, onto client pages or into blog templates.
Given this, it’s hard to understand what Google thinks it’s really protecting by concealing the data. But until 2011, there was an easy alternative. Publishers and others could turn to Google-rival Yahoo to discover how people might be linking to a page.

Goodbye Yahoo Site Explorer

Yahoo launched its “Yahoo Site Explorer” back in September 2005, both as part as a publicity push to win people away from Google and to provide information to publishers. The tool allowed anyone to see what link data Yahoo had about any page in its listings.
Today, Yahoo still supposedly wants to win people away from Google. But because Yahoo’s web search results are now powered by Bing, Yahoo has little reason to provide tools to support publishers. That’s effectively Bing’s problem now.
Yahoo closed Yahoo Site Explorer at the end of last November, saying as it still does on thesite now:
Yahoo! Search has merged Site Explorer into Bing Webmaster Tools. Webmasters should now be using the Bing Webmaster Tools to ensure that their websites continue to get high quality organic search traffic from Bing and Yahoo!.
That’s not true. Yahoo Site Explorer was not merged into Bing Webmaster Tools. It was simply closed. Bing Webmaster Tools doesn’t provide the ability to check on the backlinks to any page in the way that Yahoo Site Explorer allowed.
The closure supposedly came after Yahoo “listened to your feedback” about what publishers wanted, as it posted earlier this year. I don’t know what feedback Yahoo was hearing, but what I’ve heard has been people desperately pleading with Yahoo or Bing to maintain the same exact features that Yahoo Site Explorer provided — and pleading for well over a year.

Yahoo-Bing Deal Has Reduced Competition & Features

When the US Department Of Justice granted its approval for Yahoo to partner with Microsoft, that was supposed to ensure that the search space stayed competitive. From what the Department Of Justice said in 2010:
After a thorough review of the evidence, the division has determined that the proposed transaction is not likely to substantially lessen competition in the United States, and therefore is not likely to harm the users of Internet search, paid search advertisers, Internet publishers, or distributors of search and paid search advertising technology.
I’d say dropping Yahoo Site Explorer did harm to both users of internet search and internet publishers. Yahoo Site Explorer was a distinctive tool that only Yahoo offered, allowing both parties named by the DOJ to better understand the inner workings of the search engines they depend on. It also reduced competitive pressure for Google to offer its own tool.
Indeed, things have gotten worse since Yahoo Site Explorer closed. At the end of last December, Bing officially confirmed that it no longer supports the link command in its help forum.

Next To Go, The Link Command?

The link command allows you to enter any page’s web address prefaced by “link:” in order to find links that point at that page. It’s a long-standing command that has worked for many major search engines as far back to late 1995, when AltaVista launched.
Google still supports this command to show some (but not all) of the links it knows about that point at pages. I’d link to Google’s documentation of this, but the company quietly dropped that some time around May 2008. Here’s what it used to say:
Here’s how the command still works at Google. Below, I used it to see what links Google says point to the home page of the official Rick Santorum campaign web site:
The first arrow shows you how the command is being used. The second arrow shows you how Google is reporting there are 111 links pointing to the page. Imagine that. Rick Santorum, currently a major Republican candidate for US president, and Google says only 111 other pages link to his web site’s home page.
The reality is that many more pages probably link over. Google’s counting them but not showing the total number to people who care about what exactly is being considered. I’ll demonstrate this more in a moment, but look at the worse situation on Bing:
One link. That’s all Bing reports that it knows about to those in the general public who may care to discover how many links are pointing to the Rick Santorum web site.

It’s Not Just An SEO Thing

People do care, believe me. I actually started writing this article last Monday and got interrupted when I had to cover how Google might have been involved with a link buying scheme to help its Chrome browser rank better in Google’s own search results.
I doubted that was really the main intent of the marketing campaign that Google authorized (Google did err on the side of caution and punished itself), but the lack of decent link reporting tools from Google itself left me unable to fully assess this as an independent third-party.
As soon as that story was over, renewed attention was focused on why Rick Santorum’s campaign web site wasn’t outranking a long-standing anti-Santorum web site that defines “santorum” as a by-product of anal sex.
Major media outlets were all over that story. My analysis was cited by The EconomistCNN,The TelegraphThe New York TimesMSNBC and Marketplace, to name only some.
But again, I — or anyone who really cared — was unable to see the full links that Google knew about pointing at both sites, much less the crucial anchor text that people were using to describe those sites. Only Google really knew what Google knew.

Third Party Options Good But Not The Solution

If you haven’t heard more complaints over the closure of Yahoo Site Explorer, and the pullback on link data in general, that’s because there are third-party alternatives such as Majestic Site Explorer or the tool I often use, SEOmoz’s Open Site Explorer.
These tools highlight just how little the search engines themselves show you. Consider this backlink report from Open Site Explorer for the Rick Santorum campaign’s home page:
The first arrow shows how 3,581 links are seen pointing at the page. Remember Google, reporting only 111? Or Bing, reporting only 1?
The next two arrows show the “external” links pointing at both the Santorum home page and the anti-Santorum home page. These are links from outsiders, pointing at each page. You can see that the anti-Santorum page has four times as many links pointing at it than the Santorum campaign page, a clue as to why it does so much better for a search on “santorum.”
But it’s not just number of links. Using other reports, I can see that thousands of links leading to both sites have the text “santorum” in the links themselves, which is why they both are in the top results for that word.
Because the anti-site has so many more links that say “santorum” and “spreading santorum,” that probably helps it outrank the campaign site on the single word. But because the official site has a healthy number from sources including places like the BBC saying “rick santorum” in the links, that — along with its domain name of ricksantorum.com — might help it rank better for “rick santorum.”
It’s nice that I can use a third party tool to perform this type of analysis, but I shouldn’t have to. It’s simply crazy — and wrong — that both Google and Bing send searchers and publishers away from their own search engines to understand this.
For one, the third party tools don’t actually know exactly what the search engines themselves are counting as links. They’re making their own estimates based on their own crawls of the web, but that doesn’t exactly match what Google and Bing  know (though it can be pretty good).

Not Listing Links Is Like Not Listing Ingredients

For another, the search engines should simply be telling people directly what they count. Links are a core part of the “ingredients” used to create the search engine’s results. If someone wants to know if those search results are healthy eating, then the ingredients should be shared.
Yes, Google and Bing will both report link data about a publisher’s own registered site. But it’s time for both of them to let anyone look up link data about any site.
The Blekko search engine does this, allowing anyone logged in to see the backlinks to a listed page. Heck, Blekko will even give you a badge you can place on your page to show off your links, just as Yahoo used to. But for Google, it’s “normal” for its link command to not show all the links to a page. Seriously, that’s what Google’s help page says.
Google, in particular, has made much of wanting people to report spam found in its search results. If it really wants that type of help, then it needs to ensure SEOs have better tools to diagnose the spam. That means providing link data for any URL, along with anchor text reporting.
Google has also made much about the need for companies to be open, in particular pushing for the idea that social connection should be visible. Google has wanted that, because untilGoogle+ was launched, Google had a tough time seeing the type of social connections that Facebook knew about.
Links are effectively the social connections that Google measures between pages. If social connections should be shared with the world, then Google should be sharing link connections too, rather than coming off as hypocritical.
Finally, it doesn’t matter if only a tiny number of Google or Bing users want to do this type of link analysis. That’s often the pushback when this issue comes up, that so few do these type of requests.
Relatively few people might read the ingredients labels on the food they eat. But for the few that do, or for anyone who suddenly decides they want to know more, that label should be provided. So, too, should Google and Bing provide link data about any site.

Goodbye Keyword Referrer Data

Encrypted Search AnalyticsWhile I’m concerned about the pullback on link data, I’m more concerned about how last October, Google stopped reporting to publishers the keywordspeople used to find their web sites, for times when those people were logged into Google.
Link data has long been suppressed by Google. Holding back on keyword data is a new encroachment.
Google has said this was done to protect user privacy. I have no doubt many in the company honestly believe this. But it if was really meant to protect privacy, then Google shouldn’t have deliberately left open a giant hole that continues to provide this data to its paid advertisers.
Worse, if Google were really serious about protecting the privacy of search terms, then it would disable the passing of referrers in its Chrome browser. That hasn’t happened.
Unlike the long examination of link data above, I’ll be far more brief about the situation with Google withholding link data. That’s because I’ve already written over 3,000 words looking at the situation in depth last October, and that still holds up. So please see my previous article, Google Puts A Price On Privacy, to understand more.

Google’s Weak Defense

Since my October story, the best defense that Google’s been able to concoct for withholding keyword data from non-advertisers is a convoluted, far-fetched argument that makes its case worse, not better.
Google says that potentially, advertisers might buy ads for so many different keywords that even if referrer data was also blocked for them, the advertisers could still learn what terms were searched for by looking through their AdWords campaign records.
For example, let’s say someone did a search on Google for “Travenor Johannisoon income tax evasion settlement.” I’ve made this up. As I write this, there are no web pages matching a Google search for “Travenor Johannisoon” at all. But…
  • If this were a real person, and
  • someone did that search, and
  • if a page appeared in Google’s results, and
  • someone clicked on that page…
then the search terms would be passed along to the web site hosting the page.
Potentially, this could reveal to a publisher looking at their web analytics that there might be a settlement for income tax evasion for involving a “Travenor Johannisoon.” If the publisher starting poking around, perhaps they might uncover this type of information.
Of course, it could be that there is no such settlement at all. Maybe it’s just a rumor. Anyone can search for anything which doesn’t make it into a fact.
More likely, the search terms are so buried in all the web analytics data that the site normally receives that this particular search isn’t noticed at all, much less investigated.

Extra Safe Isn’t Extra Safe

Still, to be extra safe, Google has stopped passing along keyword data when people are signed-in. Stopped, except to its advertisers. Like I said, Google argues that potentially advertisers might still see this information even if they were also blocked.
For instance, say someone runs an ad matching any searches with “income tax evasion” in them. If someone clicked on the ad after doing a search for “Travenor Johannisoon income tax evasion settlement,” those terms would be passed along though the AdWords system to the advertiser, even though the referrer might pass nothing to the advertiser’s web analytics system.
So, why bother blocking?
Yes, this could happen. But if the point is to make things more private, then blocking referrers for both advertisers and non-advertisers would still make things harder. Indeed, Google still has other “holes” where “Travenor Johannisoon” might find his privacy exposed just as happens potentially with AdWords.
For example, if someone did enough searches on the topic of Travenor and tax evasion, that might cause it to appear one of Google Instant’s suggested searches.
So why bother blocking?
Also, while Google blocks search terms from logged-in users in referrer data, those same searches are not blocked from the keyword data it reports to publishers through Google Webmaster Central. That means the Travenor search terms could show up there.
So why bother blocking?
Nothing has changed my view that, despite Google’s good intentions, its policy of blocking referrers only for non-advertisers is incredibly hypocritical. Google purports this is done to protect privacy, but it puts its own needs and advertisers desires above privacy.
Blocking referrers is a completely separate issue from encrypting the search results themselves. That’s good and should be continued. But Google is deliberately breaking how such encryption works to pass along referrer data to its advertisers. Instead, Google should block them for everyone or block them for no one. Don’t play favorites with your advertisers.

What Google & Bing Should Do

Made it this far? Then here’s the recap and action items for moving forward.
Bing should restore its link command, if not create a new Bing Site Explorer. Google should make sure that its link command reports links fully and consider its own version of a Google Site Explorer. With both, the ability for anchor text reports about any site is a must.
If there are concerns about scraping or server load, make these tools you can only use when logged in. But Yahoo managed to provide such a tool. Blekko is providing such statistics. Tiny third-party companies are doing it. The major search engines can handle it.
As for the referrer data, Google needs to immediately expand the amount of data that Google Webmaster Central reports. Currently, up to 10,000 terms (Google says up to 1,000, but we believe that’s wrong) for the past 30 days are shown.
In November, the head of Google’s spam team Matt Cutts — who’s also been involved with the encryption process — said at the Pubcon conference that Google is considering expanding the time period to 60 days or the queries to 2,000 (as said, we think — heck, we can see, they already provide more than this). Slightly more people wanted more time than more keywords shown.
I think Google should do more than 60 days. I think it should be providing continuous reporting and holding that data historically on behalf of sites, if it’s going to block referrers. Google is already destroying historical benchmarks that publishers have maintained. Google’s already allowed data to be lost for those publishers, because they didn’t begin to go in each day and download the latest information.
So far, all Google’s done is provide an Python script to make downloading easier. That’s not enough. Google should provide historical data, covering a big chunk of the terms that a site receives. It’s the right thing to do, and it should have been done already.

What Publishers Can Do

An anti-SOPA-like effort as targeted GoDaddy isn’t going to work with the search engines. That’s because the two biggest things that publishers could “transfer” out of Google and Bing are their ads and their web sites. But there’s no place to transfer these to that wouldn’t hurt the publishers with incredible amounts of lost traffic.
This doesn’t mean that publishers are powerless, however.
Bing is desperate to be seen as the “good” search engine against “evil” Google. Publishers should, whenever relevant, remind Bing that it’s pretty evil not to have maintained its own version of Yahoo Site Explorer much less to have closed the link command.
Mention it in blog posts. Mention it in tweets. Bring it up at conferences. Don’t let it die. Ask Bing why it can’t do what little Blekko can.
As for Google, pressure over link data is probably best expressed in terms of relevancy. Why is Google deliberately preventing this type of information from being studied? Is it more afraid that doing so will reveal weaknesses in its relevancy, rather than potential spam issues? Change the debate to relevancy, and that gets Google’s attention — plus the attention of non-publishers.
There’s also the issue of openness. Google shouldn’t be allowed to preach being “open” selectively, staying closed when it suits Google, without some really good arguments for remaining closed. On withholding link data, those “closed” arguments no longer stand up.
As for the referrer data, Google should be challenged in three ways.
First, the FTC will be talking to publishers as part of its anti-trust investgation into Google’s business practices. Publishers, if asked, should note that by withholding referrer data except for Google’s advertisers, it’s potentially harming competing retargeting services that publishers might prefer to use. Anti-trust allegations seem to really get Google’s attention, so make that wheel squeak.
Second, question why Google is deliberately leaving a privacy hole open for the searchers it’s supposedly trying to protect. If Google’s really worried about what search terms reveal, the company needs a systematic way to scrub potentially revealing queries from everything: suggested searches, reporting in Google Webmaster Central, AdWords reporting as well as referrer data.
Finally, withhold your own data. Are you opted-in to the data sharing on Google Analytics that launched back in 2008? Consider opting-out, if so:
To opt-out, when you log in, select an account, then select “Edit Analytics Account” next to the name of the account in the Overview window, then you’ll see options as shown above and as explained on this help page.
Opting out means you can’t use the benchmarking feature (fair enough, and no loss if you don’t use it) and Conversion Optimizer. If you still want Conversion Optimizer, don’t opt-out or alternatively, tell Google that you should have a choice to share data solely for use with that product but not other Google products.
There might be other drawbacks to not sharing that I’m missing. But we haven’t been sharing here at Search Engine Land since the beginning of the year. So far, we’re not having any problems.
Google loves data. Withholding your own is another way for publishers to register their displeasure about having data withheld from them. And it’s the type of thing that Google just might notice.

Blogs that Will Make You More Money

Thursday, 5 January 2012

15 Copywriting and Content Marketing Blogs that Will Make You More Money
Image of a globe made of dollar bills
As you may have seen on Tuesday, we were disappointed to see that there were no copywriting blogs in last year’s Top 10 Blogs for Writers.
We think persuasive writers — content marketers and copywriters — are as worthy of cheers and accolades as our fiction-writing brothers and sisters.
So today I put together a list of 15 writing blogs I think you’ll get a lot out of.
I got lots of great suggestions for blogs to check out (thank you all), and it was tough to narrow them down to a manageable few.
When winnowing down the list, I had a few rough criteria.
First, writing advice had to be a key element of the blog.
There are hundreds of terrific social media and business blogs, and they’re wonderful resources, but we wanted to focus on sites that would make you a better writer.
I defined better writer in two ways — either as “a writer who ethically and effectively convinces customers to buy more stuff” or “a writer who’s landing more and better clients.”
We also didn’t include the big “name brand” sites — we wanted to focus on some smaller sites you might not have seen yet. Not surprisingly, we’ve got a good sample of Copyblogger guest writers here, but also plenty of folks you haven’t seen here. (Not yet, anyway).
By the way, when you click through, notice how most of these blogs make great use of their tag lines to tell you exactly how they can help solve a specific problem. Smart copywriters. :)
BenSettle.com
Ben Settle
If you’ve heard Ben speak on our radio show or you’ve read his Copyblogger posts, you know he isn’t wishy-washy. He likes to sell, and he likes to make money. He uses email marketing to do those things, and he has a lot of strong, sharp advice for email marketers. If you’re still nervous about selling, reading Ben Settle might freak you out. Which may be a good and useful thing for you.
Copylicious
Kelly Parkinson
A January post makes us optimistic that Kelly will start writing actively again for this smart, funny writing blog. From her bio: “ … this is not really about copy. This is about improving your whole business.” We couldn’t agree more.
Direct Creative
Dean Rieck
Dean has been one of our most popular guest writers here on Copyblogger, because he knows his stuff. His blog delivers no-nonsense tips and advice on how to improve your direct response copy. If you want to improve your persuasive writing chops, Dean’s site is a must-read.
The Domino Project
Seth Godin
This is a small blog around Seth’s Domino Project, a digital publishing experiment. Seth’s published articles here about digital publishing, ebooks, and how they affect writers and publishing. If you’ve considered publishing a book in this century, you should probably take a look at this site.
Ghostwriter Dad
Sean Platt
Sean has gone from a sweet, enthusiastic fledgling ghostwriter to a sweet, enthusiastic, and really, really successful marketing writer (as well as launching a thriving fiction series. He’s a busy dude). He’s publishing lots of great advice about how he made that journey, and how you can, too.
Good Copy, Bad Copy
Clare Lynch and David Pollack
A charming blog about “good business writing and bad. Especially the bad. Because there’s so much more of the bad.” If you ever help corporate clients communicate with their customers, you need this blog.
Harrison Amy Copywriting
Amy Harrison
Amy doles out copywriting advice for professional writers and businesspeople alike. She has some nice resources on the site, including a good guide on getting your sales page done if you aren’t a professional writer. (Or maybe even if you are.)
Jeff Sexton Writes
Jeff Sexton
If you want to get really good as a copywriter, you have to read Jeff Sexton. He’s not afraid to dive into the thorny, complicated tangle of what makes for truly effective copywriting. Jeff’s a pro, and he writes for pros. This is a great site.
Make a Living Writing
Carole Tice
The name of Carole’s blog says it all — she keeps a tight focus on professional writers and how they can make a better living. Her blog’s got writing tips, business, and marketing advice.
Men with Pens
James Chartrand
The times certainly have changed. For example, now there are actual men writing for Men with Pens. What hasn’t changed is a site that mixes business and writing advice for content marketers, pulled together by James Chartrand’s no-nonsense approach to online marketing.
Success Works
Heather Lloyd-Martin
Heather’s bio describes her as “split between watching the search engines dance and pinpointing the exact direct response copywriting strategies that make people buy.” That dual focus shows up consistently in sharp, well-written articles and videos by her and her team about the art and science of SEO copywriting.
The Rant
John Carlton
The name of the blog gives you fair warning — John Carlton does enjoy the sight of his own voice. But he’s also an excellent copywriter and a terrific copywriting teacher. Look to the “Must Read” and “Popular Posts” sidebars for some classic writing advice, given with a healthy dose of … well, ranting.
RicardoBueno.com
Ricardo Bueno
Ricardo specializes in content marketing for real estate professionals, and he’s got lots of resources for using blogs, social media, and content to create effective marketing for that market. I love this example of a content marketer working within a well-defined niche. (If you’re a writer struggling to stand out, think about the niche you could be serving.)
The Well-Fed Writer Blog
Peter Bowerman
Peter’s written some great books on going from being a starving writer to a well-fed one, and his blog continues that tradition with savvy business advice for professional copywriters. No writing advice here — it’s all about how to build your copywriting business, not your writing chops.
Words That Begin With You
Justin Lambert
Justin combines copywriting insights with content marketing advice, wrapped up in a strong writing voice. We like that! Lots of good articles here on becoming a better content marketer.

And one bonus

This isn’t an active blog, but it’s a wonderful resource that no content marketer or copywriter should overlook … Gary Bencivenga’s wonderful Marketing Bullets.
Gary’s one of the most successful copywriters in the history of the business, and he has a lot of simple (but not always easy) advice about mastering the craft of persuasive writing. We’re big Bencivenga fans and we think you will be, too. I have all of these printed out in a binder, and I refer back to them often.

How to Add Sound to an HTML5 Web Page

Tuesday, 3 January 2012

Using the HTML5 Audio Element
  
HTML5 makes it easy to add sound and music to your web pages with the AUDIO element.
 In fact, the hardest thing to do is create the multiple sources you need to make sure that
 your sound files play on the widest variety of browsers.But the benefit of using HTML5
 is you can embed sound just by using a couple tags. Then the browsers play the sound just
 like they would display an image when you use an IMG element.

Dificulty: Average
Time Required: 10 minutes

Here's How:

1.                 First you need a sound file. It's best to record the file as an MP3 (.mp3)
as this has high sound quality and is supported by the most browsers
(Android 2.3+, Chrome 6+, IE 9+, iOS 3+, and Safari 5+).

2.                 Convert your file to Vorbis format (.ogg) to add in Firefox 3.6+ and Opera 10.5+
 support. You can use a converter like one found on Vorbis.com. You can also convert your
 MP3 to a WAV file format (.wav) to get Firefox and Opera support. I recommend posting
 your file in all three types, just for security, but the most you need are MP3 and one other type.

3.                 Upload all the audio files to your web server and make a note of the directory you
 stored them in. It's a good idea to place them in a sub-directory just for audio files, like most
 designers save images in an images directory.

4.                 Add the AUDIO element to your HTML file where you want the sound file
 controls to be displayed.<audio controls>

5.                 Place SOURCE elements for each audio file you upload inside the AUDIO element:
<source src="/audio/sound.mp3">
<source src="/audio/sound.ogg">
<source src="/audio/sound.wav">

6.                 Any HTML inside the AUDIO element will be used as a fallback for browsers that
don't support the AUDIO element. So add some HTML. The easiest way is to add HTML to
 let them download the file, but you can also use HTML 4.01 embeding methods to play the sound.
Here is a simple fallback:

<p>Your browser does not support audio playback, download the file:
<a href="/audio/sound.mp3">MP3</a>,
<a href="/audio/sound.ogg">Vorbis</a>,
<a href="/audio/sound.wav">WAV</a>

7.                 The last thing you need to do is close your AUDIO element:
</audio>

8.                 When you're done, your HTML should look like this:
<audio controls>
  <source src="/audio/sound.mp3">
  <source src="/audio/sound.ogg">
  <source src="/audio/sound.wav">
  <p>Your browser does not support audio playback, download the file:
  <a href="/audio/sound.mp3">MP3</a>,
  <a href="/audio/sound.ogg">Vorbis</a>,
  <a href="/audio/sound.wav">WAV</a>
</audio>

Tips:

1.                 Be sure to use the HTML5 doctype (<!doctype html>) so that your HTML will validate
2.                 Review the attributes available for the AUDIO element to see what other options you can add to your element.
3.                 Note that I set up the HTML to include controls by default and have autoplay turned off. You can, of course, change that, but remember that many people find sound that starts automatically and they can't control to be annoying at best, and will often just leave the page when that happens.

What You Need:

·                        HTML Editor
·                        Sound file (preferably in MP3 format)
·                        Sound file converter




Jennifer Kyrnin

Web Design / HTML Guide


Why It is Not What it is Predicted to Be?


SearchEngineLand.com's Eric Ward 'predicts' the fate of directory submission in 2012 that is to join extinct SEO strategy archives. Is it fair to say so? Let's find out!


Directory Submissions 2012

Directory submissions is the most sought after SEO service - for 'failed predictions' since time unknown. It holds on its stand midst on and off SEO trends that are constantly under scrutiny for quality online web promotions. For all those who are not aware of what is the history and recurrent events that happened in the life of poor old directory submissions, here is a preview:


1993 - 1996 - Not every business was registered online, search engine dinosaurs like Lycos and Webcrawler could not mark their presence, the only resort for existing websites was to be listed on online web directories. Then came in Yahoo search and Yahoo directories that worked in coalition with each other. DMOZ.org (open directory) gets high recognition in this period and is a huge hit amongst websites that look for online traffic. It was an historic start to organize online data on the internet.


1997 - 1998 - Search engine giant of present times - Google, enters the internet search arena. It suddenly realizes that depending on directories like Yahoo and DMOZ costs tremendous manpower and labor hours, instead they come up with something independent and pre-programmed that allows filter search and provide relevant results. Sure there were speculations that it was the end of the Directory Submissions, only to realize it wasn't.


1999 - 2006 - Web directories slowly lost their glory to automated search engine algorithms but still remained as the indirect source of links that was eventually counted as a 'green signal' towards the presence of a certain website listed in it. The web directories were established with a huge traffic of reciprocal links flowing in. During this period paid directories were showed down by Google as the search engine giant believed search engine ranking via directory submissions mustn't be priced.


2007 - 2009 - Web directories gained larger importance to help sites that catered to specific niche markets. This was more of a transition phase for directories from a stagnant presence to a much categorized ability to attract target visitors to your website. Spam directories were slowly penalized by Google and quality directory lists were constantly updated to offer meaningful online listings. Apart from Niche web directories, country-specific directories also helped promote local businesses better during the growing trend of local business optimization online. 


2010 - 2011 - While web directory submissions' fate were being predicted by webmasters on a moderate scale, it seemed to have sky rocketed courtesy Google Panda update in 2011. However, whoever really understood the Google Panda update would realize that apart from some directory network sites that offered directories with spammed content, most popular and seemingly high page rank directories remained untouched. Eric Ward predicts that Directory Submissions would die a fatal death in 2011, to only correct himself again today to re-predict the final demise of directory submissions in 2012. Another, predictable prediction that seems like it would re-surface yet another year (unless the concept of Dooms Day stands true)


The fact is, however, much deeper than what these webmasters predict, and what they predict is only an assumption that apparently is a weather forecast without a satellite. 


Learn why directory submissions are still here to stay and become the 'Extant species in the Survival of the Fittest theory' and just does not wish to disappear from the face of the SEO Earth yet. Following is a quick overview of the information that can prove these early predictions about directory submis wrong:


High Ranked Directories: If directory submissions were to face its end, why are most directories still ranked high by major search engines? Here is an online web directory page rank report to prove my point. It is a known fact that there were some directories that were quarantined by the Panda, but the rest of the high PR directories remain untouched and still do contribute a prominent assistance in getting websites indexed by search engines through various directory submission services.


Least Cost, Most ROI: Directory submission service costs the least amongst various SEO services currently quoted in the SEO industry. As these submissions are made available at throw away prices, the ROI on this particular SEO strategy remains higher than any other. The ROI can be even enhanced further by increasing the approval rates which in turn depends on the following factors: 


a.) Relevant homepage content that helps the editor decide almost immediately that the site is professional and genuine business/info to the site visitors. 
           
b.) Appropriately written title and descriptions that do not stuff in the keywords, instead share information that is relevant to the content of the site


c.) Websites submitted to relevant categories


d.) Acquaintence to different online directory submission site rules


e.) Absence of unnecessary ads on the homepage that generally gives the impression of sites that are solely created to contain these ads. 


Source of Rich Snippets: Matt Cutts of Google says so himself that they acquire rich snippets from the high PR directories like DMOZ.org to display on search results, if they are unable to crawl the webpage to return a search result query. 


Determine the right quality of the directories you want your website listed onto and you would have no problems choosing this essential SEO ingredient for your website, that is apparently NOT DEAD!

What's up for Apple in 2012 - DotTechnologies

Monday, 2 January 2012

A new iPad, an updated iPhone and bridges to the enterprise are on the horizon

Computerworld - 2011 was a big year for Apple. The company continued to dominate the tablet market, with no rival coming close the iPad in sales. It also released Lion, an update to OS X that delivered hundreds of new features; pushed out a major update to iOS that finally cut the cord for backups and syncing; launched its new cloud service, iCloud (albeit not without some issues); and continued to rack up record sales of Macs.

And, of course, 2011 was the year that Steve Jobs stepped down as CEO because of declining health just weeks before his death on Oct. 5.

On several fronts, Apple seems likely to capitalize on its successes in the coming year. Fueling continued growth in 2012 will be several trends that began this year within the company, the consumer market and enterprise IT.

Bridges to the enterprise

Apple isn't the first company that comes to mind when you think about enterprise technology vendors. Indeed, Apple has never released an enterprise product road map like Cisco, HP, Dell and other vendors do. Despite its unique approach (or lack thereof) to the enterprise, Apple has historically created its own enterprise offerings, like its OS X Server platform and the Xserve rack-mount server.
Apple Update

Although these could be integrated with common enterprise infrastructures, Apple has focused more on producing end-to-end Apple solutions -- an approach that has worked well in Mac-centric niche markets and small businesses, but not in the larger corporate world.

Over the past couple of years, however, Apple seems to have learned that it doesn't have to be the only option for bringing its technologies into the workplace. The company has been quietly building bridges to the enterprise by establishing partnerships with third parties to advance the use of iOS devices and Macs in business environments.

Letting mobile device management (MDM) vendors lead when it comes to securing and managing iPhones, iPads and other mobile products is a great example of Apple putting the underlying technology in place and letting others devise scalable, multiplatform systems.

At the same time, Apple has scaled back some its own enterprise technology. The last orders for the Xserve rack-mounted server shipped in the spring, and OS X Server was recast as a small business system with the release of Lion Server as a simple add-on to Lion.

This is a trend that will continue through 2012. Apple will quietly boost its support for iOS and OS X in the enterprise by more deeply integrating its systems with existing enterprise standards and technologies. The number of client and system life-cycle management tools that now support Macs and iOS devices are evidence of this.

But Apple will also work to ensure that its systems are implemented and supported as effectively as possible. It will achieve that goal through partnerships with IT consulting firms, by offering training and educational resources, and by refocusing its existing set of certifications to address the needs of the market. Apple has already taken such steps with the new Mac Integration 10.7 certification for deploying Macs in Windows environments.

Apple's own efforts will be helped by the growing trend of companies allowing employees to use their own devices for work -- the bring-your-own-device (BYOD) movement -- and a tablet market that has yet to produce a true iPad competitor.

Embracing LTE and the A6 for next iPad/iPhone

Without a doubt, we'll see new iPads and iPhones in the coming year (and a real update to the iPod Touch). Those devices will be based around next-generation ARM processors. There will almost certainly be improvements to the iPad's screen, bringing it in line with the retina display of the iPhone (though maybe not to full parity when it comes to pixels per inch).

Likewise, there's little doubt that Apple will introduce LTE support. The question is more about whether Apple will offer separate 3G and LTE models of the iPad 3 and iPhone 5 (along the lines of the Verizon iPhone 4, which launched as separate GSM and CDMA handsets) or combine the technology in a single model.

The inclusion of near field communication (NFC) technology and support for mobile payments seems more in doubt. Obviously, Apple is proving that iPhones make great mobile payment systems via the self-checkout feature recently instituted by its retail stores. And it already has a user-payment mechanism in place through the iTunes Store. However, Apple sometimes waits on features (like LTE) until it feels they can work as flawlessly as possible. That could mean another year before NFC comes to iOS.

I strongly suspect that 2012 will also be the year when Apple begins to diversify iPad price points. The company isn't likely to introduce a 7-in. or limited-feature model, however. Instead, it will more likely follow the approach that it has used with the iPhone and continue producing past iterations and selling them at lower prices.

Siri evolves 

It's a given that the iPhone 4S's virtual assistant, Siri, will improve and evolve throughout 2012. And there are a handful steps that Apple is sure to take to enhance the voice-controlled technology. For starters, we know that additional languages and localized results will be added in the coming months. We also know that Apple is amalgamating a lot of crowd-sourced data that will improve Siri's speech recognition and understanding of colloquial phrases. And it seems a sure thing that Apple will deploy Siri on a number of devices, including future iPads and iPod Touches (and potentially TVs, which I'll get to in a bit).

It's also likely that Apple will begin linking Siri to additional reference and recommendation services. Currently, Siri can get information from Wolfram Alpha, Yelp, and Google Maps. As the brouhaha over Siri's inability to locate abortion services proved, Siri is at the mercy of the services it connects with. No doubt, Apple is working to identify and develop partnerships with additional content sources, particularly in regional and local markets around the world. The company may even be looking at buying one or more outright. I also wouldn't be surprised to see Apple allow users to select which services Siri uses to locate information or in what order it should query those sources.

Beyond putting more data at Siri's virtual fingertips, Apple will also likely extend the technology's capabilities in terms of what data and apps Siri can interact with on an iOS device. Any new capabilities would be implemented in iOS itself or with Apple's apps at first. That said, I'm sure Apple will eventually open up at least a few APIs so developers can plug into Siri in a variety of ways. I expect this will be a key feature for developers in iOS 6.

One big question: Will Apple eventually introduce additional voice/personality packs for Siri? Given that Apple has already done so to some extent with the localized language support in the Siri beta, it seems almost certain that the company can offer additional choices, just as most GPS navigation systems do. The questions are: When will we see them, and will they be free?

As with developer access to Siri, I'm betting that those options will arrive with iOS 6 and that users will be able to purchase them like ringtones.

iOS 6

It seems a foregone conclusion that Apple will release iOS 6 in 2012, given that the company has put out iOS updates on a yearly basis since the iPhone shipped in 2007. Probably the biggest changes will center around Siri's expanded feature set, but Apple is likely to have some other surprises in store.

Two particular areas where Apple may shake things up involve maps/navigation and search. Over the past couple of years, Apple has quietly been building better map-related features; those efforts have included the creation of its own Wi-Fi hotspot database. And it's been buying mapping software vendors, indicating an eventual shift away from Google Maps. Likewise, particularly with Siri's search-related capabilities, Apple may look to develop relationships that reduce its reliance on Google overall.

Apple will probably also improve the notification center introduced in iOS 5 this fall. For people who have been critical of Apple's decision not to allow Android-like widgets on the home screen, this may be the company's first step in that direction. The Weather and Stocks apps currently can display more information than just alerts, and it's reasonable to assume Apple could expand this type of display.

Less dramatic steps Apple is sure to take will include increasing the capabilities of the features (hardware and OS-level tools) that developers can access under iOS and improving mobility management. Apple has given developers access to a greater range of APIs in each iOS release and that seems destined to continue -- both for Siri and other technologies.

The company will be looking to continue to improve device and data security, as well as the capabilities of MDM systems for monitoring and managing iOS devices in business environments -- an area where Apple has an advantage over most Android devices at the moment.

iCloud improves

It seems impossible to count out an update to iCloud in 2012. The big issue Apple needs to address is syncing between Macs and PCs. Right now, syncing third-party app data between iOS devices is decent, but limited. If Apple is going to challenge other companies in the cloud, it needs to be able to integrate with desktops better. And what Mac user wouldn't want to see iCloud bring back Mobile Me's ability to sync system settings?

Apple TV becomes more than a hobby

With a recent estimate that 8% of U.S. households have an Apple TV (with Apple capturing 32% of the connected TV market), it's hard to call the current Apple TV a hobby -- though Apple has done so since the original model was announced five years ago. 2012 seems poised to be the year that the Apple TV becomes a major product and revenue source for Apple.

Rumors have been rampant that Apple would eventually produce its own HDTV line instead of the current Apple TV set-top box. Those rumors began flying more than ever this fall after Walter Isaacson's biography of Steve Jobs was released -- largely because of a brief passage in which Jobs says that he "finally cracked it" in reference to the television market and/or industry. That comment -- along with the release of Siri on the iPhone 4S (and its ability to control TV content when paired with Siri Proxy) -- has been taken by many to mean that Apple is planning a next-generation HDTV that utilizes voice control. Many HDTV manufacturers have actually begun work on voice-controlled models as a result of those reports.
Apple Update

    * What's up for Apple in 2012?
    * Patent win will boost Apple fight against Android
    * Apple reportedly buys Israeli SSD-maker Anobit
    * Apple's share of 'ultrabook' market set to plummet, analyst predicts
    * Sotheby's sells Apple founding contract for $1.6M
    * Mac App Store download tally reaches 100 million
    * EU watchdog launches antitrust probe of Apple, ebook publishers
    * Update: How to turn off Carrier IQ on your iPhone
    * iPhone 4S most popular iPhone ever
    * Apple's Siri balks at abortion queries, pro-choice advocates charge

If Apple does develop its own HDTV line, 2012 will probably see a small rollout of a limited number of models to test the market.

Regardless of whether Apple is planning its own line of televisions, it's pretty certain that we'll get a new version of the set-top box, probably with an updated interface and features. The current Apple TV user experience hasn't really changed much in a few years, and I can envision Apple revamping it. Given Netflix's gaffes this year, I also wouldn't be surprised to see Apple offer additional streaming services like those from Hulu+ or possibly Amazon.

I also wouldn't be surprised to see Apple add AirPlay-based gaming and limited app support -- after all, AirPlay gaming can already be done on the iPad 2 and the iPhone 4S. I do think that Apple will be very cautious about game or app support and I doubt such features would run directly on the Apple TV without a supporting iOS device.

While there may be a voice-control element based around Siri included with a new Apple TV or Apple HDTV, it will probably be an optional feature, and traditional navigation using an Apple remote or an iOS device and the Remote app will be included. In fact, voice-control systems could always be offered with complementary remote-control devices or apps to avoid the problems that would arise when background noise or loud conversations render voice commands ineffective.

Previews of Lion's successor at WWDC

There's been a lot of speculation about what the next version of OS X will look like. The idea that Lion might be the last release before the Mac and iOS platforms merge completely has even been floated since the announcement of Lion last fall. About the only assumption I can really make about OS X 10.8 is that it will be more iCloud-ready; there are plenty of ways Apple can -- and probably will -- make Macs more cloud-centric. Beyond that, the only thing we can expect is a limited preview at Apple's annual Worldwide Developers Conference.

The patent battles continue

Over the course of 2011, Apple became embroiled in patent lawsuits with Android manufacturers in countries all over the world. Keeping track of the ongoing skirmishes in what can only be called an all-out patent war has been like tracking division games during the NFL season. It's pretty clear that this war is far from over and that Apple is not interested in making money (or peace) by licensing its array of mobile technology patents. The company seems to be intent on crushing Android, which Jobs, on multiple occasions, emphatically described as stolen.

While I'd like to see cooler heads prevail, I don't see Apple backing down. What that will ultimately mean for any of the companies involved -- or even for the patent processes around the world -- isn't clear. One thing that it will almost certainly do is engender more antagonism between companies and even end users, something Apple would be better off without.

If there is any upside to this situation, it may well be that it exposes the need for an overhaul of the patent processes in the U.S. and around the world. After all, the system for registering patents was set up long before the advent of today's technologies, and it doesn't always seem to offer sensible and consistent guidelines for innovations in the 21st century.

Ryan Faas is a freelance writer and technology consultant specializing in Mac and multiplatform network issues. He has been a Computerworld columnist since 2003 and is a frequent contributor to Peachpit.com. Faas is also the author of iPhone for Work (Apress, 2009). You can find out more about him at RyanFaas.com and follow him on Twitter (@ryanfaas).

Top