Current and Future Search Trends: What the Top Internet Search Engines Are Doing

By Scott Buresh

The future of search is unclear – what is clear is that change is rapidly happening for all of the top Internet search engines. Google as always is the frontrunner for many of these search trends, but even little guys like Ask.com are making waves. In this article, I will attempt to cover some of the more interesting search trends that are occurring today with the top Internet search engines – but I am by no means being comprehensive about the subject. Things are changing on a weekly, or sometimes even daily, basis, and future articles will cover additional developments in depth.

Universal Search

In May 2007, Google – the leader among top Internet search engines — got people talking (again) when it rolled out its latest search concept, Universal Search. Universal Search was Google’s attempt to create a single page of search results, rather than separate pages for types of results, such as videos, images, maps, and websites. When it was first introduced, many search engine optimization firms raced around exclaiming that this was one of those search trends that would change everything and that new optimization rules should be created and followed immediately.

I published an article in early 2007 in which I noted, “The problem with Universal Search is that it can muddy the results, and it can also introduce irrelevant results that a searcher cannot use.”1 I also wrote, “Clearly, Universal Search will change how an SEO campaign is run if it catches on. But this is a real if – users’ search habits are hard to change overnight, even if you are Google and you essentially define what searching is and how it works.”2

And in fact, Universal Search didn’t quite take off the way Google had hoped. A post on MediaPost’s Search Insider by Mark Simon boldly states, “Universal Search will probably not be viewed as the greatest Google fiasco since Google Video, but it’s clear that it’s failed to deliver on the vaunted promises made by Marissa Mayer back in May.”3 So will we see more of Universal Search, or will it be quietly put to the side? Will other top Internet search engines want to use it for themselves? Only time will tell, but it seems like Google needs to do a lot more work before users really warm up to it.

Personalization and Personalized Search

Personalization on the other hand seems to be one of the search trends working very well for Google and many of the other top Internet search engines. In an article I wrote a few months ago, I said “The basic principle behind personalized search is simple. When you go to Google and type in a search query, Google stores the data. As you return to the engine, a profile of your search habits is built up over time. With this information, Google can understand more about your interests and serve up more relevant search results.”4

As it works right now, if you use a Google product (Gmail, Google toolbar, AdWords, etc.), Google is keeping track of what you search for and what websites you visit, and it’s then tailoring your results appropriately. Search for “bass,” and Google will know whether you mean the fish or the instrument. As I pointed out, though, there are major issues with search trends like personalization:

Privacy issues that arise from personalized search are also a big question. The EU recently announced that it is probing into how long Google stores user information (this probe was subsequently extended to include all search engines). AOL recently committed a serious blunder when it released search data from 500,000 of its users, and it was discovered that it was fairly easy to identify many people by the search terms that they use…5

Yet if nobody makes a fuss about this, then it’s very likely Google – and the other top Internet search engines – will start tracking everyone behind the scenes, whether they use a Google product or not.

It’s actually already starting – right now, the cookie Google places on your machine (did you even know they did that?) will expire in two years – but they won’t really expire at all. According to the official Google blog:

In the coming months, Google will start issuing our users cookies that will be set to auto-expire after 2 years, while auto-renewing the cookies of active users during this time period. In other words, users who do not return to Google will have their cookies auto-expire after 2 years. Regular Google users will have their cookies auto-renew, so that their preferences are not lost. And, as always, all users will still be able to control their cookies at any time via their browsers.6

Seems it won’t be long before Google knows what you’re searching for before you do.

Expanding “Sneak Peeks”

Ask, one of the smaller of the top Internet search engines, has been using sneak peeks to entice searchers for a while now. Searchers who use Ask.com can mouse over an icon next to many results and see a screen shot of the website. No clicking needed. Google, always watching for search trends, seems to have noticed, because they’ve filed a patent for expanding their own snippets.7 Soon searchers on Google may be able to read expanded summaries of pages, or longer clips of page text. This tactic appeals to searchers who are now demanding more and more information faster and faster from the top Internet search engines, and who don’t want to waste precious seconds clicking on a link and then on the back button to find just the right site for their needs.

Syntax Queries

When Ask was Ask Jeeves, the butler was supposed to listen to your search queries in the form of questions and then get answers for you. The problem was, this never worked exactly the way it was supposed to. Instead of answering the question based on syntax, the engine still responded to searches in the same way others did, by analyzing the words and returning a list. Jeeves was retired with a bit of fanfare, and the engine handles queries in the more traditional manner for now. But all of the top Internet search engines have continued to work on this concept, with Google again leading the way since it has the manpower and brainpower to do so. I expect that within the next year, this will be one of the search trends that the engines will want to focus on with a greater push toward answering questions rather than just returning related results.

Speech Recognition and the Mobile Market

Speech recognition is really going to be one of the huge search trends in the coming months and years for the top Internet search engines. In an interview from this past summer, Peter Norvig, director of Google Research, noted, “[Google] wanted speech technology that could serve as an interface for phones and also index audio text. After looking at the existing technology, we decided to build our own. We thought that, having the data and computational resources that we do, we could help advance the field.”8 With speech recognition in place, one could go to Google (or another of the top Internet search engines) and use a microphone to ask a question aloud, or just say some keyphrases, and get a list back immediately.

And speech recognition has the biggest benefit for top Internet search engines when it comes to users of mobile devices. Let’s face it, as advanced as those keyboards may have gotten, they’re still a pain to use and it’s time-consuming to type in more than a few sentences. (That’s y txt msgs r lk ths, u c?). Norvig is on top of that too, noting, “In general, it looks like things are moving more toward the mobile market, and we thought it was important to deal with the market where you might not have access to a keyboard or might not want to type in search queries.”9

More to Come

As I noted in the beginning, this is just a small sampling of the search trends for the top Internet search engines today. Google, Yahoo, and even Ask are all working tirelessly to get your business and to make search easier, faster, and more accurate. Keep checking back for future articles covering some of the other trends and following up on the ones I’ve already discussed.

About the Author: Scott Buresh

Scott Buresh is the CEO of Medium Blue Search Engine Marketing, which was recently named the number one search engine optimization company in the world by PromotionWorld. Scott has contributed content to many publications including Building Your Business with Google For Dummies (Wiley, 2004), MarketingProfs, ZDNet, WebProNews, DarwinMag, SiteProNews, ISEDB.com, and Search Engine Guide. Medium Blue serves local and national clients, including Boston Scientific, Cirronet, and DS Waters.

Rank High for your Keywords With Diverse Anchor Texts

Author: Todd Daon

In the world of search engine optimization, we all know the value of back-links, these can be better defined as votes of trust which point back to our website. There are several types of links and they are ranked differently by the search engines, a few are: one-way links, reciprocal links, three-way links, etc. The most important type of links are the one-way links since reciprocal links have somewhat lost their value due to the incredible link spamming which took place a few years ago.

The reason many reciprocal links lost value is because they were mostly placed in what were called “link pages”, these pages contained huge numbers of links and many times they pointed to websites which were completely outside of the linking site’s niche; in other words pharmacy websites were linking to real estate sites, sports sites were linking to technology sites and so on. As you can see, this linking structure offered no benefit for the visitor and was specifically designed to manipulate the search engines.

Google quickly adapted to these changes and reciprocals were downgraded, this gave one-way links more value as far as SEO. These type of links are hard to get since webmasters need to write about a specific source and quote it on their sites by providing a link to it, this represented a serious problem to webmasters who heavily relied on automated reciprocal link acquisition.

Link exchange networks which operate even at this point in time, made things easier for the search engines to detect the linking patterns and anchor text utilization, this last factor helped determine if the reciprocals were in fact spam or if they were good links. By using link networks to spam the search engines, site owners have only one way to specify anchor texts, meaning if the keyword they want to rank for is “search engine optimization” they would have to enter that term in the automated system and that would be the anchor texts linked to their sites from the thousand of sites participating in the network.

By running a simple back-link analysis on such sites it is easy to see that they have participated in networks designed to manipulate the search engine result pages since their main term comes up repeatedly hundreds of thousands of times. At this point you may be wondering “Well, how can we rank for a specific keyword without looking spammy?”, the answer is simple, you can vary the anchor text, use juxtaposition or even make long tail phrases from them. For instance, if the term you are trying to rank for is “apple pie” you can use the following anchor text variations to rank for long tail keywords as well as for the main term: “delicious apple pies”, “home made apple pies”, “gourmet apple pies”, “apple pie making secrets” and so on.

As you see the main keyword is mentioned in the anchor phrases but they are not redundant which works a lot better in terms of getting better rank and traffic from the search engines, especially from Google.

Article Source: http://www.articlesbase.com/seo-articles/rank-high-for-your-keywords-with-diverse-anchor-texts-285630.html

About the Author:

If your looking for a reliable and affordable link building service check out Manual Directory Submissions or try their Article Submission Service today.


The Web 2.0 Effect: the Characteristics of a Web2.0 Website

The term web2.0 was originally presented by O’Reilly Media (A well known media company publishing books and websites on various computer technology topics). It’s a term that refers to a new generation of websites (social networking websites, wiki-based websites etc). These websites take advantage of web application technologies and give web users the ability to collaborate and share their experiences, views, opinions and interests while they surf the web.

The web2.0 is a revolutionary phenomenon. Let’s talk about the most basic characteristics of the websites using the web2.0 concept:

– A web2.0 website should be completely interactive and dynamic with a friendly user-interface based on the latest web2.0 technologies like AJAX.

– Web2.0 websites should deliver web based applications to Internet users and allowing them to make use of these applications through a web browser.

– A web2.0 website should implement social networking capabilities allowing users to interact with each other and create friend lists.

– A web2.0 website should be a democratic website where users will be able to add value by interacting with the web based application.

– A Web2.0 websites should allow it’s users to exercise various controls over the website data and content (adding/deleting/editing content).

The conclusion is that web2.0 websites are build on participatory web based applications focusing basically on user experience and collaboration.

Examples of successful web2.0 websites

Although this new Internet revolution or trend is not widespread among web developers or Internet marketers yet, millions of users are actually participating in such websites. Not a lot of them are aware of the web2.0 concept but they are already an active part of it.

Here are some super-successful websites utilizing the web2.0 technologies:

– YouTube.com : The concept of YouTube is very simple. It allows Internet users to share their favorite video files with the entire world. YouTube gained so much popularity in such a little time. Everyone was surprised when the giant search engine Google bought the YouTube company for over 2 billions dollars!

– Wikipedia: The most famous online encyclopedia. It’s free, it’s huge, it’s quite a resource for everyone and it’s updated every single minute since anyone can edit it’s contents. Which is why it became such a popular web place.

– Social Bookmarking websites like Digg.com : These type of websites like Digg.com offer users the ability to create friend lists and share their favorite websites, opinions, stories etc with people all over the globe. The popularity of these social bookmarking websites is increasing every day, making the website owners rich!

– MySpace.com: I bet you’ve heard of MySpace.com. This website will allow you to create your own profile, friend list and personal homepage adding whatever you want on it (text, images, videos, links, etc). It will also allow to share your profile and web page with other MySpace users. Amazingly simple but so clever. MySpace.com is now one of the most visited websited in the entire Internet.

What do all these websites have in common?

The web2.0 concept. These websites are active web based applications. They all allow internet users to actively participate and customize the way the website looks and feels, thus giving the pleasure and impression of collaborating to the online community. The web2.0 is so evolutionary because of it’s simplicity and it will become even more widespread among website designers and internet marketers.

About the Author:
M.Markell is a webmaster of DigitalStarProducts Product Directory

Ajax — Bring your Website to Life

Chances are that whether or not you are not a technology guru, you keep hearing talk of internet technologies in terms of a never ending stream of acronyms. XHTML, CSS, J2EE, PHP, ASP.NET, JSP, SQL, HTTP, SSL, VOIP, P2P, XML, RSS… Sound familiar — or a bit like Klingon?

Well, this article is about another really cool acronym, only this time for the average Joe. I’ll be focusing on only one acronym in this article, and only because it’s one you’ll be hearing a lot more about, and because it can really breathe life into your business’ website.

AJAX. Asynchronous JavaScript and XML, if you must know what it stands for. Don’t let that put you off. Even if it is comprised of technical words and yet another acronym, more importantly, what is it, and why do you need it?

Remember the internet up until recently? Virtually all web pages were lifeless objects, similar to pages in a magazine, with no intelligence. In order to get a response from a website, you would click a link or press a submit button, and your computer would send a message to the computer hosting the site that it needs to send you back the next page. A little bit like turning to a new colourful but lifeless page in your magazine.

Sure, there were some really great new technologies that came along to make the internet more interesting. Animated images, Flash animations, small embedded applications, and scripts downloaded with the page certainly made the pages a lot more interesting. Yet, when compared to the highly responsive environment of your desktop computer, a web page really could not compete. Click anywhere in your local pc environment and you get an immediate and logical, virtually instantaneous response from the object you clicked, without your entire screen flashing blank and reappearing again from scratch. Do the same on a web page until recently, and usually at best you would have to wait a few seconds for a completely new page to reappear, a newly turned page in your virtual magazine.

Until Ajax. Technically Ajax is actually not a new technology, but a clever grouping of a few existing technologies. What has made Ajax a workable solution is that most people have started using browsers capable of supporting these technologies, often without even realising that their browser wields these dormant superpowers.

Ajax allows your internet page to respond to a user very similarly to the way your local computer desktop environment would. In a well-developed Ajax-based site, when you click on a page object, it responds almost immediately. Any changes to the page happen there and then, without the page disappearing and a new one replacing it. The result? A much more pleasant user experience. Once more, computers are becoming more interactive and responsive.

The concept can be a bit difficult to grasp initially, so here’s a simple example from the RealmSurfer website to demonstrate the concept: Ajax example: search page. At first glance this looks like any other search page. Notice that advanced search link underneath the search box? When you click it, notice that you instantly get a whole lot more functionality inserted into the page — no page refresh. Click the link again and the process reverses itself.

This is just a very simple example of the benefits of building Ajax functionality into your site. To get a look at some really powerful applications, have a look at a few popular Ajax-enabled websites:

Google Calendars — a really great online calendar and event scheduler. Without Ajax, this would be one of the most irritating developments since email spam, however with its Ajax capabilities it’s well on it’s way to becoming an online Outlook killer. Kiko is another excellent site utilising virtually the same concept.

Flickr — one of the first sites to popularise the Ajax phenomenon, it’s still one of the best ways to share pictures with your friends and family online. Yahoo was quick to realise the potential, and purchased the company a while back.

Protopage — create your own customised, interactive home page. From the moment you land on the site, you can move objects around the page as if they are windows on your desktop, and interact with them just as easily.

There are plenty more Ajax-powered sites, and every day a whole lot more become available. What makes this technology so special? Here are a few good contributers:

* It uses relatively well established browser technology — no plugins are needed, and most browsers these days are fully capable of supporting Ajax-enhanced pages.

* For the most part pages are still recognisable by search engines, meaning that unlike technologies like Flash, search engines will still understand what your page is about.

* It’s economical. Instead of downloading a whole page when all you want to know is the weather, a well designed Ajax-based site will only download the information you need, leaving everything else perfectly intact.

* It’s fast and responsive. Because much of the programming code that makes an Ajax-based page so useful is downloaded onto your local computer, much of your functionality is very fast and responsive, the same as any locally installed program.

* It’s non-proprietary technology. Ajax is a combination of established web standards. You will never have to pay to use it, and neither will the people who program it for you.

A new era in internet-based computing has begun. Many have dubbed this new wave of powerful and responsive technologies Web 2.0 (fortunately not another acronym this time!). What does this mean for you as a business owner? The tools for turning your business website into a highly interactive, responsive and powerful application are now here. And your customers will love the fact that it’s more responsive than their local telco representive…

About the Author:
David Malan is an expert author and business owner. He owns and runs RealmSurfer Consulting, an internet marketing and web design, development and consulting business based in Perth, Western Australia.
Website: Web Design Perth

More information about Australian ugg boots.
Sheepskin ugg boots — care instructions.

What is the Ajax Enabled Google Tool-kit?

First things first, Asynchronous JavaScript and XML (AJAX) is not a technology. AJAX is a technique that has brought about a great change in the world of web development.

The AJAX technique comes in response to the increasing demand of interactive web applications. With AJAX, web page exchange small amount of data with the server behind the scene. This means that every time a new piece of data is entered by the user, or there is a request for a change, the entire page does not have to be reloaded. Usability is also greatly affected thanks to AJAX. After all, AJAX creates conditions that are conducive to a complex scenario that is both data-centric and user-centric. The difference between web pages and other applications has been thinned down with the help of AJAX.

As already mentioned, AJAX is not a technology and this technique fuses together various existing technologies such as XHTML (or HTML), CSS, the DOM, XMLHttpRequest (or alternatively IFrame), XML.

Here is how these individual technologies play a role in AJAX:

• XHTML (or HTML) and CSS are used for mark up and styling information.

• The DOM (Document Object Model) is employed for the actual interaction that happens with the information that is presented.

• The exchange of data asynchronously with the web server happens with the use of XMLHttpRequest. Although there are many cases where an IFrame object is used in its place.

• Even though even preformatted HTML would work, XML is the format often used for the transfer of data between the server and the client.

The advantages and disadvantages of using AJAX are in fact open for interpretation. Here are some of the reasons that are cited as advantages of using AJAX.

• The main reason for using AJAX is to enhance the user experience, and to make web pages behave more like standalone applications.

• AJAX enabled pages load faster because it generates HTML within the browser. The net result of the page loading in a staggered manner is the bandwidth consumption for a web page is considerably reduced.

• The third advantage is widely critiqued because of a common misconception about AJAX – that it is a mix n’ match of various techniques, not leaving room for any consistency. Yet with AJAX programmers tend to create a distinct separation between the methods and formats that are employed for the purpose of information delivery. In other words separation between the content that is to be delivered, the structure and style elements of the webpage, and the functionality of the webpage.

On the flip side are the disadvantages that people associate with the use of AJAX.

• Given that, with AJAX, the page does not register with the history engine of the browser, the user is often unable to use the ‘Back’ function of the browser. Additionally, AJAX also makes it difficult for users to ‘Bookmark’ a page at a certain stage of us. The solutions created to tackle these problems have not been adequate, and these issues remain unresolved for the most part.

• The possible delay between user request and server response, is an obvious drawback of AJAX. This lag, known as network latency is made worse by a phenomenon that has nothing to with the technologies involved. When a page is rendered in entirety the human eye naturally re-adjusts itself to identifying the changed elements of refreshed page. On the other hand, when smaller portions of the page are rendered individually the user may not see the change immediately and imagine latency when it in fact does not exist.

• Another possible problem is that search engines cannot execute the JavaScript that is a part of the AJAX functionality. It is important to note that this particular problem is not restricted to AJAX.

• Yet another issue with AJAX is compatibility. JavaScript, which AJAX depends on, may be implemented differently by different browsers.

At the face of it, the disadvantages seem to weigh over the advantages making AJAX seem a less viable option for developers. There is no doubt that AJAX is complex, and there are still not many developers who are acquainted with its language. Yet a change has been brought about with Google slotting AJAX in their applications.

Google’s move is a landmark event in the web development arena. Google applied compilers to help them carry out this mammoth task. Compilers give developers the chance to code/develop in a higher-level language, which it converts to a lower-level language which the computer understands. A Java to JavaScript compiler was created so that developers could work in the former and leave it to the compiler to convert the same into the latter. This technology was freely shared with the developer community and is known as the Google Web Toolkit (GWT).

The GWT development cycle is rather straightforward:

1. Use Java to design, develop, debug, and test. In this process you may or may not choose to employ GWT libraries that seem of use. You are free to use any of the Java tools that you feel comfortable with – Eclipse, IntelliJ, JProfiler, JUnit.

2. Use the GWT’s compiler that distills the application from Java to a set of JavaScript and HTML files which can work with any web sever.

3. Ensure compatibility of the application with the browsers you want to support.

GWT can be run in two modes – hosted mode and web mode.

Hosted mode: Most of the development time ordinarily would be spent in this mode because since your application is run as Java byte code within the Java Virtual Machine (JVM), you can have the benefit of employing the debugging facilities in Java.

Web mode: In this mode, the application is run as pure JavaScript and HTML

If AJAX is meant to ease the surfing experience of users, GWT is meant to ease the process of developing to the farthest possible limit. And GWT has made it easy for developers to use AJAX for creating applications. For instance, common errors that occur with JavaScript such as typos and type mismatches can be identified at the time of compilation. There is often a conflict between what is easy for developers to do, and what is beneficial for users. This conflict, needless to say must end in the favor of what is beneficial for users. And the net result of using GWT and making things more convenient for developers would of course a better web experience for users.

The main features of the Google Web Toolkit are:

• Even though, unlike traditional HTML web applications, GWT applications do not need to fetch new HTML pages as they execute, they do in fact need to get data from the server. Also referred to as a server call, this mechanism is better known as Remote Procedure Call (RPC) and enables interaction with the server across a network.

• The presence of dynamic and reusable UI (User Interface) frameworks. The key difference between UI frameworks in GWT in comparison to others is the way widgets (Java classes on the client-side that are used to build user interface) are rendered.

• Full-featured debugging in the hosted mode.

• Allows for the appropriate management of browser history.

• Automatic compatibility with different browsers is yet another attractive feature of GWT applications.

• Yet another feature of the GWT is that it helps you internationalize your applications and libraries.

• GWT allows you to unit test in a debugger and browser.

• With the help of the JavaScript Native Interface (JSNI) you can add handwritten JavaScript in the Java code.

• The most important feature of the GWT is the fact that it is completely open source code.

For the uninitiated, all this sounds too technical. But the very purpose of GWT is to extract developers from the web of technicalities and give them space to create something that speaks with their end-user. And the demand for interactive spaces online is only going to increase. The AJAX trend is catching up and thanks to GWT developers are able to slowly but surely get over their initial apprehensions about the difficulties that AJAX poses. The role of developers in the development lifecycle of a web application cannot be undermined, but with AJAX enabled GWT their role actually ceases to be just that of typing together back-end operations. Google Maps is an excellent example of the advantages of working with AJAX within the GWT framework. Google map is definitive example of something that is dynamic, attractive and completely user-friendly. Finding locations and using functionalities such as zoom in/out instantaneously is a tremendous advancement. Imagine, having to interminably wait for the page to reload when you click on a location or search for it in the search bar? The very purpose of having the map would be defeated, if it was going to take just as much time to look for a specific location online as it would on a printed map. There are some detractors who say

AJAX enabled GWT is the practical way forward. End-users hardly take this for granted, but the work that goes behind creating this ultimate user-experience pays off. And indeed, GWT has made ease of development possible without losing out on user-satisfaction. With techniques like AJAX, and systems like GWT the future of web development is one that holds a lot of promise for users and developers alike!

About the Author:
Munish Rathee working on divorce related sites, main sites related topics are new jersey family law attorney, Cleveland Divorce Attorney , Connecticut divorce attorneys , relationships after divorce.