continuations

Debunking SEO

Posted in General, Software, Thoughts by Joshua K on October 2, 2011

SEO — otherwise known as Search Engine Optimization — is the process of improving a website so that search engines can more easily index relevant content, with the goal of improving how soon it will show up in a search engine’s index. Online (and sometimes offline) contractors abuse the term “search engine optimization” to sell a service to improve SEO ratings. About ninety-eight percent of those so-called contractors — and maybe more — are quacks. Companies waste significant amounts of both time and money on these contractors to improve their website’s SEO ratings. It’s all quackery and that contractor is hoping to fool you into paying him for a rain-dance.

Let me re-iterate if you missed the last paragraph: in most cases SEO is a pile of crap. If you’re an executive paying someone to improve your company’s website “SEO”, I can almost guarantee you’re wasting money that you could be spending on your employees. Most of these contractors know it’s all quackery and use that as a business opportunity to sap you for as much money as they possibly can. Why is this? Keep reading.

We already know what SEO is, but let’s define some other terms so we’re not playing semantics. One half of a search engine generates an index of all the websites that it has found on the Internet. The other half is what the user primarily interacts with: the user types in a set of keywords (a query) and the search engine tries to find relevant information in its indexes related to that set of keywords (a set of results). This creates a bottleneck: a search engine is a gateway for users to view websites. SEO, as we’ve discussed earlier, is the process of optimizing a website so a search engine can find more relevant results for a user’s query.

They key to optimizing a website then — as trite and obvious as it seems — is through relevant and useful content. That easy, you say? Not really, no. Machines generally have an extremely hard time trying to understand what is and isn’t relevant to users. They can do massive amounts of statistical analysis on the data that they collect but that takes time and users are impatient. Most modern search engines use other relevant metrics: from the search engine’s point of view, if a user clicks on a result and then clicks the “back button” in their browser, the search engine will see that the user has immediately requested the results page again and notice that, clearly, the website they viewed was not useful to them. There are other metrics that search engines use, but overall, a search engine’s results reflect the usefulness of content to a user. If a user doesn’t like what they see and they leave the website, not only will they likely never come back, but the search engine will notice that as well, compounding the effect of useless and irrelevant content.

There are many metrics but it all boils down to a bit of common-sense and a bit of understanding that your users are more than countable business goals. You might have some default blame-reducing phrase like “That’s not a problem for us! We have good content!” and your SEO-contractor will likely agree with you. The fact is, however, that if your statistics say your content is useless or irrelevant to your users no amount of paying a contractor to “improve SEO” will solve the problem. In case I haven’t driven it home just yet I will do so now: if your users find your website in any way irrelevant or useless for their direct and immediate needs, they will leave and they will never come back.

It’s important for users to see relevant and useful content. The prerequisite for that is to understand who your users actually are. If you say that you are targeting everybody indiscriminately and you’re not Facebook or some other Web 2.0 social media big-shot, you’re likely heading for a world of hurt. I task you to take a step outside of your business mindset and think about all the people who come to your website. Put yourself in their shoes. What are they coming to you for? You can’t target your content for an undefinable market segment and expect to have any success.

At this point you’re probably asking, “Okay, this is all great information, but what can we do to fix it?”. That’s a tough question and it the answer, like so many things in this world, is “it depends”. I don’t have a degree in marketing or business administration, but if I know your business, I can definitely tell you some ways your content is failing to deliver and usually within reading one or two pages. Again, take a step outside of your business mindset and hear me out.

Your website likely isn’t delivering what you expected from it because the content on your website sucks. It’s broken likely because nobody wants to read it, and stuffing more keywords, alt-tags, JPEG images and flash media into the page won’t fix the problem. Improve the site by adding real, useful content. What is “useful” is fully dependent on who you’re targeting. Real content targeted at real people tends to deliver real results.

Most people visit a website to answer a question they have. That question could be an answer to a number of problems, or they are seeking advice about a subject or information about an industry, or they are looking to buy a product, in that order. If you can’t answer that question, they leave. Mission failed. Most corporate websites focus exclusively on only the last segment and ignore all else. They likely see it as the quickest return of investment: it’s a quick, fast, and easy way to make a buck from the Internet. That short-sighted thinking means that when that last segment they exclusively targeted quickly evaporates — and it will when nobody can find the website — the website becomes costly dead weight instead of an active revenue stream. At this point, most executives will see the reports from Google Analytics and blame the website’s “SEO” without thinking about what that actually means. So fire the website developers and get new ones… right? (I’m kidding, of course.)

Many websites also suffer from ego-driven development. EDD is usually caused by many factors, the biggest of which is company-centric, ego-driven website content. It usually talks about “our commitment”, “our products” or “our services”. It turns on the alpha-dog manager or business owner that demands it be included but it provides no value whatsoever to the visitor. Many normally savvy business owners tend to believe that marketing and sales copy is all about their business but absolutely nothing could be father from the truth. Visitors don’t care about your company. They only care about what the company can do for them, now.

In case I haven’t driven the point home just yet, there are other issues at hand with website content failing to deliver. Many websites also like to list lots of facts about their products. Some of these facts are actually opinions but that’s a different issue entirely. As any marketer worth their salt (or any behavioral psychologist) will tell you: people react to emotional stimuli and then rationalize their feelings with facts afterward. Facts are great for post-purchase rationalization but they don’t connect with people on that emotional level. Save the facts for the optional flat-sheet product PDFs. Tell people about the benefits that your product provides.

This ties in with the egocentric development issue. Often a company will list the facts given to them by an expert in their company, pasted directly into their website, in an attempt to attract like-minded, technical people. What they miss is that those like-minded people will likely never see the website because the executive that wanted to demo the product got scared away by the information overload. This isn’t to say that you should strip all factual content from your website. On the contrary, facts increase credibility and authority, but the product’s benefits increase the number of customers because it appeals to their own interest.

The greatest way to improve your SEO isn’t to hire an outside contractor to perform magic voodoo, target more than just the currently buying market segment and improve your content. There’s an analogy from SEO that applies here: website content is like a good novel. Yes, you can write a masterpiece, but it won’t matter if it doesn’t get published by a publishing house. A search engine is merely a scout for massive publishing house (website visitors) to find the good writers. If no one wants to read your novel because it doesn’t answer the questions they have, it won’t get published. Mission status: failure.

How does this all relate back to SEO contractors? In short, most of those so-called experts are not actually experts at all. They know how to manipulate statistics in Google Analytics to temporarily give you a rating boost that seems positive but for a premium fee. It’ll give you a temporary company ego-boost and may attract some temporary visitors but it won’t increase your revenue in the long-term. In short, they’re gaming you for money and likely selling you bogus.

Don’t waste your money on SEO contractors. Spend it building useful, customer-centric content that will keep visitors and attract all segments of the market. The more reasons for people to stay on your website, and the more positive traffic you can keep, the less important SEO will be, and the higher your search ratings will be.

Rewriting Requests in Nginx

Posted in General by Joshua K on September 30, 2011

Recently I’ve done a lot of work on a number of server-side applications that run on Nginx. The application I’ve built needs to direct the server to rewrite all requests to the index.php script in the server’s document root (or wherever the deployment exists).

Even though Nginx has a wiki filled with very good bits of information, many sources of documentation for it (and notably not the creator’s) have incorrect rewrite rules for this purpose. Some snippets claim conditionally rewriting requests based on the existence (or lack of) of a file is a good way to do things. However, Nginx only requires a single directive in its configuration to do this:

try_files $uri $uri/ index.php;

That directive tells Nginx to try to serve files by that name, then directories by that name, and then to fallback and pass the request directly to index.php. That’s it. One line. (You can see why I advocate Nginx over the mess that Apache and its configuration files are.)

Your visitors will thank you when you scale your site in the future.

Messing with Makefiles

Posted in General by Joshua K on September 25, 2011

I like Makefiles though originally I didn’t.

Before I’d begun releasing anything of significant value, I would try almost any alternative simply because I’d deemed Makefiles too hard to work with. My prejudices were self-reinforcing: I hadn’t tried using them on any projects because they were bad and because they were bad I wouldn’t use them on any projects I had to support. I tried everything from autohell to CMake to SCons. Although the phase with SCons lasted awhile, I still began to get frustrated with its shortcomings and pitfalls. I then decided to try building something with a Makefile.

Make, to put it succinctly, is brain-dead. That’s what I like about it. It doesn’t really understand the concept of directories or files. It doesn’t know how anything on the host computer behaves. From shells to environment variables, from compilers to their languages and implementations: Make is oblivious. Discarding these assumptions it thrives on a seemingly simple set of concepts: rules, targets, dependencies, and actions. Because it doesn’t make any assumptions about the environment, the maintainer is left to build a conglomerate of fragments of Makefiles to carry out the task of constructing a piece of software. On some “highly portable” programs, Make can become bloated and slow because of the overhead bulk of Makefile fragments being carried along with the program’s distribution. As well, maintaining an utter monolith of configuration options is usually a maintenance nightmare for those who are not skilled for the task.

As I’ve found quite often in my experience of building and fixing software from the ether: the larger or more complex the project is, the worse the maintenance and support mechanisms around that software are. (There are exceptions to this but I haven’t found many.) When you look at the bigger picture, however, it begins to make sense. People most devoted to building or contributing to a piece of software just want that software to build each and every time it was called upon to do so, and don’t really bother with maintaining the large rule sets and boiler plate. It’s not what they do. After looking more in-depth into what tasks Makefiles were made do, the reasons for its deprecation among most new developers begin to make sense.

Nobody used Makefiles because nobody wanted to.
If nobody wanted to use them, it must have been because there is some major disadvantage in doing so.
Understanding that, perhaps that was where my aversion to Makefiles spawned from.

Ever since coming to this realization, I decided to try to understand Makefiles as if they were simply a piece of often written boiler plate code that was simply just required to use or build the software they came paired with. If the code I build doesn’t have a strong set of supports beneath it, it won’t work. It’s just the other half to the code itself. Since doing so, I’ve collected some greater insight into constructing Makefiles that “just work”. I don’t have any significantly large projects to support at the moment, but now that I have a better understanding of how they work, I can tailor a project with a Makefile infrastructure quite easily.

Many people don’t take the time to understand the intricacies of Makefiles. I don’t blame them: It’s not important to their software. Certain implementations of Make have notable flaws and hidden complexities. GNU Make has many notable examples. Microsoft’s “nmake” is even worse. NetBSD’s “bmake” or Debian’s changes to it deemed “pmake”, are better examples. The latter two are better from the system-portability aspect, because the ship a lot of commonly written boiler plate Makefile fragments in their distributions. This means that for both Make implementations, people who use them do not have to ship that added overhead bulk with each new distribution of software.

Other build systems, like SCons or CMake, internalize that boiler plate code even more, so that it’s embedded (or tied very closely) into the real build system binaries themselves. For some projects that sort of thing is useful, but for larger or more intricate projects the maintainers end up having to work around that built-in functionality. Other even more obscure build systems attempt not to internalize the boiler plate, but to abstract it away into dozens of different independent programs and language interpreters. Those rules in those languages are found to be common among certain sets of projects, and become shipped with their respective interpreters. Once those layers of abstraction become unmaintainable, then people begin to have to make other rules to work around the original ones. This forms the basis for the GNU autotools suite, or as those who have had experience with it so aptly dub it: autohell.

Makefiles are simple. The original implementation from 1977 was meant to do one thing: build software on UNIX systems. It was, in effect, the most simple and stupid thing that just happened to work. The problem? Most people ignored what it was originally intended to do, and tacked on features that appeared useful to them. A presentation, called “UNIX Style, or cat -v Considered Harmful” [1], explained how the same fate happened to various other UNIX utilities over time. People modified the perfectly good UNIX tools they already had to carry out a task, instead of just using what tools they already possessed to carry out their tasks.

My argument is that the same thing has happened to Make and it’s fractured implementations and that’s why nobody wants it. It does one job and it does it right. People don’t want one tool doing one job; they want a multi tool that will do anything and everything for them, and hold their hand when things go wrong. The minimalism isn’t there any more because convenience overrides it.

Convenience comes at a price.
It takes hard work to build good software and it takes hard work to support good software.

1. http://harmful.cat-v.org/cat-v/unix_prog_design.pdf

Follow

Get every new post delivered to your Inbox.