Web Usability Case study

I was recently involved in a case study within our company for our new website design project. One of the most important thing that we wanted to test and probably do our research on was scrolling bars. We wanted to gauge the amount of visitors we were losing/gaining if we let the user scroll the page.

Scroll bars/Resolution of your webpage have undergone a lot of study in the past. One very good instance of this study provided us with a lot of insights into what needed to be done. However some of the senior people in the company were not convinced with the amount of data available for this particular problem. I personally felt that the data we had from usability testing in the past was not quite convincing.

We thus conducted a test on 27 participants for providing directions whether our website should/shouldn’t have the scroll bars.


  • Participants : 27
  • Age group : 24-37
  • Level : intermediate web users
  • Total questions : 4
  • Website Niche : Web design and Web development. Also Eurosolar.

Test Aim

  • To find how the user reacted if they had to scroll for information.
  • To find whether a website in our niche performed better without scrollbars.
  • To find the direct relation between website height and bounce rate.

Further Testing

  • PPC campaigns were run on two different design layouts (One with a scrollbar and one without scrollbar) to see which one performed better?

The Process

  1. The user were asked to answer four questions
  2. Question one involved finding a specific link on a particular page. Mouse movements and time frame were tracked.
  3. Question two involved filling out the quote form which was kept at least two clicks away from the entrance page.
  4. Question three involved writing a short description about everything they remembered about the page.
  5. Question four involved rating the pages in order of a user preference. Hence a page they like the most would be given rank 1.


Question 1
  • Users were shown a series of websites before they were presented with sample layout 1 and asked to find a web link and click on it.
  • They were again put through a few websites before they were presented with sample layout 2 and asked to find a link and click on it.


  • User found the web link on the layout without scrollbars more easily.
  • Layout without scrollbar had at least a .44 second of lead time over layout with scrollbar.
  • At least 40% user found it easier to find the link on the layout without scrollbars.
Question 2

The purpose of this question was to find – how easily can a customer find the required information?


  • User found the quote form easily
  • User filled the quote with an average lead time of 4.8 seconds
  • 22 out of 27 users gained a lead time in using a page without scrollbars. Users took more time by an average of 4.8 seconds to fill the quote form on pages with scrollbars.
Question 3

This was a descriptive type of question and most users provided better reviews for a web layout without scrollbars.

Question 4

Following is the graph of responses for user rating on both layouts.

  • 67% users preferred layout with no scroll bars.

The data suggested that layouts with scroll bars made the user scroll increasing the total time to complete a set goal. A user is most likely going to bounce if he/she cannot find the information they want on the landing page.

The PPC campaign with a layout without scrollbars gave a 0.15% more ROI than the one with scrollbars. The PPC data was not factored into making the decision as it was not strongly motivating.

A lot of websites with overdone footers suffer from higher bounce rate for the same reason. The landing page/any page of the website should serve its purpose in its most simplicity. Any comments/feedback is appreciated.



Al-Awar, J., Chapanis, A., and Ford, R.  (1981).  Tutorials for the first-time computer user.  IEEE Transactions on Professional Communication, 24, 30-37. This is one of the first descriptions of formative usability testing. Prior to this paper most user testing efforts were more summative (benchmark testing).


Shackel, B.  (1990).  Human factors and usability.  In J. Preece and L. Keller (Eds.), /Human-Computer Interaction, Selected Readings/ (pp. 27-41).  This paper defined usability as a function of efficiency, effectiveness & satisfaction (the ISO 9241 pt 11 standard).

Two Things a Blogging Platform Should Have !

Talk about a blog and most people will start on “how awesome WordPress is…”. Yes I know. I use WordPress too. There are a number of blogging platform that are coming up nowadays and most of them disappear before making a sound. WordPress it seems is monopolizing the whole blogging platform.

However I personally think WordPress lacks a few core features that may push it to become the most preferred blogging system.

1) Social Mash

Think about all the king of plugins and modules you have installed on your website for bookmarking and tweeting your posts. After writing content on your website, you need to go out and market yourself.

What if there was a social mash which pooled all the content on a given platform and based on a user’s query returned the most relevant result. I know this is the job of a search engine. However this does not need to be very complex. A social mash is a collection of content on a given sphere where users can cross link, refer or cite a fellow user. This will in turn make them both grow.Ghost Blogging has grown increasingly to satisfy such requirements. The social Mash is obviously cross platform compliant. This simply means that any content you share on social mash, gets automatically synced with multiple channels. This could be automatic status updates, link backs and pings and much more.

Some core benefits include

  • No plugins or modules to install
  • Social mediums designed within the architecture
  • Social Mash may create user groups.
  • Easy for serving content based on user organization
  • Rich and effective content will be pushed up by user

2) Core Analytics / Insights

Google and Yahoo do a great job at indexing content. They also do a great job at showing you what your visitors are doing. It would be nice for WordPress or any other blogging platform to pick up a cue and integrate analytics within its core system.

The benefits of doing so will include

  • Actionable insights right where you need to apply them
  • Live user interaction
  • Live suggestion – If you go in a shop, a salesman helps you out. Similarly if you could serve your visitors.

So Forum, what are your thoughts..

WordPress Optimization

A faster loading website is a great business ethic – Period. Speed of the website depends on a lot of factors including your hosting environment, the size of your database and number & size of your images. With a lot of businesses preferring WordPress these days – a definitive guide to speeding up WordPress is required.

Recently a client of ours built a website in WP Ecommerce and was not satisfied with the load times of the website. I decided to get my hands dirty and get all the information I could on this topic. With a lot of experiments on an existing database over 330MB and home page sized 1.1 MB, I was able to speed the website up at a satisfactory level.

The following items were considered in the experimentation with appropriate impact on the speed described along with some data below.

WordPress’s Own Caching Engine

WordPress has a built in Object caching mechanism and an API. This mechanism is used to store complex objects including HTML structures which are resource consuming and require generation on the fly. The engine is implemented in PHP memory which means you won’t be able to persist objects in-between requests. If you are fetching the same object repetitively this may be of use. The procedure to active the built in the default caching system is very simple.

1) In WP-config file turn on caching - ENABLE_CACHE – “TRUE”:

2) Create a cache directory in WP-contents and CHMOD to 777.

If after browsing through WordPress site you do not see any cache files in your Cache directory – recheck your folder permission. Some hosts will require you to have the permission set as 755.You can skip enabling this caching if you are going to use w3 Total Cache

You may find this wordpress refrence for further clarification.

Flush The Buffer

The buffer is a big memory resident string. When a user requests an element on a page or page itself, the browser starts calling various components to weave the entire web page. The buffer allows to fetch partially ready HTML components.

Imagine you flush the page just after the </head> tag. Since the webpage is waiting for the body content to be fetched it can start loading CSS or Favicons simultaneously. The impact on speed although not phenomenal it can shave off a few milliseconds from your load time. This is especially useful on busy back ends.

Just insert the following line in your header.php after the closing header tags.

<?php  flush();  ?>  

 Some users have reported conflict with w3 total Cache plugin. It may cause interference with your caching plugin. On a different note, you can combine gzip and flushing.

JavaScript in the Footer

Scripts blocks parallel downloads.  Hence using a cookie free domain to serve static content like HTML or images is a good idea. This allows more than two downloads at the same time for the browser speeding up the fetching procedure.

The problem with scripts is even if you have them spread out over different host names it will block all other downloads until it is completed. Unless your webpage uses the document. Write function for java scripts at the start all other scripts can be moved to the bottom.

If you have declared all the java scripts properly an excellent plugin by Vladimir Prelovac called Footer Javascripts will do the trick.

Put Style sheets at the Top

Putting Style sheets at the top help make the page render progressively. if all the graphical elements on a page referenced within the CSS and the HTML are loaded before the scripts, the user experience increases in a very satisfying way. Users care way more about visual feedback then the complete render. It is easy for the users to start reading the text which has been completely rendered while the javascripts are loading.

You might have often seen the blank white page before the next page load. This is often caused by loading CSS at the bottom. A lot of browsers avoid rendering specially internet explorer in order to prevent redrawing of the elements on the page until the CSS has loaded.

This rule has been explicitly mentioned in the HTML specifications.

Remove unnecessary Java scripts from Pages

Classical example of this could be the contact form 7. The Java script for contact form 7 is only required on the contact us page. However, it loads the Java script on all the pages. If an additional 50K overhead is given to a page a few seconds will be added to the load time.

There are quite a few examples on the web detailing how to include particular javascripts on a particular page.

Here is a sample code to add to functions.php.

add_action( 'wp_print_scripts', 'deregister_cf7_javascript', 100 ); function deregister_cf7_javascript() { if ( !is_page(100) ) { wp_deregister_script( 'contact-form-7' ); } }

Check your theme for any custom Java script you are adding on any page. Verify whether that page really requires that javascript. Further optimize the same procedure for CSS.

WP smush it

Wp smush it strips your images from the following data

  • stripping meta data from JPEGs
  • optimizing JPEG compression
  • converting certain GIFs to indexed PNGs
  • stripping the un-used colours from indexed images

Yahoo does have an online tool that does the similar job. Using this tool on images around 80-85 KB I found that there was an average saving of 10-20 KB. If you are using the Wp Smush It bulk tool, make sure that you do the procedure when the load on your server is the lowest.

Consider an WP e-commerce category page with 35 products having a 20 KB image each. Smushing those images using the plugin could bring about a lot of savings for your website.
Also make sure you have enough max_connections set on your SQL database to enable a smooth running of bulk smushing. Most shared hosts would allow upto 100.

Do not scale images using HTML

A lot of webmasters and specially users of WP e-commerce still use the old way of scaling the image inline. <img src=”” width=”500px”> is not only resource consuming but also going to consume a lot of time loading a larger image in the first place. It is a good practice either to use a separate thumbnail for that particular page or use timthumb to load the image.

Use a Favicon

Favicon.ico will be requested by the browser regardless of whether you have it or not. If you do not have a favicon the browser will get a 404 not found. It is best to have a favicon placed in the root to ensure that the browser does not get a 404. A 404 will often cause delay due to the browser waiting for the resource response. The browser also sets cookies every time favicon is requested.

The best practice would be to make the favicon small and cacheble.

Revision Control

Revision is a great plugin to stop bloating your database with unnecessary storage of old revisions for page/posts. Every time you make a change to your post Wordpress will store it as a revision in the database. The change could be as small as a spelling edit in your post.

Revision control plugin in Wordpress would let you specify a global number which will then become the number of revisions for a particular post that will be saved by Wordpress. The plugin serves a great purpose as you can stop your Wordpress database from bloating because of the unnecessary number of revisions it is storing for your site.

Using a CDN

A CDN (Content distribution network) is basically a collection of computers that store copies of your websites such as CSS, Javascripts in order to serve your website quickly to the end user. An end user will get served by the nearest CDN rather than all end user accessing the same central server. A CDN will definitely provide performance boost to your blog. Cloudflare offers a free account. W 3 Cache has inbuilt functionality to use cloudflare CDN. You can try Cloudflare to notice the difference in performance and bandwidth usage.

Removing unnecessary plugins

WordPress Optimization  does specify deactivating all unnecessary plugins to avoid performance reduction. One other thing you could do to measure a plugins performance is to selectively disable it in order to measure the performance of the site when a particular plugin is turned on/off. A lot of plugins are coded inefficiently and hence they should be checked to verify whether they are causing slow speed. Get a freat SEO reputation management company if you need to.

Optimize the Database

There are countless guides and plugins out there that will optimize the database for you. Optimizing the database is indeed useful. I haven’t been able to measure any impact on speed after optimizing the database on my own server. However it seems like a good activity specially if your database is large.

Plugin that can optimize the database for you



You can go through the plugin directory and you will find a lot of plugin who can do this job fairly well.

Head Cleaner

Head Cleaner basically cleans up tags from your header and footer to speed up the loading up of javascripts and CSS. It does have a lot of other features like w 3 total cache. It is a great tool to optimize some of your theme files and javascripts.

Remove 404 elements from your page

If you are referencing elements on a page via a direct link and it returns a 404 – it may slow down the response thereby leading to a poor user experience. If you are calling external javascripts which result in a 404, not only will it look into the body for the javascript code of that page but it will simultaneously block the parallel downloads.

<link> over @import

IE renders @import the same as <link> at the bottom of the page. The explanation was provided earlier when discussing loading css at the top for progressive rendering. Best not to use @import.

External Caching

w3 Total Cache 

From 0 to 45 Visits a Day – My Experiments And Mistakes

“My blogging journey has just begun. I have failed at blogging seven times before with countless domains and endless PHP installations in the hope of getting it right.

The thing with failure is that it is simply – “”Incredible””. You fail and you become even more determined to stand up and run. OK successive failure means you are a Dumb-ass (In the words of Red Foreman). Seven times means that you are no “”blogging”” expert rather an “”enthusiast””.

Meanwhile, let me put forward some really good things I have learned from blogging and how this may impact you.

Let’s look at my analytics Graph and then analyse.

The graph reflects the data for a total of 25 days giving me an average of

  • 30 visitors a day. – This has climbed in the last days
  • 3.2 Pages/visit
  • 70% bounce
  • Tremendous amount of time on the website

Now let’s look at some of the lessons learned.

1) Kissing ass does work – But wait

You know how you get carried away by tasting a small piece of success. Completely overwhelmed by the reason your trick worked, you start overdoing it. Don’t.

First thing I did was blog about What Mallika can teach you about social media. It was a good article and some very valid points, but in short it was kissing ass. Mallika, a Bollywood Diva liked it and re tweeted. Which is the huge influx of traffic you see on 18th of October.

What did I gain from it?

Nothing. Infect I lost more than I gained. The bounce rate affected the overall matrix and now my bounce rate is ridiculously higher than the normal benchmark.

  • Relevance is the key to great blogging. What is even more important is that you stay in your niche. Attracting visitors from the general spectrum makes you lose more and gain less.

Ok. So after realizing this very important lesson – I blogged about SERPD. Now let’s look at some numbers.

A total of 15 unique views which provided me with

  • An average of 10.26 time on page.
  • A 30% bounce rate.
  • 12 Feed burner subscribers.

Would I say that the second article was more insightful than the first one? NO. It is simply that I was talking to people that know my language.

2) Know what I am selling, rather than selling what Google wants.

Money is a great motivator. If you have been blogging or started blogging because of some awesome monthly income reports , STOP and think for a while.

Talking about things that Google likes – will get you one time visitors. Passionately speaking about things you love, will get you an audience. This is a very important lesson I have learned. Well after failing seven times.

3) Engage or be a part of the community. You will be invited elsewhere.

I know this sounds very generic. However I say this with utmost sincerity. If you spend an hour writing content, spend another participating, appreciating, reviewing or commenting on other people’s work. Blogging is more of a social exercise than a monologue.

We humans are a subset of a community. If we cannot actively leverage this to our advantage, we might just as well start mocking the very nature of civilization. Make sure you get to know people that read your content. Other experts of your industry. People who have learned a lot from their mistakes.

For example I can gladly say that today I know a little bit about a few people – not in a particular order

And vaguely know of a few other people , but looking forward to conversing with them one-on-one.

4) Stay away from “”list”” articles, unless you really have a real list.

Don’t just compile a list because you think it will draw visitors. For example look at this, this and this article. Miserable failure. Does not even show in my top content in analytics.

Unless you have a truly incredible list and you govern a strong readership – stay away from the lists. They will fetch you visitors, not audience.

5) Experiment and Fail – is the only ladder to success.

You had be pretty unlucky if you became famous overnight. yes, you read that right. The problem with people who become suddenly famous is that they do not value failure.

Failure I say is more important than success. Infect in my dictionary – Failure equates to success. Experiment something and measure the results. You may have to go through some really crappy designs, some really weird writing styles and some incredibly stupid conversations. However, if you succeed the first time, you will miss out on the other opportunities that stem from failure.

6) Brand yourself

If you have returned today to my website, you know it has been re branded as “”Quantum Entanglement””. This is the very first step to being unique. This indeed was my first mistake and I hope you, my readers can learn from it.

So Forum!- I implore you to speak, to comment, to suggest and to engage. What are your thoughts?”

Nesting Sub/Child Pages Under Parent Page Navigation- WordPress

I was trying to design a WordPress theme for one of the clients. The client wanted all the child pages of a particular page to be listed below the parent page. I looked around WordPress for support and combining a few support threads finally came to the following code conclusion.

This may have been already be mentioned at WordPress. However I am doing it to help anyone else who may find this information useful.

The code for the menu will be

<ul id=”nav”>
<?php wp_list_pages(‘title_li=&depth=1′); ?>
<?php if($post->post_parent)
$children = wp_list_pages(“title_li=&child_of=”.$post->post_parent.”&echo=0″); else
$children = wp_list_pages(“title_li=&child_of=”.$post->ID.”&echo=0″);
if ($children && is_page()) { ?>
<ul id=”childnav”>
<?php echo $children; ?>
<?php } else { ?>
<?php } ?>


And I am sure you can style it up the way you want it.

If you have any concerns or if this helped you, please leave a comment below.

Why Following Avinash Kaushik Is Bad For Your SEO

SEO (Search Engine Optimisation) is an art, and art requires detailed vision. You may have all the possible data and insights you need but a detailed tedious implementation of this plan will only yield results.

If you have been following the SEO or analytics landscape, you may well be aware of Avinash Kaushik. He is the author of the widely read Web Analytics: An Hour A Day. He is also the Analytics Evangelist for Google. Getting back to why following this industry expert can be bad for your SEO.

1) Avinash will quite simply entice you to become a follower

There are industry experts who will inspire you once in a while. Some who will truly make you think. Then there are those who will make you their followers (without you knowing about it). His articles are based for people who do their own SEO, people who work for their company and then those who do it for their clients.

He will tell you exactly which string to pull and when. He may even clear out some myths and if you are lucky enough he will answer your question once in a while.

His blog articles are insightful, intricately detailed and minutely filtered for the most critical information. He may even ridicule you and your EGO -“if one exists” with all the errors you have been making following the SEO trend.

Meanwhile why it is bad for SEO?

Take for example

Frequency with which Avinash will post: once a fortnight

No of times you will check his blog in sheer curiosity: once a day. (A gazillion times if you just stumbled upon him the first time)

2) Avinash will tell you, why you are wrong.

The thing with most technical people is that they fail to speak your language. It becomes difficult for a backend developer to go to a client and tell him why he should get his website designed/developed through his company. Similarly, for a salesman it become s increasingly difficult to explain to an IT administrator why his company is best at web analytics. Avinash seems to successfully accomplish putting the technical jargons and concepts into an easy to understand process.

You could start reading Avinash from any page on his website and it will all make sense to you because he makes it amply clear for his first time reader.

You may be entirely right in how you interpreted the data but his concepts in looking at the data and excellent articles in ROI calculation will most likely provide you with pointer you already did not consider.

In Summary

Is that enough to convince you? Let me challenge you to go and read one of his insightful post and before you know it you will be scouting a billion other useful links he has put in that article. All in all, you have spent four hours of your day in awe of gazillion bytes of information he has already provided.

You were only doing a simple search on “how to set up simple 301 redirects on your WordPress blog” and stumbled upon “Avinash Kaushik”. huh… it took you five hours to set up those redirects. Totally bad for your SEO project.

Meanwhile you can follow Avinash here on twitter.

Gain Access to Remote Email.

How to hack a Gmail password? This question has repeatedly bought up on the Internets. First – if you are a beginner do not even thing about hacking into Gmail servers.

Disclaimer – All the information presented here is informational purpose only. This post aims to show what can be done with technology. What you do with this information and how you choose to use it – is simply your concern and business. Awesmm will not be held responsible for what you do with this information.

This post will be quite a bit of a read so why don’t you – get a cup of coffee and settle down.

Chinese have done an advanced intrusion into Gmail servers quite recently. It is indeed not difficult to hack into Gmail or steal someone’s password. However for beginners who have no hands on experience, such an exercise can become a little daunting. With no knowledge of IP addresses or network administration, the only choice available to end users is RAT ((Remote administration tool)). There are numerous ways you can use in order to hack into someone’s Gmail. Let’s start with the basics.

1) If you have access to the victims computer.


A keylogger basically logs all the strokes on the keyboard or keeps a track of all the websites a person visits by logging all the keystrokes in a log file (TXT document).

  • A Keylogger can upload files to a remote server
  • A Keylogger can send files to a predefined email address
  • A Keylogger can open some ports on the victims computer to retrieve these log files
  • A Keylogger(hardware based) can transmit the data wirelessly to a specified destination.
The specialty of these kleyloggers are that they are completely hidden from the computer system. The victim has no clue that their computer has a Keylogger installed. The sophisticated key loggers cannot be detected even when the system is rebooted.
A simple search for keyword Keylogger in google will yield a lot of good keylogger for reasonable price.

2) If you are connected to the same router or network

Packet sniffer

A packet sniffer is basically a network traffic logger which will analyse and log all traffic over a particular network. The data passes over a network and a packet sniffer will capture all raw data, decode it and present the values of various variables. Network analysers are basically administration tools for network administrators.

Username and passwords are more generally passed over the network as a text value rather than encrypted form. Packet sniffer can be used to sniff the traffic packets and analysed to decode the username and password of the victim.

Cain and Abel is a good packet sniffer available for free.

3) If you do not have access to victims computer.

If you do not have access to the victim’s computer then there are still some alternatives you can try to hack into Gmail of a particular user. Remote administration tools, Login page cloning , Trojan through compression are some of the simplest and most effective. You could also try to steal their cookies in a live session and try and fake their authentication.

Let us go through these methods one by one.

RAT (Remote administration tool)

RAT are basically tools to provide a remote operator to take complete control over the system of the victim. If you have used Team viewer or Remote desktop assistance, you would know that the remote operator can control everything on your computer including your mouse and the keyboard.

There are quite a number of tools available that can do this job without the user consent. Most of them malicious in nature. Poison Ivy is a good example, however very basic in nature. I would let you do your own research for RAT.

RAT by combining two files – using Binders.

Binders basically bind two different files. Simple!

You take a remote administration tool, bind that with a Jpeg file. Send the Jpeg file to your victim and ask them to open it. As soon as someone opens up the Jpeg, the RAT would automatically execute itself and install on the victims computer.This method was successfully used in the past but now most antivirus pick up the Trojan in the jpeg files. Even Email providers scan the document before they send it across for download. You can still tick the not so computer savvy people to fall into your trap by using this method.

There are literally thousands of binders or patchers out there that can help you combine a RAT with a Jpeg file. Just Google they and you would have to dig a bit to select your preference.

Cloning Login page on Remote Servers

This is a tedious trick and not extremely effective. However it can get the job done from an unsuspecting victim.

You will need a tool like backtrack which is easily available and can be installed on your USB.

The video at the bottom explains how a Facebook page can be cloned. This same trick can be applied to Gmail or Yahoo.

So as I said earlier, it is not impossible to hack Gmail password. Hacking a Gmail account may be far easier and is entirely dependent on the type of the victim. You can even social engineer your way into someone’s Gmail account. We will talk about that some other day. I hope you had fun reading this. Share your experiences.