Usability Evaluation

On the 31st of July 2015 work began on a survey to inquire on how many individuals used e-readers and what their general opinion on the technology in question was. After careful consideration over a week and a half span the survey was officially completed and ready for a general audience to provide the answers on August 11th. This report is to show not only the results of this survey but to provide some background information on what led to design choices in the survey itself and to provide opinions on both the survey process and the tools used to perform the survey.

As stated previously the survey initially started with a goal of seeking opinions on “whether e-readers would be appealing to people over 65” (Preece, Rogers, & Sharp, 2011) and a set of questions provided by the course work in addition. The first concern faced was finding this target audience of 65 or older and phrasing the questions to reflect this, but the target audience could not be approached in a quick and easy fashion using an online questionnaire so instead the survey and the questions were opened up to be more generic instead of age specific like initially planned. Once this decision was accomplished it became time to transfer the example questions initially to the survey tool chosen, specifically QuestionPro located at http://www.questionpro.com, as a trial experience on what functionality the questionnaire tool provided with its free account. The interface was found to be pleasing to use and offered many styles of question styles and built-in templates to make the process of making the survey in question an effortless experience where most of the focus was on design choices than learning how to use the features offered.

Once the initial questions were inserted into the tool with some default answers, some of which were provided in the source material also, it was time to preview the initial survey to get a personal opinion on how the questions were presented. Many changes came at this point as the personal evaluation was repeated several times until answers were altered, questions were rephrased, and questions were added or broken down into separate questions. Only one set of eyes on the questionnaire however was found to be inadequate so a decision was made that other users should see the initial questionnaire and instead of responding to the questions itself to instead give feedback on how they felt the survey itself was structured. Several additional changes came from these secondary evaluations including the idea of adding optional questions to collect information on the age and gender of the survey participant.

The usability of the tools available on the website made it simple to make the necessary changes and make additional changes including an description at the beginning of the survey to better clarify some vocabulary and what was to be accomplished by the survey and to add a page break to the survey in order to differentiate between the required questions and the optional census style data requested. These changes took place over the previously mentioned 10 days in order to allow assessment and changes to be made over time, but it eventually came to the point in time that in order to have data readily available to be analyzed the survey would need to be finalized and posted.

The website tools, once again, made the task of sharing the survey simple and elegant as the link was initially shared on a personal social media account hoping to reach a varied audience between the people that follow the account and would be willing to participate in the short survey. One of the key reasons about posting the survey online and on social media was that “more than 80% of the U.S. population is currently online” (Taylor, 2012) and it was personal opinion that a majority of that population using social media also. The results from sharing on social media however proved to be of a lower number than initially anticipated so additional methods were taken into consideration including sharing the survey with friends and co-workers directly and requesting that they share the survey also with anyone that they feel would have varied opinions on the questions presented. This allowed additional completed surveys though the number only reached 8 individuals however which was more than adequate for the requirements of the assignment but not the large varied audience originally anticipated and desired. With future surveys additional methods would be implemented to potentially bring in more individuals including checking up on individuals who were personally requested to take the survey or advertising the survey in a fashion where it would be shared by a larger audience. The other observation made was that in the census data collected 87.5% of the survey participants were between the ages of 25 and 34 and all participants were male, which did not give the varied perspective initially desired but this is mostly caused by the audiences that the survey was presented to. Despite the number of participants and the close proximity of age and gender there was found to be some differing opinions discovered with the data received on each question.

The first question presented was “Have you used an e-reader before?” and half of the participants responded with the answer “Yes, but only once or twice”, a quarter responded with “No, have never used”, and both “Yes, use daily” and “Yes, use less than weekly” received only one response each. The only answers with no results were “Yes, use weekly” and “Don’t know / Not sure”. In future iterations of this question the last answer of “Don’t know / Not sure” could more than likely left off as it was more designed for questions where the participant may feel unsure of what answer is correct. From these results it is easy to see three quarters of the participants have used an e-reader device before though a majority of that number only tried one temporarily instead of having an extended amount of time with one.

The second question was to discover what e-readers these users have used to get an opinion on what devices participants had used in the hopes of not only seeing what platforms are most popular but to potentially validate the earlier question if the participant was confused on what devices are considered e-readers. There was one result that did not match however but is more than likely caused by poor wording as three individuals selected “None of the Above” which was initially designed to instead mean that the individual had never used an e-reader, which should be the same number as the first question. With some clarification the last option of “Other” would have seen two results instead of one individual stating that they used a Sony E-reader which was not included in the answer list. The other results found were three answers for using an application on a tablet device that could perform other functions than just the loading and displaying of books and one answer for the Amazon Kindle. The results shown were surprising as not only did every individual only answer only one device despite the question allowing multiple to be chosen, but that most of the individuals used e-reader applications on another device instead of a dedicated e-reader. This is a prime question that would benefit from a wider and larger audience to get a more general feel of what devices are actually used and some clarification made both on that the user could select multiple responses and change “None of the Above” to “Never used an e-reader device”.

The third question was designed to be a yes or no style question of if the individual enjoyed reading books using an e-reader which the majority said yes compared to no with five individuals compared to one individual respectively. The other two individuals’ answers “Don’t Know / Not Sure” which matches up with the first question where two individuals had never used an e-reader before. This question was made to match up with the fourth which asked if the e-reader was easy to handle or cumbersome, providing another binary question for the user in preparation for more in-depth questions to follow. What was initially surprising to find is that there was a unanimous answer that e-readers were easy to handle by the six individuals that had used an e-reader before with the two expected “Don’t Know / Not Sure” again.

The fifth question was to inquire to all users what they liked most about e-readers in general with a list of common features found to be enjoyable in most e-readers. This question like the second and the sixth allowed the user to select several answers to allow there to be multiple likes by the same individual, but this was the only question surprisingly to receive more than one selected answer per user as there was a total of thirteen results from the eight participants, though the one user that selected Other typed in an answer that was more comedic than serious so in reality there were only twelve real answers given. Over half of the answers, seven of the twelve given, answered that the devices were easy to carry which also implied that all but one individual selected this answer also. The rest of the results were three for ease of use with purchasing new books and one each for keeping up with technology and text to speech in some models. The lack of any answers for “Do not like anything about e-readers” was surprising as even the two users that had never used e-readers and one that disliked them all found something that they felt was enjoyable about them.

The sixth question was the opposite of the previous with asking the user what they liked least about e-readers in general. As mentioned with question five this one despite being multiple choice allowed only had one answer given for each user. Half of the users answered that they did not dislike anything about e-readers with two answers for battery life, one answer for not having the physical feel of a book, and one other answer where the individual specified the device they used was slow to turn on. There were no results for disliking using technology and difficulty with reading the screen, but those are two answers heard personally during work hours so this too would prosper from a more varied audience. Also on a future iteration of this survey it would be advisable to add the answer that the user provided of the device being slow to use.

The seventh and last question before the census questions was a textbox style question where the user had to specify a time when they felt that the e-reader was uncomfortable or difficult to use. The results from this question were very similar to the previous question where four participants had their equivalent of never or that they had not used one before. The other four results were unique and could be added to the sixth question also as one individual felt that e-readers were harder to use on a crowded bus than a paperback book, one reiterated how his old tablet had a tendency to be slow, one mentioned that a device they used to use would lock up or freeze on a consistent basis, and the last result was that they had trouble using the e-reader in sunny spots.

Overall the results gathered from the survey showed not only potential for the devices being analyzed but considerations on how questions should be phrased for questionnaires in general including being clearer on meaning of answers and specifying if an answer allows the user to select multiple responses. The only concern discovered directly related to the tools used was that despite the analysis view giving detailed information the export option to convert the data to a format usable by outside programs was only available to paid users while free users could only view the information online or make a printed copy of the page. These concerns mentioned will prove to be good guidance to future endeavors when trying to gain information from a large audience using the questionnaire or survey style of data gathering.

Place an Order

Plagiarism Free!

Scroll to Top