Evaluation of AIMS-Academic Information Management System.

Wills GB, Hughes GV, Hall W.

Multimedia Research Group Technical report No M99-5

ISBN-085432700-2

Copyright © November 1999, University of Southampton. All rights reserved.

 

Contents

Abstract

1. Introduction.

2. Methodology for the hypermedia application (expand)

3. Results from the AIMS User Evaluation. (expand)

  • 3.2 Expert Review. (expand)
  • 3.2.1 Reviewers Comments Scenario: Task 1 Enter a new document
    3.2.2 Reviewers Comments Scenario: Task 2 Update a document.
    3.2.3 Reviewers Comments Scenario: Task 3 Search for a document.
    3.2.4 Reviewers Comments Scenario: Task 4 Edit document details.
    3.2.5 Expert Scenario: Task 5. Following links with WebCosm.
    3.2.6 General comments from reviewers.
  • 3.3. User Trails

    3.3.1 General comments raised by the users.

    3.3.2 User Questionnaires (expand)

  • Comments from questionnaire.
    Sores from the Questionnaires.
  • 3.3.3 Time taken to complete tasks.

  • 4. Discussion.

    5. Conclusions

    Acknowledgement

    References

    Appendix A AIMS Questionnaire


    Abstract

    The evaluation process reported here used academic secretarial staff from the Electronics and Computer Science (ECS) department to evaluate the Academic Information Management System (AIMS). The evaluation focused on the subjective opinion of the users and measured the time taken to perform the major functions of the system. This report presents the methodology and describes the rationale behind the approach used. The results and conclusions of the evaluation are reported.

    1. Introduction.

    The successful introduction of any new equipment or system is largely governed by the acceptance by the users. There is therefore a need to evaluate the effectiveness of the Human-Computer-Interaction (HCI) within the office environment, along with the design of the user interface and navigational aids. For the purpose of the research a user is defined as any person, who would normally have access to the information in the course of their daily work.

    Evaluation is concerned with the gathering of information about the usability (or potential usability) of a system, in order to improve features within an interface and its supporting material or to assess a complete interface [Preece 93]. Due to the availability of resources and costs limitations, traditional experimental and usability testing involving a specialist usability laboratory are not an option for evaluating interface issues in this research.

    The methodology used to evaluate AIMS evaluating is presented, along with the rationale for and a brief description of the methods used. The report presents the results and conclusions from the evaluation.

    Back to Contents List.


    2. Methodology for the hypermedia application.

    The methodology used for evaluating AIMS is principally based on the methodology used in Factory Information Resource Management (FIRM) project [Wills 99] which in turn used the SUE (Systematic Evaluation of Hypermedia) framework [Garzotto 97] for the user task evaluation. SUE was originally designed with Hypermedia Design Methodology (HDM) and museum applications in mind. Hence, the actual abstract task and evaluation methods in the execution phase are not applicable this toresearch. The principles that can be distilled from the SUE approach, and used in this research, are that the model primitives or design criteria require explicitly stating, and the usability criteria and user tasks defined. The evaluation of AIMS takes place in three stages:

    1. Contextual review of the current office practice into which AIMS is to be used.
    2. A structured expert reviews is to be carried out using discounted usability engineering.
    3. The users perform predefined tasks using AIMS to evaluate the effectiveness of the application. After completing the tasks, the operators fill out the questions the results of which. .

    2.1 Choice of users.

    It is essential that the evaluation of the hypermedia application takes place within the environment that it is to be used, using the appropriate end users [Yamada 95]. In addition, the choice of subjects is vital to success of any experiment [Dix 98]. Hence, when carrying out the evaluation, different personnel are required for each stage. This is especially applicable to the expert review of AIMS, where both the content and user interfaces are to be evaluated.

    When evaluating the user interface experts with computer science and HCI backgrounds should be used. When evaluating the content of the application it is advisable to use experts that are or have worked in an academic environment, preferable connected with administration. However, as each department has its own culture, actual work practices and subtle changes in terminology, ideally personnel should come from the actual working environment when evaluating the content of the application.

    The choice of users for the time trial will be the actual personnel (or subgroup) from the academic secretaries, from within ECS that contribute to the information held on AIMS .

    2.2 Contextual Review.

    Any developed hypermedia application should be relevant and specific to the application domain in which it is to be used. Therefore, in order to produce AIMS it was essential that the current work practices and documentation structures, together with how they effect the content of the application, were understood. To achieve this, an audit of the information structure and working practices, using the contextual inquiry method [Preece 94] was undertaken. Contextual inquiry is an evaluation method that comes under the general heading of interpretative evaluation. It is used in the collecting and analysing of data on how people use technology in their natural situations. The rationale behind this type of evaluation method is that the working environment and practical tasks required of the user will effect their behaviour. Contextual enquiry is used to aid the developers of the hypermedia application in understanding the working environment, to uncover hidden work practices, and to spotting implicit work practices, etc.

    2.3 Structured Expert Reviews.

    The resources available for the evaluation lends itself to Structured Expert Review [Preece 94], in particular discounted usability evaluation. Discounted usability evaluation is a heuristic evaluation and was first proposed by Nielsen [Nielsen 89] and subsequently refined [Molich 90, Nielsen 92, Nielsen 94]. Discounted usability evaluation can be carried out with few resources, consisting of scenarios, simplified think aloud and heuristic evaluation. The whole cycle needs only a few testers as any more do not produce any significant benefit. The method uses small scenarios which can be easily changed, and as the ‘think aloud’ is carried out informally it does not require psychologists to analyse the results. The advantages of this method are that the majority of errors can be found without having to waste users time. However, the method does rely on the ability of 'experts' to judge the reactions of the user and hence not all the problems will be found. Only a few guidelines are required for the heuristic evaluations;

    However, Wills et al [Wills 97] noted that an area that is particularly applicable to hypermedia and not covered above is that of navigation through the information space. Therefore, another principle needs to be added, that of 'Provide Navigational Aids'.

    Back to Contents List.


    2.4 Defining the evaluation tasks.

    The Aims model is used to guide the analysis of the application and provide a common language/ set of definitions for the abstract tasks. The main primitives or design concepts that describe the essential features of AIMS are shown in Table 2-1. The usability evaluation of a hypermedia application can generally be divided into three main areas.

    1. The general look and feel of the user interface. This will include dialogue styles, menus, shape of buttons, usage of colours, window design and the actions the operator has to perform i.e. click on the button with mouse.
    2. Evaluation of the representation of the information structure. That is, revision control, access methods, general navigation features.
    3. Evaluation of the application specific information. That is, does it make sense, can the users find the information they need, etc.
    Primitives Structural Navigational Presentational
    Nodes (Documents)

    *

     

    *

    Sidebar  

    *

    *

    Menus  

    *

    *

    Data Entry Forms

    *

       
    Links

    *

    *

     
    Clusters  

    *

     
    Security

    *

    *

     
    Versioning  

    *

    *

    Searching  

    *

    *

    Table 1 The Design model primitives.

    The majority of the evaluation issues are concerned with information structure and information content. However intrinsic to the evaluation will be the look and feel of the application as a whole, with special attention being given to those areas of the interface specifically designed for the application, such as the toolbars, menus and data entry forms. The usability criteria used in this research are similar to those suggested by Garzotto [Garzotto 97], there are:

    2.5 Evaluation Tasks.

    The object of the time trials was to measure the effectiveness and ease of use of AIMS in entering documents and locating information. The tasks were chosen to reflect this, by using a set of prepared documents.

    The defining of the tasks also provides uniformity in the evaluation process, especially when different evaluators are involved. Hence, the evaluation process will depend, more on the procedure used and less on the evaluator's ability. In addition, the task description will identify critical areas to be evaluated and the appropriate usability criteria for evaluation. A list of tasks is given in Table 2-2.

    Scenario

    Activity Description

    Design Criteria:

    Usability Criteria

    Additional Comments

    Task 1

    Enter a new document

    You are to enter a new set of minutes into AIMS.

    ( A particular set of minutes will be given to the user, and so will the location of a specially prepared document.)

    Forms

    Security, Menus

    Sidebar, Links, Nodes

    Reuse

    Intuitive

    Consistency.

    Accessibility

    The form used is basically similar to the one used in tasks 2 and 4. The user needs to access the form from the correct menu.
    Task 2

    Update a document

    Update a version of the set of minutes entered earlier. (The title for the minutes and location of the new version will be given on the day of the trial) Versioning

    Forms

    Security, Menus

    Sidebar, Links, Nodes

    Reuse

    Intuitive

    Consistency.

    Accessibility

    Is it intuitive for the users to distinguish on the menu between the links for the form for new and updated documents
    Task 3

    Search for a document

    Enter the search term for a particular documents type. (Titles to be given on the day of the trials) Search

    Security, Menus

    Sidebar, Links, Nodes

    Accessibility

    Orientation

    Intuitive

    Can the users find any documents that they have permission for? Also checks error messaging for people trying to view documents unauthorised documents.
    Task 4

    Edit document details

    You are to edit the subtitle of the set of minutes.

    (a number of document tiles will be given to the user on the day of the trial)

    Forms

    Security, Menus

    Sidebar, Links, Nodes

    Reuse

    Intuitive

    Consistency.

    Accessibility.

    Can the use navigate back/find the correct document and edit its information.
    Task 5

    Hyperlinks

    Follow the initials of a person on the minutes. (Initials to be given on the day of the trials) Clusters

    Security, Menus

    Sidebar, Links, Nodes

    Accessibility

    Orientation

    Intuitive

    Consistency.

    Has the automatic generated clustering facilitating worked.

    Table -2 A list of the Tasks for expert reviews and timed trials.

    Back to Contents List.


    2.6 Interviews and questionnaires.

    In this research, both flexible interviews and questionnaires were used to gather and analyse qualitative data, that is, the users’ opinions. The questionnaires were designed to measure the scale of: Impression, Command, Effectiveness, Learnability, Helpfulness and Navigability. See Table 2-3 for definitions of terms. The first five scales are based on the definitions used in the Software Usability Measurement Inventory (SUMI) [Hirst 95]. The last one, navigation, was a scaled added to measure an area specific to hypermedia.

    Scales Definitions
    Impression The users feelings or emotions when using the software.
    Command The measure to which the user feels that they are in control.
    Effectiveness The degree to which the user feels that they can complete the task while using the system.
    Learnability The degree to which the user feels that the application is easy to become familiar with.
    Helpfulness The degree to which the application assists the user to resolve a situation.
    Navigability The degree to which the user can move around and through the information space.

    Table -3 Questionnaire scales and definitions.

    The questions used to measure the criteria in Table 2-3 can be mapped onto the Technology Acceptance Model (TAM) [Davis 93, Davis 96]. TAM is used to measure the user’s:

    Two questions used to measure the impression criteria were mapped onto the intention to use, all the questions used to measure the effectiveness criteria were mapped onto the perceived usefulness, and perceived ease of use is mapped from all the questions in command and learnability.

    The questions were designed by the authors to reflect the specific application domain. The questions on the questionnaire were arranged so that answering the questions by putting a cross down one of the columns would produce a score of zero. The users were forced to make a choice as there was no neutral answer. As the users were forced to make a choice with each question, the number of questions was limited, see appendix A for full questionnaire.

    Back to Contents List.


    3. Results from the AIMS User Evaluation.

    3.1 Contextual Review.

    During the evaluation, many if not all of the secretarial PCs were swapping, indicating that these machines do not have enough memory and in some cases not enough disk space. This may be the result of upgrading to Windows NT 4 and Office 97.

    Sharing information.

    One of the secretaries does not type minutes, her role is to enter the minutes into AIMS, from a floppy disc. This high lights some of the problems in sharing information in ECS. While there are workgroups set-up for each group, the secretaries are not trained in how to share information through work groups. Therefore, some of the secretaries share information by e-mailing each other, or by copying files onto floppies and passing them to each other. In addition, as there is no enforced departmental standardisation of word processing packages, sharing of information can be difficult.

    Achieving information.

    The main method of archiving is in the paper format. Some secretarial staff keep their own electronic archive normally, on their hard drive, only lasting a couple of years (approximately 6 sets of minutes) the other files are delete or used as a template for new ones.

    Where as in the Electrical Engineering Department, each of the secretarial machines have had a initial predefined directory structure so that administrative information can be easily located and archived.

    Distributing of Minutes.

    The minutes and any accompanying material is photocopied and mailed to all members of the committee. However, academic staff frequently ask for another copy just before the next meeting. On the surface this seems a straightforward process, however, this is not always the case, for example in the case of the Board Minutes or Academic Committee Minutes, where there can be a considerable amount of additional material. This additional material is often only available to the secretaries in paper format and photocopying and collating the information can be very time consuming. The length of time required to photocopy, collate and put the documents into envelopes, stick the address labels on (which have previously been printed) then walk them to each group, takes between six and eight hours.

    Use of the Campus network.

    The secretaries mainly use the network for e-mail and finding peoples e-mail address and for general web browsing (outside working hours).

    Back to Contents List.


    3.2 Expert Review

    The reviewers were Dr L Carr and Dr K Martinez from the Multimedia Research Group, at the University of Southampton. The expert reviewers have considerable experience of HCI and user interface design. The following subsections presents the comments from the expert users.

    3.2.1 Reviewers Comments Scenario: Task 1 Enter a new document

    3.2.2 Reviewers Comments Scenario: Task 2 Update a document.

    3.2.3 Reviewers Comments Scenario: Task 3 Search for a document.

    3.2.4 Reviewers Comments Scenario: Task 4 Edit document details.

    3.2.5 Expert Scenario: Task 5. Following links with WebCosm.

    3.2.6 General comments from reviewers.

    3.3 User Trials

    3.3.1 General comments raised by the users.

    Comments raised by the users during the evaluation were:

    3.3.2 User Questionnaires

    A list of the question with number of users responding agreeing or disagreeing and the number who did not answer is shown in Table 3-1.

    Comments from questionnaire.

    Very few addition comments were made. Those that were made, came from those that had been at the university for approximately one year.

    Impression- user's feelings or emotions when using the software.

    Command - the measure to which the user feels that they are in control:

    Navigability - the degree to which the user can move around the application.

    Learnability - the degree to which the user feels that the application is easy to become familiar with.

    Helpfulness - the degree to which the application assists the user to resolve a situation

    Effectiveness - the degree to which the user feels that they can complete the task while using the system.

    Other comments.

    Criteria Questions

    Strongly/ Disagreed

    Strongly/ Agreed

    No Response

    Impression I found the AIMS system awkward to use.

    7

    1

    1

    Impression The system is one that I would want to use on a regular basis.

    1

    6

    1

    Impression I enjoyed working with the AIMS system.

    1

    6

    1

    Impression I would not recommend the AIMS system to my colleagues.

    8

    0

    0

    Command I was unsure if I was using the right command.

    3

    4

    1

    Command I felt that I was in control when using the AIMS system.

    2

    4

    2

    Command The system was responsive to my inputs.

    0

    6

    2

    Command I found the interaction with AIMS cumbersome.

    5

    1

    2

    Command The AIMS system reacted quick enough to my selections.

    0

    7

    1

    Command I found it easy to make AIMS do what I needed it to do.

    1

    6

    1

    Navigability In AIMS there are too many steps required to get to the information I needed

    7

    1

    0

    Navigability I was able to move around the information in AIMS easily.

    3

    5

    0

    Navigability The AIMS side toolbar provided useful short cuts.

    0

    7

    1

    Navigability I found the AIMS menus and content pages useful.

    1

    7

    0

    Navigability I often become lost/disoriented when using AIMS.

    8

    0

    0

    Navigability There were plenty of ways to find the information I needed.

    1

    7

    0

    Navigability The actions associated with the options on the side toolbar were easily understood.

    2

    5

    1

    Navigability The information displayed was inconsistent.

    7

    1

    1

    Navigability I did not understand the icon in the menus

    6

    2

    0

    Navigability The screen became cluttered and confusing.

    6

    1

    1

    Learnability Learning to use the system was easy.

    2

    6

    0

    Learnability I did not have enough time to learn to use the AIMS system.

    6

    2

    0

    Learnability The AIMS system was difficult to learn to use.

    7

    1

    1

    Learnability It was difficult to learn more than the basic functions of the AIMS system.

    7

    1

    1

    Learnability Enough guidance was given before using AIMS.

    0

    7

    1

    Learnability I felt at ease trying different ways to get to the information I needed.

    1

    6

    1

    Helpfulness The error messages were not easy to understand.

    3

    0

    5

    Helpfulness There was not enough information on how to respond/proceed to error messages.

    3

    0

    5

    Helpfulness The system was awkward to use if I wanted to do something out of the ordinary.

    1

    1

    6

    Helpfulness The system messages were helpful when coping with an error.

    0

    3

    5

    Helpfulness I understood and was able to act on the messages provided by this system.

    0

    2

    6

    Helpfulness The system help files provided enough information to use the system.

    0

    2

    6

    Effectiveness Using AIMS would be NOT be of use to me in my job.

    7

    1

    0

    Effectiveness Using AIMS would get in the way of the task I was undertaking.

    7

    1

    0

    Effectiveness When using AIMS I found it difficult to obtain the information I needed.

    8

    0

    0

    Effectiveness Using AIMS will enable me do my job effectively.

    3

    5

    0

    Effectiveness When using AIMS it is straightforward to get to the information I needed.

    0

    7

    1

    Effectiveness Using AIMS allows me to accomplish the task more quickly

    1

    5

    2

    Shaded area indicates a negative response

    Table -1Question and response of users, by catergory.

    Back to Contents List.


    Sores from the Questionnaires.

    The questions were grouped under the appropriate criteria, the score are shown in Table 3-2, the table also shows the normalised score. The scores were normalised by dividing the score by the number of respondents and the number of question they answered. The maximum normalise score is +1 that is the users very strongly agreed to –1 they very strongly disagreed to all positively phrased question (and visa-versa on all the negatively phrased questions). A sore of 0.5 indicates that all the users agree with a positive statement, i.e. They agreed with ‘The AIMS system is one that I want to use on a regular basis.’

    Criteria

    Score

    Average

    (SD)

    Median

    Normalised Score

    Possible Range +/-
    Impression

    29

    3.6

    (1.6)

    4.0

    0.50

    64

    Effectiveness

    41

    3.6

    (3.3)

    4.0

    0.46

    96

    Learnability

    39

    8.3

    (5.6)

    10.0

    0.44

    96

    Navigability

    66

    6.0

    (3.8)

    6.0

    0.43

    160

    Helpfulness

    13

    1.6

    (2.4)

    0.0

    0.43

    96

    Command

    29

    5.1

    (3.8)

    6.0

    0.37

    96

    Over All

    217

    27.1

    (15.5)

    32.0

    0.44

    576

    Table -2 Score from the questionnaire.

    The TAM criteria scores are shown in Table 3-3.

    TAM Criteria Score Normalised Score
    Intention to use

    17

    0.57

    Perceived Usefulness (effectiveness)

    41

    0.46

    Perceived Ease of Use(Command and Learnability)

    80

    0.41

    Table -3 Technology Acceptance Model Scores.

    The questionnaire also showed that the scores given was not effected by the user access to the internet from home, or the users preference for using the internet. However, the over all score from those users that had been at the university for approximately a year were lower than those who had been working at the university for a longer period, see Table 3-4. In addition all these people were over 35 years old, however other personnel in this age bracket rated the system highly.

    Length of service No. of User Over all Score (normalised) Mean (SD) Median
    Service less than 1.1 years

    3

    34

    (0.20)

    11.3

    (13.8)

    6

    Service 1.1 years or more

    5

    183

    (0.55)

    36.6

    (5.3)

    37

    Table -4 The scores according to length of service

    The difference in length of service also effected the perceived usefulness and ease of use, see Table 3-5, the normalised score are calculated in the same way as for Table 3-2.

      Average Score Intention to use Average Score Perceived Effectiveness Average Score Perceived Usefulness
    Service less than 1.1 years

    0.4

    0.23

    0.13

    Service1.1 years or more

    0.7

    0.57

    0.53

    Table -5 The effect of length of service on user perception (normalised scores).

    Back to Contents List.


    3.3.3 Time taken to complete tasks.

    The average time to complete the tasks is shown in Table 3-6.

    Task Average time second (SD) Median
    Task 1 Enter a new document

    98.6

    (10.0)

    98.5

    Task 2 Enter the next set of Minutes

    60.8

    (7.7)

    63.5

    Task 3 Edit document details

    33.0

    (6.3)

    35.0

    Task 4 Search for a document

    40.9

    (12.0)

    43.5

    Task 5 Using Webcosm

    31.5

    (8.3)

    32.0

    Table -6 Average times to complete the tasks.

    Back to Contents List.


    4.0 Discussion.

    The general comments from the users showed that they were interested in the idea of a central repository for administrative information. This was reflected by the high score from the questionnaires given to the Impression Category, Table 3-2, and the intention to use category from TAM criteria, Table 3-3. All the users replied that they would recommend the system to their colleagues, and seven of the eighth users said that they would want to use the system regularly.

    However, nearly all could not envision its use beyond just the current system usage. In addition, there appears to be little encouragement from the management or academic staff to use the system beyond it current usage. This will need implementing from above and an active program by senior staff. In addition, the staff felt that they could do with more practice to use the system.

    The amount of memory in the PC’s hinders the progress of many tasks, coupled with and the lack of training in IT especially in the area of sharing information. This impedes the efficiency of the secretarial staff.

    The user generally thought that the AIMS system would be of use in their work, the Effectiveness/Perceived Usefulness criteria score, shown in Table 3-2 and Table 3-3, reflects this. Similarly, from these tables, the scores show that the majority of the uses found the system easy to use, represented by the command and learnability category. The low score in the Helpfulness category was due to all the users not experiencing problems during the evaluation, see Table 3-1. Hence some of them felt that they were unable to comment on his section.

    The results from the questionnaires showed that the majority of users found that they could navigate the information space easily see Table 3-1 and Table 3-2

    The users perception of the system varied with the length of service within the university see Table 3-4 and Table 3-5. Although new to the university the users were experienced secretarial staff. However, their perception, especially on the usefulness of the system, differed significantly from staff that had been at the university for some time.

    The time taken to complete the tasks were generally consistent, throughout the user group. The obvious saving of time comes from the ability to publish the set of minutes extremely quickly, when compared to the current method. The ability to disseminate information quickly would be of benefit to the department. However, this will only be an advantage if there is a paradigm shift. That is that the academic member of staff will need to ready the minutes on line, or as last resorts print them off to read them.

    Back to Contents List.


    5.0 Conclusions

    As a method of archiving and disseminating of academic administrative information AIMS is superior to the method presently used. That is, it is faster to disseminate information and a wider set of documentation is more easily accessible to all secretarial staff.

    Generally the users perception is that the system would be useful to them in performing their job function, and they intend to use the system. However, this perception was less in those who had worked in the university a relatively short period of time.

    There is a requirement for a program of expansion for the current document set used in AIMS. In addition, a review of the computer administrative support/policy for secretarial staff is required, if they are to continue to operate and share information efficiently.

     

    Acknowledgement

    The authors acknowledge Joint Information System Committee (JISC) for funding the work under project number 462.

    Back to Contents List.


    References

    [Dix 98] Dix AJ, Finaly JE, Abowd GD, Beale R. Human-Computer Interaction. Prentice Hall 1998
    [Davis 93] Davis FD. User acceptance of information technology: system characteristics, user perceptions and behavioural impacts. International Journal of Man Machine Studies (1993) Vol. 38 pp475-487.
    [Davis 96] Davis FD. A critical assessment of potential measurement biases in the technology acceptance mode: three experiments. International Journal of Human Computer Studies (1996) Vol. 45, pp19-45.
    [Garzotto 97] Garzotto F, Paolini P. Systematic Evaluation of Hypermedia Applications. Tutorial at The Eighth ACM conference on Hypertext, HT’97, Southampton UK, 9-11 April 1997
    [Hirst 95] Hirst SJ, Corthouts J. Hyperlib Deliverable 5.1: Evaluation of the Hyperlib Interfaces. Hyperlib Electronic Document Store University of Antwerp 1995 available at http://143.169.20.1/MAN/WP51/root.html
    [Molich 90] Molich R, Nielson J. Improving a Human-Computer Dialogue. Communications of the ACM, vol. 33 no.3, march 1990, pp338-348.
    [Nielsen 89] Nielsen J. Usability Engineering at a Discount. In: Designing and using Human-Computer Interfaces and Knowledge Based Systems. Editors, Salvengy G & Smith MJ. Elsvier1989, pp394-401.
    [Nielsen 92] Nielsen J. Finding Usability Problems Through Heuristic Evaluation. In Human Factors in Computing Systems CHI’92 Conference proceedings.(Bauersfeld P., et al, eds.) pp373-80. New York ACM Press. 1992.
    [Nielsen 94] Nielsen J. Enhancing the Explanatory Power of Usability Heuristics. Factors in Computing Systems CHI ’94. Boston, Massachusetts, USA, April 24-28, 1994, p152-158.
    [Preece 93] Preece J et al. A Guide To Usability, Human Factors in Computing. Addison Wesley, The Open University 1993..
    [Preece 94] Preece J, Rogers Y, Sharp H, Benyon D, Holland S, Carey T. Human-Computer Interaction. Addison-Wesley 1994 Chapter 29-34.
    [Wills 97] Wills G.B, Heath I, Crowder R.M, Hall W. Evaluation of a User Interface Developed for Industrial Applications. University of Southampton Technical report No M97-4 ISBN 085432 6901 at http://www.mmrg.ecs.soton.ac.uk/publications.html
    [Wills 99] Wills G.B, Heath I, Crowder R.M, Hall W User Evaluation of an Industrial Hypermedia Application. University of Southampton Technical report No. 99-3 March 1999. ISBN 0854326723 http://www.mmrg.ecs.soton.ac.uk/publications.html
    [Yamada 95] Yamada S, Hong JK, Sugita S, Development and Evaluation of Hypermedia for Museum Education: Validation of Metrics. ACM Transactions on Computer-Human Interaction, Vol. 2, Number 4, December 1995, pp 284-307.

    Back to Contents List.


    Appendix A AIMS Questionnaire

    Master Questionnaire, used to aid data entry

    Name: ...........................................................e-mail name......................

    User status: Missing Value 9

    Administrator Secretary Other
    1 2 3

    Age:

    24 or younger 25-34 35-44 45-54 55 or older
    1 2 3 4 5

    Approximate Length of service within University: ..............Years...........Months

    e.g. 0 years 4 months

    Experience of using the Internet/World Wide Web:

    Do you use the Web at home:

    Daily Weekly Once a Month Rarely Never
    5 4 3 2 1

    Do you use the Web at work (not just for work related information):

    Daily Weekly Once a Month Rarely Never
    5 4 3 2 1
             
    Do you find the: Agree Disagree Neither
    high-tech route to information is intimidating 3 2 1
    paper-based system easier to use 3 2 1
    paper-based system a more effective means of sharing information 3 2 1
    Why do you use the Network/Web? Agree Disagree Neither
    It is the only method available to retrieve the information 3 2 1
    Convenience 3 2 1
    Easier searching 3 2 1
    Speed to retrieve information 3 2 1
    Easier to use 3 2 1
    Prefer to use 3 2 1

    Do you use ftp files to transfer files?

    Yes 3 No 2 Do not Know 1
  • Do you use an Internet search engines? e.g. Yahoo, Alta Vista etc.
  • Yes 3 No 2 Do not Know 1

     

    Back to Contents List.


    Impression- user's feelings or emotions when using the software. (Perceived Intention To Use) Strongly Disagree Disagree Agree Strongly Agree
    I found the AIMS system awkward to use.

    2

    1

    -1

    -2

    The system is one that I would want to use on a regular basis.

    -2

    -1

    1

    2

    I enjoyed working with the AIMS system.

    -2

    -1

    1

    2

    I would not recommend the AIMS system to my colleagues.

    2

    1

    -1

    -2

    Additional comments about your feeling or emotions when using the software:-
    Command - the measure to which the user feels that they are in control. (Perceived Ease Of Use) Strongly Disagree Disagree Agree Strongly Agree
    I was unsure if I was using the right command.

    2

    1

    -1

    -2

    I felt that I was in control when using the AIMS system.

    -2

    -1

    1

    2

    The system was responsive to my inputs.

    -2

    -1

    1

    2

    I found the interaction with AIMS cumbersome.

    2

    1

    -1

    -2

    The AIMS system reacted quick enough to my selections.

    -2

    -1

    1

    2

    I found it easy to make AIMS do what I needed it to do.

    -2

    -1

    1

    2

    Additional comments about whether you feel in control:-
    Navigability - the degree to which the user can move around the application. Strongly Disagree Disagree Agree Strongly Agree
    In AIMS there are too many steps required to get to the information I needed

    2

    1

    -1

    -2

    I was able to move around the information in AIMS easily.

    -2

    -1

    1

    2

    The AIMS side toolbar provided useful short cuts.

    -2

    -1

    1

    2

    I found the AIMS menus and content pages useful.

    -2

    -1

    1

    2

    I often become lost/disoriented when using AIMS.

    2

    1

    -1

    -2

    There were plenty of ways to find the information I needed.

    -2

    1

    1

    2

    The actions associated with the options on the side toolbar were easily understood.

    -2

    1

    1

    2

    The information displayed was inconsistent.

    2

    1

    -1

    -2

    I did not understand the icon in the menus

    2

    1

    -1

    -2

    The screen became cluttered and confusing.

    2

    1

    -1

    -2

    Additional comments on how you easy you found it to locate the information :-
             
    Learnability - the degree to which the user feels that the application is easy to become familiar with. Strongly Disagree Disagree Agree Strongly Agree
    Learning to use the system was easy.

    -2

    -1

    1

    2

    I did not have enough time to learn to use the AIMS system.

    2

    1

    -1

    -2

    The AIMS system was difficult to learn to use.

    2

    1

    -1

    -2

    It was difficult to learn more than the basic functions of the AIMS system.

    2

    1

    -1

    -2

    Enough guidance was given before using AIMS.

    -2

    -1

    1

    2

    I felt at ease trying different ways to get to the information I needed.

    -2

    -1

    1

    2

    Additional comments about how easy you felt the software was to become familiar with:-
    Helpfulness - the degree to which the application assists the user to resolve a situation. Strongly Disagree Disagree Agree Strongly Agree
    The error messages were not easy to understand.

    2

    1

    -1

    -2

    There was not enough information on how to respond/proceed to error messages.

    2

    1

    -1

    -2

    The system was awkward to use if I wanted to do something out of the ordinary.

    2

    1

    -1

    -2

    The system messages were helpful when coping with an error.

    -2

    -1

    1

    2

    I understood and was able to act on the messages provided by this system.

    -2

    -1

    1

    2

    The system help files provided enough information to use the system.

    -2

    -1

    1

    2

    Additional comments about how helpful the system is in assisting you resolve a situation:-
    Effectiveness - the degree to which the user feels that they can complete the task while using the system. (Perceived USEFULNES) Strongly Disagree Disagree Agree Strongly Agree
    Using AIMS would be NOT be of use to me in my job.

    2

    1

    -1

    -2

    Using AIMS would get in the way of the task I was undertaking.

    2

    1

    -1

    -2

    When using AIMS I found it difficult to obtain the information I needed.

    2

    1

    -1

    -2

    Using AIMS will enable me do my job effectively.

    -2

    -1

    1

    2

    When using AIMS it is straightforward to get to the information I needed.

    -2

    -1

    1

    2

    Using AIMS allows me to accomplish the task more quickly

    -2

    -1

    1

    2

    Additional comments about how effective you feel the software was:-
     

    Any other addition comments you wish to make?

    Back to Contents List.