Evaluation of a User Interface Developed for Industrial Applications.

Wills G.B, Heath I, Crowder R.M, Hall W.

Abstract

The user interface on factory floor computers is in most cases limited to a fixed page, non-window format. While this approach is acceptable for data entry, it can not be used for multimedia applications such as envisaged in current projects. As reported previously [1,2], we have demonstrated that the use of hypermedia in a factory floor environment is a viable solution to information dissemination. We noted however that the acceptance of the approach was dictated by the user interface design. As part of our current work, we have conducted a formal evaluation of the initial user interface developed for the Pirelli Cables, Aberdare application. The study was conducted using a discount usability method of evaluation. The results are summarised in this report and will provide guidelines for the development of user interfaces for industrial hypermedia applications. The report considers approaches to enhance the evaluation process, and concludes with a review of user interface requirements for industrial hypermedia applications.

Multimedia Research Group Technical report No M97-4

ISBN-0854326499

Copyright © 1997, University of Southampton. All rights reserved.


Contents.

1. Introduction.

2. The Project.

3. Evaluation Method Used

3.1 The procedure used.

4. Results

5. Comments on the procedures used for evaluation trials

6. Observations Relating to Industrial User Interfaces

6.1 Fixed Window Layouts.

6.2 Screen Management for a flexible environment.

7. Conclusions

References

Appendix A. Form For Expert Review.


1. Introduction.

The use of computers on the factory floor has been limited to data entry terminals based on a fixed page non-windows format. While there has been a move to using Programmable Logic Controllers (PLCs) with multi-colour displays, only data is displayed, and is limited to fixed page screen formats. This contrasts sharply with the trend for networked PC and workstation within the office environment. In more recent times complex systems are being supplied with electronic manuals to aid fault finding [3]. However, the screen layout is still fixed paged and the information is limited to a narrow data set.

The Aberdare case study was used to demonstrate the use of hypermedia in a factory floor environment [1,2]. Resulting from the success, another project called FIRM (Factory Information Resource Management) is now underway to look at hypermedia in the whole factory [4]. Before the requirements for the user interface of the application were specified, a formal evaluation of the previous project's user interface was considered necessary. The aims of the work detailed in this report are:

This report presents the methodology used and discusses the results obtained. Finally, a set of general rules for user interfaces for industrial hypermedia applications are presented.

Return to Contents List


2. The Project.

One of the main aims of the initial project, Multimedia Systems as an Interface to Advanced Manufacturing Technology, funded by the EPSRC, was:

To demonstrate that the concepts of the project were viable, a case study was undertaken. The study was centred on the cable packaging line at Pirelli Cables Aberdare, with particular emphases on maintenance and operator set-up procedures.

It was envisaged that the developed application would be used by the operators and maintainers of the manufacturing process line. The operators use it as an aid to set-up the process line and for a limited amount training. The maintenance technician would use the system to support maintenance procedures, together with line re-commissioning.

Image of the Pirelli process line

Figure 1 User Interface for the Operator Set-up

The developed application used a pen based portable computer to deliver the information with a screen resolution of 640 x 480. The users found this easy and natural to use [5]. To select a link the user circled the object on the screen with the pen, and by tapping the pen on the screen the user was able to select a menu action. The hypermedia application developed for Pirelli Cables, Aberdare used an interface layer on top of the hypermedia system. A hypermedia system is one in which the concepts of hypertext are applied to multimedia information resources. Hypertext is a phrase coined by Ted Nelson in the late 1960’s. It was applied to unstructured text, with associations between the text being made with links. In practice hypermedia allows association to be made between different types of media, i.e. text, video, digitised photographs, databases, engineering drawings, spreadsheets. Microcosm is a commercial hypermedia software package used in the case study. Microcosm was developed at the University of Southampton, and is an open hypermedia system [6] that separates links from the information resource (text documents, video clips, photographs, tables, technical drawings, audio files etc.). This enables the documents to be stored in different locations, in their original format. Microcosm allows the user to navigate through the document resource base using a number of different link mechanisms. The links can be arranged in link databases (linkbases) to represent different cognitive and pedagogical structures. The users request for actions (new document, follow links, etc.) can come from the context of the display, a menu selection or a button on a toolbar.

This interface layer consisted of an image, taken from a CAD model of the production line (Figure 1) and a tool bar (Figure 2). This still allowed the underlying functionality of Microcosm to be used. However, in practice, some of the Microcosm options were masked from the user.

Image of the Toolbar

Figure 2 Tool Bar

The buttons on the toolbar allow the user to access Microcosm functionality. The toolbar incorporates the following features: -

Image of select a document Window

Figure 3 Microcosm's Select A Document Window

3. Evaluation Method Used.

Evaluation of the user interface must be considered an integral part of the hypermedia application development process. Hence, the evaluation should not be limited to the start and/or end of the development cycle, and therefore addition user evaluations can be undertaken during the development cycle. The evaluation process is generally concerned with the gathering of information about the usability or potential usability of a system, in order to improve features within an interface and it’s supporting material or to assess a complete interface [7].

The aim of an HCI evaluation is to assess the following [8]:

In addition, this evaluation was undertaken to provide information from which future user interfaces can be developed, to ensure that any previous errors in presentation and structure are not repeated.

As this evaluation was to be carried out on a completed project, and the option of using personnel from Pirelli Cables, Aberdare was not available. This eliminated traditional experimental and usability testing [6] as an evaluation method and another method of evaluation was sought. Considering the skills and number of the people available to carryout the evaluation, some form of structure expert reviewing [9] was favoured, in particular Discounted Usability Evaluation. The method of Discounted Usability Evaluation was first proposed by Nielsen [10,11,12]. The procedure consists of: scenarios, think aloud and heuristic evaluation to alleviate any problems found by design-evaluate-design [9]. The whole cycle only needs a few reviewers, as more reviewers did not produce any significant benefit. The features that result in important discounts include:

The nine guidelines, suggest by Neilson, that are required for the heuristic evaluations are:-

  1. Use simple and natural dialogue.
  2. Speak the user’s language.
  3. Minimise user memory load.
  4. Be consistent.
  5. Provide feedback.
  6. Provide clearly marked exits.
  7. Provide shortcuts.
  8. Provide good error messages.
  9. Prevent errors.

3.1 The procedure used.

The review consisted of four expert reviewers, divided into two groups.

Group1:

Group2:

None of the reviewers have visited the production line, or been briefed on the details of its operation. Hence, only the user interface aspects of the human computer interaction were evaluated. The reviewers were given the following scenarios one at a time.

Operator Scenarios.

* This refers to the individual elements of the cable packaging line

Maintenance Scenarios. 

The expert reviewers explored the presentation of the information required to carry out the tasks. The scenarios were chosen to ensure that the reviewers would visit the different user interfaces of the system. The reviewers were asked to look and comment on the flow of information from screen to screen and on each screen to evaluate the HCI in accordance with the principles set out in Neilson’s Discount Usability Engineering.

The reporting of any situation encountered (any limitation, possible confusions or shortcomings in the HCI of the system) and ideas for improvement were to be written down for analysis and discussion later. While the reporting system used was classified as ‘unstructured reporting’ [8], a predefined form was used to record any observations. The form clearly stated the number and title of the scenario to be undertaken, (see appendix A for an example of the form that was used). To ensure the rest of the group was aware of what was going on, a think aloud approach was encouraged. In addition to ensure that good aspects of the design are brought forward into the future projects reviewers were asked to comment on the positive aspects, and not just the negative aspects of the HCI design.

Return to Contents List


 4. Results.

The nine essential principles suggested by Nielsen, described above, give a framework for good human-computer interaction. Since these general principles were first put forward, there has been a rapid growth in Graphical User Interfaces (GUI). Neilson’s original work was on text based user interfaces. While style guide for general purpose GUIs exist, this report evaluates the specific application of the Microcosm industrial user interface. Many of the points raised (95 in total, Click For List) by the two groups are unique to the application, and would have been fed back to the original design team had the project not ceased. However, these can be summarised and grouped together under the nine principles, amplifying the nine principles given by Neilson, and making them applicable to any industrial hypermedia applications using Microcosm. The results are summarised below, the bold text are Neilson’s principles, the bullet point text give the summary of the usability problems found:

Use simple and natural dialogue.

Speak the user’s language.

Minimise user memory load.

Be consistent.

  1. There should be a distinction between the operators’ and maintainers’ requirements.
  2. The options should not appear if the user is not able to select those options.

Provide feedback.

Provide Clearly Marked exits.

Provide shortcuts.

Provide good error messages.

Prevent errors.

An area that is particularly applicable to Hypermedia and not covered above is that of navigation through the resource base. Therefore, another principle needs to be added: -

Provide Navigational Aids.

While navigating a large information space, it is quite easy to get lost in hyperspace, even with a good memory. Sellen and Nicol [13] explained that the users make mental maps to help them navigate. Therefore, the system should make these maps explicit. They also point out that the maps will take differing forms depending on the hypermedia application. Sellen and Nicul did call this ‘reducing the memory load’, which was one of Neilson principles. However, in both cases navigation was only a minor aspect of the user interface. In hypermedia it becomes a major principle. Therefore, reducing the memory load is not sufficient, the system must provide the user utilities to navigate the information space without getting lost. Hence, any evaluation of a large hypermedia system needs to explicitly assess the level to which the system provides navigational aid, in order to reduce the possibility of getting lost in hyperspace.

Return to Contents List


5. Comments on the procedures used for evaluation trials.

Group one completed the first 3 Scenarios only, taking one and a half hours, while group two completed the five scenarios in just over 2 hours. During this time, the comments of the expert reviewers were noted. It was originally intended that the experts would write down their observations, but it was quickly realised that this would not be practical if ideas were to flow freely.

Wharton et al. [14] point out that they had similar problems, as their reviewers would often get into discussions and forget to write items down. They also suggest using a person to oversee the sessions and ensure that the focus of the discussions is kept on track. However, we found that it was necessary for the observer to slow the discussion, in order for the comments to be written down. In addition, the observer was in danger of directing the reviewers to particular errors, while justifying others. Therefore, when the observer is part of the development team, the observer should have minimal involvement in the discussion, except to ensure that ideas flow and the discussion remains on track, so as not to bias the reviewers comments. It may therefore be necessary to video the review sessions, while this method will take longer to evaluate, it would ensure that the expert reviewers could converse freely. Alternatively, an observer with shorthand skills could be used during the evaluation process that takes place on the factory floor, especially where the use of video cameras may be awkward to position and sound recording may be difficult.

The length of time spent on each scenario varied, with the greatest time spent on the first scenario and the least time spent on the last scenario. This was in part because the points raised in the first scenario were often fundamental principles that applied to the remaining scenarios. The time spent also varied in part, due to the tiredness of the reviewers. Therefore, either the reviewers short take short breaks during the evaluation process or the time limited to two hours. The scenarios allowed the reviewers to converse more freely and suggest possible design solutions. Careful thought to the scenarios and allowing time for the groups to discuss the user interface layouts ensured that the scenarios were completed in a reasonable time. This overcame some of the problems found by Rowley and Rhodes [15], who could not get the bulk of the evaluation completed in the time available (90 minutes) as the sessions took longer than planned. However, they were using a Cognitive Walkthrough approach and we used scenarios. Walkthroughs involve carefully designed tasks normally taken from the system specification [9] telling the reviewers exactly what to do, in a step by step approach. Karat et-al [16] found that their groups also ‘favoured scenarios over self-guided exploration in identifying usability problems.’

Neilson has evaluated several sets of usability heuristics [17]. He aim was not to find the best set, but an amalgam of usability heuristics, in order to produce a set that would find the usability problems in real systems. The result was 2 lists of usability heuristics. The first covered 85% of the 249 usability problems found in previous case studies, while the second list covered 95% of the 82 serious usability problems. It is worth pointing out that the original set of usability heuristics that provided the most comprehensive coverage of the usability problems (82%) was the set used in the evaluation of the Aberdare case study. For the evaluation they did included one more criteria to the original 9 usability heuristics that was Help and Documentation. However non-of the sets of heuristics covered the problem of navigation. Which we have shown to be a key area when evaluating hypermedia interfaces and systems.

Rowley has carried out field usability testing in the field [18]. These were carried out in a company’s regional offices in the USA and Europe. While many of the 13 lessons stated in the paper can be put down to careful planning prior to carryout the field test, the lessons that were not due to planning are:

The industrial environment brings together users with different and varying computer skills, all of whom need to be supported. Most of the information required by the operator in setting up a machine will involve the use of text, photographs, diagrams, tables, etc. The Microcosm RTF viewer allows the user to view text and embedded graphics. This ensures that sketches that appear in the original paper document to appear in the correct place with in a hypermedia-authored document. This is of benefit where there is a short piece of text referring to an embedded graphic.

However, the information normally used within an industrial or technical environment, for example manuals, work-packages, instruction-sheets, etc., the text will quite often refer to a diagram on another page (often several pages away). This can lead to errors while scrolling the pages looking for the diagram. This is similar to the problems experienced in a paper based system, when the user is required to cross-reference information on different pages. In addition in an industrial hypermedia application, the text will not only refer to another page, but may also refer information stored in different forms of media. It is therefore essential to be able to display the text next to the information it refers to.

6.1 Fixed Window Layouts.

The advantage of fixed window layout (FWL) in hypermedia systems is that it ensures that the most appropriate window layout for the information is seen, at least in the first instance. It allows the text and engineering drawings or video together to be placed in pre-defined areas of the screen. Microcosm runs under the Microsoft Windows environment. Due to the modular architecture of the Microcosm system, a number of processes can have interfaces (document viewer, a toolbar, or a dialogue box) on the screen at the same time. The interfaces (processes) all act independently of each other in terms of screen management and can therefore be made to overlap or obscure one another. This can cause problems with inexperienced users. Similarly, inexperienced users may have problems with basic screen management such as inadvertently moving windows off the screen, inability to restore a minimised window, etc. Hence, novice users can soon clutter the screen, losing track of where they are going, or worst still, move the document off the viewing area never to be seen again. Therefore, a fixed window layout is the preferred method for displaying information to the factory floor operator, where operators tend to be non-experienced users and where a sequence of events are to be followed. Within the FIRM project, the need for a fixed window layout is most likely to occur with new operators, accessing information in regard to set-up tasks and faults encountered.

The three-dimensional model gave an effective gateway to the information resource database. The model can take a significant time to produce, however, any number of screens shots from different angles are easily produced from a single model. These images are then overlaid with links, as with any digitised photograph or picture. It was found that users preferred the idealised 3D model of the line compared to a photograph for the interface [3]. Hence, this type of interface will be reused within FIRM.

The toolbar approach is a considerable improvement over the basic Microcosm user interface. The toolbar enables ease of navigation by providing short cuts to the underlying information resource database. A significant number of comments from the reviewers, centred around the fact that the toolbar and menus did not distinguish between users. This distinction is necessary so not to confuse operator with option they could not use (even if ‘greyed out’).

In a fixed windows environment, experienced Windows users are no longer free to position and size the window, as they so desire. Hence, they lose the flexibility of having complete control of their working environment. A case in point is during fault finding, where multiple windows are often preferable.

Therefore, a separate style of interface is required for the maintenance technician:

6.2 Screen Management for a flexible environment.

The major advantage with hypermedia is the ability to navigate through the information resource database. Screen management is concerned with how information is presented to the user on the screen and the tools that enable them to manipulate this information using the interface provided. The use should then be able to adjust the screen and navigate freely through the information domain.

A new solution for tackling the problems created by providing for users with differing abilities has been designed and implemented by Hall et al. [19]. The architecture, which promotes the disclosure of state information by the individual processes and allows screen management processes to modify the state of the interface components, has been named SHEP (Screen Handler Enabling Process). This evaluation, along with the HiDES project [20] with the Multimedia Research Group (MMRG) led to the development of a Screen Handling Enabling Process (SHEP). The SHEP solution gives the designers of hypermedia applications flexibility when creating interfaces for users of different and varying ability when using microcosm. The designers no longer need to target the interface at a specific user group at the exclusion of other users. The SHEP solution allows the designers of industrial hypermedia applications to provide:

7. Conclusions.

The evaluation of the Aberdare case study has provided guidelines for future industrial Microcosm interfaces. While the evaluation did find a number of problems, it also showed the use of the toolbar to be an effective method of providing short cuts. The following conclusions were drawn:

Return to Contents List


Acknowledgements.

The authors acknowledge the EPSRC (Engineering and Physical Science Research Council) for funding the work under grant numbers GR/H/43038 and GR/L/10482, and Pirelli Cables, Aberdare for allowing us to use their site for a case study.

References.

  • 1. Crowder RM, McManus P. Information Live. Manufacturing Engineering Vol. 74 No. 5 October 1995 pp 227- 229.

    2. Crowder RM, Hall W, Heath I, Bernard R, Gaskell D. A Hypermedia Maintenance Information System. IEE Computing and Control Engineering Journal, Vol. 7 No. 3 June 1996 pp121-128.

    3. Stobart RK. Design for diagnostic and Condition Monitoring. Conference proceedings, Maintenance, The Business Challenge. I Mech E. Birmingham UK. 16-17 June 1992 pp 109-113.

    4. Crowder RM, et al. Requirements Specification: FIRM: Factory Information Resource Management, EPSRC Grant: GR/L/10482. University of Southampton.

    5. Crowder RM, et al. ITE - Multimedia Information Systems as an Operational interface within the Advanced Manufacturing Environment, Final Report. SERC Grant : GR/H 43038 University of Southampton October 1995.

    6. Davis HC, Hall W, Heath I, Hill G, and Wilkins R. 1992 "Towards an Integrated Environment with Open Hypermedia Systems." 1992 Proceedings of the ACM Conference on Hypertext, EHCT'92, Milian, Italy, December 1992 pp 181-190. ACM Press 1992.

    7. Preece J et al. A Guide To Usability, Human Factors in Computing. Addison Wesley, The Open University 1993.

    8. Preece J, Rogers Y, Sharp H, Benyon D, Holland S, Carey T. Human-Computer Interaction. Addison-Wesley 1994 Chapter 29.

    9. Preece J, Rogers Y, Sharp H, Benyon D, Holland S, Carey T. Human-Computer Interaction. Addison-Wesley 1994. Chapter 33

    10. Nielsen J. Usability Engineering at a Discount. In: Designing and using Human-Computer Interfaces and Knowledge Based Systems. Editors, Salvengy G & Smith MJ. Elsvier1989, pp394-401.

    11. Molich R., Nielsen J. Improving a Computer Dialogue. Communications of the ACM. March 1990, Vol. 33 No 3. pp 338-348.

    12. Nielsen J. Finding Usability Problems Through Heuristic Evaluation. In Human Factors in Computing Systems CHI’92 Conference proceedings.(Bauersfeld P., et al, eds.) pp373-80. New York ACM Press. 1992.

    13. Sellen A, Nicol A. Building User-Centered on-line Help, in The Art of Human-Computer Interface Design edited by Laurel B, Addision-Wesly 1990 p.143-153.

    14. Wharton C, Bradford J, Franzke M. Applying Cognitive Walkthroughs To More Complex User Interfaces: Experiences, Issues And Recommendations. CHI’92, ACM Conference on Human Factors in Computer Systems, Munerey California. May 3-7, 1992. Pp 381-388.

    15. Rowley DE, Rhoades DG. The Cognitive Jogthrough: A fast-paced User Interface Evaluation Procedure. CHI’92, ACM Conference on Human Factors in Computer Systems, Munerey California. May 3-7, 1992. Pp 381-388.

    16. Karat CM. Campbell R. Fiegel T. Comparison of Empirical Testing and Walkthrough Methods in User Interface Evaluation. CHI’92, ACM Conference on Human Factors in Computer Systems, Munerey California. May 3-7, 1992. Pp 381-388.

    17. Nielsen J. Enhancing the Explanatory Power of Usability Heuristics. Factors in Computing Systems CHI ’94. Boston, Massachusetts, USA, April 24-28, 1994, p152-158.

    18. Rowley DE. Usability Testing in the Field: Bringing the Laboratory to the User. Human Factors in Computing Systems CHI ’94. Boston, Massachusetts, USA, April 24-28, 1994, p252-257.

    19. Hall W, Weal M, Heath I, Wills GB, Crowder RM. Flexible Interfaces in the Industrial Environment. International Conference Managing Enterprises- Stakeholders, Engineering, Logistics and Achievement (ME-SELA'97) Loughborough, UK. 22-24 July 1997 pp453-460.

    20. Hall W, Colson F. Multimedia Teaching with Microcosm-HiDES : Viceroy Mountbatten and The Partition of India. History and Computing, Vol. 3 No2 1991 pp89-98.

    Appendix A. Form For Expert Review.

  • HCI Expert Review of Pirelli Project

    Scenario 1: Setting up of the cable laying on machine R2.

    Reviewer: Date: Sheet:
    At each run through the reviewers are to, look and comment on :
    • The flow of information from screen to screen and on each screen evaluate.
    • Use simple and natural dialogue.
    • Speak the user’s language.
    • Minimise user memory load.
    • Be consistent.
    • Provide feedback.
    • Provide Clearly Marked exits.
    • Provide shortcuts.
    • Provide good error messages.
    • Prevent errors.
    Comments:
     
     
     
     
     
     
     
     
     
     
     
     
     
     
     

    Return to Contents List


    University Home page ECS Home page

    © University of Southampton. 1997