index.html   Comparing Compatible Semiotic Perspectives for the Analysis of Interactive Media Devices

Shaleph O’Neill

Interactive Design Lab

School of Design/Division of Applied Computing

University of Dundee

s.j.oneill@dundee.ac.uk



Abstract

The purpose of this paper is to explore existing semiotic techniques in order to identify their strengths and weaknesses in analysing interactive media systems. Three individual studies are compared using variations of product semiotics, visual semiotics and Eco’s revised KF (Katz & Fodor) model. Taken individually each study does not provide a wholly satisfactory solution to the problems of evaluation. However, when considered together, the possibility of an integrated semiotic theory becomes an attractive proposition as an evaluation method. This paper suggests that older semiotic approaches, while useful, are not enough in themselves. In order to be useful to HCI (Human Computer Interaction), the relevant aspects of semiotic theory must be integrated with an understanding of interactive interpretation, in such a way, as to produce a semiotics of new interactive media that is capable of articulating its specific characteristics.

 

1   Introduction

Semiotics, the study of sign systems, is a promising candidate for helping us to understand cognitive ergonomics. The concerns of both cognition and culture are central to semiotic analysis, with the interface being seen as a message sent from designer to user (de Souza, 2001; de Souza, 1993). The interface to a product must reveal what it does and how it does things through the collection of physical and perceptual characteristics and behaviours that constitute the user interface. Of course how people interpret these things depends heavily not just on the appearance of the device, but also on the background of the observers and the context within which the interaction takes place.

This paper attempts to address the problem of applying semiotic theory to the practicalities of interacting with interactive media artefacts. Mobile phones have been chosen here because they sit somewhere between traditional HCI artefacts (i.e. task based interaction) and new media (culturally diverse and convergent media appliances). While these particular phones are now quite old, as the forerunners of a new wave of convergent media technology, they still exhibit the kind of digital abstraction and media convergence that is becoming increasingly evident in new media artefacts. They are interactive sign systems, the interfaces of which must be interpreted in order to operate them. The aim here is to apply semiotic approaches from older media domains to an investigation of these kinds of systems, in order to highlight which aspects of semiotic theory are appropriate for developing a broad, culturally informed, critical approach to interface design and evaluation.




AS/SA nº 16, p. 25





Current Formal Methods

Cognitive ergonomics takes the view of HCI, according to de Hann (2000), as ‘a matter of knowledge representation and information processing’. The current development of user-centred design methodologies such as ETAG based design (de Haan, 1994, 1997) and Display-Based HCI (Kitajima and Polson, 1992, 1995) as formal methods, focus largely on the use of knowledge to operate computers based on the execution of tasks to achieve goals in an action/evaluation cycle (Norman, 1988). ETAG is a formal notation system that is used in line with an all-encompassing view of the design life cycle from a user perspective. Display-based HCI focuses on the mental modelling process of the users action/evaluation cycle at the interface, offering a different perspective on how mental models work. Other standard approaches to HCI evaluation include, user observation studies, GOMS models and cognitive architectures (Katajima and Polson, 1997; Byrne, 2003; Kieras, 2003). While all of these approaches are valid and in many cases very successful, they suffer from two main problems. Firstly, approaches based on user observations and on cognitive psychology, while scientifically sound, can take a very long time to develop in relation to system design and evaluation; this is not always cost effective (Byrne, 2003). Secondly, standard approaches, particularly cognitive models, by their very nature require to be constrained to specific aspects of interaction. This limits the reusability of such methods, where cognitive models have to be built for each interactive situation, before the system can be thoroughly evaluated (Byrne, 2003). As HCI moves further into exploring interactive technology within the home and leisure applications, models must either become more detailed or quicker alternatives must be sought that concentrate on the broad aspects of these new types of interaction (Karat, 2003; van der Veer and del Carmen Puerta Melguizo, 2003).

Semiotics does not dispute the need for an understanding of the cognitive processes involved in interaction. Indeed, concepts such as, Norman’s action/evaluation cycle remain relevant to research in this area, arguably because semiosis is dependent on some sort of perceptual process (Eco, 2000). However, in terms of knowledge representation and sense making semiotics brings with it new ways with which to understand how interfaces are interpreted during interaction. Where ETAG is unable to describe the ‘presentation interface’ (de Haan, 1994) a semiotic approach offers a method directly related to the syntagmatic structuring of signs and their semantic content. When one examines the notions of Display-based HCI in semiotic terms one begins to see striking similarities between ‘associative networks’ (Kitajima and Polson, 1992) and the wider ranging notion of ‘semiosis’ in semiotics (Eco, 1976).



 

1.1    Semiotic Theory and Interactive Systems

Semiotics’ concern with the nature and use of signs is a good place to approach interactive systems from, because some of the central concerns of HCI parallel those already present in semiotics. The notion of the sender and reader in semiotics is not dissimilar to the notion of designer and user, or system and user, in HCI. Taking the view that computers are machines built on signification and code Mihai Nadin points out that, ‘One cannot not interact…one cannot avoid semiotics’ (Nadin, 2001). The whole process of interaction can be seen as an act of manipulating and understanding the signs in an interface.

Current strands of research being developed within the HCI/semiotics community consist of an approach, based on the classic structuralist concerns of Peter Bogh Andersen (Andersen, 1990, 1999); radical reassessments of paradigms such as Navigation (Benyon, 2000); the development of software (Nadin, 2000); and in approaches to user centred design in the form of Semiotic Engineering (de Souza, 1993, 2001). Both Semiotic Engineering (de Souza, 1993) and Computer Semiotics (Andersen, 1990) have been around for over a decade without successfully entering into the mainstream of HCI methods, this may be due to the diverse nature of the research so far. O’Neill and Benyon take these ideas further by exploring semiotic theory in relation to the broader cultural issues of interacting with new media in a range of domains (O’Neill, 2005; O’Neill and Benyon, forthcoming; O’Neill and Benyon, 2003a, 2003b; O’Neill et al, 2002). The research presented in this paper is a continuation of these ideas focusing on a comparison of three different types of analysis, in an attempt to identify the strengths and weaknesses of different semiotic approaches. Important issues regarding the development of a semiotic approach to understanding interaction in new domains are thus identified.




AS/SA nº 16, p. 26





1.2    Analysis Techniques

The first analysis is made using an adapted version of Vihma’s product semiotics (Vihma, 1995). Phones are excellent examples of media that exist as products that can be bought ‘off the shelf’, and as such, it would seem most appropriate to use Vihma’s analysis techniques to establish which aspects of semiotic theory can be said to manifest in such products. The second analysis is made using Kress and van Leeuwen’s visual semiotic method (Kress and van Leeuwen, 1996). This has been chosen as a complementary approach to Vihma’s, because the operation of mobile phones relies heavily on visual representation and Kress and van Leeuwen’s method provides the most in-depth analysis of visual grammar. The third and final analysis is made using Umberto Eco’s modified KF model (Eco, 1976), where it is employed specifically to focus on the process of interaction with the phones interfaces. This is a much more detailed analysis, as it considers interaction over time and is a first attempt to understand how the structuring of dynamic signs affects interpretation. Photographs of the three phones are given below and should be used as a constant reference throughout the analyses.

 

panasonic_gd35.gif           5110_2.gif           nokia6150.jpg  

Figure 1.1 Three phones: the Panasonic GD35, Nokia 5110, and the Nokia 6150




2     Product Semiotics

Susan Vihma uses semiotic theory to explore the nature of product design in order to develop an analytic method that is derived from the sign categories of Peirce (Vihma, 1995). Vihma sees the form of the object, i.e. the overall physical construction of it, as part of that which has been designed and therefore as having been designed by some person, in line with its purpose and functionality. By exploring the structure of products in semiotic terms, Vihma manages to articulate the parts of designed products, which can be viewed as communicative of its purpose and function. In this way, she categorizes aspects of designed products in functional terms with a semiotic framework that sees a designed product as a bundle of concurrent messages or text, not unlike Barthes’ analysis of newspaper articles (Barthes, 1977).

The analysis method employed here is based on Vihma’s product semiotics, which has been adapted and tested by Bosse Westerlund in relation to analysing websites (Westerlund, 2002). Essentially, the technique proposed by Westerlund involves two distinct stages. The first is an analysis of functionality, which is employed to discern, what a product does, or in the case of design, establish what a product should do. The second stage is to employ Vihma’s classification of sign types, and their various elements, in analysing the product. The concern in this study is to focus only on the second stage, as it is only this stage that is inherently semiotic.

Below is a table (Table 2.1) that describes each of Vihma’s sign types along with the characteristics that are important to analysing product design. It should be noted here that Vihma’s sign types are based on Peirce’s conception of the sign and his subsequent identification of icons, indices and symbols.






AS/SA nº 16, p. 27



 

Sign type

Description

Icons

 

The tradition of form

Normally used as a reference for the design of new product. Conformity with a product tradition and especially any divergence from it will be noted and can function as a sign.

Colour

May signify a quality: e.g. white can refer to cleanliness. (Connotative)

Material

May often refer to quality, e.g. gilding indicates wealth; - concrete, emotional coldness. (Connotative)

Metaphor

The resemblance of a particular object to some other object from another domain, often not a designed object. For example the front of a car might resemble a face.

Style

The period styles like art nouveau, 1950’s etc.; moreover geometric classifications like “spherical” vs. “square” styles. Here again, conformance and divergence from well-known styles (if any) will be salient.

Environment

Some industrial products are designed for a specific environment, e.g. kitchen, bathroom etc.; others may have the (false) appearance of being so designed, e.g. a sports car appearance.

 

 

Indices

 

A pointing form

Arrows and pointers are often found on operating buttons of machines; sometimes the product itself has such a form.

Traces of tools

Characteristic marks from tools used to make the product in manufacturing e.g. the seam on plastic parts from injection moulding.

Marks of use

Abrasions, dents, flaws, dirt etc.

Other traces

Rust and corrosion. Drops of water on the surface of a bottle indicating cool drink

Light and sound

Often indicate the technical functions of appliances and computers.

Noise

The sound of a product in use.

Smell

The distinctive smell of certain products e.g. leather.

Touch

The feel of a certain material may indicate quality or by lifting a container you can find out if it is empty or not.

Graphic figures

If they are integral parts of the product itself. E.g. the yardstick with scale and numbers to indicate measurements.

Symbols

 

Graphic symbols

Logotypes, on-off buttons, washing instructions on textiles etc.

Symbolic colour

A red carpet signifies the high society of fame fortune and royalty. (Connotative)

Symbolic form

Uniforms often denote type of job, rank etc.

Position and
posture

Compositional arrangement e.g. closeness, above or below, etc.

Material

The quality of a certain type of material used in dress making for example may signify social status or the character of an event.

Table 2.1 Vihma's analysis framework



2.1    Analysing the Phones with Product Semiotics

The process of analysis is simply to use Vihma’s sign type characteristics as a checklist by which to evaluate a product. One simply moves through the list establishing whether or not the product displays any of the characteristics described and notes down what they are. Westerlund points out that not all of these elements are useful in relation to the analysis of websites. Similarly, not all of these elements are expected to be useful in the analysis of the phones attempted here. However, it is expected that the phones, as products, should be quite susceptible to this type of analysis. In this study each individual phone is analysed separately and comparisons are discussed afterwards. It should be noted that this technique is not considered to be an exhaustive one but one that provides vital semiotic information about products to designers in a standardised format.






AS/SA nº 16, p. 28



2.1.1   Analysing the GD35


Sign type

Description

Icons

 

The tradition of form

This phone follows the form of traditional telephones, mobile and otherwise in that it has the earpiece at the top and mouthpiece at the bottom, although these are not made overtly obvious. Similarly in a traditional mobile phone configuration of it has an aerial top left, a screen and a keypad.

Colour

Black, in this context is associated with the corporate world in which mobile phones were born, connoting a sense of seriousness and professionalism.

Style

Rounded curves

Environment

Again the corporate world is associated with the mobile phone even although they are now utterly pervasive in modern society.

Indices

 

A pointing form

Pointing forms are evident on a centrally located button, indicating four directions of menu navigation.

Light and sound

This phone has a number of ring tones that signify incoming calls and the delivery of text messages. As well as a vibrate function that performs the same indicative functions silently.

Graphic figures

There are two graphical figures on this phone that behave in an indexical way: the first indicates the amount of power that the battery has left and the second indicates how good the reception of the phone is. One is a graphical representation of a battery that empties as power reduces, the other is a segmented representation of an aerial that reduces or increases as signal strength varies.

Symbols

 

Graphic symbols

There are a number of graphical symbols evident on this phone. These can be broken down into two groups. Those that belong to the screen and those that belong the buttons.

 

The screen symbols generally denote menu choices or the display of information such as date/time and the input of data such as writing text messages.

 

The button symbols generally denote the function of pressing that button, e.g. the phone logo indicates the button to press to call someone, the signs on the keypad denote which numbers appear on screen when they are activated. Some of the buttons however do not clearly denote their function. The buttons directly below the screen and the button stamped with the letter c.

 

 

Additionally, there is a logo printed on this phone below the screen.

Symbolic colour

The central button on the phone is a different colour from all the others reinforcing its significance in the operation of the phone.

Position and posture

The most salient aspect of the phone is obviously the screen, signifying its importance to the operation of the phone. The screen elements are arranged with a logo central, battery power top left corner, signal strength top right corner and menu options in both bottom corners.

 

Interestingly the biggest button on the interface is positioned centrally on the phone also indicating its importance to the phones operation.




AS/SA nº 16, p. 29






2.1.2  Analysing the Nokia 5110

Sign type

Description

Icons

 

The tradition of form

Again this phone follows the form of traditional telephones both mobile and otherwise. The earpiece at the top is made quite obvious and is one of the most salient aspects of the phone. There is no evidence of a mouthpiece at the bottom. In the tradition of mobile phones it has an aerial top right, a screen and a keypad, as well as an on/off switch (top right).

Colour

Dark Grey, again this colour tends to denote the sobriety of the corporate world

Style

Rounded curves

Environment

Again the corporate world

Indices

 

A pointing form

There are two pointing forms situated to the right hand side of the phone, on two separate buttons that are above and below each other, indicating up and down menu navigation.

Light and sound

This phone has a number of ring tones that signify incoming calls and the delivery of text messages. As well as a vibrate function that performs the same indicative functions silently.

Graphic figures

There are also two graphical figures on this phone that behave in an indexical way: the first indicates the amount of power that the battery has left and the second indicates how good the reception of the phone is. Battery power is indicated by a small battery logo with segmented bar above it that reduces as battery power fades. Similarly signal strength is indicated by an aerial graphic with a segmented bar above it which alters in relation to signal strength.

Symbols

 

Graphic symbols

There are a number of graphical symbols evident on this phone. These can again be broken down into two groups. Those that belong to the screen and those that belong the buttons.

 

The screen symbols generally denote menu choices or the display of information such as date/time and the input of data such as writing text messages.

 

There are two buttons on this phone that do not clearly denote their function, one is the central button that has a blue bar across it the other is the button left of it that has the letter c stamped on it.

 

The keypad symbols generally denote which numbers appear on screen when they are activated.

 

Additionally, there is a logo printed on this phone above the screen.

Symbolic colour

The on/off switch is red indicating its importance. The symbol on the central button is blue identifying it as different from all the other buttons.

Position and posture

Interestingly the most salient features of this phone, apart from the screen, are the earpiece situated above it and the largest button situated in the middle of the phone.

 

The screen elements themselves are arranged with a logo central, battery power to the right, signal strength to the left and menu option below centre.




AS/SA nº 16, p. 30






2.1.3  Analysing the Nokia 6150

Sign type

Description

Icons

 

The tradition of form

Once again this phone follows the form of traditional telephones both mobile and otherwise. The form of the earpiece differs from the other two and is linear rather than round. Again there is no evidence of a mouthpiece at the bottom. It has an aerial top right, a screen and a keypad, as well as an on/off switch (top right).

Colour

Black, again this colour tends to denote the sobriety of the corporate world.

Style

Rounded curves.

Environment

Again the corporate world.

Indices

 

A pointing form

There are two pointing forms situated centrally in the phone on two separate buttons that are above and below each other, indicating up and down menu navigation.

Light and sound

This phone has a number of ring tones that signify incoming calls and the delivery of text messages. As well as a vibrate function that performs the same indicative functions silently.

Graphic figures

There are also two graphical figures on this phone that behave in an indexical way: the first indicates the amount of power that the battery has left and the second indicates how good the reception of the phone is. These signs are absolutely identical to those found in the 5110.

Symbols

 

Graphic symbols

Again there are a number of graphical symbols evident on this phone, which can again be broken down into two groups.

 

The screen symbols generally denote menu choices or the display of information such as date/time and the input of data such as writing text messages.

 

The keypad symbols generally denote which numbers appear on screen when they are activated.

 

There are also two buttons on this phone that do not clearly denote their function. Each one is a mirror image of the other and carries the same blue sign as the central button on the 5110. They are also positioned centrally at either side of the phone in the manor of unidentifiable buttons on the GD35.

 

Additionally, there is a logo printed on this phone above the screen.

Symbolic colour

The on/off button is red indicating its importance. Blue bars on the unidentified buttons indicate their importance.

Position and posture

There are no overtly salient features to this phone as all of the buttons are of a similar size.

 

The screen elements themselves are arranged with a logo central, battery power to the left, signal strength to the right and menu options in both bottom corners.




2.2    Findings

There are many obvious similarities in form and function across these three mobile phones. The somewhat repetitive nature of each analysis makes this patently clear. All show the same iconic features that are part of the telephone tradition; they are all similar in colour, they all have earpieces, aerials, screens, and keypads, which are organised in the same sort of configurations. Also, they all exhibit indexical features of remarkable sameness. They all have pointing forms, ring tones and graphical features that perform the same indexical functions, even although their form might vary from phone to phone. Lastly, the symbolic features of all three phones are again remarkably similar. They all exhibit evidence of similar symbols in both the screen elements that display menu choices and information, as well as in the buttons of the keypad.




AS/SA nº 16, p. 31



While the similarities between these phones identify the presence of icons, indices and symbols, in the interface, it is in the differences between the signifiers and arrangement of signifiers of these three phones that semiotic analysis comes into its own. For example, where one phone signifies its navigational elements with a large central button containing four arrows, the other two have only an up and down navigational signification. The arrows used to represent these functions also vary while the function remains the same. Similarly, the organization of these elements varies from phone to phone while functionality and meaning are maintained. Some signs indicate the same meanings across phones while having a different syntactic structure or position. Here we have found evidence not only of icons, indices and symbols but also of the arrangement of signs into concurrent syntagmatic relationships that vary slightly from phone to phone. This is also exhibited on the screens where battery life and signal strength are displayed in similar ways across the phones but occupy completely different areas of the screen. In short, the organisation of phone elements, those that follow a traditional pattern and those that vary from phone to phone exhibit evidence of what Eco identified as ‘rhetorical’ forms (Eco, 1986)

However, this particular analysis technique does nothing to engage with the problem of interactivity. It stops short at uncovering the structure and meaning of interface elements and does nothing to articulate how meaning unfolds as users interact with the phone. While this technique could be extended to analyse the signs in each screen during interaction, there is no structure to the approach by which to do so, as the technique focuses on identifying sign types rather than configurations of signs.




3    Visual Semiotics

In Reading Images (Kress & van Leeuwen, 1996) Gunther Kress and Theo van Leeuwen concern themselves primarily with the task of isolating and defining the different methods of construction used in image making that allow meaning to be conveyed. Their in-depth study of all kinds of images leads them away from traditional semiotic evaluation, in the sense of procuring meaning through the relationships between the various signifiers in an image and into a deeper concern with the syntactic construction, or grammar, of images as a whole.

Visual grammar considers the composition of spatial syntagms with regard to the ‘informational value’ of the positioning of elements within an image. From a particularly Western perspective, Kress and van Leeuwen propose that the ‘left side’ of an image is the ‘Given’ side; the already known side; the start of an idea, as in the headline or opening paragraphs of a magazine article for example. The right side is the ‘New’ side, often a photograph, in the case of magazines. It usually demands attention or is problematic in some way. The left to right direction of reading also forms some kind of narrative that is linked to sequential syntagms in a way. Obviously, this does not apply in cultures where signs and symbols are arranged to be read up and down or from the back of a book to the front as in Chinese or other Asian cultures.

For Kress and van Leeuwen, aspects of images that are spatially organised in the top section of images are considered to be ‘ideal’ ‘good’ or ‘whole’, while elements that are in the lower sections of images are considered to be ‘Real’, ‘Base’ or generally more down to earth. This is particularly true of paintings that contain religious motifs. Finally, when a pictorial element is presented in the centre of an image, it is presented as a nucleus of information around which all other elements become marginalized, subservient or dependent. These ideas are closely related to notions about embodied understanding that Lakoff and Johnson define in their work on metaphor theory (Lakoff & Johnson, 1980, 1999) where they consider orientation metaphors in relation to conceptual understanding of the world. The spatial organization of syntagms then, derive much of their meaning in relation to bodily understanding and orientation in the world.

 

article1en.gif  

Figure 3.1 Kress and Van Leeuwen’s Visual Grammar (Kress and Van Leeuwen, 1996 p208)

 




AS/SA nº 16, p. 32



Other aspects of visual structure that Kress and Van Leeuwen discuss in terms of importance to analysis are the Salience of objects e.g. size, sharpness of focus, tonal contrast, colour contrasts and placement; framing e.g. the degree by which units of information are demarcated as independent from others, and the liner/nonlinear composition of texts e.g. the use of subheadings, emphatic devices, numbered lines, tables, diagrams and so on that encourage readers to scan or skip read the information instead of reading it in a standard sequential mode. Hypertext is a perfect example of this.




3.1    Analysing the Phones with Visual Semiotics

This study uses the visual semiotic techniques as a method for evaluating the three mobile phones. Bearing in mind that Kress and van Leeuwen’s theories are geared more towards the analysis of visual surfaces such as paintings and diagrams, rather than mobile phones, it is not expected that all of their techniques will be useful in this analysis, although some results in certain areas are expected to be revealing. Employing the techniques of visual semiotic analysis involves studying the three phones in relation to: narrative processes, conceptual representations, analytical processes, representations and modality, as well as informational value and salience. Unlike the study above, where each phone was treated separately, all of the phones are discussed together in relation to these concepts.



3.1.1  Narrative Processes

Narrative processes are indicators of visual narrative. Kress and van Leeuwen specifically identify the diagonal tension between organised visual elements to express this. On serious contemplation of the surfaces of the phones, it becomes evident that there is no trace of such visual narrative processes. That is to say, that within the inter-relationships of the components of the interfaces there is no strong diagonal, which attempts to convey transactional actions to the observer. Of course, it is evident that there are many different components in the interface, which are related to one another in some way. However, it would appear that within their organisational structure there are no stories to be told and therefore no narrative processes. Where arrows do appear on the phones, e.g. on some of the buttons, they do not fall into the category of narrative process as defined by Kress and van Leeuwen, which specifically focuses on directional relationships between pictorial elements. The arrows on the phones do not point to other elements of the phone directly. However, there are some narrative processes hidden within the functionality of the phone. For instance, the navigational aspects hinted at by the arrow buttons, move the user into an interactive narrative, which they construct by interacting with the phone. This narrative thread brings the understanding of interface signs into contact with the external narratives of peoples lives, such as calling a sick friend or texting someone a football score line. While these may be considered as goals in themselves, it is interesting to consider how such a call or message fits into a wider meaning-making context, made available by the mobile phone itself.



3.1.2  Conceptual Representations

Within all three phones it is apparent that there is an underlying structure described by Kress and Van Leeuwen as a “multileveled overt taxonomy”. This is where elements are organised into a visual hierarchically. The space occupied by the screen in all three phones is the super ordinate element of the taxonomy, all other elements are therefore subordinate to the screen. This is emphasised by the screen being placed above the other button elements even although they are related to the operation of the information on the screen. On the GD35 the large silver button in the centre of the phone, directly beneath the screen, is set above the importance of the smaller operational buttons that occupy the rest of the interface. So we have a three-tier taxonomy. This is also similar in the other two phones although the size and position of the central button varies; there is still a cluster of buttons that surround a central larger one. This in itself is another visual hierarchical organization of elements (Figure 3.2).

 

article1en1.gif  

Figure 3.2 Multilevel taxonomy in the GD35 phone

 



3.1.3   Analytical Processes

It might be argued that the different component parts of the phones are organized through what Kress and Van Leeuwen call ‘exhaustive analytic process’, i.e. the component parts of the phones may be organized in such a way as to make obvious all of their features. This is true in terms of indexical signs, such as the battery life and signal strength icons identified in the previous study. However, this point of view breaks down in terms of interface structure because many of the phones’ features are hidden away from sight, in the memories and central processing units of the phones. These are accessed at different times through the screen and are therefore not displayed in a one to one relationship. That is to say that the component parts of the interface do not correspond directly to the phone’s features. Instead, it is the combination of spatially organised signs, coupled with the interactive possibility of restructuring them into various ‘commands’ that point to the functionality of the phone.






AS/SA nº 16, p. 33



3.1.4  Representation & Interaction

Kress and Van Leeuwen’s definitions in this section are taken largely from the study of paintings, photographs and TV. Although their ideas of interaction seem to be applicable they, are couched in terminology that is entirely inappropriate for dealing with Human Computer Interaction, i.e. interaction defined as the active manipulation of a computer interface. Therefore, no significant data about the phones interface can be determined by the application of this particular set of definitions.



3.1.5  Modality

Similar to the section above the definitions outlined by Kress and Van Leeuwen seem inapplicable in this form of analysis.



3.1.6  Informational Value

Looking for evidence of a ‘left/right’ organisational structure one immediately notices that this is a prevailing method of organisation across all the phones in a number of ways. Firstly, there are the logos Panasonic, Vodaphone, Nokia etc. above and below the screens. As obvious as this may seem, the positioning of these two logos has an important effect. The positioning of these words, next to the most salient component of the interface, ensures maximum impact in displaying the manufacturers and the network provider’s names whereby no interaction, or even cursory glance at the phone, is passed without their authority. More subtly though, the left/right direction set up by the logos next to the screens primes users for information delivered in left/right fashion by the screen itself. Secondly, the Keypad buttons are also numbered from Left to right, as are the letters. This is of noticeable significance when you take into account the function of the buttons when sending ‘text messages’

Looking for ‘top to bottom’ organisational structure within the phones interfaces what becomes apparent is the taxonomic structure as outlined in the conceptual representation section thus: Screen, Control panel, Keypad. Using Kress and Van Leeuwen definitions the screens are the ‘Ideal’ top part of the phones and the Keys are the ‘Real’ lower half of the phones. This is interesting because the focus of attention is on the screen when using a phone, but the keys, being in the ‘Real’ section of the phone, are given to the user as tangible, usable artefacts. It should also be noted here that this arrangement has a long history of development through typewriters, TV remote controls, Computers and on into mobile phones as a fairly standard form of representation during text manipulation. It also further separates the graphics on screen from the graphics on the buttons. The buttons are interactive while the screen elements are only there to display information. Centrally there is the control panel, with a central button in all the phones. The other buttons are out in the margins of the central band. This ensures the relationship of dominance/subservience outlined earlier in conceptual representations.



3.1.7  Salience

As noted earlier, the most salient component of the phones interfaces are the screens. This is because of their size (they take up about a third of the available space), their position and because of the surrounding facing that stands out against the body of the phone. In the case of the GD35 this is silver, in the case of the other two they are black. The second most salient components of the phones are the central control buttons. In keeping with the taxonomic structure outlined earlier, the position size and colour of these buttons give them very important characteristics. Their centrality means that they are a strong point of focus. When the phone is held ready to be used the user’s thumb sits directly over them suggesting that they might be the most used button on the interface. Their size, usually the biggest button on the interface, gives them a tremendous pull in terms of usability. They almost want to be pressed. Clearly there is a strong relationship between these buttons and the screen above.

The subordinate function buttons in the mid section of the phone are arranged around the central button and marginalized. In terms of salience they seem less significant as a grouping than the keypad. However, their position above the keypad gives them a sense of hierarchical power within the taxonomic structure. They are clearly related to the function of the central button. An interesting aspect of this is the difference between the three phones here. On the GD35 the central button has arrows on it that seem to denote navigation or direction. Similarly, the 6150 has central buttons that show arrows but only up and down. On the 5110 these directional buttons are to the left of a larger central button, still forming a central group that are interrelated. On each phone the screen is situated at the top and below are the keypads.

The square formed by the regimental grouping of the keys into the rows and columns of the keypad takes up a significant portion of the space on the interface. Thus, due to its form, as a group, it has its own sense of weight within the interface structure as a whole, forming the culturally recognisable unit of the telephone touch-tone keypad, common to all manner of modern phones. This last aspect in itself produces a strong sense of salience in that it is a very recognisable form.




AS/SA nº 16, p. 34



Considering the suitability of this type of analysis in relation to the mobile phones, it is apparent that there are only a few concepts that are any use in evaluating them. This is because this technique is derived from concerns with the visual aspects of images rather than interactive objects. Nevertheless, a number of concepts when applied do provide some data about the semiotic nature of the phones.

Avoiding any useable terminology for categorising sign types per se, Kress and van Leeuwen’s theories concentrate on uncovering the organisational, or as they term them ‘grammatical’, relationships between visual elements. In this way, aspects of the hierarchical relationships between phone elements are identified on two occasions. Moreover, a propensity is identified within all of the phone structures to promote a left-to-right reading of elements both on the screen and in the organisation of the buttons themselves.

The two types of organisational structuring identified in this study then appear to be equivalent to the structuring of interface signs as concurrent syntagms. It is the similarlities between phones that confirm this, whereby the various forms of elements that differ from phone to phone purport to similar sorts of functionality across phones. In short, while the signifiers alter in look and location across the three phones, their meanings stay the same. It is only possible to understand this from a semiotic perspective if the interfaces of the phones are considered as a paradigmatic langue, as Saussure would describe it (Sassure, 1966) of phone signs organised into concurrent syntagmatic relationships. Again, this type of analysis fails to really get to grips with the process of interaction. While it uncovers hints of it within the signs e.g. the navigational buttons, it does not have a mechanism by which to deal with the sequential aspects of interaction analysis.




4    Eco’s Revised KF Model

The widened horizon of social and cultural semiotics developed by Barthes (1977) is what gives Umberto Eco the background to his unifying theory of semiotics. Also based on much of the research performed by Hjelmslev (1961), his “theory of Semiotics” (Eco, 1976) is a highly developed re-evaluation of the major branches of semiotics from both the Saussurean and Peircean schools of thought. Eco produces not so much a new definition of the sign but a definition of the sign that takes into account the myriad social, cultural and contextual issues that underlie every instance of sign use. In doing so, Eco proposes a theory of semiotics in terms of the use of signs as acts of coding and decoding messages with reference to sets of culturally defined conventions. The socio-cultural aspects of semiotics and the importance of context in evaluating meaning are central to his theory.

Based on the work of Katz and Fodor, Eco develops a dynamic model of the semantic aspects of signification that takes into account the circumstances and contexts on which the denotation and connotation of signs are so dependent. Eco’s conception of signs as aspects of codes, which run along and across the various social groups which make up society as a whole, are based on the notion that for a sign to be understood the reader has to be ‘in possession’ of the correct code to interpret it. It is this coding and decoding of signs, which Eco attempts to model in his revised Katz and Fodor (KF) model (Figure 4.1).

 

 

article1en2.gif  

Figure 4.1 The revised Katz and Fodor (KF) model (Eco, 1976)






AS/SA nº 16, p. 35



In explanation: a sign vehicle /s-v/ is a signifier which is formed by a set of syntactic markers (sm). This sign vehicle then has a meaning <<sememe>> that can be either a denotation d or a connotation c depending on the context (other signs within its system (cont)) and circumstances (signs outside of its specific system [circ]), with which it is encountered. The contextual and circumstantial parameters in which the sign vehicle is encountered affect the type of meaning that the sign vehicle may pertain to. In other words, the denotative and connotative meanings that a sign vehicle might have alter depending on when and where the sign vehicle is encountered (Eco, 1976, p 105). For example, the word ‘blue’ might be encountered in relation to ‘sky’, ‘grass’ and ‘feeling’. Each alternative word alters the meaning of blue offering different denotations and connotations. ‘Blue sky’ simply denotes the colour of the sky. ‘Blue Grass’ is a type of American folk music. ‘Feeling Blue’ connotes an emotional state.




4.1   Analysing the Phones with the Revised KF Model


This study differs from the other two in that it is not taken from a domain of applied semiotics but relies on Eco’s theory of semiotics directly. Like the previous two studies the aim is to evaluate the elements of the phone interfaces in relation to the concepts identified as relevant to new media. The premise here is that Eco’s revised KF model, where denotations and connotations are dependent on the context and the circumstances of organisational structuring, should be directly applicable to the phone elements as signifiers that bear meaning. Therefore, the established method of analysis equates context (Cont() in the diagrams presented below) with the sequence of screens that are displayed through interaction and circumstance (Cir[]) with the concurrent structuring of signs on each screen. In doing so, the revised KF model is adapted to analysing sequential and concurrent syntagms within phone interfaces. It is important to point out here that in the diagrams presented below connotations are omitted purely because of the number of images and the space they would take up. Where possible connotations are at least alluded to if not explicated in the supporting text.



4.1.1  The perceived meanings of phone signs

According to semiotic theory (Eco, 1976; Hjelmslev, 1966; Anderson, 1990 ) semantic fields are essentially the range of possible meanings that are associated with any particular sign. In this respect, a semantic field of a specific word contains the synonyms of that word e.g. beautiful: - good-looking, gorgeous, stunning, attractive. All of these words are related in meaning and the use of the sign ‘beautiful’ may refer more closely to one of these other words depending on its context of use.

In relation to the signs on the phones, it is interesting to explore their semantic fields because they are relatively new cultural phenomena that consist of new and old signs. With this in mind, it is the control section of the phones and the screen that are most interesting. Buttons are designed with symbolic codes that are supposed to communicate their function, while screens consist of integrated symbols that communicate information to the user about the functioning of the phone.

On the 5110 and the 6150 there are two buttons one above the other that have arrows printed on them. On the 5110 they are to the right hand side of the phone. On the 6150 they are in the centre of the phone. On the GD35 there is a large central button that has arrows pointing up, down, left and right. The possible meanings of this sign in relation to different contexts are outlined below using Eco’s Revised KF model (Figure 4.2).


navsemfield2copy.gif

Figure 4.2 The semantic field of the arrow signs on the GD35



In this diagram the button is presented on the left hand side. The arrows point towards the contextual situations in which the sign might be encountered and the subsequent denotative meanings associated with them I.e. the semantic field of the sign. Culturally then, this sign has a number of different meanings associated with it that are brought to an interaction. It is not exactly clear what the button is for but the signs on the button denote something about navigation or direction. The same can be said of the arrow buttons on the two other phones (see Figure 1.1). This particular sign carries connotations with it too, in as much as the directional aspect of its compass form bring to mind the NATO logo.




AS/SA nº 16, p. 36



Similarly on the GD35 and the 5110 there is a button with a single ‘C’ symbol printed on it. The possible meanings of this symbol outlined as a semantic field is given below. Obviously there may be other contexts in which the symbols in these diagrams might have different meanings. The examples given are purely illustrative and not exhaustive.


csemfieldcopy.gif

Figure 4.3 The Semantic Field of ‘C’ on the GD35

 

Figure 4.3 shows how the different contexts that the ‘C’ symbol is used in can offer multiple meanings. It may purely denote the verbal or written phoneme ‘C’. It may denote the ‘clear’ function on a calculator or it might denote the element of Carbon in chemistry.

ecocmodel4copy.gif

Figure 4.4 The extended semantic Field of ‘C’ on the GD35

 

In the case of the mobile phones though, it becomes apparent through interaction that ‘C’ stands for the ‘Cancel’ function (Figure 4.4). Therefore the semantic field for ‘C’ is extended and while it denotes ‘Cancel’ in the phones it retains these other possible meanings, the context of the mobile phone interface perhaps connoting meanings associated with a keyboard or calculator.




4.1.2  Analysis of Interaction Using Eco’s Revised KF Model

Taking a single symbol as a starting point an exploration can be made, using Eco’s revised KF model of its functional meaning throughout the process of interaction. The sign (Figure 4.5) only appears on the GD35 while buttons with a similar position on the 6150 have a small blue dash on them, as does the central button under the screen on the 5110. It is not at all clear what this sign means or indeed if the buttons on the GD35 perform the same functions as on the other phones.

fig4.5.gif

Figure 4.5 The semantic Field of an unknown sign



At first glance this symbol is very difficult to decipher, as it bears no relation to any of the external codes used on the phone. Therefore it must be unique to the interface (although similarities may exist in other mobile phone interfaces). It is constructed from a large rectilinear shape, which has a smaller dark filled rectilinear shape occupying its bottom right hand corner. The problem here is that Eco’s formula seems largely useless because the sign is so new. However, if we follow his rationale and attempt to shed light on this rectilinear sign by exploring the context of its use we can pick up some clues as to its meaning.

What needs to be taken into account in relation to interactivity is the fact that, as Andersen points out (Andersen, 1990), it is not until a button is pressed that its functional meaning is ultimately revealed. Therefore, the proof of what the sign is attempting to communicate is only verified through interacting with it. An entirely new form of semiotic model may be required to account for this type of activity but this does not mean that Eco’s formula cannot be used at this stage.




AS/SA nº 16, p. 37



Presumably, considering the nature of activating buttons, when the button is pressed some activity will occur within the phones system. This gives us two distinct semiotic phases, if we consider that by pressing this particular button this activity will be registered on the screen. So what does the change in state say about the functionality of the button and does this shed any light on the meaning of the sign printed on the button?

Given the fact that the meaning of a sign in Eco’s model is entirely reliant on its denotations and connotations in relation to differing contextual and circumstantial variables, it follows that the complex alterations in meaning that arise when a sign is viewed in different situations can be captured by it. This flexibility in Eco’s model then results in an opportunity to view those. That is to say, it allows the mapping of different stages of meaning that are denoted and connoted by the signs of a system when the state of that system is altered during an interaction.

Choosing the interaction goal of ‘check messages’ for the study of the mobile phones limits the number of stages mapped. Concentrating only on the GD35 interface, the meanings of the active button signs are mapped here in relation to the circumstances and context denoted by the other signs in the interface. Initially, it is not clear as to what some of the control buttons signify but through studying the interaction with the device it becomes apparent that they hold a close relationship with the signs displayed on screen and with each other.

ecomainscreen.gif

Figure 4.6 The Main screen of the GD35 (Note: all diagrams omit connotations due to space restrictions)



Figure 4.6 shows what the initial screen looks like at the start of interaction along with the corresponding semantic field of the right hand button in relation to the displayed information. By focusing on the meaning of the active sign at each stage of the interaction, it is possible to see how the operation of the device is conveyed through the relationships between its signs. As the interaction takes place the signs change and meanings are altered. Sign elements come together over time in order to form sequential syntagms that the user interprets as the interaction takes place. On the ‘Main Screen’ the function of the right hand button is denoted by [menu] in the black square at the bottom right of the screen. Eco’s model shows how the action of pressing the button results in the function <<go to menu>>.

 

·    User Action: Press the right hand button.

·   System State: The Screen now shows a list of functions with an arrowhead pointing from left to right at the beginning of the first heading on the screen.



ecomenuscreennav.gif

Figure 4.7 Menu screen



Figure 4.7 shows how the functional meaning of the right hand button has altered. Pressing the button will not result in the same operation because the circumstances have changed. These changes are indicated by the presence of a list, an indicator arrow to the left of the screen and [Enable] in the bottom right of the screen. Here the interaction moves to a different input button that performs a different function which has a meaning all of its own. Looking at both buttons with Eco’s model begins to show how complex the relationships are across the signs. Both signs convey information in terms of the the focus of action, switching the operative mode between them through the signs on the interface. The right hand button denotes <<Enable Key Guard>> as its function. This is not an action that will move the user closer to achieving the goal i.e. the combination of displayed signs does not mean what the user wants to communicate to the phone. The user is forced to look elsewhere on the interface for a possible action that will move them further forwards. This is where the information supplied by the signs in relation to the Navigation button fulfils the intention of ‘checking messages’. The signs have to be organised in the correct syntagmatic structure in order to proceed correctly, otherwise an alternative function of the phone will be brought into operation.




AS/SA nº 16, p. 38



·    User Action: Use the Navigation button to scroll down the choices one at a time.

·   System State: The indicator arrow moves down to the next heading on the list and the number at the top of the screen changes from 1 to 2. ‘Menu’ is displayed instead of ‘Enable’.

 

ecomenu2.gif  

Figure 4.8 The menu screen stage 2

 

The next screen (Figure 4.8) shows how the meanings of the two buttons interrelate with one another. The function <<Move Down>> is directly related to the change in state of the right hand button <<display Phonebook Menu>>. Again this is not the meaning the user is looking for. The concurrent syntagm created between the signs on the screen and two button signs generates the meaning <<Move Down>>, which in turn moves focus to the down arrow at the bottom of the navigation button in order to get to ‘Messages’.

 

·   User Action: Use the Navigation button to move down to ‘Messages’.

·   System State: The indicator moves down to point at ‘Messages’. The number shows three.

 

ecomenu3.gif

Figure 4.9 The menu screen stage 3



Having now reached the ‘Messages’ option (Figure 4.9) the right hand button becomes the focus of activity again. The indicator arrow is pointing at the choice the user wants to make. Also, it is apparent that if the Navigation button is pressed again the user will go too far down the list. By using Eco’s formula at this level it can be seen how the various functional meanings of the signs are created as the user moves through the interaction. Moreover, it is possible to see how meaning is made through understanding the relationship of the signs that are constructed within the interface. In other words, it is the users syntagmatic structuring of the signs in the interface that is a central part of interaction.



2.1.1  Comparisons Across Three Phones

Looking at just one sign it is possible to see how Eco’s model can act as a map of its various functions across the different contexts and circumstances within the interactive possibilities of the phone. Figure 4.10 shows the map of related functions that the right hand button has in the various stages of the ‘Check Messages’ interaction. The structure of the semantic tree becomes more complex as various configurations of signs are revealed according to the differing Circumstances and Contexts that occur throughout the interaction. In all the phones examined, similar screen states and relationships between the signs in the interface were encountered.

selectmodelv3.gif

Figure 4.10 A functional model of meaning for the Panasonic GD35 Select button






AS/SA nº 16, p. 39



If this mapping were to continue over every interaction that the button has a relationship with, it could be shown just how complex the construction of meaning with regard to this sign is within the device as a whole. This can be seen as a map of the operation of the button from which a concept is derived about its operation. In general terms, while there are a number of different denoted meanings related to this button a close look at them reveals that a lot of them have a similar meaning. What is learnt about the button is that it operates across the signs in the interface as a ‘Select’ button. The abstract concept derived from its operation is one that allows you to make a selection choice based on the complex arrangement of interface signs at any moment throughout the interaction. Thereby we have some notion related to the sign about the concept with which it operates.

5110selectmodelcopy.gif

Figure 4.11 A functional model for the Nokia 5110 Select button

In the other phones (Figure 4.11) similar patterns of meaning structures emerge through analysing interactions within their specific system of signs. The middle button on the 5110 and two buttons on the 6210 have blue lines across them. Despite the differences in the position, size and form of symbol on these buttons, they behave just like the select buttons on the GD35.


navmodelgd35copy.gif

Figure 4.12 A functional model of the GD35 Navigation button


In the same way other buttons across the phones behave in similar ways. For example, the arrow buttons mentioned earlier Figure 4.12. Although on all the phones they appear graphically distinct they not only have similar semantic fields, but in practice they actually operate in very similar ways across the phones Figure 4.13.


navmodel6150copy.gif

Figure 4.13 A functional Model of the Nokia 6150 navigation button


Both the GD35 and the 5110 have a button marked with a ‘C’ but the 6210 does not. The function of this button is not immediately apparent but after some investigation it becomes clear that the general function of the button is as a ‘Cancel’ button (Figure 4.14).


ecocmodel2.gif

Figure 4.14 The contextual functional relationships of the ‘C’ button





AS/SA nº 16, p. 40



It operates in different ways depending on the length of time it is held down, but in general it’s function is to erase functions that have just been performed, returning to the main screen. The function of this button is identical on both phones Figure 4.15.

Figure 4.15 Metasymbolic aspects of Select signs

Different phones use different symbols yet when one is understood in one phone the principles in application are transferable across phones. While not strictly across domain, this is essentially the principle of metaphor at work, where it becomes possible to substitute the signifier from one phone for the signifier on another while maintaining the same functional meaning (see Figure 4.15 and Figure 4.16).

metanavigationcopy.gif

Figure 4.16 Metasymbolic aspects of navigation signs




2.2   Findings

Consideration of the first part of this study reveals how some signifiers carry a certain amount of cultural baggage with them, which offer a host of potential meanings in the form of a semantic field. Given a certain context and/or circumstance only certain meanings become appropriate. This is how coding and decoding works. Essentially Eco’s theories, the revised KF model in particular, offer an opportunity to articulate notions of decoding in relation to denotations and connotations which are dependent on the context and circumstance in which they are encountered. Indeed, as the second part of this study shows, new signs often have no cultural frame of reference, they are under-coded, and work commences to establish what they mean in relation to the signs around them.

The second part of the study concentrates on exploring the possible meanings of a sign in relation to the changing contexts and circumstances that occur during interacting with the phone. At each stage of interaction the relationships between signs and the resultant meanings are laid bare. This section in particular highlights how the concepts of syntagmatic structuring are articulated during an interaction. Each screen in itself is a grouping of concurrent signs, which are based on the internal paradigmatic structuring of functionality within the phone. The concurrent arrangements of signs in screens allude to interactive possibilities, while the resultant changes in those signs become manifest through the sequential nature of interaction. Manipulating the phones controls to produce the correct sequential syntagmatic relationships out of the concurrent arrangement of signs on each screen establishes an end goal.

While this is only a very short task, this study shows how complex these relationships are, as well as how time consuming it might be to analyse the whole interface. With this in mind section 4.1.3 considers how the semantic fields of signs become extended through establishing a working concept of a sign within a particular set of contexts and circumstances. Moreover, relationships between signifiers can be established when it becomes apparent that they are associated with the same working concept

A difficulty in implementing Eco’s formula to interactive interfaces would appear to be that it is far more difficult to establish the context and circumstance of a sign within an interface due to the hidden depths of its functionality within the operation of the device as a whole. Context and circumstance in written texts are constituted by the other words that surround the word or phrase being analysed. Within an interactive interface this remains true, but there may also be many hidden operations that alter the meanings of a sign that only become apparent over time.







AS/SA nº 16, p. 41



3    Discussion

Each of the analytic techniques used in this chapter are different, but each has been used in an attempt to apply some aspect of semiotic theory to the analysis and interpretation of interface elements. While each one has its own particular strengths and weaknesses, the important thing to consider is how some relevant aspects of semiotic theory have been identified and explored, in relation to the interactive systems and new media.

For example, it is quite apparent in the product analysis, which focuses on a sign type, that icons, indices and symbols are evident across all three phones. This type of analysis was able to identify subtle differences in the construction of these sign types, which while often looking different conveyed the same sorts of meanings across all three phones.

All three studies provide evidence of the concurrent arrangement of signs across the three phones. The first analysis identifies not only similar syntagmatic structures of signs across each phone, but also different signs with similar meanings, which are arranged differently in the interface. Similarly, the visual analysis also uncovers these differences, along with the fundamental organisation of mobile phone forms, such as the hierarchical visual structure of screen above control keys. However, this type of analysis is fairly limited in relation to uncovering aspects of meaning over and above the structure of the interface.

In the third analysis, again concurrent syntagmatic relationships are discussed, but more importantly, the third analysis of interaction uncovers the way in which signs are manipulated sequentially in order to establish meanings. Interaction across all three phones is reliant on the ability of a user to interpret and manipulate signs in both concurrent and sequential relationships. Through exploring semantic fields along with the contextual and circumstantial relationships that establish denotations and connotations, it becomes apparent that many signs in the phones interfaces have many potential meanings. While the analysis predominantly explores how a sign’s functional meaning is established through interaction, attention is drawn to the place of this meaning in relation to a wider cultural context. This particular aspect of the analysis highlights the role of the reader as an interactive interpreter of the interface. Meanings are derived through the manipulation of signs within the structure of the interface as well as the realm of potential meanings brought to the interaction by the user. Obviously in interfaces, such as these, that attempt to communicate their functionality through signs and symbols, there is a real potential for misunderstanding. Signs do not always convey a one to one relationship with functionality and it is only through learning the language of the interface signs that the full potential of the phone can be unlocked. Clearly, some elements of these three studies highlight this problem and offer the possibility to identify combinations of signs that might prove to be confusing. Thus, this exploration of semiotic analysis highlights that:

·   It is not always clear what signs initially mean. Many signs in interfaces are culturally defined and my bring confusing multiple meanings with them from external domains. Similarly, some signs are ‘phone’ specific and are thus under coded, making it difficult to understand the purpose they denote.

·   The functionality of the devices are expressed by signs that have to be learned. The under coding of some signs in the interface means that phone interfaces have their own language, making it difficult for novice users to pick up and interpret. Signs tend to be symbolic rather than iconic or indexical. Opaque rather than transparent.

·   It is the sequential process of interacting with the dynamic interface that creates meaning, even when the interface signs aren’t fully understood. It is the concurrent and sequential structuring of elements by the user that allows for meaning to be produced and functions to be communicated. The designer and user are both authors and readers of interactive texts.




4    Conclusions

Icons, indices and symbols, as well as concurrent and sequential syntagms, are all evident in each of the phone studies presented here. For example, we see the abstraction of battery life and signal strength into indexical signs that communicate the phones state. Moreover, in newer versions of these phones, concepts such as address books and tools for personalising your phone are abstracted and represented by icons such as a picture of an address book or screwdrivers and spanners respectively. Therefore, semiotic theory does indeed connect the technologies of new media to the critique of older media. In doing so, this semiotic approach provides useful terminology that helps to articulate the problems conveying meaning. It is without doubt very useful to think of new interactive media from this perspective.




AS/SA nº 16, p. 42



Furthermore, new phones continue the progression of convergent technology, where sign systems from different domains of functionality are integrated into one device. For example many new phones contain signs systems that require the understanding of cameras, media players and Internet connection, as well as the now familiar texting and calling options. This convergence of functionality leads to more complicated interfaces and thus more need to develop understandable and meaningful signs to represent functionality. A semiotics of interactive media must address this problem by establishing what kind of signs it hopes to identify and how people make sense of interacting with them.

Older semiotic theory, in the most part, is aimed at analysing ‘static’ texts. I.e. texts constructed by an author where the constituent parts have been organized into a structure that does not change over time. As seen in the studies presented here, in interactive media, this is not the case. Although designers are still responsible for organising the structure of software applications, the very nature of what makes them interactive i.e. the introduction of the agency of the user, makes semiotic analysis with traditional static methods problematic. The possibility for personalisation and self-organisation, as controlled by the user in some applications, confounds the traditional static semiotic approach even further. A semiotics of new media has to be able to cope with dynamic texts that alter over time as users interact with interfaces and content.

The three semiotic analyses presented here do not individually provide examples of relevant semiotic theory that is useful for analysing interactive systems. None of them are singularly effective in describing the phones. However, each of the analysis provides evidence for at least one aspect of relevant semiotic theory with significant crossover between them all. When they are considered together, the possibility of an integrated semiotic theory from older media in relation to analysing convergent media artefacts becomes an attractive proposition. Using semiotic analysis, in this way, highlights that elements of older semiotic approaches, while useful, are not enough in themselves. In order to understand interactions with new media, the relevant aspects of semiotic theory must be combined together, in such a way, as to produce a semiotics of new media that is capable of articulating its specific characteristics. This is particularly relevant for interactive systems outside the work domain, where there is a great deal more speculation surrounding interpretation and meaning, e.g. interactive TV, games, interactive artwork.



References

Andersen, P. B. (1990) A Theory of Computer Semiotics, Cambridge, Cambridge University Press.

Andersen, Peter Bøgh, (2001) What Semiotics can and cannot do for HCI, Knowledge-Based Systems, Volume 14, Issue 8, 1 December 2001, pp. 419-424.

Barthes, R. (1977) Image, Music, Text, Fontana Press, London.

Benyon, D. (1994) A Functional Model of Interacting Systems: A semiotic Approach. In J. H. Connelly & E. A. Edmonds (Eds.), CSCW and Artificial Intelligence, London: Springer Verlag, pp. 105-125.

Benyon, D. (2001) The new HCI? Navigation of information space, Knowledge-Based Systems, Volume 14, Issue 8, 1 December 2001, pp. 425-430.

Bolter, J. D. and Grusin, R. (1999) Remediation: Understanding New Media, The MIT Press, Cambridge, Massachusetts.

Byrne, M.D. (2003) Mental Models, in The Human-Computer Interaction Handbook: Evolving Technologies and Emerging Applications, J.A. Jacko and A Sears (Eds), Lawrence Earlbaum Associates, New Jersey, pp. 97-117

de Haan, G. (1994) An ETAG-based Approach to the Design of User Interfaces. In: Tauber, M.J., Traunmüeller, R. and Kaplan, S. (eds.) Proceedings Interdisciplinary approaches to System Analysis and Design. Schärding, Austria, 24-26 May, 1994.

de Haan, G. (1997) How to Cook ETAG and Related Dishes: Uses of a Notational Language for User-Interface Design. Cognitive Systems 4, 3-4, 353-379.

de Haan, G. (2000) ETAG, a Formal Model of Competence Knowledge for User Interface Design. Doctoral thesis. Department of Mathematics and Computer Science, Free University Amsterdam, October 2000.

de Souza, C. S., Barbosa, S. D. J. and Prates, R. O., (2001) A semiotic engineering approach to user interface design, Knowledge-Based Systems, Volume 14, Issue 8, 1 December 2001, Pages 461-46.

Eco, U (2000) Kant and the Platypus: Essays on Language and Cognition, Vintage, London.




AS/SA nº 16, p. 43



Eco, U. (1976) A Theory of Semiotics, Indiana University Press, Indiana

Eco, U. (1986) Function and Sign: Semiotics of Architecture. In M. Gottdiener & A. Lagopoulos (Eds.), The City and the Sign, Columbia University Press, New York.

Halliday, M. A. K. (1978) Language as social semiotic, The social interpretation of language and meaning, Edward Arnold, London.

Hjelmslev, L. (1961) Prolegomena to a Theory of Language. Madison: University of Wisconcin Press.

Karat, J. (2003) Mental Models, in The Human-Computer Interaction Handbook: Evolving Technologies and Emerging Applications, J.A. Jacko and A Sears (Eds), Lawrence Earlbaum Associates, New Jersey, pp. 1152-1164.

Kieras, D. (2003) Mental Models, in The Human-Computer Interaction Handbook: Evolving Technologies and Emerging Applications, J.A. Jacko and A Sears (Eds), Lawrence Earlbaum Associates, New Jersey, pp. 1139-1151.

Kitajima, M. and Polson, P. G. (1992) "A Process Model of Display-Based Human Computer Interaction." CHI 1992 Research Symposium.

Kitajima, M. and Polson, P. G. (1995). Measuring the Gulf of Evaluation in Display-Based HCI. A position paper presented at the workshop on Cognitive Architecture and HCI in CHI'95

Kress, G. and van Leeuwen, T. (1996) Reading Images (The grammar of visual design), Routledge, London.

Lakoff, G. and Johnson, M. (1980) Metaphors We Live By, University of Chicago Press, Chicago.

Lakoff, G. and Johnson, M. (1999) Philosophy of the Flesh, Basic Books, New York:

McCullogh, M. (1996) Abstracting Craft: the practiced digital hand, The MIT Press, Cambridge Massachusetts.

Nadin, M. (2001) One cannot not interact, Knowledge-Based Systems, Volume 14, Issue 8, 1 December 2001, Pages 437-440.

Norman, D. (1988). The Design of Everyday Things. The MIT Press, London, England.

O’Neill, S. J. (2005) Exploring a Semiotics of New Media, Doctoral Thesis, Napier University, Edinburgh.

O'Neill, S. J. and Benyon, D. R. (2003a). An Exploration of a Semiotic Model of Interaction Through Interactive Media. Paper presented at the the International HCI, Arts and Humanities workshop, 21st July 2003, University of York, York, UK.

O'Neill, S. and Benyon, D. (2003b). A Semiotic Approach to Investingating Presence. Paper presented at the COSIGN-2003, Middlesborough.

O'Neill, S., Benyon, D. R., and Turner, S. R. (2002). Semiotics and Interaction Analysis. Paper presented at the ECCE 11, Catania, Sicily.

O'Neill, S., Benyon, D. R., and Turner, S. R. (forthcoming). The Semiotics of Interactive Systems. To appear in Cognition, Technology and Work.

Peirce, C. S. (1931-1958) Collected Papers of Charles Sanders Peirce (Vol. 1). Harvard University Press, Cambridge.

Prates R, de Souza C. S. and Barbosa S. D. J. (2000) A method for evaluating the communicability of User Interfaces, Interactions Jan/Feb, pp. 31-38.

Saussure, F. (1966) Course in General Linguistics, McGraw Hill, New York.

van der Veer, G. and del Carmen Puerta Melguizio, M. (2003) Mental Models, in The Human-Computer Interaction Handbook: Evolving Technologies and Emerging Applications, J.A. Jacko and A Sears (Eds), Lawrence Earlbaum Associates, New Jersey, pp. 52-80.

Vihma, S. (1995) Products as Representations - a semiotic and aesthetic study of design products. Helsinki: University of Art and Design Helsinki.

Westerlund, B. (2002) Form is Function. Paper presented at the DIS2002, London.

 

 

 

 






index.html




E-mail the editors
Pour écrire à la rédaction



© 2005, Applied Semiotics / Sémiotique appliquée