L01 Introduction 2006

100 %
0 %
Information about L01 Introduction 2006
Education

Published on June 27, 2007

Author: TSBAIM

Source: authorstream.com

Methodology & Explanation:  Methodology andamp; Explanation Richard Joiner MSc. Human Communication andamp; Computing Lecture 1: Introduction Introduction:  Introduction Course Aim Course Objectives Course Assessment Rationale Laboratory Studies Field Studies Aim:  Aim To give the students an introductory understanding of the research methods in human computer interaction and communication research. To raise students awareness of the scientific and engineering methods used in the context of human-human and human-computer interaction. Objectives:  Objectives Apply appropriate techniques for the interpretation of material, including observational and ethnographic material. Develop a critical understanding of the assumptions that underpin the development and application of models. Objectives:  Objectives Apply methods of analysis, experimentation and model building. Distinguish between descriptive, predictive and prescriptive models. Design and carry out empirical studies including experimental and observational approaches. Objectives:  Objectives Apply analytical techniques to the analysis of human-human and human-computer interaction. Construct descriptive, qualitative, quantitative and explanatory accounts of human-human and human-computer interaction. Assessment:  Assessment 4 practical assignments to be submitted as portfolio at the end of the unit for a total value of 80%. Oral presentation on an assigned reading of your choice (first come basis), in which the materials and approaches used are presented in a critical manner 20%. Moodle:  Moodle The course materials are located in a VLE called Moodle http://www.bath.ac.uk/e-learning/ Course Text Book:  Course Text Book Preece, J, Rogers, Y andamp; Sharp H. (2002) Interaction Design: Beyond human-computer interaction. Chapter 11. London Wiley Methodology & Explanation:  Methodology andamp; Explanation Richard Joiner MSc. Human Communication andamp; Computing Lecture 1: DECIDE a framework for evaluation Objectives:  Objectives Describe the evaluation paradigms and techniques used in interaction design Discuss the conceptual, practical and ethical issues to be considered when planning an evaluation Introduce the decide frame work to help you plan your evaluation Introduction:  Introduction Not just desktop computing. What other kinds of technologies are we evaluating? Collaborative Technologies:  Collaborative Technologies Immersive technologies:  Immersive technologies Tangible interfaces:  Tangible interfaces Mobile and Wireless:  Mobile and Wireless Evaluation Paradigms:  Evaluation Paradigms Any evaluation is guided by a set of beliefs and practices. These are known as an evaluation paradigm Each paradigm has a set of techniques associated with it Evaluation Paradigms:  Evaluation Paradigms There are a number of evaluation paradigms Quick and Dirty Evaluation Usability testing Field Studies Predictive Evaluation Evaluation Paradigms:  Evaluation Paradigms Quick and Dirty Informal feedback from the users Emphasis is on speed Usually descriptive and informal Often consultants are used Evaluation Paradigms:  Evaluation Paradigms Usability Testing Measuring user performance measures include time, number of errors Strongly controlled by the evaluator and typically in a laboratory. Quantitative data is collected Evaluation Paradigms:  Field Studies Carried out in natural settings Help identify new opportunities for new technology determine requirements for design facilitate introduction of new design Two distinct approaches: outsider and insider Evaluation Paradigms Evaluation Paradigms:  Evaluation Paradigms Predictive Evaluation Experts apply there knowledge of typical users Often in the form of heuristic evaluation It is quick, relatively inexpensive Evaluation Techniques:  Evaluation Techniques Observing Users Asking users Asking Experts Testing Users Performance Modelling Users Performance Slide24:  Yes Yes Yes Yes Yes Yes Yes Yes Yes Yes No No No No No No No No No No DECIDE Framework:  DECIDE Framework Well planned evaluations are driven by clear goals and appropriate questions We are going to use the DECIDE framework DECIDE:  DECIDE Determine the overall goal What are the overall goals Who wants it and why An evaluation to determine users needs is different to one to fine tune an interface or to find the best metaphor for a conceptual design or how technology will change work practices DECIDE:  DECIDE Explore the questions Overall goals need to broken down into questions that can be answered to satisfy them. Why do people not use a computerised calendar too busy don’t want others to know what they are doing. DECIDE:  DECIDE Choose the Evaluation paradigm Having chosen the goals and the main questions the next step is to choose the evaluation paradigm and techniques Evaluation paradigm determines the techniques used Practical and ethical decisions must be considered. DECIDE:  DECIDE Identify Practical Issues There are many practical issues. These include users facilities and equipment Schedule and budget constraints Expertise DECIDE:  DECIDE Decide how to deal with ethical issues The BPS and the ACM have ethical codes, which they expect their members to abide by. Don’t do anything that you would not like done to others. DECIDE:  DECIDE Evaluate Interpret and present data Reliability Validity Biases Scope Generalisability or External validity Ecological validity Summary:  Summary Describe the evaluation paradigms and techniques used in interaction design Discuss the conceptual, practical and ethical issues to be considered when planning an evaluation Introduce the decide frame work to help you plan your evaluation

Add a comment

Related presentations

Related pages

CS61C – Machine Structures Lecture 1 – Introduction 1 ...

CS61C – Machine Structures Lecture 1 – Introduction CS 61C L01 Introduction (2) Wawrzynek Spring 2006 © UCB 61C What are “Machine Structures”?
Read more

6.375 Complex Digital System

February 8, 2006 http://csg.csail.mit.edu/6.375/ L01-1 6.375 Complex Digital System Spring 2006 Lecturer: Arvind TAs: Chris Batten & Mike Pellauer
Read more

Do we need more chips (ASICs)?

1 February 8, 2006 http://csg.csail.mit.edu/6.375/ L01-1 6.375 Complex Digital System Spring 2006 Lecturer: Arvind TAs: Chris Batten & Mike Pellauer
Read more

Lecture 1: Introduction & DSP - Columbia University

E6820 SAPR - Dan Ellis L01 - 2006-01-19 1/29 COLUMBIA UNIVERSITY IN THE CITY OF NEW YORK EE E6820: Speech & Audio Processing & Recognition Lecture 1:
Read more

Introduction to Java Development with IDS

1 Introduction to Java Development with IDS Jean Georges Perrin RBS L01 Monday, May 8th, 2006 • 10:20 a.m. – 11:30 a.m. Platform: IDS, Java
Read more

Welcome to Comp 411! I thought this course was called ...

L01 -Introduction 1 Comp411 –Fall 2007 Welcome to Comp 411! I thought this course was called “Computer Organization” 8/22/2006 David Macaulay
Read more

Introduction and Overview - uni-freiburg.de

Introduction and Overview Lecture 01, 2006-01-24 ... KTH 2006-01-24 Plan of Lecture Introduction ... 15 2G1515-L01, Christian Schulte, ECS, ICT, KTH 2006-01-24
Read more

CS/ENGRD 211 Spring 2006 - Cornell University

CS/ENGRD 211 Spring 2006 Lecture 1: Overview ... An introduction to computer science and software engineering •Concepts in modern programming languages:
Read more