Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Requirements for the Wheels case study system

Carol Britton, Jill Doake, in A Student Guide to Object-Oriented Development, 2005

Exercises

2.1

Design an interview plan for an interview with Naresh, the chief mechanic at Wheels.

2.2

List five questions that it would be useful to ask Naresh during the interview.

2.3

This chapter includes a list of requirements that have been identified during the requirements elicitation process. Suggest two further requirements that it would be useful to have in the new Wheels system.

2.4

Mike, the owner of Wheels, is wondering whether to expand the hire business to include sports items such as skateboards, surfboards, golf clubs and tennis rackets. He wants to find out whether this would be attractive to his current customers.

Using the example in Figure 2.3 as a guide, design a questionnaire to find out whether there is a market for hire of sports items among Wheels’ current customers.

2.5

Following the example scenarios in Figures 2.4 and 2.5, write a scenario to illustrate what happens when a customer, Sheena Patel, returns a bike on the due date, but there is some damage to the front wheel which is charged at £10.

2.6

Following the layout of the problems and requirements list in Figure 2.7, document the details of a requirement to keep records of customer details. You can assume that this requirement has come from Annie and that it is essential. It is related to the second requirement shown in Figure 2.7 to record details of previous bikes hired.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780750661232500025

Requirements Engineering

Bashar Nuseibeh, Steve Easterbrook, in Encyclopedia of Physical Science and Technology (Third Edition), 2003

III Eliciting Requirements

The elicitation of requirements is perhaps the activity most often regarded as the first step in the RE process. The term “elicitation” is preferred to “capture,” to avoid the suggestion that requirements are out there to be collected simply by asking the right questions. Information gathered during requirements elicitation often has to be interpreted, analyzed, modeled, and validated before the requirements engineer can feel confident that a complete enough set of requirements of a system have been collected. Therefore, requirements elicitation is closely related to other RE activities—to a great extent, the elicitation technique used is driven by the choice of modeling scheme, and vice versa: many modeling schemes imply the use of particular kinds of elicitation techniques.

III.A Requirements to Elicit

One of the most important goals of elicitation is to find out what problem needs to be solved, and hence identify system boundaries. These boundaries define, at a high level, where the final delivered system will fit into the current operational environment. Identifying and agreeing a system's boundaries affects all subsequent elicitation efforts. The identification of stakeholders and user classes, of goals and tasks, and of scenarios and use cases all depend on how the boundaries are chosen.

Identifying stakeholders—individuals or organizations who stand to gain or lose from the success or failure of a system—is also critical. Stakeholders include customers or clients (who pay for the system), developers (who design, construct, and maintain the system), and users (who interact with the system to get their work done). For interactive systems, users play a central role in the elicitation process, as usability can only be defined in terms of the target user population. Users themselves are not homogeneous, and part of the elicitation process is to identify the needs of different user classes, such as novice users, expert users, occasional users, disabled users, and so on.

Goals denote the objectives a system must meet. Eliciting high-level goals early in the development process is crucial. However, goal-oriented requirements elicitation is an activity that continues as development proceeds, as high-level goals (such as business goals) are refined into lower-level goals (such as technical goals that are eventually operationalized in a system). Eliciting goals focuses the requirements engineer on the problem domain and the needs of the stakeholders, rather than on possible solutions to those problems.

It is often the case that users find it difficult to articulate their requirements. To this end, a requirements engineer can resort to eliciting information about the tasks users currently perform and those that they might want to perform. These tasks can often be represented in use cases that can be used to describe the outwardly visible interactions of users and systems. More specifically, the requirements engineer may choose a particular path through a use case, a scenario, in order to better understand some aspect of using a system.

III.B Elicitation Techniques

The choice of elicitation technique depends on the time and resources available to the requirements engineer, and of course, the kind of information that needs to be elicited. We distinguish a number of classes of elicitation technique:

Traditional techniques include a broad class of generic data gathering techniques. These include the use of questionnaires and surveys, interviews, and analysis of existing documentation such as organizational charts, process models or standards, and user or other manuals of existing systems.

Group elicitation techniques aim to foster stakeholder agreement and buy-in, while exploiting team dynamics to elicit a richer understanding of needs. They include brainstorming and focus groups, as well as RAD/JAD workshops (using consensus-building workshops with an unbiased facilitator).

Prototyping has been used for elicitation where there is a great deal of uncertainty about the requirements, or where early feedback from stakeholders is needed. Prototyping can also be readily combined with other techniques, for instance, by using a prototype to provoke discussion in a group elicitation meeting, or as the basis for a questionnaire or think-aloud protocol.

Model-driven techniques provide a specific model of the type of information to be gathered, and use this model to drive the elicitation process. These include goal-based and scenario-based methods.

Cognitive techniques include a series of techniques originally developed for knowledge acquisition for knowledge-based systems. Such techniques include protocol analysis (in which an expert thinks aloud while performing a task, to provide the observer with insights into the cognitive processes used to perform the task), laddering (using probes to elicit structure and content of stakeholder knowledge), card sorting (asking stakeholders to sort cards in groups, each of which has the name of some domain entity), repertory grids (constructing an attribute matrix for entities, by asking stakeholders for attributes applicable to entities and values for cells in each entity).

Contextual techniques emerged in the 1990s as an alternative to both traditional and cognitive techniques. These include the use of ethnographic techniques such as participant observation. They also include ethnomethodogy and conversation analysis, both of which apply fine-grained analysis to identify patterns in conversation and interaction.

III.C The Elicitation Process

With a plethora of elicitation techniques available to the requirements engineer, some guidance on their use is needed. Methods provide one way of delivering such guidance. Each method itself has its strengths and weaknesses, and is normally best suited for use in particular application domains.

Of course, in some circumstances a full-blown method may be neither required nor necessary. Instead, the requirements engineer needs simply to select the appropriate technique or techniques most suitable for the elicitation process in hand. In such situations, technique-selection guidance is more appropriate than a rigid method.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0122274105008553

Elicitation of Probabilities and Probability Distributions

L.J. Wolfson, in International Encyclopedia of the Social & Behavioral Sciences, 2001

4 Eliciting Utilities

The ultimate goal of eliciting utilities is to make decisions that minimize expected loss (see Decision Theory: Bayesian), under the constraints of uncertainty specified by the prior distribution and likelihood function. Determining expected loss means constructing loss functions that consist of outcomes based on a quantity of interest, L, and the loss associated with each outcome. The loss can be fixed or random. In general, there is a set A of possible actions that can be taken, and for each element of A, there are losses that may or may not depend on L. The set A must be exhaustive; in other words, all possible actions must be enumerated, and each action must be exclusive of every other action.

The first stage of the elicitation process should be to choose the action set A and the quantity of interest L. The set of actions should be well defined and must contain at least two elements. These actions are usually dependent on some quantity of interest. For example, if the action set consists of choosing which family planning program to implement, the quantity of interest might be some quantification of the state of infant mortality. In choosing the quantity of interest, it is important to make sure that L is a quantity that incorporates the uncertainty inherent in the problem under study. Generally, L will be a random quantity defined by a statistical model, and as such, it can have several dimensions.

Once the action set A and the quantity of interest L have been determined, the second stage of the elicitation is to construct loss functions based on them. This is the most difficult stage of the elicitation process. If several stakeholders are involved in the decision-making process, they may construct these separately or as a group; regardless, those assessing the loss functions must give careful thought to the consequences resulting from each action in A for any value of L. For useful overviews of loss function elicitation strategies, see works by Clemen (1996), Zeleny (1982), Edwards (1992), and Saaty (1980). In general, it is necessary to specify the form the loss functions should take; this is sometimes specified in advance, but it is advisable to consider this as part of the elicitation process.

There are several ‘standard’ loss functions (see Decision Theory: Classical for a discussion of them). One of the best-known examples of a loss function is the 0–1 loss function generally applied to statistical hypothesis testing. In this case, the set of actions is given by A={a1: reject the null hypothesis, a2: do not reject the null hypothesis} (Berger 1985). An extension of this is the 0–k loss function, which is typically written (letting L=θ)

L(θ,ai)={0θ∈Θi Kiθ∉Θi,i=1,2

There are many possible variants on this loss function that could be considered, and identifying the ‘form’ of the loss function is the more controversial of the tasks that must be performed in eliciting utilities. Given the conceptual framework of a 0–k loss function, consider the next elicitation task—identifying the values of k1, k2. Because k1, k2 can represent rather intangible quantities, it is often advantageous to specify their relative rather than absolute values. Consider that the ‘optimal’ decision in this example is to choose a1 when

k1k2>p(θ∈Θ2|x)p(θ∈Θ1|x)=p(θ∈Θ2|x)1−p( θ∈Θ2|x)

Then the remaining elicitation task needed for decision-making is to elicit the ratio k1/k2, which simplifies the elicitation task from eliciting k1, k2 directly or independently. The ratio on the right-hand side above can be viewed as the ratio of the posterior probabilities that the parameter lies outside the ‘rejection region’ (θ∈Θ2) to the posterior probability that the parameter lies within the region (or θ∈Θ1). From this, a general relationship can be drawn between the functional form of the model and the two quantities for which preferences need to be elicited; this is illustrated in Fig. 2.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 2. A decision maker should choose a1 or a2 based on the values of P(θ∈Θ1) and k1/k2

Exploiting the relationships between the functional form of the model and loss function and the quantities about which preferences and value judgments must be elicited is a useful strategy. Many examples of the application of evaluating utilities and preferences are found in journals in the fields of risk analysis, accounting, and economics.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B0080430767004125

User-Centered Requirements Definition

Alistair Sutcliffe, Jan Gulliksen, in Usability in Government Systems, 2012

Case Study: The Swedish National Tax Board (1994-2004)

We have had extensive cooperation with public authorities to improve their computerized work environment and development processes to better fit with the need for change in society and technology. In a 10-year cooperation project, the Uppsala University (Sweden) helped a public authority to change their development processes toward a more user-centered one for ongoing large-scale projects. That project yielded a number of lessons learned about the requirements elicitation process (Gulliksen et al., 2003):

Unsurprisingly, specifying requirements as use cases or using UML made them difficult to understand for the user organization, which was asked to confirm these requirements before development (Gulliksen, Göransson, & Lif, 2001).

Prototypes served as excellent tools to help clarify the understanding of the requirements for both the development organization and for the user representatives.

Usability designers played an important role as “go-betweens” between the user and the developer organization, to explain difficult terminology and to focus specifically on quality aspects from the user perspective.

The RE process was cumbersome and lasted for a very long time (up to one a year for a larger project was not uncommon). Therefore the need to change or add new requirements late in the process was considered a major problem that risked even further delays, and was therefore forbidden in the requirements revision process, although everybody acknowledged the need for modification.

Very large and long development projects that lacked the ability to adapt to changing requirements demonstrated a need to leave the requirements focus behind during the later phases of the project and to focus instead on the product at hand.

One of the things we noticed was the lack of interest in understanding and capturing all aspects of the work environment during the process of software development. Much attention was given to the development of the public face of these systems, and to the increased opportunity for citizens to interact directly with the government information that they needed. The changing role of the work conducted by the public authorities received less attention than the user-facing aspects, not only during development but also in requirements elicitation. One of the reasons may be that the public face of the systems receives much media attention, the effects that these systems could have on the staff working with them receive inadequate attention or understanding. In the planning of the public-facing projects, the initial expectation was that it would not necessarily result in a change of system for the staff. We noticed that, in one particular case, the developers logged into the public-facing systems, as the information provided by those systems differed from what was available in the internal case-handling systems.

One important aspect was the necessity to understand and work with the actual competencies and procedures that were already well established. See, for example, Figure 18.6; note the number of documents and “stickies” pasted around the display and on the bookshelves.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 18.6. A typical work environment from a Swedish public authority (not the Tax Board)

Traditionally, requirements communicated the needs determined for the computerized system without considering the full breadth of the work environment that provided the context for such requirements. From looking at the work environment, it was obvious that the established work routines, the organization of the work, and the skills available also provided important information for requirements definition. We needed to understand and evaluate the context of use, so that we could engineer it to meet any new requirements that might be imposed (e.g., from legislative changes, new technology, or organizational development and improvement goals.)

To enable us to understand the complexity of the context of use, we conducted a series of observation-interviews (see Figure 18.7), with the aim of gaining a better understanding of which aspects of the context of use should be maintained and which should be elaborated in greater detail in the development work. The results of the contextual interviews were analyzed and turned into new requirements in that process. A large amount of the knowledge gathered in such field studies, however, does not end up as new requirements; instead, it is maintained as a common repository of knowledge to be shared among those involved in the development work. Contextual knowledge helps developers interpret requirements and refine designs to suit users’ needs; not all aspects of users’ needs and goals can be captured as formal requirements.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 18.7. Contextual interviews or field studies to gather requirements

The users themselves are always the people most familiar with, and the most knowledgeable about, the actual procedures in their work; but often they have little or no knowledge about the process of formulating requirements. Many user representatives have participated in requirements gathering work, but they often report a lack of influence, both in terms of what impact their participation actually had on the resulting system and on the contribution they could make because of the methods and tools used.

We used a collaborative prototyping activity to elicit the user requirements. We worked with low-fidelity tools and techniques, with which the users themselves drew sketches and diagrams. Then the users acted out operational sequences in their future system in cooperation with usability designers and system developers. In this way knowledge and understanding of the needs and requirements of the future computerized work situations were increased (see Figure 18.8). Both the users and the developers were very happy with the outcome, in that it clearly communicated the requirements in such a way that the users could understand and relate to them, and the developers could clearly see what the system should look like. The low-fidelity prototypes (see Figure 18.9), the scenarios, and the user profiles then became a part of the formal requirements in a way that felt rewarding for all involved. It also clearly gave more control to the business and user side of the organization in comparison to previous methodologies, in which the users felt like hostages under the control of the software developers. Users, usability designers, and developers participated on an equal footing to design and develop their future vision of the system.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 18.8. Collaborative prototyping at the Swedish National Tax Board

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 18.9. Low-fidelity prototypes illustrating screen designs and operational sequences

Working with low-fidelity prototypes to elicit requirements provided a mutual language that evened out the power relations between the development team and users; furthermore, it made the team focus on developing future work tasks, rather than getting stuck in terminological or detailed graphical design issues that higher-level prototypes can engender.

In our engender evaluation, it turned out that the prototype-based requirements documents gave the developers, for the first time, a clear understanding and overview of the work to be done and a general feel for the requirements. It also worked well as a tool to provide the users with real power in the development phase, and as a result the subsequent system fulfilled its purpose in a much better way than had systems produced by previous development projects.

A prototype-driven development process where prototypes play an important role as a replacement or complement to the standard requirements specifications is more effective in meeting development goals; it decreases the risk of project failure or abandonment and increases quality and a sense of participation among the users.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B978012391063900050X

“Filling in the blanks”

Luigi Buglione, ... Andrea Herrmann, in Software Quality Assurance, 2016

7.3.3 Working with Stakeholders (with the Proper Requirements)

Another contextual aspect related to requirements processes addresses stakeholders and their interactions. The fewer perspectives taken into account, the lower the potential (valid) requirements to be filtered, discussed, and then approved. Stakeholder Management and Requirements Elicitation are two areas in which many guides and frameworks have focused in recent years (PMI, 2013). For completeness and perspective from the various viewpoints, requirements should be grouped by stakeholders, as many RE text books indicate (Lauesen, 2002). In the original QFD the left side of the matrix represented the “Customer” viewpoint. However, more often than not, there is no single customer, but many involved in the Requirements Elicitation process (PMI, 2013; Herrmann and Daneva, 2008). To account for this, our proposal for improving the project estimation process begins by improving the requirements processes. We leverage a technique we created (Buglione and Abran, 1999) called QF (Quality Factor), which calculates and manages the overall quality value of a software product from several viewpoints. More specifically, we account for three perspectives, which we refer to as E/S/T, Economical, Social, and Technical. We defined them using the ISO 9126 QM (now ISO 25010, ISO/IEC 25010:2011, 2011), expressed as a normalized value (from 0 to 1) on a ratio scale:

“E” viewpoint represents management

“S” expresses the viewpoint of the final users

“T” viewpoint, the technical team working on the project.

The QF technique assists matching a viewpoint against one or more QM sub-characteristics, which may lead an analyst to ask questions to elicit a requirement not previously noted. For example, “accessibility” is an attribute useful for both business users and developers, while “modularity” would typically be of interest to developers only. Moreover, functional requirements and functionality features are more likely selected by business managers and/or users.

Creating such associations between requirements and their sources facilitates the creation of “profiles” (or “patterns”) that can be stored into an organizational project repository and made available for later use, for example, for comparing past and new project estimates (ISO15939 refers to this as MEB—Measurement Experience Base). For example, this could be used in situations where a project manager may need to check if the correct project scope has been defined starting from a proper requirement elicitation involving the right (primary and secondary) stakeholders.

The application of a QM standard taxonomy (using one of the QMs previously discussed) may provide support during the requirement elicitation step in “filling in the blanks.” Therefore the QF contribution is to propose the grouping of customer requirements using at least two more criteria:

by customer perspective

by a “standard” QM taxonomy (at least for the nonfunctional side).

As previously discussed, this may reduce the uncertainty of the scope defined and applied at the estimation stage against a better defined scope (Kassab et al., 2007) at the end of the Requirements stage and, lastly, at the end of the project.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128023013000077

Applying Software Data Analysis in Industry Contexts

Madeline Diep, ... Forrest Shull, in The Art and Science of Analyzing Software Data, 2015

12.2.4 Applying Software Measurement in Practice—The General Approach

In this section, we briefly discuss Fraunhofer’s general approach to implementing a measurement program with an industry or government partner. The steps of this process reflect, as we will discuss in Section 12.3, that many of the challenges facing an industry measurement program are not matters of processing or analyzing data, but rather in working with people and organizations.

As exemplified in the QIP, our general approach for measurement is comprised of three phases performed iteratively: (1) requirements gathering; (2) metrics planning and formalization; and (3) metric program execution consisting of implementation, interpretation, communication, and response (Figure 12.1).

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 12.1. General applied software measurement approach.

In the requirements gathering phase, we identify the relevant stakeholders and elicit their business needs. Through the elicitation process, the available assets (e.g., existing data, process, and insight) as well as constraints and limitations (e.g., data availability and access, personnel engaged in measurement, etc.) are discovered. We also obtain the stakeholders’ commitments by defining their roles and responsibilities in the measurement program.

In the planning and formalization phase, we articulate the business needs as measurement goals—specifying the purpose, object, focus, and context of the measurement. We also outline how the measures shall be analyzed and interpreted against the goal using GQM. We use the measurement goals, constraints, and limitations to define a measurement plan with specific metrics to gather, and the process (who, when, where, etc.) for gathering them. The formalization of the measurement plan includes the standardization of the vocabulary adopted in the measurement program. This alleviates the problem when dealing with a heterogeneous set of stakeholders.

The execution phase consists of four main activities: (1) implementation; (2) analysis and interpretation; (3) communication; and (4) response. In implementation, the measurement plan is executed and data is collected. During the data gathering, unanticipated changes and/or roadblocks may occur and need to be addressed. Next, the gathered data is analyzed and interpreted with respect to the business goals using the help of Subject Matter Experts (SMEs). Results are then communicated to the relevant stakeholders in an easy-to-understand format. Finally, we gather the organization’s responses to the measurement program findings to define organizational improvement activities as well evaluations and improvements to the measurement program. New business needs may be identified, and the measurement process may be repeated.

In the remainder of this chapter, our discussion of challenges and lessons learned follows this progression from measurement requirement gathering, to formalization, to the many phases of implementation.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124115194000124

Requirement Management

Jean-Louis Boulanger, in Certifiable Software Applications 1, 2016

10.2.3.2 Requirement analysis phase

10.2.3.2.1 Objectives

The requirement analysis phase aims to collate all stakeholders’ requirements. Note that the analysis phase enables us to identify the limits of the product (see Figure 10.9) and characterize interfaces with other products.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 10.9. Environment of the software application

Figure 10.9 shows the environment of a system and/or application that comprises three inputs (Ei), two outputs (Sj) and three interfaces (Ik) with existing means (reused systems, power supply, etc.).

The requirements of stakeholders form the basis of whether or not a system is accepted, negotiation and agreement on the project, system development and management of changes of requirements. The requirements define the result as expected by stakeholders; therefore, it is necessary to obtain requirements to the best possible ability.

The following must be identified within the system:

interfaces with the environment (see Figure 10.9); these interfaces can be electrical, mechanical, software-related, pneumatic, etc.

states: stopped, running, degraded state, etc. (see, for example, Figure 10.10). The concept of state introduces partition between proper functioning, fall-back states and hazardous states;

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 10.10. Evolution of status of a system

the concept of correct behavior, degraded behavior and dangerous behavior;

the concept of functional and non-functional requirement.

With regard to the system states, Figure 10.10 identifies the correct and incorrect states, but we must go further and introduce all the states attainable by the system that characterize specific behaviors (fall-back, maintenance, degradation, etc.) as shown in Figure 10.11.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 10.11. Different states of the system

During the elicitation phase, it is necessary to consider the non-functional requirements and implement analyses related to dependability, to define the safety requirements but also the availability, reliability and maintainability requirements.

Figure 10.12 shows a process taking into account elicitation of non-functional requirements related to RAMS.

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 10.12. Elicitation process with RAMS analysis

10.2.3.2.2 Elicitation techniques

The objective of elicitation techniques is to help discover conscious, unconscious and subconscious requirements of the stakeholders. These techniques are selected based on risk factors, human and organizational constraints, business sector and the expected level of detail for requirements. Elicitation techniques can also be selected according to the requirement document to be drafted.

Based on the list of stakeholders and sources, it is necessary to establish a requirement acquisition process. There are various elicitation processes such as [ZOW 05]:

investigative techniques: surveys, questionnaires, etc.;

interview techniques;

creative techniques: brainstorming, storyboarding;

animation techniques: brown paper, role-plays, use cases;

observation techniques: field, learning;

prototyping and simulation techniques.

In appendix A of Meinadier [MEI 02], the author presents the methodological aspects related to requirements engineering. Requirements engineering is a tool that system engineering offers to implement as shown in the standard EIA-632 [EIA 98].

The best result is achieved when the analyst implements it along with several of these techniques.

As it is not possible to state all these techniques in an exhaustive manner in this book, in the following subsections we shall introduce only two techniques; however, the identified technology is implemented with respect to the project.

10.2.3.2.3 Interview techniques

An interview consists of the following steps:

Questioning: questions are asked and answers are taken into account.

Pause: during breaks, most people will find something to say and/or explain, which helps to capture additional requirements.

Summary and/or reformulation phase: this phase is important because it enables to verify the understanding of the answers.

Interviewing involves a preparation phase, where a set of questions is prepared. This set of questions should identify the requirements of different stakeholders. This set of questions includes:

Open questions: they require answers other than yes or no;

Closed questions: they elicit yes or no answers. Closed questions are used to obtain a definitive answer (after reformulation, for example).

Note that during the interview, it is possible to add questions based on the responses and after the reformulation of answers step.

All the identified stakeholders must be interviewed and they must be made aware that their requirements must be considered as a whole (system). It is essential to take stakeholders seriously and not to pass judgment on the expressed requirements. During interviews, it is necessary to process all elements as requirements, document the importance of the requirements for each stakeholder, documenting the results of interviews and formalizing the acceptance of stakeholders of the interview results (notes, documents, etc.).

Conducting interviews requires the skill to stimulate and encourage stakeholders to respond. The person in charge of interviews must have strong abilities to manage a discussion and talk while being able to organize pauses.

10.2.3.2.4 Prototyping and simulation techniques

A model is the initial simplified modeling of a problem. A model helps in modeling certain aspects of the problem to be resolved in a more or less accurate manner. When the model handles the actual elements of the system (such as data files, etc.), it is called prototype.

Models and/or prototypes are a way to confront the vision of various stakeholders and expected behaviors (see Figure 10.13).

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system?

Figure 10.13. Prototype and expression of requirement

A prototype can be static, where we try to model the interactions between various elements, including actors, but it can also be dynamic, where simplified behaviors are modeled, and it is possible to run scenarios.

Developing a prototype is a good way to facilitate understanding of requirements, but this involves costs, and often the prototype is considered as the beginning of a solution.

It must be noted that, nowadays, more and more prototypes are used to validate a concept (website, etc.). The challenge with a prototype is not to lose sight that it is a prototype (which can be extremely successful) and that it is not the design of the final product.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9781785481178500106

What’s it all about?

Ian H. Witten, ... Christopher J. Pal, in Data Mining (Fourth Edition), 2017

Diagnosis

Diagnosis is one of the principal application areas of expert systems. Although the hand-crafted rules used in expert systems often perform well, machine learning can be useful in situations in which producing rules manually is too labor intensive.

Preventative maintenance of electromechanical devices such as motors and generators can forestall failures that disrupt industrial processes. Technicians regularly inspect each device, measuring vibrations at various points to determine whether the device needs servicing. Typical faults include shaft misalignment, mechanical loosening, faulty bearings, and unbalanced pumps. A particular chemical plant uses more than 1000 different devices, ranging from small pumps to very large turbo-alternators, which used to be diagnosed by a human expert with 20 years of experience. Faults are identified by measuring vibrations at different places on the device’s mounting and using Fourier analysis to check the energy present in three different directions at each harmonic of the basic rotation speed. This information, which is very noisy because of limitations in the measurement and recording procedure, can be studied by the expert to arrive at a diagnosis. Although handcrafted expert system rules had been elicited for some situations, the elicitation process would have to be repeated several times for different types of machinery; so a learning approach was investigated.

Six hundred faults, each comprising a set of measurements along with the expert’s diagnosis, were available, representing 20 years of experience. About half were unsatisfactory for various reasons and had to be discarded; the remainder were used as training examples. The goal was not to determine whether or not a fault existed, but to diagnose the kind of fault, given that one was there. Thus there was no need to include fault-free cases in the training set. The measured attributes were rather low level and had to be augmented by intermediate concepts, i.e., functions of basic attributes, which were defined in consultation with the expert and embodied some causal domain knowledge. The derived attributes were run through an induction algorithm to produce a set of diagnostic rules. Initially, the expert was not satisfied with the rules because he could not relate them to his own knowledge and experience. For him, mere statistical evidence was not, by itself, an adequate explanation. Further background knowledge had to be used before satisfactory rules were generated. Although the resulting rules were quite complex, the expert liked them because he could justify them in light of his mechanical knowledge. He was pleased that a third of the rules coincided with ones he used himself and was delighted to gain new insight from some of the others.

Performance tests indicated that the learned rules were slightly superior to the handcrafted ones that had previously been elicited from the expert, and this result was confirmed by subsequent use in the chemical factory. It is interesting to note, however, that the system was put into use not because of its good performance but because the domain expert approved of the rules that had been learned.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128042915000015

Economic Models and Value-Based Approaches for Product Line Architectures

Anil Kumar Thurimella, T. Maruthi Padmaja, in Economics-Driven Software Architecture, 2014

2.5 Relevant value-based approaches for SPL

In the following, the value-based approaches mentioned in Table 2.2 are reviewed in detail. At the end of this section, we provide a discussion and summary of the reviewed approaches.

2.5.1 Framework for product line practice

The framework for product line practice is based on patterns that have been traditionally used to solve recurrent problems in software engineering. For example, design patterns have been used for designing and implementing various software modules. The framework uses patterns to solve the product line engineering problems at three different levels:

The context is the organizational situation.

The problem is what part of a software product line effort needs to be accomplished.

The solution is the grouping of practice areas and the relations among them that together address the problem for that context.

Overall, the framework uses 12 different patterns (Clements and Northrop, 2002). For example, the Essentials Coverage pattern gives a mapping of each practice area to each of the three essential product line activities: core asset development, product development, and management. The Product Builder pattern consists of practice areas that should be used whenever any product in the product line is being developed. The economic models discussed in the previous section could be applied for relevant patterns. For example, SIMPLE economic model can be applied for calculating ROI for developing new products in the Product Builder pattern.

2.5.2 Feature models enriched with assumptions

Design decisions often determine the development, deployment, and evolution of a system and provide the earliest place for assessing architecture. Lago and van Vliet (Lago and van Vliet, 2005) model assumptions to reason about variability and invariability as well as architectural design decisions. The authors argue that enriching variability with assumptions provides a more complete picture of the SPL. For example, the assumption “modularity is business driven” creates variability by impacting the design of the functionality for common services and service specific functionality.

The approach (Lago and van Vliet, 2005) models relationships between features and assumptions. For example, an assumption can impact a feature. Similarly, a feature can realize an assumption. Modeling assumptions provides several advantages.

Assumptions provide a strong link between requirements and design. Therefore, modeling assumptions improves forward and backward traceability.

Hinders implementation of changes that go against assumptions made at architecting time.

Assumptions are useful for improving management of architectural knowledge. For example, externalizing the documentation assumptions is helpful in their reuse for multiple projects.

2.5.3 Value-based elicitation of variability

Bridging business and technology is receiving increasing attention in the software engineering community. For example, value-based software engineering (VBSE) (Biffl et al., 2005) aims to associate a value (e.g., business value) for artifacts. In this context, value-based variability modeling would mean considering the business value and the associated risks during decision making on variability.

Extracting tacit variability knowledge about product line architecture is a collaborative process (Dhungana et al., 2006). In particular, software engineers/architects involved in developing the reusable assets as well as people from business and marketing (e.g., people involved in selling these assets) are in the process.

The value-based process (Rabiser et al., 2008) for eliciting product line variability aims at integrating three research areas:

SPLE, with a focus on variability modeling and management

VBSE, particularly the question how much is enough in variability modeling?

Collaborative software engineering (Finkelstein et al., 2010) with patterns of collaboration that enable different people working together to produce mutually satisfactory results

The value-based elicitation process (Rabiser et al., 2008) was implemented using structured workshops. Based on the workshops, various lessons were learned. Here are the relevant findings for software architecture:

Different levels of variability decisions are identified, which include decisions on customer and sales, systemwide, subsystemwide, component level, and low-level parameterization

Complementary results with variability recovery tools such as parsers analyzing existing architecture models and configuration files

2.5.4 Issue-based variability management

Issue-based variability management methodology (IVMM) (Thurimella and Bruegge, 2012) supports decision making on variability management based on concepts such as questions, options, and criteria (QOC). In addition, the methodology enables capture of the rationale behind variability decisions. In the IVMM meta-model, a criterion is modeled based on a business goal (e.g., price) or a nonfunctional requirement (e.g., maintainability). Therefore IVMM relates value-based considerations to variability at the modeling level. IVMM supports decision making on variability in three different use cases:

UC1: Variability identification: For deciding on the creation of variation points, stakeholders evaluate possible variations against a set of criteria. In particular, stakeholders collaborate and decide on mandatory, optional, and alternative variations.

UC2: Variability instantiation: Instantiating variability models involves selecting a subset of variants for a new product of a product line. For instantiating variation points, stakeholders evaluate the possible options of a variability model against product-specific quality concerns. For example, stakeholders would be able to reason about the value of a feature for a particular market segment.

UC3: Variability evolution: Stakeholders evaluate alternative change requests for changing assets and reason about the variability changes.

IVMM allows capture of rationale during the decision-making process. The captured rationale information is reused to solve similar issues that occur during variability management. For example, a variation point is instantiated for multiple products of a product line. The tacit knowledge that is captured while instantiating a variation point for a product could be reused for instantiating the same variation point for other products. Similarly, the tacit knowledge captured is also reused for the evolution of variability models.

2.5.5 Value-based portfolio optimization

Value-based portfolio optimization (Muller, 2011) addresses the problem of identifying and prioritizing the most important features to be realized. In particular, the approach portfolio optimization (Muller, 2011) addresses the problem by applying an optimization technique called simulated annealing. The basic idea is to understand how the entities such as features, assets, customers, and market segments affect profits. Based on this idea, the approach models a mathematic utility function for profits with the dependent variables, such as customers, market segments, features, and their costs. Based on the mathematical optimization, the approach suggests

the features that should be implemented and

the starting price for the products.

2.5.6 Discussion and summary

The Framework for Product Line Practice (Clements and Northrop, 2002) addresses product line architecture issues based on patterns. Here, architects have the possibility to apply a set of patterns that give value and economic benefits to the organization.

Identifying the value for creating variation points and making the value explicit are other important topics. Value-based elicitation adds value to artifacts (Biffl et al., 2005) during variability elicitation. Value-based portfolio optimization (Muller, 2011) helps to decide on features to be implemented and their initial price. This technique uses mathematical optimization.

Architectural knowledge management is a related area. Issue-based variability management (Thurimella and Bruegge, 2012) supports decision making and rational management during the identification, instantiation, and evolution of variability. The approach can incorporate value-based criteria. Enriching feature models with assumptions (Lago and van Vliet, 2005) provides value by improving traceability, preventing unanticipated changes, and aiding architectural knowledge management.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780124104648000027

Automatic recognition of self-reported and perceived emotions

Biqiao Zhang, Emily Mower Provost, in Multimodal Behavior Analysis in the Wild, 2019

20.4 Collection and annotation of labeled emotion data

The collection of emotionally expressive data and the acquisition of emotion labels are essential for creating automatic systems for recognizing self-reported or perceived emotions. In this section, we briefly discuss the methods commonly used for emotion-elicitation and the annotation tools for collecting the two types of labels.

20.4.1 Emotion-elicitation methods

The difficulty of obtaining emotionally expressive data is a well-known problem in the field of automatic emotion recognition. Researchers have adopted various methods that allow for different level of control and naturalness for eliciting emotions, such as acting, emotion-induction using images, music, or videos, and interactions (human–machine interactions, human–human conversations).

One of the earliest emotion-elicitation method was to ask actors to perform using stimuli with fixed lexical content [12,37] or to create posed expressions [50,62,133,134]. One of the advantages of this approach is that researchers have more control over the microphone and camera positions, lexical content, and the distributions of emotions. A criticism of acted or posed emotion is that there are differences between spontaneous and acted emotional displays [19,122].

However, many of these criticisms can be mitigated by changing the manner in which acted data are collected. Supporters have argued that the main problem may not be the use of actors themselves, but the methodologies and materials used in the early acted corpora [15]. They argue that the quality of acted emotion corpora can be enhanced by changing the acting styles used to elicit the data and by making a connection to real-life scenarios, including human interaction. Recent acted emotion datasets have embraced these alternative collection paradigms and have moved towards increased naturalness and subtlety [14,17]. One example is the popular audiovisual emotion corpus, Interactive Emotion Motion Capture (IEMOCAP), which was collected by recruiting trained actors, creating contexts for the actors using a dyadic setting, and allowing for improvisation [14]. The same elicitation strategy has been successfully employed in the collection of the MSP-Improv corpus [17].

Various methods have been proposed for eliciting spontaneous emotional behavior. One popular approach is the passive emotion-induction method [54,58,100,111,121,129,131]. The goal is to cause a participant to feel a given emotion using selected images, music, short videos, or movie clips known a priori to induce certain emotions. This method is more suitable for analysis of modalities aside from audio, because the elicitation process often does not involve active verbal responses from the participants. A more active version is induction by image/music/video using emotion-induction events. For example, Merkx et al. used a multi-player first-person shooter computer game for eliciting emotional behaviors [69]. The multi-player setting of the game encouraged conversation between players, and the game was embedded with emotion-inducing events.

Another method often adopted by audiovisual emotion corpora is emotion-elicitation through interaction, including human–computer interaction and human–human interaction. For example, participants were asked to interact with a dialog system [57,59]. The FAU-Aibo dataset collected the interactions between children and a robot dog [6]. The Sensitive Artificial Listener (SAL) portion of the SEMAINE dataset used four agents with different characters played by human operators for eliciting different emotional responses from the participants [67]. Examples of human–human interaction includes interviews of topics related to personal experience [91] and collaborative tasks [88].

An emerging trend is that researchers are moving emotional data collections outside of laboratory settings, a paradigm referred to as in the wild. For example, the Acted Facial Expressions In The Wild (AFEW) dataset selected video clips from movies [26], the Vera am Mittag (VAM) dataset [40] and the Belfast Naturalistic dataset [110] used recordings from TV chat shows. User-generated content on social networks has become another source for creating text-emotion datasets. Researchers curate the contents published on social networks, such as blogs, Twitter, and Facebook for emotion detection [86,87,89].

Finally, researchers use combinations of known “best practices” for collecting behaviors associated with different emotions. For example, in the BP4D-Spontaneous dataset, the spontaneous emotion of the participants was using a series of tasks, including interview, video-clip viewing and discussion, startle probe, improvisation, threat, cold pressor, insult, and smell [139]. These tasks were intended to induce happiness or amusement, sadness, surprise or startle, embarrassment, fear or nervous, physical pain, anger or upset, and disgust, respectively.

20.4.2 Data annotation tools

Emotion annotation is a critical component of emotion recognition. Yet, it is time-consuming and potentially error-prone [136]. Researchers have developed several tools designed to improve the quality of emotion labels and to reduce the annotators' cognitive load. These tools are generally designed to annotate either self-report or perceived emotion (see Table 20.1), but they can be used in either context.

Table 20.1. Common emotion annotations tools. Label type: type of label originally designed to collect (SR: self-report; PE: perceived emotion); Emotion descriptor: the type of emotion descriptor (i.e., categorical or dimensional); Verbal description: if the tool includes verbal description; Graphical illustration: if the tool uses graphical illustration; Continuous: if the tool is used for collect time-continuous labels

ToolLabel typeEmotion descriptorVerbal descriptionsGraphical illustrationsContinuous
SAM [55]SR dim. N Y N
GEW [104]SR both Y Y N
MECAS [25]PE cat. Y N N
Feeltrace [22]PE dim. Y Y Y
EMuJoy [74]PE dim. N Y Y

Self-Assessment Manikins (SAMs) [55] belong to the most widely used tools for collecting dimensional emotion descriptions. SAMs are a pictorial grounding of N-point Likert scales. The common set is across three dimensions: valence (positive vs. negative), activation/arousal (calm vs. excited), and dominance (dominant vs. submissive). SAMs reduce variability in emotion annotation [39] and make it easier to obtain evaluations across cultures and languages [72].

The Geneva Emotion Wheel (GEW), originally proposed in [104] and updated to the most recent version in [105], is available in several languages. GEW arranges twenty emotion families according to their levels of valence and activation, with an intensity scale for each emotion family. GEW effectively combines categorical and dimensional descriptions of emotion and enables annotation using a wide range of emotions, together with intensity.

The Multi-level Emotion and Context Annotation Scheme (MECAS) is a hierarchical framework for collecting categorical emotion labels [25]. In this scheme, a set of emotion labels is organized into different layers of granularity (e.g., coarse-grained and fine-grained). This hierarchy mitigates challenges due to the scarcity of the fine-grained emotions.

Feeltrace is a tool that permits continuous annotation of emotion on the valence-activation space [22]. It provides visual feedback to the annotators by displaying a trace of their ratings. A more general version of Feeltrace called “General trace” was proposed in [23]. It allows people to create their own dimensions. The EMuJoy [74] is a similar approach to annotation that has been used in the music community.

Read full chapter

URL: https://www.sciencedirect.com/science/article/pii/B9780128146019000274

What are the factors to be considered to select elicitation techniques?

The five factors are project, stakeholder, analyst, organization and software development process.

Which three 3 aspects must first be considered when choosing suitable requirements elicitation techniques?

Which aspects have to be taken into consideration in choosing elicitation techniques for a product data management system? Age of the stakeholders and business environment. Tools applied and availability of staff. Availability of stakeholders, project deadlines and budget.

Why is it important to choose the correct elicitation technique's )?

The type of elicitation technique used can dictate the thoroughness of the search and the value of the information found. It is therefore important for analysts to use the most appropriate techniques to gather complete, concise and clear requirements from all relevant stakeholders.

What is the first step of requirement elicitation should you consider?

Therefore, requirements gathering is the perfect first step in the process of requirements elicitation.