...

Model-Based Testing There is Nothing More Practical than a Good Theory

by user

on
Category: Documents
12

views

Report

Comments

Transcript

Model-Based Testing There is Nothing More Practical than a Good Theory
Model-Based Testing
There is Nothing More Practical
than a Good Theory
Jan Tretmans
TNO – ESI, Eindhoven, NL
Radboud University, Nijmegen, NL
1
Jan Tretmans
TNO
Embedded Systems Innovation
Eindhoven
The Netherlands
RU
ESI
Radboud University
Nijmegen
The Netherlands
Consumer Electronics
Medical Systems
Research
Applied Technologies
Industrial
Network
Academic
Network
ESI
Research cooperation with leading Dutch
high-tech multinational industries & SME’s
Research cooperation with all Dutch
universities with embedded systems research
Research cooperation in EU projects
Overview
• Model-Based Testing - State of the Art
–
–
–
–
–
Motivation
What
Theory: Labelled Transition Systems and ioco
Tools
Discussion
• Model-Based Testing - Next Generation
– Motivation
– Requirements for Model-Based Testing :
• State + Data
• Composition and Representation
– Example
– Application
– Discussion
4
Systems, Quality, Testing
5
Embedded Systems
or: What do Dutch Dykes have to do
with Model-Based Testing ?
6
Embedded Systems
or: What do Dutch Dykes have to do
with Model-Based Testing ?
7
Trend: Software in Embedded Systems
relative
effort
100%
physics/chemistry, etc.
mechanics
electronics
1970
[Progress 2006]
SW
2000
time
8
Quality of Embedded Systems
Software is brain of system
• software controls, connects, monitors
almost any aspect of ES system behaviour
• majority of innovation is in software
Software determines
quality and reliability
of Embedded System
• often > 50 % of system defects
are software bugs
9
Quality Software
10
Model-Based Testing
11
Software Testing
Checking or measuring
some quality characteristics
of an executing software object
specification-based,
active, black-box
testing of
functionality
by performing experiments
in a controlled way
w.r.t. a specification
specification
tester
SUT
System Under Test
12
Model-Based Testing
MBT
model-based
test
generation
model
test
execution
system
next step in
test automation
+ test generation
+ result analysis
pass fail
13
1 : Manual Testing
1. Manual testing
SUT
System Under Test
pass fail
14
2 : Scripted Testing
test
TTCN
cases
1. Manual testing
2. Scripted testing
test
execution
SUT
pass fail
15
3 : Keyword-Driven Testing
high-level
test notation
test
scripts
1. Manual testing
2. Scripted testing
3. Keyword-driven
testing
test
execution
SUT
pass fail
16
4 : Model-Based Testing
model-based
Test
test
TTCN
cases
generation
system
model
test
execution
SUT
1. Manual testing
2. Scripted testing
3. Keyword-driven
testing
4. Model-based
testing
pass fail
17
MBT : Validation, Verification, Testing
ideas
ideas
wishes
abstract
models,
math
concrete
realizations
validation
validation
verification
model
properties
testing
testing
SUT
18
A Theory of Model-Based Testing
with Labelled Transition Systems
19
MBT : Model-Based Testing
SUT passes tests
system
model
SUT
conforms to
model

SUT
conforms to
model
model-based
Test
test
TTCN
cases
generation
test
execution
SUT
pass fail
20
!coffee
MBT : Example Models
?coin
?button
!alarm
?button
21
Models: Labelled Transition Systems
Labelled Transition System:
 S, LI, LU, T, s0 
initial state
states
input actions
transitions
output actions
? = input
! = output
!coffee
?coin
?button
!alarm
?button
22
MBT : Labelled Transitions Systems
SUT ioco model
 

sound
exhaustive
SUT passes tests
ioco
Test
test
TTCN
cases
generation
LTS
model
input/output
conformance
set of
LTS tests
LTS
test
execution
ioco
SUT
behaving as
input-enabled LTS
pass fail
23
Conformance : ioco
=def    Straces (s) : out (i after )  out (s after )
i ioco s
s is a Labelled Transition System
i is (assumed to be) an input-enabled LTS
p 
p
Straces ( s )
p after 
 !x  LU  {} . p
=
=
=
{   ( L  {} )* | s
{ p’ | p

out ( P ) = { !x  LU | p !x
!x

}
p’ }
, pP }  {  | p

p, pP }
24
Conformance : ioco
i ioco s
=def    Straces (s) : out (i after )  out (s after )
Intuition:
i ioco-conforms to s, iff
• if i produces output x after trace ,
then s can produce x after 
• if i cannot produce any output after trace ,
then s cannot produce any output after  ( quiescence  )
25
Example: ioco
?quart
?dime
?dime
?dime
?quart
specification
model
!coffee
?dime
?quart
!choc
?dime
?quart
?dime
!tea
?dime
!tea
!coffee
?dime

!tea
!coffee
i ioco non-determinism
s =def
uncertainty
   Straces
(s) :
out
(i after )  out (s after )
under-specification
!choc
26
i ioco s =def
   Straces (s) :
out (i after )  out (s after )
Example: ioco
s
i
i ioco s
?dub
?dub
?dub
s ioco i
?dub
!tea
?dub
?dub
?dub
?dub
?dub
!tea
?dub
!coffee
?dub
?dub
!tea
!coffee
?dub
?dub
?dub
out (i after ?dub.?dub) = out (s after ?dub.?dub) = { !tea, !coffee }
out (i after ?dub..?dub) = { !coffee }
 out (s after ?dub..?dub) = { !tea, !coffee }
27
MBT : Labelled Transitions Systems
SUT ioco model
 

sound
exhaustive
SUT passes tests
ioco
Test
test
TTCN
cases
generation
LTS
model
input/output
conformance
set of
LTS tests
LTS
test
execution
ioco
SUT
behaving as
input-enabled LTS
pass fail
28
Test Case
?coffee
?tea
test case = labelled transition system
fail fail
?coffee
– ‘quiescence’ / ‘time-out’ label 
?tea
?coffee
– finite, deterministic
– final states pass and fail
• either one input !a
• or all outputs ?x and 
!kwart
pass fail
– tree-structured
– from each state  pass, fail :
!dub
?tea

fail
?coffee
?tea
fail
!dub
fail fail
?tea
pass

pass
?coffee
fail
29
Test Generation Algorithm : ioco
Algorithm to generate a test case t(S)
from a transition system state set S, with S   ( initially S = s0 after  ).
Apply the following steps recursively, non-deterministically:
1
end test case
3
observe all outputs
pass
forbidden outputs
2
?y
supply input !a
allowed outputs
forbidden outputs
?y
allowed outputs
?x

fail fail
?x
!a
t ( S after !x )
fail fail
allowed outputs (or ):
t ( S after !x )
t ( S after ?a   )
!x  out ( S )
forbidden outputs (or ): !y  out ( S )
30
Example: ioco Test Generation
s
specification
model
t
!dime
?dime
!tea
!coffee
generated
test case
?tea
?coffee
pass pass

?choc
fail
fail
i ioco s
i
implementation
?dime
i fails t
!tea
!choc
i ioco s =def
   Straces (s) :
out (i after )  out (s after )
31
MBT with ioco is Sound and Exhaustive
Test assumption :
gen : LTS
 (TTS)
s  LTS
i ioco s
test
t  SUT
tool
SUT
SUTIMP . mSUT IOTS .
tTESTS .
SUT passes t  mSUT passes t
Prove soundness and exhaustiveness:
mIOTS .
( tgen(s) . m passes t )
 m ioco s
SUT comforms to s
pass fail
exhaustive

sound
SUT passes gen(s)
32
Model-Based Testing
with Labelled Transition Systems
Background Theory
33
Testing Equivalences
S1

S1
S2
environment e
environment e
S2
  e  E . obs ( e, S1 ) = obs (e, S2 )


?
?
34
MBT: Test Assumption
Test assumption :
 SUT .  mSUT  MODELS .
 t  TEST .
SUT passes t
 mSUT passes t
SUT
mSUT
test t
test t
35
MBT : Completeness
?
SUT passes Ts

SUT conforms to s
SUT passes Ts

SUT passes Ts
 t  Ts .

def  t Ts .
SUT passes t
SUT passes t
test hypothesis:  t TEST . SUT passes t  mSUT passes t
 t  Ts . mSUT passes t

prove:  m MOD. (  t  Ts . m passes t )  m imp s
mSUT imp s

define : SUT conforms to s iff mSUT imp s
SUT conforms to s
36
Genealogy of ioco
Labelled Transition Systems
IOTS
( IOA, IA, IOLTS )
Trace Preorder
Canonical Tester
conf
Testing Equivalences
(Preorders)
Quiescent Trace Preorder
Refusal Equivalence
(Preorder)
Repetitive Quiescent
Trace Preorder
(Suspension Preorder)
ioco
37
Model-Based Testing
with Labelled Transition Systems
Variations
38
model
with data
and time
and hybrid
?coin1
?coin2
? money  n: int 
?coin3
and action
refinement
[ n  35 ] ->
? button1
c :=:=00
V
t
d Vct /<dt10= 3
[Vt = 15 ] ->
! tea
[ n  50 ] ->
? button2
Vcc :=
:= 00
d Vcc <
/ dt15= 2
[ [V
c c =5 10
] ->] ->
! coffee
39
Variations on a Theme
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
•
i ioco s 
i ior s

i ioconf s 
i iocoF s 
i uioco s 
i mioco s
i wioco s
i eco e
i sioco s
i (r)tioco s
i rioco s
i dioco s
i hioco s
i qioco s
i poco s
i stiocoD s
......
  Straces(s) : out ( i after )  out ( s after )
  ( L  {} )* : out ( i after )  out ( s after )
  traces(s) : out ( i after )  out ( s after )
  F :
out ( i after )  out ( s after )
  Utraces(s) : out ( i after )  out ( s after )
multi-channel ioco
non-input-enabled ioco
environmental conformance
symbolic ioco
(real) timed tioco (Aalborg, Twente, Grenoble, Bordeaux,..... )
refinement ioco
distributed ioco
hybrid ioco
quantified ioco
partially observable game ioco
real time and symbolic data
40
Model-Based Testing
Tools
41
ideas
MBT : Ingredients
requirements
off-line
MBT
model-based
test
generation
system
model
test
TTCN
cases
test
execution
SUTSUT
test harness
verdict
test result
analysis
pass fail
42
ideas
MBT : Ingredients
requirements
on-the-fly
MBT
model
based
test
generation
+
execution
system
model
SUT
test harness
verdict
test result
analysis
pass fail
43
MBT : Many Tools
• AETG
•
M-Frame
•
Temppo
• Agatha
•
MISTA
•
TestGen (Stirling)
• Agedis
•
NModel
•
TestGen (INT)
• Autolink
•
OSMO
•
TestComposer
• Axini Test Manager
•
ParTeG
•
TestOptimal
• Conformiq
•
Phact/The Kit
•
TGV
• Cooper
•
QuickCheck
•
Tigris
• Cover
•
Reactis
•
TorX
• DTM
•
Recover
•
TorXakis
• Gst
•
RT-Tester
•
T-Vec
• Gotcha
•
SaMsTaG
•
Uppaal-Cover
• Graphwalker
•
Smartesting CertifyIt
•
Uppaal-Tron
• JTorX
•
Spec Explorer
•
Tveda
• MaTeLo
•
Statemate
•
...........
• MBT suite
•
STG
44
MBT : Many Tools
• AETG
•
M-Frame
•
Temppo
• Agatha
•
MISTA
•
TestGen (Stirling)
• Agedis
•
NModel
•
TestGen (INT)
• Autolink
•
OSMO
•
TestComposer
• Axini Test Manager
•
ParTeG
•
TestOptimal
• Conformiq
•
Phact/The Kit
•
TGV
• Cooper
•
QuickCheck
•
Tigris
• Cover
•
Reactis
•
TorX
• DTM
•
Recover
•
TorXakis
• Gst
•
RT-Tester
•
T-Vec
• Gotcha
•
SaMsTaG
•
Uppaal-Cover
• Graphwalker
•
Smartesting CertifyIt
•
Uppaal-Tron
• JTorX
•
Spec Explorer
•
Tveda
• MaTeLo
•
Statemate
•
...........
• MBT suite
•
STG
45
Model-Based Testing
Discussion
46
MBT : Benefits !
• Automatic test generation
– automation of test generation + execution + result analysis
– test fast, test often
• More, longer, and more diversified test cases
– more variation in test flow and in test data
• Model is precise and consistent test basis
– unambiguous analysis of test results
– early error detection during model construction and analysis
– link to model-based system development
• Test maintenance by maintaining models
– improved regression testing
• Expressing test coverage by model coverage
47
MBT : Benefits ???
Promising, emerging technology .... but
If doing MBT is so smart, why ain’t you rich ?
• Many companies experiment, but who’s really doing it?
• Application of MBT still disappointing
• Organizational issues :
Education, adoption, shift in the development process
Technical Issues : Theory, Algorithms, Tools
48
Testing : Trends & Challenges
complexity
size
heterogeneous
components
Model
Based
Testing
model-driven
development,
continuous *
quest
for
quality
connectivity,
systems-of-systems
uncertainty
49
MBT : Next Generation Challenges
abstraction
concurrency
complexityparallelism
size
state +
complex data
heterogeneous
components
Model
Based
Testing
model
composition
model-driven
test
selection
development,
criteria
continuous *
usage
profiles for
quest testing
for
quality
scalability
connectivity,
link to
MBSD
systems-of-systems
multiple uncertainty
uncertainty
paradigms
nondeterminism
integration
50
MBT : Next Generation Challenges
concurrency
parallelism
abstraction
usage
profiles for
testing
state +
complex data
State of
the Art
MBT Tools
model
composition
scalability
test
selection
criteria
link to
MBSD
multiple
paradigms
integration
uncertainty
nondeterminism
51
Model-Based Testing
Next Generation
TorXakis
52
Next Generation MBT : TorXakis
abstraction
concurrency
parallelism
usage
profiles for
testing
state +
complex data
model
model
composition
scalability
TorXakis
system
under
test
test
selection
criteria
link to
MBSD
multiple
paradigms
integration
uncertainty
nondeterminism
53
Yet Another MBT Tool : TorXakis
JTorX
TorXakis
TorX
54
Yet Another MBT Tool : TorXakis
LTS : Labelled
Transition Systems
ioco
Testing
Equivalences
Soundness
Completeness
STS : Symbolic
Transition Systems
Uncertainty
Non-determinism
Abstraction
Under-specification
uncertainty
nondeterminism
ADT : Algebraic
Data Types
Process
Algebra
concurrency
parallelism
Equational
Reasoning
SMT : Satisfiability
Modulo Theories
model
composition
abstraction
state +
complex data
55
Model-Based Testing
State + Data
56
TorXakis : A Black-Box View on Systems
black-box
system view
model
a?n
a
x!n+1
b?m
y!`yes`
y
a?n
SUT
b
a?k
x
x!42
y!`no`
modelled as
state-transition
system
57
TorXakis : A Black-Box View on Systems
model
MODEL
a?n
labelled transition system
with parameterized actions on
x!n+1
b?m
y!`yes`
input- and output channels
a?n
Not (yet) in TorXakis:
a?k
x!42
y ! `yes`
y!`no`
b?s
• real-time
• probabilities
• derivatives (hybrid)
58
STS : Symbolic Transition Systems
STS:
model
with data
money ? n :: int
[[ n  35 ]] ->
button1
tea
return ! n – 35
[[ n  50 ]] ->
button2
coffee
return ! n – 50
59
MBT : Nondeterminism, Underspecification
specification
model
? x [[x < 0]]
SUT models
?x
[[ x >= 0 ]]
? x [[ x >= 0 ]]
! x
!y
?x
[[ | y x y – x| < ε ]]
? x [[ x < 0 ]]
• non-determinism
?x
[[ x >= 0 ]]
• under-specification
• specification of properties
rather than construction
! -x
!error
?x
60
STS : Symbolic Transition Systems
semantics
in ? n :: int
[[ n ≠ 0 ]]
in-4
out ! m :: int
[[ 0 < m < -n ]]
out ! m :: int
[[ 0 < m < n ]]
in-3
in3
in4
in-2 in-1in1 in2
out1
out2 out1
out3 out2
out1
out1
out1
out1 out2
out2 out3
61
TorXakis: Data Types
• Standard types: Int, Bool, String
• Algebraic data types
TYPEDEF Colour ::=
TYPEDEF IntList
::=
Red | Yellow | Blue
Nil
| Cons {
,
}
hd :: Int
tl :: IntList
62
TYPEDEF Colour
TorXakis: Functions
::=
Red | Yellow | Blue
TYPEDEF IntList
::=
Nilvalue expression
• Functions: name, parameters,
type, (recursive)
| Cons {
,
}
• Overloading
• Standard functions for: Int, Bool, String
FUNCDEF add ( x, y :: Int ) :: Int
::=
hd :: Int
tl :: IntList
x+y
FUNCDEF ++ ( s :: IntList; x :: Int ) :: IntList
::=
IF isNil ( s )
THEN Cons ( x, Nil )
ELSE Cons ( hd ( s ), tl ( s ) ++ x )
FI
63
TorXakis : Process with Data
queue (buf)
Deq
[[ isNil (buf) ]]
Value ! hd(buf)
buf := tl(buf)
Enq ? x
buf := buf ++ x
Deq
[[ not (isNil (buf)) ]]
64
More Complex Data
Test data generation from XSD
(XML) descriptions with constraints
complex data
65
TorXakis : Lift Test Generation
STS
semantics
LTS
symbolic test generation
=
ioco test generation
SYMB-TEST
semantics
TEST
66
TorXakis : Lift Test Generation
STS
semantics
LTS
symbolic-concrete
test generation
ioco test generation
TEST
67
Model-Based Testing
Composition and Representation
68
Compositionality – Representing LTS
a
a
b
c
b
a
c
b
d
a
b
a
b
a
b
a
b
c
c
d
c a
b
a
d
a
a
b
69
Compositionality – Representing LTS
• Explicit :  { S0, S1, S2, S3 } ,
{10c,coffee,tea} ,
{ (S0,10c,S1),
( S1, coffee, S2 ),
(S1,tea,S3) } ,
S0 
S
S0
10c
S1
coffee
S2
• Transition tree / graph
tea
S3
• Language :
S ::= 10c >-> ( coffee ## tea )
70
Compositionality – Representing LTS
a >-> b ## a >-> c
a
a
b
c
b
a
c
a
c
a >-> b  | | | c >-> d
d
c a
b
a >-> ( b ## c )
c
b
d
d
a
b
71
Representation of LTS
Q
a
b
a
b
a
b
a
b
where
Q ::= a >-> ( b | | | Q )
a
P
where
P ::= a >-> P
72
TorXakis : Defining Behaviour - LTS
basic behaviour
= transition system
complex behaviour
= combining transition systems
a
x
y
b
a
a
x
y
•
•
•
•
•
•
•
•
•
named behaviour definition
named behaviour use
sequence
choice
parallel
communication
exception
interrupt
hiding
73
Model-Based Testing
Example :
Dispatcher-Processing System
74
Example: Dispatcher-Processing System
input
jobs
dispatcher
processor
processor
processor
processor
1
2
3
4
processed
jobs
75
Example: Dispatcher-Processing System
Start
job
processor
Idle
Start
processor
Finish
Processing
Finish
job
state
transition
system
76
Example: Dispatcher-Processing System
Start
Start
Start
Start
Start Finish
Start Finish
Start Finish
Start Finish
Finish
Finish
Finish
processor(i)
Start
Finish
Finish
parallel
composition
processors
::=
processor(1) ||| processor(2) ||| processor(3) ||| processor(4)
77
Example: Dispatcher-Processing System
Job
dispatcher
Job
Start Finish
Start Finish
Start Finish
Start
Start Finish
dispatcher
Finish
Finish
Finish
Finish
dispatch_procs
::= HIDE [ Start ]
IN processors |[ Start ]| dispatcher
NI
composition
and
abstraction
78
Example:
Dispatcher
Processing
System
79
Example: Dispatcher-Processing System
FUNCDEF gcd ( a, b :: Int ) :: Int
::=
IF a == b
THEN a
ELSE IF a > b
THEN gcd ( a - b, b )
ELSE gcd ( a, b - a )
FI
FI
Start
job
state + data
Start
? job :: JobData
TYPEDEF JobData
::= JobData
{ jobId
:: Int
; jobDescr :: String
; x, y
:: Int
}
Finish
! gcd ( job.x, job.y )
[[ isValidJob(job) ]]
processor
Finish
job
FUNCDEF isValidJob ( jobdata :: JobData ) :: Bool
::=
jobdata.jobId > 0
/\ strinre ( jobdata.jobDescr, REGEX('[A-Z][0-9]{2}[a-z]+') )
/\ . . . . .
80
Running TorXakis and SUT
SPECDEF Spec
::=
CHAN IN
Stim
CHAN OUT Resp
BEHAVIOUR
Stimulus >-> Response
ENDDEF
ADAPDEF Adap
::=
CHAN IN
Stim
CHAN OUT Resp
model model.txs
MAP IN
MAP OUT
TorXakis
ENDDEF
SUTDEF Sut
::=
SUT IN
In :: String
SUT OUT Out :: String
SUT IN
SUT OUT
Stimu -> In ! "“
Out ? s
-> Response
In :: String
Out :: String
SOCK IN
In HOST "localhost" PORT 7890
SOCK OUT Out HOST "localhost" PORT 7890
ENDDEF
client
server
adapter adapter
SUT
socket
host portnumber
81
Demo: Dispatcher-Processing System
82
Next Generation MBT : Status
LTS : Labelled
Transition Systems
ioco
Testing
Equivalences
Soundness
Completeness
STS : Symbolic
Transition Systems
Uncertainty
Non-determinism
Abstraction
Under-specification
uncertainty
nondeterminism
ADT : Algebraic
Data Types
Process
Algebra
concurrency
parallelism
Equational
Reasoning
SMT : Satisfiability
Modulo Theories
model
composition
abstraction
state +
complex data
83
Next Generation MBT : Status
usage
profiles
abstraction
state +
complex data
scalability
model
composition
concurrency
parallelism
test
selection
criteria
multiple
paradigm
integration
uncertainty
nondeterminism
link to
MBSD
84
Discussion
Perspective
85
MBT : Benefits
• Automatic test generation
– automation of test generation + execution + result analysis
– test fast, test often
• More, longer, and more diversified test cases
– more variation in test flow and in test data
• Model is precise and consistent test basis
– unambiguous analysis of test results
– early error detection during model construction and analysis
– link to model-based system development
• Test maintenance by maintaining models
– improved regression testing
• Expressing test coverage by model coverage
86
Model-Based Testing
There is Nothing More Practical than a Good Theory
• Arguing about validity of test cases
and correctness of test generation algorithms
• Explicit insight in what has been tested, and what not
• Use of complementary validation techniques: model checking, theorem
proving, static analysis, runtime verification, . . . . .
• Implementation relations for nondeterministic, concurrent,
partially specified, loose specifications
• Comparison of MBT approaches and error detection capabilities
87
Model-Based Testing
There is Nothing More Practical
than a Good Theory
model-based
test
generation
model
test
execution
system
?
Questions
pass fail
88
Fly UP