0% found this document useful (0 votes)
11 views

34308

The document promotes various ebooks related to reliability and statistics available for download on ebookgate.com. It highlights the importance of reliability research and introduces a new volume in the Handbook of Statistics series that includes contributions from leading experts in the field. The volume is organized into 13 parts covering a range of topics related to reliability models, life distributions, and inferential methods.

Uploaded by

narindiehlrb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

34308

The document promotes various ebooks related to reliability and statistics available for download on ebookgate.com. It highlights the importance of reliability research and introduces a new volume in the Handbook of Statistics series that includes contributions from leading experts in the field. The volume is organized into 13 parts covering a range of topics related to reliability models, life distributions, and inferential methods.

Uploaded by

narindiehlrb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 91

Get the full ebook with Bonus Features for a Better Reading Experience on ebookgate.

com

Advances in Reliability 1st Edition N.


Balakrishnan

https://ebookgate.com/product/advances-in-reliability-1st-
edition-n-balakrishnan/

OR CLICK HERE

DOWLOAD NOW

Download more ebook instantly today at https://ebookgate.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Advances in Survival Analysis N. Balakrishnan

https://ebookgate.com/product/advances-in-survival-analysis-n-
balakrishnan/

ebookgate.com

Reliability Engineering Advances 1st Edition Gregory I.


Hayworth

https://ebookgate.com/product/reliability-engineering-advances-1st-
edition-gregory-i-hayworth/

ebookgate.com

Quantile Based Reliability Analysis 1st Edition N.


Unnikrishnan Nair

https://ebookgate.com/product/quantile-based-reliability-analysis-1st-
edition-n-unnikrishnan-nair/

ebookgate.com

Methods and Applications of Statistics in Clinical Trials


Volume 2 Planning Analysis and Inferential Methods 1st
Edition N. Balakrishnan
https://ebookgate.com/product/methods-and-applications-of-statistics-
in-clinical-trials-volume-2-planning-analysis-and-inferential-
methods-1st-edition-n-balakrishnan/
ebookgate.com
Reliability in pragmatics 1st Edition Mccready

https://ebookgate.com/product/reliability-in-pragmatics-1st-edition-
mccready/

ebookgate.com

Female Entrepreneurship Routledge Advances in Management


and Business Studies 1st Edition N. Carter

https://ebookgate.com/product/female-entrepreneurship-routledge-
advances-in-management-and-business-studies-1st-edition-n-carter/

ebookgate.com

Managerial Accounting 1st Edition Ramji Balakrishnan

https://ebookgate.com/product/managerial-accounting-1st-edition-ramji-
balakrishnan/

ebookgate.com

Degradation Processes in Reliability 1st Edition Kahle

https://ebookgate.com/product/degradation-processes-in-
reliability-1st-edition-kahle/

ebookgate.com

Safety Reliability and Risk Analysis Theory Methods and


Applications 3rd Edition 4 Volumes Sebastián Martorell

https://ebookgate.com/product/safety-reliability-and-risk-analysis-
theory-methods-and-applications-3rd-edition-4-volumes-sebastian-
martorell/
ebookgate.com
Preface

The area of Reliability has become a very important and an active area of
research. This is clearly evident from the large body of literature that has been
developed in the form of books, volumes and research papers since 1988 when the
previous Handbook of Statistics on this area was prepared by P. R. Krishnaiah
and C. R. Rao. This is the reason we felt that this is indeed a right time to
dedicate another volume in the Handbook of Statistics series to highlight some
recent advances in the area of Reliability.
With this purpose in mind, we solicited articles from leading experts working in
the area of Reliability from both academia and industry. This, in our opinion, has
resulted in a volume with a nice blend of articles (33 in total) dealing with
theoretical, methodological and applied issues in Reliability.
For the convenience of readers, we have divided this volume into 13 parts as
follows:
I Reliability Models
II Life Distributions
III Reliability Properties
IV Reliability Systems
V Progressive Censoring
VI Analysis for Repairable Systems
VII Analysis for Masked Data
VIII Analysis for Warranty Data
IX Accelerated Testing
X Destructive Testing
XI Test Plans
XII Software Reliability
XIII Inferential Methods
We hope that this broad coverage of the area of Reliability will not only
provide the readers with a general overview of the area but also explain to them
what the current state is in each of the topics listed above.
We express our sincere thanks to all the authors for their fine contributions and
for helping us in bringing out this volume in a timely manner. Our special thanks
go to Ms. Nicolette van Dijk for taking a keen interest in this project and also for
helping us with the final production of this volume.
N. Balakrishnan
C. R. Rao
Contributors

J. A. Achcar, ICMC, University of $8o Paulo, C.P. 668, 13560-970, Säo Carlos,
SP, BraziI, e-mail: [email protected] (Ch. 29)
R. Aggarwala, Department of Mathematics and Statistics, University of Calgary,
2500 University Drive N.W., Calgary, Alberta, Canada T2N 1N4, e-mail."
[email protected] (Ch. 13)
R. Agrawal, GE Corporate Audit Staff, Fairfield, CT 06432-1008, USA, e-mail:
[email protected] (Ch. 27)
P. A. Akersten, Center for Dependability and Maintenance, Luleä University of
Technology, Luleä, Sweden, e-mail: [email protected] (Ch. 16)
E. K. AL-Hussaini, Department of Mathematics, University of Assiut, Assiut
71516, Egypt, e-mail." [email protected] (Ch. 5)
S. Aki, Division of Mathematical Science, OsaÆa University, Graduate
School of Engineering Science, Toyonaka, Osaka 560-8531, Japan, e-mail:
[email protected] (Ch. 11)
M. Asadi, Department of Statistics, University of Isfahan, Isfahan 81744, Iran,
e-mail: [email protected] (Ch. 7)
N. Balakrishnan, Department of Mathematics and Statistics, McMaster
University, Hamilon, Ontario, Canada L8S 4Kl, e-mail: [email protected].
mcmaster.ca (Chs. 1, 14, 23)
U. Balasooriya, Department of Statistics and Applied Probability, National
University of Singapore, Lower Kent Ridge Road, Singapore 119260, e-mail:
[email protected] (Ch. 15)
M. Banerjee, Center for Health Care Effectiveness Research, Wayne State
University, Detroit, MI 48201, USA, e-mail." [email protected]
(Ch. 19)
A. P. Basu, Department of Statistics, University of Missouri at Columbia,
Columbia, MO 65211-0001, USA, e-mail: [email protected] (Ch. 2)
S. Basu, Division of Statistics, Northern Illinois University, DeKalb, IL 60115-
2854, USA, e-mail: [email protected] or [email protected] (Ch. 19)
B. Bergman, Division of Total Quality Management, Chalmers University of
Technology, Gothenburg, Sweden, e-mail." [email protected] (Ch. 16)
W. R. Blischke, Emeritus Professor, Department of Information and Operations
Management, University of Southern California, Los Angeles, CA 90089-1421,
USA, e-mail: [email protected] (Ch. 20)

xix
xx Contributors

A. Chatterjee, Department of Statistics, Burdwan University, Burdwan 713104, W.


Bengal, India, e-mail." [email protected] or [email protected] (Ch. 4)
E. Cramer, Department of Mathematics, University of Oldenburg, D-26111
Oldenburg, Germany, e-mail: [email protected] (Ch. 12)
N. Doganaksoy, GE Corporate Research and Development, K14C35, Niskayuna,
N Y 12309, USA, e-mail: [email protected] (Chs. 26, 27)
N. Ebrahimi, Division of Statistics, Northern Illinois University, DeKalb, IL
60115-2854, USA, e-mail: [email protected] (Ch. 31)
B. J. Flehingert, IBM Research Division, Mathematieal Sciences Department,
Thomas J. Watson Research Center, P.O. Box 218, Yorktown Heights, N Y
10598, USA (Ch. 18)
S. K. Ghosh, Department of Statisties, University of North Carolina, Raleigh, NC
27695, USA, e-mail." [email protected] (Ch. 31)
E. Gouno, Department of Applied Statistics (SABRES), University of South
Brittany, Rue Yves Mainguy, Tohannic, F56 000 Vannes, France, e-mail:
[email protected] (Ch. 23)
G. Y. Hong, Institute of Information and Mathematical Sciences, Massey
University, Auckland, New Zealand, e-mail: [email protected] (Ch. 28)
K. Hussein, University of Wisconsin, Department of Mathematics, 400 University
Drive, Fond du Lac, WI 54935 USA, e-mail: [email protected] (Ch. 33)
R. A. Johnson, Department of Statistics, University of Wisconsin, 1210 W. Dayton
Street, Madison, WI 53706-1685, USA, e-mail: [email protected] (Ch. 24)
U. Kamps, Department of Mathematics, University of Oldenburg, D-26111
Oldenburg, Germany, e-mail: [email protected] (Ch. 12)
N. Kannan, Department of Mathematics and Statistics, University of Texas at Sah
Antonio, San Antonio, TX 78249, USA, e-mail." [email protected] (Ch. 14)
Md. R. Karim, Department of Statistics, University of Rajshahi, Rajshahi- 6205,
Bangladesh, e-mail: [email protected] (Ch. 21)
L. B. Klebanov, Faculty of Mathematics and Mechanics, Division of Statistics and
Probability, St. Petersburg State University, St. Petersburg 198904, Russia,
e-mail: [email protected] (Ch. 9)
B. Klefsjö, Center for Dependability and Maintenance, Lule{t University of
Technology, Luleä, Sweden, e-mail." [email protected] (Ch. 16)
K. B. Kulasekera, Department of Mathematical Sciences, Clemson University,
Clemson, SC 29634-0975, USA, e-mail: [email protected] (Ch. 30)
C. D. Lai, Institute of Information Sciences and Technology, Massey University,
Palmerston North, New Zealand, e-mail: [email protected] (Ch. 3)
N. Limnios, Département Génie Informatique, Division Mathématiques Appliquées,
Université de Technologie de Compiègne, B.P. 20 529, 60205 Compiègne Cedex,
France, e-mail: [email protected] (Ch. 1)
W. Lu, Department of Statistics, University of Wisconsin, 1210 W. Dayton Street,
Madison, WI 53706-1685, USA, e-mail: [email protected] (Ch. 24)
M. Mazumdar, Department of Industrial Engineering, University of Pittsburgh,
Pittsburgh, PA 15261, USA, e-mail: [email protected] (Ch. 25)
Contributors xxi

G. C. McDonald, Director, Enterprise Systems Lab, General Motors Research and


Development Center, MC #480-106-359, 30500 Mound Road, Warten, MI
48090-9055, USA, e-mail." [email protected] (Ch. 17)
J. Mi, Department of Statistics, Florida International University, University Park,
Miami, FL 33199, USA, e-mail: [email protected] (Ch. 8)
N. A. Mokhlis, Department of Mathematics, Faculty of Science, Ain-Shams
University, Cairo, Egypt, e-mail: [email protected] (Ch. I0)
S. P. Mukherjee, Department of Statisties, Calcutta University, Calcutta 700019,
W. Bengal, India, e-mail: [email protected] or [email protected] (Ch. 4)
D. N. P. Murthy, Department of Mechanical Engineering, University of Queens-
land, Brisbane, Queensland 4072, Australia, e-mail: [email protected]
(Chs. 3, 20)
P. R. Nelson, Department of Mathematical Sciences, Clemson University,
Clemson, SC 29634-0975, USA, e-mail: [email protected] (Ch. 30)
W. Nelson, Consultant, 739 Huntingdon Drive, Sehenectady, N Y 12309, USA,
e-mail: [email protected] (Ch. 22)
S. Panchapakesan, Department of Mathematics, Southern Illinois University at
Carbondale, Carbondale, IL 62901-4408, USA, e-mail." [email protected]
(Ch. 33)
C. Papadopoulos, D@artement Génie Informatique, Division Mathématiques
Appliquées, Université de Technologie de Compiègne, B.P. 20 529, 60205
Compiègne Cedex, France, e-mail." [email protected] (Ch. 1)
J. Rajgopal, Department of Industrial Engineering, University of Pittsburgh,
Pittsburgh, PA 15261, USA, e-mail: [email protected] (Ch. 25)
B. Reiser, Department of Statistics, University of Haifa, Haifa, Israel (Ch. 18)
S. E. Rigdon, Department of Mathematics and Statistics, Southern Illinois
University at Edwardsville, Edwardsville, IL 62026-1653, USA, e-mail:
[email protected] (Ch. 2)
A. Sen, Department of Mathematics and Statistics, Oakland University, Rochester,
MI 48309, USA, e-mail: [email protected] (Ch. 19)
M. Shaked, Department of Mathematics, University of Arizona, Tuscon, AZ
85721, USA, e-mail: [email protected] (Ch. 6)
D. N. Shanbhag, Department of Probability and Statistics, University of Sheffield,
Sheffield, $3 7RH, England, UK, e-mail." [email protected]
(Ch. 7)
F. Spizzichino, Dipartimento di Matematica, Universita degli Studi di Roma
ùLa Sapienza", Piazzale Aldo Moro, 2, 00185 Roma, Italy, e-mail."
[email protected] (Ch. 6)
J. Stein, G.E. Corporate Research and Development, Building K1-4C27A, One
Research Circle, Niskayuna, N Y 12309, USA, e-mail: [email protected]
(Ch. 26)
K. S. Sultan, Department of Mathematics, AI-Azhar University, Nasr City, Cairo
11884, Egypt (Ch. 5)
xxii Conlributors

K. Suzuki, Department of Systems Engineering, The University of Electro-


Communications, 1-5-1 Chofugaoka, Chofu-city, Tokyo 182-8585, Japan,
e-mail: [email protected] (Ch. 21)
G. J. Szekely, Department of Mathematics and Statistics, Math Science Building,
Bowling Green State University, Bowling Green, OH 43403-0221, USA, e-mail."
[email protected] (Ch. 9)
L. Wang, Graduate School of Information Systems, University of Electro-
Communications, Tokyo 182-8585, Japan, e-mail: [email protected]
(Ch. 21)
M. Xie, Department of IndustriaI and Systems Engineering, The National
University of Singapore, 10 Kent Ridge Crescent, Singapore 119260, e-mail:
[email protected] or [email protected] (Chs. 3, 28)
E. Yashchin, IBM Research Division, Mathematical Sciences Department, Thomas
J. Watson Research Center, P.O. Box 218, Yorktown Heights, N Y 10598, USA,
e-mail: yashchi@us,ibm.com (Ch. 18)
S. Zacks, Department of Mathematical Sciences, Binghamton University,
Binghamton, N Y 13902-6000, USA, e-mail: [email protected]
(Ch. 32)
N. Balakrishnan and C. R. Rao, eds., Handbookof Statistics, Vol. 20 |
© 200i Elsevier Science B.V. All rights reserved. 1

Basic Probabilistic M o d e l s in Reliability

N. Balakrishnan, N. L i m n i o s and C. Papadopoulos

Notation
SMP semi-Markov process
EMC embedded M a r k o v chain
r.v, r a n d o m variable
RE relative error
R(t) reliability function
A(t) pointwise availability function
A limit availability
M(t) maintainability function
system failure rate function
MTTF mean time to failure
MTTR mean time to repair
MUT mean up time
MDT mean down time
MTBF mean time between failures
N(t) number o f j u m p s of the semi-Markov process in the time interval
(0,t]
~.(t) number of visits of the semi-Markov process into the state i in the
time interval (0, t]
Qij(t) semi-Markov kernel: discrete state space case; i E E, j E E, t E IP,+
p8 transition function of the M a r k o v chain (Jn): discrete time case
H,(O distribution function of sojourn time in the state i, i E E
~ij(t) M a r k o v renewal function: discrete state space case;
i EE, j EE, t E IR+
initial law
#U mean hitting time of the SMP into the state j, starting in state i
#u mean hitting time of the E M C into the state j, starting in state i
mi mean first jump time under IPi or mean sojourn time in state i
«(x,; t E ,r) the a-algebra generated by the family of r a n d o m variables (Xt; t E I)
Q1 -x Q2 Stieltjes convolution of two semi-Markov kernels on E
Q(,) nth fold Stieltjes convolution of the semi-Markov kernel Q, n E N
Stieltjes convolution product
2 N. Balakrishnan, N. Limnios and C. Papadopoulos

« matrix Stieltjes convolution product


a.s. almost surely
N the set of natural numbers: {0, 1 , 2 , . . }
N* the set of positive natural numbers: { 1 , 2 , . . }
IP, the set of real numbers
IR+ the set of nonnegative real numbers: [0, ec)
lA indicator (of characteristic) function of a subset A;

{1 ifxŒA
lA(x)= 0 ifx~A

l(x) the Heavyside function on IR;

I1 ifx_>0
l(x)= 0 ifx<0

1s~/" is an s-dimensional column vector ( 1 , . . , 1 , 0 , . . , 0) ', with r l's and


s - r 0's; if r = s, we write 1« (or simply 1) in place of 1~«
d
___+ convergence in distribution
a.s.
_____+
convergence almost sure
weak convergence of random elements
N(0, 1) standard normal r.v. (mean # = 0; variance a 2 = 1)

1. Introduction

The aim of this chapter is to give the basic probabilistic models in reliability. We
present thus the discrete time Markov chains (DTMC), the continuous time
Markov chain (CTMC) and the semi-Markov model in continuous time. In the
case of DTMC, where the reliability is modeled in discrete time, the formulation is
simple and can be used to model reliability in a first approach. Moreover, in most
cases the numerical accuracy of this formulation is good. In the case of CTMC we
give explicit formulae for reliability-related indicators in continuous time. As far
as semi-Markov processes are concerned, we give an explicit formulation of re-
liability-related indicators in continuous time and of finite state space. We also
give statistical estimation for reliability and availability.
We then continue with the basics of Monte Carlo methods that are used quite
often in reliability theory. The simplicity of a Monte Carlo method, its efficiency
on higher dimensions as weil as its ability to model any arbitrary system, are the
main advantages of this method. We present the basic idea of Monte Carlo
method that was originally used to estimate integrals, and we continue with the
presentation of simple algorithms for the simulation of discrete and continuous
random variable (r.v.), as well as D T M C and CTMC. We discuss the problem of
rare event estimation and we briefly review the basic principles of the well-known
variance reduction methods. Importance sampling is also quite useful in reliability
systems where some basic system's parameters can be estimated by simulating the
corresponding model over regenerative cycles.
Basic probabilistic models in reliability 3

We terminate with two algorithms for the simulation of semi-Markov pro-


cesses and the results obtained for a simple three-state system.
This chapter principally consists of two parts: the first one containing the basic
probabilistic models in reliability, while the second one deals with mainly Monte
Carlo simulation of reliability systems.

2. D i s e r e t e time M a r k o v ehains and reliability

2.1. Basics o f M a r k o v chains

2.1.1. Definitions
Let X = (Xn, n E N) be an E-valued stochastic process defined on (~2, ~ , IP).

DEFINITION 2.1. The stochastic process X is a discrete time M a r k o v chain


(DTMC) il, f o r all n E N , and f o r all i,j, io, i l , . . . , in 1 in E, we have

IP(Xn+~=jIX0 = i0,X1 = i l , . . . ,Xn-1 = in-l,Xn = i)

= lP(Xn+l = jIXn = i) = Pn(i,j) . (1)

The family of probabilities Pn(i,j) is called the transition probability function.


In the case where this function is independent of the time n, we say that the
Markov chain X is a (time) homogeneous Markov chain and we put
P ( i , j ) := Pn(i,j) for all i , j E E. In the finite case we will refer to the transition
probability family P mostly as the transition matrix P where the (i,j) entry is
P(i,j).
The Chapman-Kolmogorov equation can be written as follows:

Pn+m(i,j) = ~ Pn(i,k)pm(k,j) , (2)


kcE

where P n ( i , j ) is the transition probability from state i to state j in n steps of time,


i.e.,

P n ( i , j ) = IP(Xn =jIÆ0 = i) .

Let us also denote by ~ the initial distribution of X, i.e., for any i E E,

~(i) = IP(XO = i) .

PROPOSITION 2.1. For all n >_ 1 and all i0, il, . . . , in E E, we have:
1. IP(Xo = io,X1 = i l , . . . ,Xn 1 = in-l,Xn = in) = c t ( i o ) P ( i o , i l ) . . . P ( i n - l , i n ) ;
2. 1P(Xù+I -----i l , . . . ,Xn+k-~ = ik-l,Xn+k = ikIXn = io) = P(io, il)...P(ik-1, ik);
3. ]P(Xn+m = jlXm = i) = IP(Xn = j[Xo = i) = p n ( i , j ) .
4 N. BaIakrishnan, N. Limnios and C. Papadopoulos

PROPOSITION 2.2.
1. The sojourn time o f the system into the state i E E is a geometric r.v. with
parameter P ( i , j ) .
p(ij)
2. The probability that the system enters stare j when it leaves stare i is' 1-p(i,i)"
The above two propositions allow us to simulate by M o n t e Carlo m e t h o d s a
M a r k o v chain.

EXAMPLE 2.1. (Binary c o m p o n e n t ) Consider a binary c o m p o n e n t (or system)


having an up (functioning) state and a down (failure) state that we denote by 1
and 0, respectively. At time n = 0 the c o m p o n e n t starts functioning until its
failure at a r a n d o m time, S1 say, where it is replaced instantaneously by an
identical one which lasts a r a n d o m time, $2 say, and so on. The lifetime distri-
bution of the c o m p o n e n t s is 0 = (On,n E N*), i.e. IP(S1 = n ) = 0 , The r.v.'s
U1, U 2 , . . defined by: U~ = Sn - Sn 1, (n _> 1, So = 0), are i.i.d, and represent the
lifetimes of the successive components.
Assume now that the distribution 0 is the geometric one with p a r a m e t e r p. F o r
a time n >_ 0, define the r.v. X~ with values in E = {0, 1}. The event {X~ = 1}
means that the c o m p o n e n t is functioning at time n and the event {X~ = 0} means
that the c o m p o n e n t is failed at time n.
Then it is clear that the stochastic process X = (Xù, n >_ 0) is a M a r k o v chain
with state space E, transition matrix P

~(~~ ~) q 1-q

and initial distribution (a, 1 - ~) (c~ E [0, 1]). After some calculus we obtain the
following spectral representation for the powers of the transition matrix P:

pn = 1- p p 1 -~
q 1 q --p+q p+q -q
(3)
The state probability vector is

P(n) = (P1 (n),P2(n)) , (4)

(1 - p - q)n
Pl(n) - q ~- (p~ - q(1 - ~)) . (5)
P+q P+q

2.1.2. Recurrent and transient states


A state of a M a r k o v chain m a y be either transient or recurrent. This is a
f u n d a m e n t a l notion in the study of M a r k o v chains.
L e t j be a fixed stare and the r.v. S j, s J , . . , represent the times of the 1st, 2nd,...
return to stare j. Define also U~ - S~ - S~_I,
J n = 1,2,.. (take S° = So), to be the
B a s i c p r o b a b i l i s t i c m o d e l s in reliability 5

times between two consecutive returns to state j, which are called recurrence times
of state j and

Nr:=~ l{xk=j} , (6)


k=l

N/~.:= ~ l(X k ~=i,Xk=j} , (7)


k=l
Nj := ev~ , (8)
Nij:=N,f , (9)
p,y := ]Pi(S{ < oo) , (10)
B/j:= IEiS{ . (11)

DEFINITION 2.2. A state i is called a recurrent s t a t e i f Pii = 1. On the other hand, if


Pii < 1, then i is called a transient stare. I f i is a recurrent state lhen either ~ii < oo,
and it is called recurrent positive state or #ii = OO, and it is called null recurrent
state.

PROPOSlTION 2.3.

1. A state i E E is transient iff ( i f and only if) lPi(Ni = + o o ) = 0 or / y


~ n P " ( i , i ) < oo.
2. A state i E E is recurrent ifflPi(Ni = +oo) = 1 or iff ~ n P n ( i , i ) = O0.

PROPOSITION 2.4. I f i and j are two recurrent states, then


N n a., l
- - --~ - - » a s 17 ---+ O0 » (12)
n [lii
N n
ij ~.s. _ / . . ~ 1
-- --+ v ~ , 1 , - - , asn--+oc . (13)
n #ii

2.1.3. Stationary probability and asymptotic behavior


DEFINITION 2.3. A probability distribution on E, is called stationary or invariant
with respeet to the M a r k o v chain X (or to the transition probability P ) , if f o r all
j E E, we have

Z re(i)P(i,j) = ~(j') . (14)


lEE

Eq. (14) can be written in matrix form as


reP : re .

PROPOSlTION 2.5. For every recurrent state i, we have: re(i) = l /#ii. For each state
i, deßne di = g.c.d. {n : n > 1,P"(i,i) > 0}.
6 N. Balakrishnan, N. Limnios and C. Papadopoulos

DEFINITION 2.4. A state i E E is said to be periodic if di > 1, and aperiodic if di = 1.

DEFINITION 2.5. A recurrent positive and aperiodic state is called an ergodic state.
An irreducible M a r k o v chain with all its states aperiodic is called an ergodic Markov
chain.

PROPOSITION 2.6 (Ergodic t h e o r e m for M a r k o v chains). For an ergodic M a r k o v


chain we have

P~(i,j) --+ re(j), as n --+ oc .

PROPOSITION 2.7. I f E is finite and the chain is irreducible and aperiodic, then
exists and pn converges toward FI = l~z with an exponential rate.

2.2. Reliability in countable time


2.2.1. Introduction
The reliability of a system can be studied in countable time too; for example in N
instead of IR+. This lies in its easy formulation and calculus. This countable
formulation that can be used as a first approach, m a y also be, in m a n y cases,
sufficient.
Let us consider a c o m p o n e n t (of a system) that is observed in times
n = 0, 1 , 2 , . . and suppose that at time n it occupies stare x a m o n g a n u m b e r of
possible stares given by its stare space. In the time interval (n, n + 1), where the
c o m p o n e n t is not observed, it can change or not its state with probabilities p and
q = 1 - p , respectively. We can then define the "failure rate", the reliability, etc.
Let us define the N - v a l u e d r.v. T which denotes the lifetime of the a b o v e
c o m p o n e n t , then the failure rate is defined as follows:

)o(n) := IP(T = n]T >_ n) (15)

for any n E N. The sequence 2 = (2(n),n E N ) is a failure rate o f a c o m p o n e n t or


a system if 0 _< 2(n) _< 1 and ~n>0 2(n) = +oo.
Moreover, the probability of (ailure at time n _> 0 is
(]ln-1 )
f ( n ) = IP(T = n ) = \ ~ . ö [ 1 - 2(i)] )~(n) (16)

with the convention that H ó (') = 1.


The reliability at time n > 0 is given by
n
R(n) = IP(r > n) = H [ x - 2(i)1 . (17)
i--0
If R(m) > 0, we have
Basic probabilistic models in reliability 7

f(n)
(18)
;(~) - R(~)

for all 0 < n < m.

2.2.2. Muhistate Markov systems


Consider a c-order system (C, (p), where C - { 1 , . . ,c} stands for the set of its
components (c E N*) and (p for its structure function. Ler S be the state space of
the system and of its components. (There is no loss of generality by considering
the same state space for all components and for the system.) For example
S = {0, 1 , . . ,M}, where M is the perfect state of the components and of the
system and 0 the complete failure. When M = 1 we say that the system is binary
with 1 the working state and 0 the failure state.
Let X i = (Xi, n E N) be an S-valued stochastic process describing the behavior
of the component i, and ( X 2 , . . ,X~) be the SC-valued stochastic process. This last
process describes jointly the states of the system components. Consider now a
one-to-one mapping h : S c -+ E, where E = { 1 , . . , s}, s = # S « and the process
Xn = h ( X 2 , . . ,X~). In general we take h to be the lexicographic order of the
set S c.

EXAMPLE 2.2. Consider a second-order binary system. We then have: c = 2,


S = {0, 1}, S 2 = {(0,0),(0, 1), (1,0),(1, 1)} and E = {1,2,3,4}.
If the components are independent and the above processes X i are Markov,
then the process X is a Markov chain too. The following proposition gives a more
precise description.

PROPOSITION 2.8. Consider two independent Markov cha&s, X 1 and X 2 say, $1 and
$2 valued with transition functions p1 and p2, respectively. Then the (X1,X 2) is an
S 1 x S2-valued Markov chain with transition function P, with P((i,j), (k, g)) --
p l ( i , k ) p 2 ( j , g ) , for all i,k E S 1 and j, g E S 2.
The above proposition can be generalized for more than two processes.
In the case of binary systems, we have to partition the whole stare space E into
disjoint sets, U and D say, where U includes the working states (up stares) and D
the failure states (down states), (i.e. U U D -- E, U N D = ~ and U ¢ (~, D ¢; (~.)
The reliability-related indicators at time n > 0, become:
• Reliability: R(n) = IP(Vr E [0, n] M N,Xv E U).
• Availability (pointwise): A(n) = 1P(Xn E U).
• Maintainability: M(n) -- 1 -1P(Vv E [0, n I N N,X~ c D).
Consider now a binary multistate system (BMS) with state space
E={1,..,s}, up states U = { 1 , . . , r } and down states D = { r ÷ l , . . . , s } ,
described by an E-valued Markov chain X, with transition probability matrix P
and initial distribution vector et.
8 N. Balakrishnan, N. Limnios and C. Papadopoulos

Let us consider the following partition of P and c~corresponding to the U and


D sets:
p = (Ph P12 "~
\ P21 P22 J~
=(< ~~) •

We have the following results.

PROPOSITION 2.9. For the above B M S , we have:


• Availability: A(n) = c~Pnls,r.
• Reliability: R(n) = cqP~~lr.
• Maintainability: M(n) = 1 - a2P~21s_r.

2.2.3. Hitting times


Besides the main quantities of interest in reliability theory, other quantities such
as the m e a n hitting time of failure stares, are also important. In what follows we
will define these m e a n hitting times in the f r a m e w o r k of M a r k o v chains.
• Mean time to failure ( M T T F ) .
• Mean time to repair ( M T T R ) .
• Mean up time ( M U T ) .
• Mean down time (MDT).
• Mean time between failure (MTBF).
Ler T and Y be the hitting times of sets D and U, respectively, i.e.,

T = inf{k >_ 0 : X~ E D}

and
Y=inf{k>0:X~EU}

with inf0 = ec. We have the following explicit formulas.

PROPOSITION 2.10.

MTTF := IET = ~1(I - P11) l l r , (19)

MTTR := IEY = ~2(I -- P22)-lls-r . (20)

I f the M a r k o v chain describing the above B M S is an ergodic chain, with stationary


probability row vector 7z = (7zl, 7z2) (a partition following the U and D sets), we have
the following result.

PROPOSITION 2.1 1.
7"Cllr
M U T := ~E3~[Tl -- rtiP211r (21)
Basic probabilistic models in reliability 9

/r2 ls-r
M D T := IEB2[Y] ~2P121 r , (22)

where fil and fl2 are the input distributions to the sets U and D, respectively, under re.
For the M T B F we have the following representation."

MIBF = MUT + MDT . (23)

PROPOSITION 2.12. The variance o f the hitting time T o f the set D is given by

Vari(T) = V(i) - (L1(i))2 , (24)

where V = [IEI(T2), ..,IEm(T2)] ' = ( I - P l l ) - a [ l + 2 P l l ( I - Pll) 11] and LI =


[]EI(T),..,IEm(T)]'= (I-Pll)-11, where a' means the transpose vector o f
vector a.

EXAMPLE 2.3. In the case of the binary c o m p o n e n t presented in the example of


Section 2.1.1, we have for the initial distribution vector (e, 1 - e ) :

Reliability: R(n) = e(1 - q)",


q (1 - p - q ) "
Availabiaity: A(n) -- - - + (p~ - q(1 - c~)),
P+q P+q
Maintainability: M ( n ) = 1 - (1 - «)(1 - q)",
1
MTTF = MUT -
1 -p'

l
MTTR = MDT =
1-q

2.3. Non-homogeneous M a r k o v chains and reliability


Consider a M a r k o v chain X = (Xù, n E IN) whose transition function depends on
time n, i.e.

Pù(i,j) = IP(Xn+I = jlXn = i)

for all i , j E E and n E il,I. Then X is called a n o n - h o m o g e n e o u s discrete time


M a r k o v chain ( N H D T M C ) .
Define also, for m > n > 0, the transition probabilities

Pù,m(i,j) = IP(Xm = jlXù = i)

and, for n _> 0,

P~,ù(i,j) = l{i=j} •
10 N. Balakrishnan, N. Limnios and C. Papadopoulos

We have Pn,n+l (i, j) = Pn (i, j). Moreover, the C h a p m a n - K o l m o g o r o v equation is

Pù,m(i,j) = Z Pm,r(i, k)Pr,m(k,j)


kEE

forn<r<m.
Define now the matrices Pn = (Pn(i,j); i,j E E) and Pn,m = (Pn,m(i,j); i,j C E).
F r o m the C h a p m a n - K o l m o g o r o v equation we can easily obtain
r-1
Pn,n+r(id) -- I1 Pn+~(i,j)
k=0
Let Œ be an initial distribution on E. Then, using the conditional proba-
bility formula and the M a r k o v property, the following relations can be easily
derived:

PROPOSITION 2.13. For all n >_ 1 and all io, il,..., in C E, we have:

1. IP(Xo = io,X1 = i l , . . . ,Xn-1 = in 1,Xn = in) = o~(io)Po(io, il)" "Pn-l(in-1, in).


2. IP(X~+I = i l , . . . ,X~+k-1 = i ~ - l , X n + k ----i~lX~ = io) =Pn(io,il)'" "Pn+k-l(ik-l,ik).

EXAMPLE 2.4. Consider an N H D T M C X with transition probabilities

Pn = n+2
1 1+ •
n+--~ ~
Then for any initial distribution e, we have
(i 11@+ /
Pn,m = n+21 21
n+2 2 q- ~ /
The following results can also be f o u n d in Platis et al. (1998).

PROPOSITION 2 . 1 4 . The availability of an N H D T M C system is given by


n-1
A(n) = { Œ[L=0Pkl~,r, n >_ 1,
2 j ~ ~ «(J), n = 0

PROPOSITION 2.15. The reliability of an N H D T M C system is given by


o~ "Fln l U
R(n) = { 1 Ilk=0 Pk L , n >_ 1,

PROPOSITION 2.16. The maintainability of an N H D T M C system is given by


n 1 D
A(n) = { 1 -- ~X2 Hk=0
Pk 1~_~, n _> 1,
1 - ~ j c D c~Ü), n = 0 .
Basic probabilistic models in reliability 11

PROPOSITION 2.17. The M T T F and M T T R of an N H D T M C system are given by

M T T F = Cq ( I
X
/
oe k D)
MTTR = ~2 ( I
\ +Z~P~~=0 1~~

}~XAMPLE 2.5. Consider an absorbing N H D T M C with transition probability


matrices, for n = 0, 1 , 2 , . .

( (n+l)a
, (n+l)a
~ 1_~)
Pn = 1 1
(n+l)b0 (n+l)b0 1 ~ j ,

where a > 1 and b > 1. The stare space partition is: U = {1,2} and D = {3}.
Then, we have:

2(a + b) ~-I
A(n)=R(n)- n[anbù_l , n> 1

and A(0) = R(O) = 1. M o r e o v e r ,

2b .1+~
MTTF=I+ [e~ - 1 ] .
a+b

3. Continuous time Markov chains and reliability

3.1. Definition
This section deals with continuous time M a r k o v chains, i.e. I = IP,+. As in the
previous section we will consider here a probability space (f~, ~ , IP) where we
define an E-valued stochastic process X = (X(t), t c IP,+). The state space E here
is at m o s t a countabte space.

DEFINITION 3.1. The stochastic process X is a continuous time Markov chain


( C T M C ) if, for all j E E, and all t, h >_ O, we have

Ip(X(t+h) =j]X(u),u < t) = Ip(X(t+h) =jIX(t)), (a.s.) . (25)

We consider here time h o m o g e n e o u s processes in which case the transition


function Pt,t+h(i,j) := IP(X(t) = jIX(t) = i), i E E, j E E and t > h > 0 is inde-
pendent of the time t. Thus we can put Pt,t+h(i,j)= Ph(i,j). The C h a p m a n -
K o l m o g o r o v equation in this case can be written as follows:
12 N. Balakrishnan, N. Limnios and C. Papadopoulos

Pt+h(i,j) = Z P t ( i , k ) P h ( k , j ) , (26)
kEE
which in matrix form can be written as
Pt+h - PtPh • (27)
Thus the family (Pt, t _> 0) forms a semi-group. We will study here Markov
processes w/th semi-group satisfying the following property:
limp,(i,j) = l{i_j} (28)
t+0

for all i,j E E and the semi-group will be called standard. If limtloPt(i,i) = 1 is
satisfied uniformly with respect to i E E, then the semi-group will be called uniform.

3.2. Reliability in continuous t/me


Let us follow the same formulation of the reliability-related indicators as in the
case of DTMC. The difference is that the t/me here is continuous.
Consider a CTMC X, w/th state space E = ( 1 , . . , s}, generator A, transit/on
function Pt and initial distribution 0{. As in the case of DTMC, consider the up
states set U = { 1 , . . , r } , and the down states set D = {m + 1 , . . , s } and the
part/tion of the generator and the initial distribution vector following U and D:

A= (An A12)
\A21 A22
0{ = (0{1 0{2) .

Availability:
PROPOSITION 3.1. The (pointwise) availability of a C T M system is g/ven by
A(t) = 0{etAls,r . (29)

The steady-state availability, denoted by Aoo, is given by

Aoo : Z ~ ( k ) : 7c-ls,r •
kEU

Reliability:
PROPOSITION 3.2. The system's reliability is R(t) = 0{~etanlr.

Maintainability:
PROPOSITION 3.3. The maintainability is g/ven by M(t) = 1 - 0{2etA221s_r.

Hitting times:
For the mean hitting times that were defined in the previous section we have the
following:
Basic probabilistic models in reliability 13

PROPOSITION 3.4.

MTTF = - CqAllllr,
MTTR = - Œ2A~lls r •

PROPOSmON 3.5. I f the C T M C X is ergodic, then

7"Cll m
MUT - - -
7ZlA21 l r '
7"C21N_m
MDT -
7z2A121s-r

and M T B F = M U T + MDT.

PROPOSITION 3 . 6 .

Var(T) = 2«lAi-~l - ( C q A l ? l ) 2 .

3.3. Distributions o f phase type


Phase-type distributions (of Ph-distributions) play a very important role in
reliability theory for two reasons. The first one is because of a property making
this family to be a dense set in the set of all distributions on IR+, while the second
one is due to the fact that we have an explicit formulation of the basic system
operation in reliability as parallel system, series system, redundant cold standby
system, etc. (see Neuts, 1981; Asmussen, 1987).
Consider a M a r k o v process, X say, with generator A and stare space E =
{1,..,N+ 1} with the following partition: U = { 1 , . . , N } and D = { N + 1}.
Suppose that U is a transient class and state N + 1 is an absorbing state. Fol-
lowing U and D the generator and the initial distribution vector can be parti-
tioned as follows:

A:(~ A0
0)
and (c~,~ZN+I).

DEFIMTION 3.2. A distribution function x H F(x) on [0, oc) is o f phase type ( P H -


distribution), if it is the distribution function o f the absorbing time o f a C T M C
defined as above. The couple (Œ, T) is called a representation o f F.

The distribution of absorbing time in the state N + 1 is given, for x _> 0 , by

F(x) = 1 - c~exrl .

Properties (Neuts, 1981):


14 N. Balakrishnan, N. Limnios and C. Papadopoulos

1. It possesses an atom at x = 0, which is equal to C~N+I. The absolutely contin-


uous part has density f , given by f(x) = F'(x) = aeXTTo, x > O.
2. The Laplace-Stieltjes transform/~ of F is given by

/W(S) = ~ N + I -~- O~(SI -- T)-IT0, for Re(s) _> 0 . (30)


3. The moments (no centered) are ]~!. z (-1)~n!(~T-'l), n > O.

PROPOSITION 3.7 (Convolution of two distributions of phase type - N e u t s , 1981).


Consider two distributions of phase type, F and G say, with representations (c~,T)
and (fi, S) of order M and N, then their convolution F * G is a phase-type
distribution with representation (7, L), where

0:0)
and 7 = (e, CgN+lfl)"
Let us define now some elements of Kronecker's algebra useful for the next
proposition. The operation ® is the Kronecker's sum of two matrices. Note d//m,
the space of matrices of dimension m x n and let A c J g , , and B E Jmm; the
Kronecker's sum is defined as follows:

A = (A ®Ira) + (B®I,)

with ® the Kronecker's product. Let A E ,/¢lkt and B E ~/~mn, we have:


A @ B E J~kxmflxn and

allB ... a11B)


A®B=
ak~B ... aklB

PROPOSITION 3.8 (Formation of series and parallel systems - Neuts, 1981).


Consider two phase-type distributions F and G with representations (cq T) and (fl, S)
of order M and N, respectively, then:
1. The distribution K given by K(x) = F(x)G(x) is a Ph-distribution of represen-
tation (7, L) of order MN + M + N with

L = (T@So0 I®SOTo T O o 1 )

and 7 = [c~@ fi, fiN+lCq C~M+lg]


2. The distribution W defined by W(x) = 1 - ( 1 - F ( x ) ) ( 1 - G ( x ) ) is a Ph-distri-
bution of representation (c~® ~, T @ S).
Basic probabilistic models in reliability 15

P~OPOSITION 3.9 (Asymptotic behavior - Neuts, 1981). Let F be a Ph-distribution


of representation (cq T). I f T is irreducible, then F is asymptotically exponential, i.e.,

1 - F(x) = Ke -z~ + o(e -~=) ,

K > O, 2 > 0 with - 2 the eigenvalue of T having the greatest modulus of real part
and K = c~v where v is the right eigenvector of T corresponding to the eigenvalue

4. Semi-Markov processes and reliability

Some systems satisfy a M a r k o v property not for all points of time but only for a
special family of increasing stopping times. These times are the state change times
of the considered stochastic process.

4.1. Basic results and definition o f a Markov renewal process


4.1.1. Definition
Consider an at most countable set, E say, a two-dimensional stochastic process
(J,S) = (J~,S~, n E N), where the r.v. J~ take values in E. Consider also that the
r.v. Sn takes values in IP,+ and satisfies 0 = So _< $1 _< $2 _< ...

DEFINITION 4.1. The stochastic process (J, S) is called a Markov Renewal Process
( M R P ) if it satisfies the following relation:

lP(J~+l = j, Sn+l <_ ttJo,J1,S1, . . Jn,Sn) = IP(J~+I = j, Sn+l <_ tl&,S~)

=: Q«o«ù+, (t - sù), (~.~.)

for all n E N , j C E and t E IR+.


The set E is called the state space of the MRP.
F r o m these relations it is clear that (Jn) is a M a r k o v chain with state space E
and transition probability P(i,j) = Qij(oc). It is called the embedded Markov
chain. For every i E E, we have P(i, i) = O.
Let us now define, the counting process (N(t), t >_ 0) associated to the point
process (Sn, n _> 0), i.e., for each time t _> 0 the r.v. N(t) is

N(t) := sup{n : Sn <_ t}

and define the continuous time process Z = (Z(t), t E IR+) by

Z(t) := JN(t) •

Then, the process Z is called semi-Markov process. Define also


16 N. Balakrishnan, N. Limnios and C. Papadopoulos

P~j(t) = IP(Z(t) = jlz(0) = i),

Hi(t) = Z Qij(t)'
j~E

mi=
/o ~ [1 - Hi(u)ldu •

The M R P considered here will be strongly reguIar, i.e., for every t E IR+,
N(t) < oc (a.s.), and for all j c E, mj < oc.

4.1.2. Particular cases


1. Discrete time M a r k o v chain

Qij(t) = P(i,j)l{t>_l} for all i,j E E, andt_>0 . (31)

2. Continuous time M a r k o v chain

Qij(t) = P(i,j)(1 - e -2(i)t) for all i,j C E, and t > 0 . (32)

3. Renewal processes:
(a) Ordinary: it is an M P R with two states E = {0, 1}, P(0, 1) = P(1,0) = 1
and Q0~ (') = F(.), where F is the c o m m o n distribution of the inter-arrival
times of the renewal process.
(b) Modified or delayed: it is an M R P with E = {0,1,2}, P ( 0 , 1 ) = 1,
P(1,2) = P(2, 1) = 1 and 0 elsewhere and Q01 (') = F0(.), Q12(') = Q21 (') =
F(.), where F0 is the distribution function of the first arrival time and F is
the c o m m o n distribution function of the other inter-arrival times of the
renewal process.
(c) Alternating: E = { 0 , 1 } , P(0,1)=P(1,0)= 1, and 0 elsewhere and
Q01(') = F(.), Qa0(') = G(.), where F and G are the distribution functions
corresponding to the odd and even inter-arrival times.

4.1.3. Markov renewal equation


By a renewal argument, we obtain

Pij(t) = l{i:j}(1 - Hi(t)) +


~/0 ' Qik(ds)Pkj(t -- s) ,

which in matrix f o r m becomes


e(t) = [I - H(t)] + Q . P(t) .

Solution:
If supjHj(t) < 1 for some t > 0 and maxi,j SUpxI(I - H(x))(i,j)l _< 1, the solution
of the above M R E exists, is unique and it is given by

P(t) = (I - Q(t)) (-1) * (I - H(t)) ,


B a s i c p r o b a b i l i s t i c m o d e h ' in reliability 17

where

~9(t) = (I - Q(t)) (~) = Z Q ( " ) ( t )


n>O

is the M a r k o v R e n e w a l Function. Moreover,

O!~)(t) =
x-~U
{0~k f ó~/~1/
Qik (t-u)Qkjdu, ift>O
ift<O
and
(o)
Qij (t) = l{i_j}l{t>o},
(1)
Qij (t) Qij(t) .

4.1.4. L i m i t distributions and theorems


Here we give some limit theorems useful for the reliability analysis. More
importantly, these theorems give an extension of the basic classical limit theorems
in probability theory to the semi-Markov setting.

THEOREM 4.1 (Steady-state distribution, Taga, 1963).

7rj = l i m Pij(t) = vjmj


vm

where v = (vi) is aH invariant measur« of (Jn) and m = (mi) with mi =


]Ei[S1] : f~(1 - Hi(t))dt.
The following two theorems are straightforward applications of the Blackwell
renewal theorem and of the key renewal theorem, respectively.

THEOREM 4.2 (Blackwell type theorem, ~inlar, 1969).


c
~,ü(t) - ~,,,(t- «) ~ E,[sl] ' ast--+ (?~ .

THEOREM 4.3 (Key renewal type theorem, ~inlar, 1969). I f i is non-periodic


persistent state and hi a direct R i e m a n n integrable function, then
t 1 oo
/o ~~~ldyl~~l~-~l ~ ~ /o ~~~ld~

THEOREM 4.4 (Law of large numbers, Taga, 1963).


1 n
~~X~~~~[Xl],= as t ~ o~ .
18 N. Balakrishnan, N. Limnios and C. Papadopoulos

THEOREM 4.5 (Central limit theorem, Taga, 1963).

Xl + . . . + X, - hIE[X1] --+ N(O, 1) .


nv/h~io-i

Under the hypotheses that E~[X1] < oc, and a2i = IEi([Y[ - SlIEr(X1)] 2) < ec
where yi = 2j=s~ 1+1Xj, S~ = 0, n = 1 , 2 , . . . , then S i, is the recurrence time of state
i for the Markov chain (J~).

4.1.5. A central limit theorem


Let f be a real measurable function defined on E x E x IR. Define, for each t _> 0,
the functional Wf(t) by

N~j(t)
Wf(t)= Z E f ( i , j , Xij~)
i,j n=l

when the series converges. Put:

Aij=
/0 f(i,j,x)dQij(x), Ai=
j=l
Aij,

0(3 S

Bij =
~0 (f(i,j,x))ZdQij(x), Bi = Z B i j
j=l

and
s

mi = ZAJ/
j=l
ii 5

= - + Br i;/ )j
r--1
s

+ 2E Z E AreAklti*i(#*ti + #i*k- I~e*k)/(#~r#*kk),


r=l g,~-i k~-i

mi ~2
mf~--~ B f - -_ - - z .
#ii ~lii

THEOREM 4.6 (Pyke and Schaufele, 1964). Under the hypotheses that the above
moments are finite, we have that, as t -+ oc,

t-1/Z[wf(t)- t.mf] d N(O, Bf) .


Basic probabilistic models in reliability 19

4.1.6. A functional central limit theorem


Let

W(t) = g(Z(u))du

be a functional of the semi-Markov process (Z(t), t >_ 0).


Hypothesis (H):
• MC (Jn) has a unique invariant measure ~;
• llp<~)
(x,-)-<I <- 0(~),
• ~(g) = L ' ~ ( ~ ) f~+ g(x>H(x, a~) < oo.

THEOREM 4.7 (Limnios and Oprisan, 1999a, b). Suppose that (H) isfulfilled and
~ n > l ~(n) 1/2 < oo and Var~X1 < oo. Put

s,, = ~ g(J~_,)x,~ - ~(g),


k=l

0n (t) = + SE<

and
(72 = Var~[g(Jo)X1] 2 q- 2 ~ Cov~(g(Jo)Xa,g(Jk j)Xk) •
k>_2

Then a 2 < oo, and for all probability measures # on #,


IP~ o 0n~ ~ W

provided that «2 > O. W is the Wiener measure and ~ denotes weak convergence.

4.1.7. Statistical inference


Consider an observation of a sample path of the semi-Markov process on [0, T],
T is a fixed time.
~T = { J 0 , J 1 , .. , J N ( T ) , X 1 , . . . , X N ( T ) } •

Put

k>l

N,j = N,«(r) := Z l~j~ l=,,J~=j,~~_«~ •


k_>l

The empirical estimator of the semi-Markov kernel is:

1 ~
ôgj(t, T) = Ni ~ l{j~_,=+,J~:+X~<_t} .
20 N. Balakrishnan, N. Limnios and C. Papadopoulos

Estimators of other quantities are defined by replacing the semi-Markov kernel


in the analytical expressions by the above estimator.

THEOREM 4.8 (Moore and Pyke, 1961). The empirical estimator of the semi-
Markov kernel." Qij(t, T) is uniformly strongly consistent, i.e.

max sup IO«(t,T)-Qq(t)l 224 O, asT--+oc .


icl" O<t<T

THEOREM 4.9 (Ouhbi and Limnios, 1997). For all t E IP,+

max sup I Ôij(t, T) - ~pij(t) l .... » O, as T--+ oo .


ij O<t<_L

THEOREM 4.10 (Ouhbi and Limnios, 1997).

B { O i j ( t , T) - ~Pij(t)} ~ N(O, az(t)), a s T ----~ o(3~

where
S S

~72(t) = ~ ~ t.t«q{ (tPiq * ~kj) 2 * Q«k(t) - [(~tiq * I/Ikj * Q«k)(t)] 2} .


q=l k=l

THEOREM 4.11 (Ouhbi and Limnios, 1997).


A vailability:
a.s.
sup ]Äij(t, T) - Ai«(t)] ---+ 0, asT--+ec .
O<_t<L

Reliability:
a.s.
sup [Rij(t, T) - Ru(t)[ ---+ 0, asT--+e~
O<_t<_L
(L ~ ~,+).

4.2. Safety and reliability modeling


In reliability problems, the state space E = { 1 , . . , s } is usually partitioned into
the following two sets:
• U = { 1 , . . , r), containing the up states, and
• D = {r + 1 , . . , s), containing the down stares.
Thus, the Reliability-related indicators will be expressed as in Limnios (1996):
Basic probabilistic mode& in reliability 21

Reliability:

R(t) = ~ l ( I - Q l l ( t ) ) ( 1) s r ( I - H l ( t ) ) l r •

REMARK 4.1. T h e s a m e f o r m u l a as a b o v e gives the safety o f the s y s t e m w h e n the


D set c o n t a i n s o n l y u n s a f e states.

Availability:

A(t) = ~(I - Q ( t ) ) (-1) -x (I - H(t))ls,,- .

Mean time to failure:

M T T F = cq(I - P l l ) - l m l .

Mean time to repair:

M T T R = ~z2(I - P l l ) - l m 2 .

Mean up time:

7qm1
MUT -
7~2P211~

Mean Down Time:

MDT - g2m2
7rlP121D

5. Monte Carlo methods in reliability

5.1. Introduction

A M o n t e C a r l o m e t h o d is u s u a l l y u s e d to p r o v i d e a n a p p r o x i m a t e s o l u t i o n to the
f o l l o w i n g e s t i m a t i o n p r o b l e m : L e t h be a real function, h : IR --+ IR. We want to
B n d the value o f the integral
22 N. Balakrishnan, N. Limnios and C. Papadopoulos

=
ä•01 h(x)~. (33)

Even if the problem of estimation o f / h a s no probabilistic content at all, it can be


easily transformed to a problem with an intrinsic probabilistic structure, i.e.

I = (34)
f0 ~h ( x ) d x = /0 ~h ( x ) f x ( x ) d x .

where f x ( x ) = l[0,1](x) is the density of a uniform U(0, 1) random variable X.


Since I = lEIh(X)], the estimation of the integral can be carried out using prob-
abilistic arguments. This fact constitutes one of the greatest advantages of the
Monte Carlo simulation method, together with its efficiency in higher dimensions
and of course, its obvious simplicity.
A Monte Carlo simulation method uses pseudo-random numbers that have the
property of being almost uniformly distributed in a given interval. These pseudo-
random numbers are generated by some deterministic and not at all random
sequence constituting the pseudo-random number generator. The most common
pseudo-random number generators are the linear congruential ones, described by
the following recursive formula:

x0 initial value,
xn=(aXù-l+C) modulom, n> 1 ,

where the initial value x0 is called the seed of the generator, and ~ and c are given
positive integers. The elements of the generated sequence xn will then be ap-
proximately uniformly distributed in the interval {0, 1 , . . , m - 1}. Consequently,
m is called the p e r i o d of the generator and surely, after a large number of trials
(less than m) the sequence of number will be reproduced exactly the same. It is
desirable, therefore, to choose ~, c and m in a way that the period of the generator
is the largest possible, and in fact constructing " g o o d " pseudo-random number
generators constitutes an active research area.
As far as reliability theory is concerned, Monte Carlo simulation is a very
efficient tool since it does not make any restrictive assumptions on the system
model. We can thus simulate without any particular difficulty DTMC, CTMC,
semi-Markov processes, as well as non-Markovian ones. The output of the sim-
ulation of a stochastic process is normally the trajectories describing the evolution
of the system in time, i.e. the stare occupied by the system at any instant. No
doubt, these simulated trajectories will be close to the real trajectories of the
system depending of course on the system model and out ability to clearly de-
scribe it. Consequently, a number of simulated trajectories can be used in order to
estimate system's parameters such as the reliability, the availability, the M T T F ,
the stationary distribution, etc.
In the following, we give an algorithm describing how to simulate a discrete
random variable. This algorithm will be useful to construct an algorithm for the
simulation of DTMC.
Basic probabilistic models in reliability 23

5.1.1. Simulating discrete random variables


In order to simulate a discrete r a n d o m variable X taking values in a set
E = { 1 , 2 , . . ,s}, with distribution pi, i CE, i.e. Pi = IP(X = i), the following
algorithm can be used:

Algorithm 1 - Inverse transform method


Input data: The probability vector p = (Pi, i = 1 , . . , s).
1. Generate a uniform r a n d o m number in the interval (0,1), say u;
2. Find k, k >_ 1 such that
k 1 k
ZPi<__u<~pi ,
i=1 i=1

using the convention that ~~-1" = 0;


3. Set X = k.
The previous algorithm can be ameliorated/modified depending on the distribu-
tion of the r a n d o m variable that we want to simulate (see Ross, 1990).

5.1.2. Simulating discrete time Markov chains


Algorithm 2
Input data: The state space of the D T M C , its initial law #(.), as well as its
transition matrix P.
1. Sample a r.v. X ~ # and put n = 0, Xn(co) = X(c~);
2. Generate a uniform r a n d o m number in the interval (0,1), say u;
3. Find k such that

k-I k
Z P(X~(o,),j) < ~ < ~ P(X~(oo),j)
j=l j-1

4. Set n = n + 1, Xn(o~) = k;
5. Repeat steps 2 4 for the number of jumps of the chain needed;
6. Output the sequence of states visited, described by Xn(c~).
Different implementations of the previous algorithm are also possible. For
example, given that we are in state i at time n, one could generate a geometric
r a n d o m variable having parameter 1 - P(i, i) to find out when the chain will exit
the current state i and then use the transformed transition matrix to find the state
of the system in which the process will finally jump into.
Consider again the binary system described in Section 2.1.1. One typical
(simulated) trajectory of the corresponding M a r k o v chain is given in Figure 1.
Moreover, independent simulations can be carried out in order to estimate the
principal system's parameters.
24 N. Balakrishnan, N. Limnios and C. Papadopoulos

Sample trajectory of the binary system

õ
o

-3
I I I [ I I
1 2 4 5 6 9 10
Time

Fig. 1. A typical trajectory of the binary system.

5.1.3. Simulating continuous random variables


PROPOSITION 5.1 (see Ross, 1990). Let U be a uniform (0, 1) random variable. For
any continuous function F the random variable X defined by X = F -1 (U) has dis-
tribution F (in this, F - l ( u ) is defined to be the minimum value of u such that
F(x) = u).
F o r example, if we want to generate a sample from an exponential ran-
d o m variable with distribution function F(x) = 1 - e - a , x > O, 2 > O, all we
have to do is to first generate a uniform r a n d o m n u m b e r u in (0,1) and then
take

1
x = F - l ( u ) = - 7 l o g ( 1 - u) B

This is very useful in the simulation o f C T M C , where the sojourn times in


different states have indeed an exponential distribution.

5.1.4. Simulating continuous time Markov chains


Algorithm 3
I n p u t data: The state space o f the C T M C , its initial law/~(.), the exit rates at each
stare of the state space (q(x)), as well as the transition matrix P o f the embedded
M a r k o v chain.
Basic probabilistic models in reliability 25

1. Sample a r.v. X ~ # and set t = 0, X(t, co) = X(co);


2. Generate -c an exponential r a n d o m variable with parameter q(X(t, co));
3. Using the transition probability matrix of the embedded M a r k o v chain find
the state y in which the chain will j u m p into;
4. S e t t = t + ~ , X ( t , co)=y;
5. Repeat steps 2-4 for the number o f j u m p s of the chain needed, or until the time
t becomes greater than the observation period T.

6. Variance reduction methods

Consider now the class of systems whose failure is an event of a small probability.
Examples of this type of systems are computer systems and networks, nuclear
stations, communication systems, etc. In order to analyze the behavior of such
systems our principal tool will be the variance reduction methods.
Throughout this chapter the term "rare" will imply a probability that will be
"sufficiently" small. For example, in the communication systems field we are
interested in the probability of error during the transmission of bits. This prob-
ability is often smaller than 10 .6 and doing the analysis of such a system by
taking into account events of this type, becomes an extraordinary task. We have
to underline however, that a definition of the rare event does not clearly exist,
since it depends not only on the probability of the event but also on the system
model. Moreover, the term "rare" reflects our difficulty and the effort in us
obtaining estimates of the associated measure. Thus, the event of a failure of a
two-component system having a failure probability of 10 -9 may be stated as "less
rare" than the failure of a 300-component system whose corresponding failure
probability is 10 s. Moreover, the notion of rare event changes as years pass by.
It may also be the case that the system under question is quite complex con-
taining a large number of components. To model a system having n components
we need a state space of at least 2 n different states. The analysis becomes more
complicated when the system at hand is, furthermore, non-Markovian in nature.
In the case of a large state space, state lumping or state aggregation methods can
be used (cf. Goyal et al., 1987; Kemeny and Snell, 1976) allowing us to reduce the
dimension of the problem and treat a system $1 instead of the original S. However,
these techniques fall into practical difficulties as a considerable amount of com-
puter time and m e m o r y is still required. Furthermore, they are not easily applied
to complicated system models and even if this is possible it may sometimes be quite
difficult to assess the error incurred through the state aggregation process.
Existing analytical methods are not efficient in such settings and we are quickly
obliged to use simulation in order to estimate system's performance. In the case of
rare events direct simulation is not efficient since, in order to get an idea about the
probability of the event, we have to hit it several times and thus use a large
number of samples. For instance, if we want a 20% error and 95% confidence
interval for the estimation of a probability of the order of 10 6, we will certainly
need at least 108 realizations.
26 N. Balakrishnan, N. Limnios and C. Papadopoulos

Given that direct simulation is too expensive in terms of the number of samples
needed in order to obtain a given confidence level, other simulation methods have
appeared to face this problem. These methods are known as variance reduction
methods, and are quicker than the standard Monte Carlo simulation scheine and
can be principally classified into four separate categories (see Bratley et al., 1987;
Fishman, 1996; Ripley, 1987; Ross, 1990):
1. correlation methods;
2. conditioning methods;
3. methods of importance;
4. others...
In the first category, we can find the method of antithetic variables together with
the one of control variables. The underlying principal idea consists in using
correlated (negatively or positively) random variables in order to reduce as rauch
as possible the variance of the corresponding estimator.
The second category comprises the method of conditioning as well as the
method of stratified sampling, both of which are based on the formula of the
conditional expectation/variance.
The third one, deals with the method of importance sampling, a method that
is actually used to estimate systems parameters especially in highly reliable
markovian systems (see Glynn and Iglehart, 1989; Heidelberger, 1995). This
method consists in modifying the original probabilistic dynamics of the system
and carrying out the simulation using a new distribution. A compensatory factor,
called the likeIihood ratio, will remove the bias introduced to the estimator by this
change of measure. The aim of this method is to choose a new distribution that
will result in a variance reduction. Normally, this distribution has to privilege the
rare event under question and make it happen more frequently. In such cases, a
significant amount of variance reduction may be obtained. However, a bad choice
of this new distribution may give rise to an infinite variance and therefore we have
to be very careful when doing such changes of measure.

6.1. Correlation methods


6.1.1. Antithetie variables
Suppose that we are interested in estimating a parameter 0 associated to a given
system model, and consider that 0 can be expressed as the expected value of a
function of a random variable X. The natural way to estimate 0 would be to
consider independent replications of the random variable X, say t " 1 , X 2 , . . ,X,,
and take as estimator of 0 the quantity
1
ôù =~~x~.=,
which is an unbiased estimator for 0. Moreover its variance is given by

var[ô,,] - Var[xl]
n
Basic probabilistic models in reliability 27

Consider now the case when n = 2k, with k E N and the estimator

ô' 1 ff--,x,1 + x~2


~i=1 2 '
which is also an unbiased estimator for 0 and whose variance is given by

Var[Ô'ù] = Var[X11 @212]


4k
In such cases it will be advantageous to have the random variables Xtl,X~z
negatively correlated, since then we will have that Var[Òùl < Var[Ôù]. This is
exactly the case when Xll can be expressed as a function of m random numbers,
Xll = h(U1, U 2 , . . , Um), while X12 = h(1 - UI, 1 - U2, . . , 1 - Um). Of course the
two r a n d o m variables have the same distribution and given that the function h is
monotone with respect to all of its arguments they are negatively correlated. In
reliability systems, normally the function h stands for the structure function of the
system which in most cases is monotone. Thus, using the method of antithetic
variables we have the following two benefits:
• the estimator has smaller variance than the direct estimator;
• only half of the r a n d o m numbers is necessary to carry out the simulation.

6.1.2. Control variables


Consider again the problem of estimation of 0 = lE[X] and let Y be a supple-
mentary r a n d o m variable for which the expected value /~y = lE[Y] is already
known. The method of control variables consists in using the information we
have for the r.v. Y, in order to ameliorate out estimates of 0. Notice that the
variance of the following unbiased estimator for 0:

X + «(Y - ~y) , (35)

is equal to

Var[X + «(Y - gy)] = Var[X] + «2 Var[Y] + 2« Cov[X, Y] (36)

and thus, the value of c minimizing (36) is given by

e* - Cov[X, Y]
VarV]
Moreover, using such a value for the parameter c the variance of the estimator
(35) beeomes

(Cov[x, y])2
Var[X + c*(Y - U~)] = Var[X] VarV] '

which is evidently smaller than the variance of X itself.


28 N. Balakrishnan, N. Limnios and C. Papadopoulos

Even if using the method of control varies we have the benefit of effectively
reducing the variance of the estimator, this method has the inconvenience to
necessitate some additional information. We need to know for example Var(Y)
and Cov(X, Y) which is not always the case. In practice, these quantities have to
be estimated using the first samples of the simulation and the estimates obtained
can be used to give an approximate value for c*. We can then carry out the rest of
the simulation using this value.
The name of control variables sterns from the fact that the random variable Y
and the samples obtained by the simulation play the role of the correction/control
factor. In other words, when the simulated Y value is greater than its already
known expected value, then i f X and Y are positively correlated, X will have the
tendency to be greater than its mean (0), also. However in this case, Cov(X, Y)
will be positive making thus c* to be negative which will consequently adjust the
value o f X + c * ( Y - tty) more closer to 0 than does X itself. Similar arguments
can be given for the case when X and Y are negatively correlated or the observed
value of Y is smaller than its known mean.
No doubt, the efficiency of the method depends principally on the chosen
control variable Y as well as on the precision we have in estimating the c* value.

6.2. Conditioning methods


6.2.1. Conditioning
The conditioning method is based on the following expression for the variance:

Var[X] = IE(Var[X]Y]) + Var(lE[X]Y]) . (37)

Since the variance is always positive, the terms at the right of this last expression
are all positive, implying thus that

Var[X] _> Var(~[XrY]) . (38)

However, since lE{lE[X]Y]} = lE[X] = 0, lE[X]Y] can also be used as an unbiased


estimator of 0 and it is preferable to do so since its variance will be smaller than
the one of the direct estimator as indicated by (38). In fact in this case, we prefer
to simulate the random variable Y and by observing its values make conclusions
about X. In this, we suppose that lE[X] Y] is known or can be easily determined
from the simulation tun.
This method may be quite efficient in the case where we are interested in
estimating the failure probability of a k-out-of-n system. The variable Y will
represent the n - 1 components and if we know the state of these components we
will probably be in position to determine the state of the system.

6.2.2. Stratified sampIing


Stratified sampling is a variance reduction method based on the same formula
(37) as the conditioning method. The idea behind this method is to separate our
original sample space into different strata and carry out independent simulations
Basic probabilistic models in reliability 29

in each of these• Then, by taking the expected value of the output of the simu-
lations on all of the strata we can obtain the desired estimate of the system's
parameter.
However, the principal difference with the method of conditioning is the fact
that the former uses (38) to prove the variance reduction while in the case of
stratified sampling the basic argument is that Var[X] _> IE(Var[XIYI). Moreover,
it is sometimes difficult to separate the sample space into strata and to define
how many samples we have to take from each one in order to have the largest
possible variance reduction. This depends clearly on the problem at hand and
may be difficult to do in the general case, except in problems having an intrinsic
layered structure. See Ross (1990), for the details and examples of this method.

6.3. Importance sampling


6.3.1. Relative error
Consider now the problem of estimation of the following quantity:

7 = ~y[h(x)] = h(x)f(x)d~, (39)


O0

where X is a real random variable defined on a probability space (~2,~ , IP),


having density f and h:lR--+ IR. In order to estimate ~ using Monte Carlo
simulation we can generate an n-sample (~ol,..., con) issued from f and consider
the following unbiased estimator:

n •

In the case where h(x) = lA (x) with lA (x) = 1, if x ~ A and 0 otherwise, then
7 = IP(X ~ A). Moreover, the variance of the ~n estimator is equal to 7(1 - 7)/n,
while the associated 100 x (1 - 6)% confidence interval will be

where z6/2 is defined by the equation 6/2 = P(Z > z6/2) and Z denotes a random
variable having the standard normal distribution N(0, 1). If we are interested in
constructing a confidence interval for 7 the natural way will be to continue the
simulation until the interval's half width becomes less than ~c (~c E]0, 1D times the
value of the parameter that we are trying to estimate. Thus, the stopping criterion
for our simulation will be

)~(1 - ~~) < ~~, which implies that z~/2 < ~c . (40)
z~/2 n 7
30 N. Balakrishnan, N. Limniosand C. Papadopoulos

Note however, that the relative error (RE) of the estimator ~ù, which is defined to
be the ratio of its standard deviation to its expected value, will be given by (as
n ~ +oc)

Bl~-~ù) 1 ^ n-,+o~
REG) = z6/~ - - ~, ,'~ Z 6 / 2 ~, since ?. "7 •

It is this last equation that clearly illustrates the inconvenience of using direct
simulation: the relative error of the estimator remains without bounds, while the
event becomes rarer and rarer (i.e. RE(~n) ~ +ec when 7 --+ 0). It also means that
in order for Eq. (40) to be satisfied and thus obtain the desired relative precision
of estimation, we have to considerably increase the size n of the sample. In other
words, in order to estimate ? up to a certain level of precision, one has to increase
the number of simulation runs as the probability of the event becomes smaller and
smaller.

6.3.2. Some background theory


Importance sampling is a method that may help us to overcome the previous
difficulty. The basic idea of the method is to change the original probabilistic
dynamics of the system, and modify at the same time the function to be inte-
grated.
This change of measure is illustrated as follows:

+o~ h(x ) ~ f
=
f j ~x)
' ( ~ ) & = ~s, Eh(x)L(x)] , (41)

where L(X) = f ( X ) / f ' ( X ) represents the corresponding likelihood ratio and the
subscript f ' means that the expected value is now taken with respect to the new
density fl.
The name given to this method is due to the fact that the process is sampled in
the areas that are more important for the estimation of 7, in the case where
h(x) = lA(x) the areas where the event {X E A} is realized. Consequently, the new
density f~ has to be chosen in a way to make the rare event under consideration
more likely to occur. Since this change of measure introduces a bias to our
estimation, the results obtained by the simulation have to be multiplied by the
appropriate likelihood ratio. This term plays the role of the compensatory factor,
since the system has been simulated using a probability measure that is not
directly associated to the system's model.
Eq. (41) is valid only in the case that f ( x ) > 0, for every x E IP, with f ( x ) > 0
and h(x) > 0, which implies that a possible value o f X under f , is also possible
under S - It is possible however to have f ( x ) = 0 and f(x) > 0, for any x E IP,
with h(x) = 0. By making this change of measure, the new unbiased estimator of 7
will be ~n(f') = ~ ~ L 1 h(oi)L(°»i), where the new n-sample (col,..., con) has now
been generated using density S . Its corresponding variance is given by
Basic probabilistic models in reliability 31

Varf,[~n(f')] = ~ h(x) S ( x ) d x _ ?2

= IEf[h(X)L(X)] - 72 (42)

The main aim of importance sampling is to find a suitable - and easily im-
p l e m e n t a b l e - new density f l in order to minimize the variance of 7n(ff) and by
doing this, reduce the cost of the estimation procedure. Thus, using importance
sampling the rare event has to be realized more often, meaning that its new
probability taust be greater than the original one. The corresponding L term in
Eq. (42) has to be kept as small as possible. In the case where the L term is
uniformly less than one, then Varf,[~ù(f')l < 7 - 7 2 = Varf[Tn(f)] and we will
certainly obtain a variance reduction. Another alternative would be to choose f~
in a way that lEf Ih(X)L(X)] is of the same order of magnitude as 72. In such cases
the associated change of measure is sometimes called asymptotically efficient or
asymptotically optimal (see the survey of Heidelberger, 1995 for a discussion on
this matter).
An optimal change of measure is defined to be a measure that results in a zero
variance estimator for the unknown quantity (see Kuruganti and Strickland,
1995) and it always exists. For our example, this corresponds to choosing

f* (x) -- h(x)f(x) (43)


?

Using the optimal change of measure for the simulation, the exact value of the
parameter will be obtained in the first simulation tun. Unfortunately, it has the
disadvantage of containing 7, the parameter that we are trying to estimate,
making it thus not directly exploitable. Nevertheless, in some special cases, we can
explicitly construct this optimal change of measure, which will enable us not only
to estimate ? at a minimum cost, but also - and more importantly - to find its
exact value, as a by-product of the intermediate calculations (see Kuruganti and
Strickland, 1995, 1997).
The conditions on the applicability as well as the theoretical framework behind
importance sampling are given in Glynn and Iglehart (1989). In their work, im-
portance sampling is extended to problems arising in the simulation of both
discrete time and continuous time Markov processes, as well as in generalized
semi-Markov processes. In the same paper, the authors discuss the problem of
steady-state quantities estimation, that can be carried out by exploiting the re-
generative structure of the Markov chain, as well as the estimation of transient
quantities, where a different approach has to be used.

6.3.3. The optimal change of measure


Let us consider again expression (43) and let h(x) = lA (x). In this case, the choice

f* (x) = f ( x ) 1~ (x) (44)


7
32 N. Balakr&hnan, N. Limnios and C. Papadopoulos

corresponds to the optimal change of measure associated to the estimation of


7 = IP(X E A). Even though Eq. (44) seems to be at first sight of no use, since it
contains the unknown parameter, it has the benefit of providing us with a very
useful insight concerning the choice of the new density fl(.). Indeed, as it is
indicated in Strickland (1993) the following hold:
• All the mass of the probability is concentrated on the rare event {X E A}, and
consequently only those samples that correspond to the realization of this event
will be produced when the optimal change of measure is used to carry out the
simulation.
• On A, the new density is exactly the conditional density of X, given that
{X C A} has occurred:

f ( X ) " IA(x) _ ~ f ( x [ X E A), x E A,


f * (x) -- ~ ~- Ä) 1, o, otherwise .

Therefore, the relative likelihood of the values of X on A, is exactly the same


for the original as well as the new distribution

dF*(xl) _ dF(xl) where Xl,X 2 ŒA


dF* (x2) dF(x2) '

and F(.) represents the cdf corresponding to the density f ( . ) .

6.4. M a r ovian systems and importance sampling


Consider now a continuous time M a r k o v chain X = {Xt: t > 0}, with state
space E = {0, 1 , . . , s } , s < +ec, infinitesimal (conservative) generator
Q = {q(x,y): x , y c E}, and initial law #(.). The quantity q(x) = - q ( x , x ) repre-
sents the total rate out of state x. Suppose also that the stare space of the system
is divided into two disjoint subsets, U and F, with U U F = E and U N F = (3.
The set U = {0, 1 , . . , m } represents the set of operational states, while
F = {m + l , . . . , s} stands for the set of failed states of the system. In state 0, all
components are considered new. Define also 0 = To < T1 < ... < Tn < ..., the
sequence of the successive j u m p times of the chain X. Then Y = {Yn, n > 0},
defined by

B-Xtù, n=0,1,.. ,

will be the embedded discrete time M a r k o v chain associated to Xt. The elements
ofits transition matrix P = { P ( x , y ) : x , y E E}, are given by P(x,y) = q ( x , y ) / q ( x ) ,
when x ¢ y and 0 otherwise.
Importance sampling can be easily extended to the case of Markovian sys-
tems. In order to modify the probabilistic dynamics of the system, one has to
basically modify the transition probability matrix or the generator of the pro-
cess (see Glynn and Iglehart, 1989) and/or the initial distribution of the process.
Basic probabilistic models in reliability 33

The estimation problems may concern transient or steady-state quantities. In


the first case, the regenerative structure of the chain is employed and the esti-
mation problem is transformed to its analog over the regenerative cycles of the
chain (see the following discussion), while in the second case the process is
simulated for a given time horizon. Moreover, in case of simulation of a con-
tinuous time Markov chain, one part of the simulation is devoted to the sim-
ulation of the corresponding embedded Markov chain where the sequence of
states visited by the process is generated. Then, given the sequence of states
visited by the chain, the second part of the simulation concerns the generation
of the associated (conditional) sojourn times in these states. This discrete time
conversion always results in a variance reduction (see Goyal et al., 1992; Fox
and Glynn, 1986).

6.4.1. Regenerative simulation


Let -c be a stopping time for {Yn : n _> 0} which means that the realization or not
of the event {z = n} may be determined by yn _ (Y0,. •, Yn). Note also En for the
set of all possible paths of the chain Y until time n

en - { / = ( y 0 , y , , . . ,yn): • c E} .

Then, the probability associated to any yn c En is given by

P(Y") = #(Yo)P(Yo,Yl)...P(Yn-I,Yn) ,

where #(Y0) = P(Y0 = y0) is the initial law of the chain. Moreover, let Bn c f~n
stand for the set of all paths for which {z = n}. We have the following propo-
sition:

PROPOSmON 6.1 (Goyal et al., 1992). Consider a discrete time Markov chain with
transition probability matrix P. Ler P be the probability measure associated with
the different trajectories of the chain and z a stopping time which is finite under P,
with probability 1. Note also Z, a measurable function of Y~ .for which
IEe [[Z(Y~)11 < ~ . Let P' be a new probability measure for whieh z is also finite with
probability 1 and for any y~ ¢ Bù, p'(yn) • 0 whenever Z(yn)P(y n) ~ O. Then
IEe[Z(Y~)] = IEp,[Z(Y ~) L(Y~)], with L(Y ~) = P ( S ) / P ' ( y n ) , for any yn C Bh.
Remark however, that in this case it is not necessary for the new importance
sampling measure to correspond to a time-homogeneous Markov chain. A dif-
ferent measure P' given by

p,(yn) = P'(Yo)P'(Yl lYO)'" P'(Yn]Y0...Yn-1) ,

can also be used and it is called a "Dynamic Importance Sampling measure"


(DIS, see Shahabuddin and Nakayama, 1993 and references therein). In this,
P'(YnlY0..-Yn-1) represents the likelihood of the path Yn=yn given that
En-1 = CVO,... ,Yn 1).
Consider now that we are interested in estimating the steady-state unavail-
ability of the system c~that represents the fraction of time for which the system is
34 N. Balakrishnan, N. Limnios and C. Papadopoulos

considered failed. Let h(y) = 1/q(y) be the mean sojourn time in state y and let
g(Y) = 1F(y)h(y). Then, we can write (see Crane and Iglehart, 1975)
nz rv,~0-1
« = ~PL~k=0 g(:~k)] (45)
-1 h(rk)l
] l P rV'VCo
L2Jk=O

Let us now define TB = inf{t > 0 : Xt E B}, the hitting time for B C E (with the
convention that infO = +co) and r» = inf{n > 0 : I1, E B}, the number of jumps
for Y to enter B C E. A somehow similar representation holds for the mean time
to failure (MTTF) of the system, which may be written as (see Goyal et al.,
1992)

nz rv'~min(z0,ZF) - I h(Yk)]
M T T F = ]Ep[TF] = ]te[min(T0, TF)] = uzP[2--*k=0 (46)
P(TF < To) ]lp[l{zF<~0} ] '

where P(rF < z0) = ]lp[l{zF<ZO} ] stands for the probability of a failure during a
cycle (a cycle is defined to be the period between two consecutive instants when
the system is in state 0 and a component failure occurs).
This last expression is the key relation for the M T T F estimation using Monte
Carlo simulation. Thus, in both cases the general problem of estimation boils
down to the estimation of the ratio of two expected values
]lp[a]
- ]lp[H] '

where G and H for the case of steady-state unavailability and M T T F estimation


are given above. In the case of M T T F estimation, our main difficulty is the
estimation of the denominator that corresponds to a rare event, while for the
numerator we can simply use direct simulation. In the case of unavailability
estimation, importance sampling can be used to estimate the numerator while
direct simulation may be suitable for the denominator.
Thus, the two parts of the ratio can be simulated in an independent manner
and a different number of simulation runs can be used to obtain estimates for
]lp[G] and ]lp[H 1. This is the principle behind the "Measure Specific Dynamic
Importance Sampling" (MSDIS, see Goyal et al. 1987, 1992). Suppose that a
total number of n regenerative cycles will be used for the simulation, where the
first l(nJ cycles, with 0 < ~ < 1, will be generated using P1, while the rest
F(1 - ()nj cycles will use P2 (P1 # P2). Let also L represent the likelihood ratio
between the original and the new measure and note Gj,Lj, and Hj the samples of
G, L, and H, respectively, from t h e / t h regenerative cycle. We can then construct
the following estimator for q:

(2}~~ GjLy)/L~nJ
0(,,~) = (~}'-r«,] HjL«)/F(1 - ()nl "

This is a consistent estimator for ~, and we have (see Goyal et al., 1992):
Basic probabilistic models in reliability 35

lim 0~n r~ =

with probability 1. Moreover:

v~(0(n,~ ) - t/) ~ N(0, o-2(p1, P2)/]E22 [H]) , (47)

where " ~ " denotes the convergence in distribution and

«2 (P1, P2) Varp 1[GL]


- ~ +~2 Varp2
~ - ~[HL] (48)

Thus, in the case of steady-state unavailability estimation we can take P1 ¢ P


and P2 = P, while in the case of M T T F estimation we can choose P1 = P and
P2¢P.

6.4.2. Estimation of P("CF < "Co)


Consider now Markovian systems where the failure rates of individual compo-
nents are rauch smaller than the corresponding repair rates. For these systems the
entrance into the failed subset of states may be considered as a rare event and a
quantity of great interest is the probability P("CF < "CO)of failure during a cycle.
The last one is associated to the M T T F of the system (see (46)), as well as to the
unreliability of the system at time t (see Shahabuddin and Nakayama, 1993).
In particular, the following proposition holds.

PROPOSITION 6.2 (Shahabuddin, 1994). The probability 7 = P('CF < "Co) may be
represented as aoer + o(er), where ao and r are positive constants.
In this, e stands for the maximum failure rate of the components in the system
and it is exactly this parameter that reflects the highly reliable nature of the
system. For this reason, we call e the rarity parameter of the system. Conse-
quently, Markovian systems can be classified in balanced systems, where all
components have failure rates of the same order of magnitude and unbalanced
systems otherwise. See Nakayama (1994, 1995, 1996) for a detailed description of
the system model.
Thus for the estimation of the probability P("CF < "Co), good importance sam-
pling schemes are those making the first term in the expression of the variance of
the importance sampling estimator to be close enough to e2r, the order of mag-
nitude of 72. In such cases, the corresponding method will be said to possess the
bounded relative error property, since the relative error of the importance sam-
pling estimator will be bounded by a constant. The methods actually used in
practice, where the total failure (repair) probability at each state is increased
(decreased), are called failure biasing methods. They are distinguished from each
other from the way the new failure probability is allocated to individual transi-
tions. See the references for the different failure biasing methods.
Nakayama (1996) gave necessary and sufficient conditions for a failure biasing
method to have bounded relative error. These conditions are difficult to verify in
36 N. Balakrishnan, N. Limnios and C. Papadopoulos

practice, since one has to find the order of magnitude of a large number of paths
of the process.
Note also that the results obtained by any arbitrary simulation scheme can be
ameliorated by eliminating all transitions to state 0. This is a direct consequence
of the optimal change of measure (see Kuruganti and Strickland, 1997).

EXAMPLE. Consider a 3-out-of-5:F Markovian system that is considered failed if


at least 3 of its 5 independent components are failed. The state space of the system
is E = {0, 1,2, 3}, where state 0 is the perfect state of the system (all components
are operational) and state 3 is the failed state (F = {3}). The transition diagram
of this system is illustrated in Figure 2 where the transition rates are equal to:
201 z 5 × 10 -4, -'~12 = 4 x 10 -4, /~23 = 3 x 10-4, #10 = 1 and [/21 = 2. Note that
the repair rate in state 3 is of no use since the simulation ends the moment we
enter state 3. We want to estimate the probability P(zF < z0) as well as the M T T F
of the system using the ratio formula (46).
The simulation results are given in Figure 3. At the left-hand side, we give the
estimation results for P(zF < %) together with the ones obtained by an analytical
method. In the same figure, we also represent the estimation results obtained by
the same method, if before changing the transition probability matrix according
to the Balanced Failure Biasing method, we eliminate all transitions to state 0. In
both cases a significant amount of variance reduction is obtained. At the right-
hand side we represent the estimation of the M T T F of the system using the same
methods. All the results presented correspond to a 95% confidence interval with
1% error level.

7. Simulation for semi-Markov systems

In what follows, we present two different algorithms for the simulation of semi-
Markov systems: the method of competing risk and the embedded Markov chain
method. We give also simulation results for a simple 3-state semi-Markov system.

7.1. The method o f competing risk

The method of competing risk is based on the following proposition (see Opri~an,
1999).

~01 ~2 z~3
initial 3
state :tate
~qo t51
Fig. 2. A 3-out-of-5system.
Basic probabilistic models in reliability 37

x 10 l ° Simulation results for t h e a-out-of-5 system


! ! ! ! [ !

BFB(0.9)
4.4
- - BFB(0.9) + no transitions to state 0
Exact method

4.2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

~ 3.8

3.6 ................................................................

8.4

: i

3.2 I i ~ I
1000 2000 3000 4000 5000 6000 7000 8000 9000 lOOOO
Iterations

x 10 -8 Simulation results for t h e 3-out-of-5 system


1 ! ! ! ! ! !

BFB(0.9)
0ù9 ....................... - - BFB(0.9) + no transitions to state 0

0.8 .. ~ ..... .. . . . . . . . . , ......... .. . . . . . . . . ,

0.7 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

.~ 0.6 ............................................................

E
~ 0,5 ...... ". . . . . . . . . . . . . . . . . . . . . . . . ......... . . . . . . . . . . . . . .
ò

ku 0 . 4 .........................................................................

0.3

0.2

0.1

I I I i I i I
1000 2000 3000 4000 5000 6000 7000 8000 9000 10000
Iterations

Fig. 3. Simulation results for the 3-out-of-5 system.


38 N. Balakrishnan, N. Limnios and C. Papadopoulos

PROPOSITION 7.1 ( K o r o l y u k and Turbin, 1982). For all i c E, there exists a family
o f independent random variables {~ik, k E E}, taking values in ~ + with the
distribution functions

Aik(t) =-
{ 1- [ , Qikld~tl /f« > 0,
exp - fö I-H~(ù)J (49)
0 otherwise .

The semi-Markov matrix Qij(t) is given by

Qij(t) = hi«(u)Aij(du), i,j E E ,

where

1 - Hi(u)
hij(u) - 1 - Aij(u) -- IE[~j]z/j = u 1

and Iij is the indicator function o f the event {mink~Æ zik = zij}.
The algorithm for the realization of one sample trajectory of the process is the
following.

Algorithm 4.
I n p u t data: The state space of the process, its initial law #(.), and the distribution
functions Aij.
1. Sample a r.v. X ~ # and set t = 0, X(t, o~) = X(m);
2. Set i = X(t, ~o); generate zifs (/" C E) using the distribution functions Aij;
3. Set ~ = minj~s ~,ij, t = t + z, set X(t, o~) = arg minjcs Tij'~
4. Repeat steps 2-3 for the n u m b e r of j u m p s of the process needed, or until the
time t becomes greater than the observation period T.

7.2. Embedded M a r k o v chain method


This m e t h o d consists simply in using the transition probabilities of the e m b e d d e d
M a r k o v chain in order to find the next state of the system and generating the

Exp(0.001)

~x~~oo~~~®/we~~u~~~o~~~
Fig. 4. A 3 states semi-Markov system.
Basic probabilistic models in reliability 39

Availability for the semi-Markov system


1 ! ! ! ! !
\ --- Competin Risk
...... Embedde~ Markov Chain Method
0.99 . . . . . . . !. . . . . . . l -- Analytical solution ]

0.97

0.94[....... i......... i ! .................................................

0.920"93I. ...... . . . . . .i. . . . . . . . . . . !. ....... " >'~"~ . . . . . . . i ! ....... i. . . . . . . . . ........

0.91
0 50 100 150 200 250 300 350 400 450 500
Time

Reliability for the semi-Markov system


1 ~ ! ! ~ ! ! ~ ~ !
i i Competin q Risk
i .... Embedded Markov Chain Method

0.950.850.9 I I ....................

0.6

0.75

07

0.65

0.6
50 100 150 200 250 300 350 400 450 500
Time

Fig. 5. Graphical comparison of the simulation methods.


40 N. Balakrishnan, N. Limnios and C. Papadopoulos

corresponding holding times to the states visited using the distribution functions
F/j(t). This algorithm is similar to the algorithm used for the simulation of C T M C
and is given below.

Algorithm 5.
Input data: The state space of the process, its initial law /~(.), the transition
probabilities Pij and the distribution functions ~j.
1. Sample a r.v. X ~ # and set t = 0, X(t, co) = X(co).
2. Set i = X(t, co); using the transition matrix of the embedded M a r k o v chain find
the state j in which the process will jump into.
3. Generate z, the holding time in state i, using the distribution function F~j(t),
4. S e t t = t + ~ , X ( t , co)=j.
5. Repeat steps 2M for the number of jumps of the process needed, or until the
time t becomes greater than the observation period T.

EXAMPLE. Consider the semi-Markov system whose state diagram is given in


Figure 4. The state space of this system is E = {0, 1,2}, with U = {0, 1} and
F = {2}. The different laws governing the sojourn time of the process in different
states are indicated in the figure. Thus the sojourn law of the system in state 0 is
exponential with parameter 21 -- 0.001. When the system is in state 1 and jumps
towards state 0, then the sojourn time has a Weibull distribution with parameters
cq = 2.0, fil = 10.0, while when the system goes towards state 2, the sojourn time
has a Weibull distribution with parameters «2 = 0.7,/~2 = 2.0. When the system is
in state 2, only one transition is possible, the one going to stare 0. In this case, the
sojourn time in state 2 has an exponential law with parameter 22 = 0.01.
We have simulated the system described previously in order to estimate its
availability A(t) = IP(X(t) E U), as well as its reliability R(t) = ]P(TF > t). The
results of the simulation obtained for 100 iterations of the algorithm are given
in Figure 5, where the results obtained by the previous two simulation methods
are compared to the results obtained using an analytical method. For the
embedded M a r k o v chain the transition probabilities were taken to be
pl0 = 0.1048 and P12 = 0.8952. Note however that the time step of the algorithm
was 0.2 time units.

Hitting times:
Mean time to failure: M T T F = 1119.7,
Mean time to repair: M T T R = 100,
Mean up time: M U T = 1117.1,
Mean down time: M D T = 100.

References

Asmussen, S. (1987). Applied Probability and Queues. Wiley, New York.


Bratley, P., B. L. Fox and L. E. Schrage (1987). A Guide to Simulation. Springer, Berlin.
Basic probabilistic models in reliability 41

~inlar, E. (1969). Markov renewal theory. Adv. Appl. Probab. 1, 123 187.
Crane, M. A. and D. L. Iglehart (1975). Simulating stable stochastic systems III, regenerative processes
and discrete event simulation. Oper. Res. 23, 3345.
Fishman, G. S. (1996). Monte Carlo. Concepts, Algorithms and Applications. Springer Series in
Operations Research, Springer, New York.
Fox, B. L. and P. W. Glynn (1986). Discrete time conversion for simulating semi-Markov processes.
Oper. Res. Lett. 5, 191 196.
Glynn, P. W. and D. L. Iglehart (1989). Importance sampling for stochastic simulations. Manage. Sci.
35, 1367 1392.
Goyal, A., P. Heidelberger and P. Shahabuddin (1987). Measure specific dynamic importance sam-
pling for availability simulations. In 1987 Winter Simulation Conference Proceedings, pp. 351-357.
IEEE Press.
Goyal, A., P. Shahabuddin, P. Heidelberger, V. F. Nicola and P. W. Glynn (1992). A unified
framework for simulating Markovian models of highly reliable systems. IEEE Trans. Comput.
41(1), 36-51.
Goyal, A., S. S. Lavenberg and K. S. Trivedi (1987). Probabilistic modeling of computer system
availability. Ann. Oper. Res. 8, 285 306.
Hammersley, J. M. and D. C. Handscomb (1964). Monte Carlo Methods. London, Methuen.
Heidelberger, P. (1995). Fast simulation of rare events in queueing and reliability models. A C M Trans.
Modeling Comput. Simul. 43-85.
Ionescu, D. C. and N. Limnios, (Eds.) (1999). Statistical and Probabilistic Models in Reliability.
Birkhäuser, Boston.
Janssen, J. and N. Limnios, (Ed.) (1999). Semi-Markov Models and Applications. Kluwer Academic
Publishers, Dordrecht, The Netherlands.
Kemeny, J. G. and J. L. Snell (1976). Finite Markov Chains. Springer, Berlin.
Korolyuk, V. S. and A. F. Turbin (1982). Markov Renewal Processes in Problems of Systems
Reliability. Naukova Dumka, Kiev (in Russian).
Kuruganti, I. and S. G. Strickland (1995). Optimal importance sampling for Markovian systems. In
Proceedings of the 1995 IEEE Systems, Man and Cybernetics Conference.
Kuruganti, I. and S. G. Strickland (1997). Optimal importance sampling for Markovian systems with
applications to tandem queues. Math. Comput. Simul. 44(1), 61 80.
Limnios, N. (1996). Dependability analysis of semi-Markov systems. Reliab. Eng. and Syst. Safety.
Limnios, N. and G. Oprisan (1997a). A general framework for reliability and performability analysis
of semi-Markov systems. In Eighth International Conference on ASMDA. Anacapri (Napoli). Italy,
June 1997.
Limnios, N. and G. Oprisan (1997b). A general framework for reliability and performability analysis
of semi-Markov systems. Appl. Stochast. Models Data AnaL (to appear).
Limnios, N. and G. Oprisan (1997c). Semi-Markov process to regard of their application. World
Energy Syst. J. 1(1), 6zF75.
Limnios, N. and G. Oprisan (1999a). Invariance principle for an additive functional of a semi-Markov
process. Rer. Roumaine. Math. Pures Appl. 44(1), 75-83.
Limnios, N. and G. Oprisan (1999b). Semi-Markov Processes and Reliability. Birkhäuser (to appear).
Nakayama, M. K. (t994). A Characterization of the simple failure biasing method for simulations of
highly reliable Markovian systems. A C M Trans. Modeling Comput. Simul. 4(1), 52-88.
Nakayama, M. K. (t995). Asymptotics for likelihood ratio derivative estimators in simulations of
highly reliable Markovian systems. Manag. Sci. 41, 52zP554.
Nakayama, M. K. (1996). General conditions for bounded relative error in simulations of highly
reliable Markovian systems. Adv. Appl. Prob. 28.
Neuts, M. F. (1981). Matrix-Geometric Solutions in Stochastic Models. The John Hopkins University
Press, Baltimore, MD.
Opri~an, G. (1999). On the failure rate. In Statistical and Probabilistic Models in Reliability (Eds.
Ionescu and Limnios).
42 N. Balakrishnan, N. Limnios and C. Papadopoulos

Ouhbi, B. and N. Limnios (1996). Non-parametric estimation for semi-Markov kernels with
application to reliability analysis. Appl. Stoch. Models Data Anal. 12, 209-220.
Ouhbi, B. and N. Limnios (1997). Estimation of kernels, Availability and Reliability functions of
semi-Markov Systems, In Statistical and Probabilistic Models in Reliability (Eds. Ionescu and
Limnios).
Platis A., N. Limnios and M. Le Du (1998). Hitting time in a finite non-homogeneous Markov chain
with applications. Applied Stoch. Models Data Anal. 14, 241-253.
Pyke, R. (1961a). Markov renewal processes: definitions and preliminary properties. Ann. Math. Stat.
32, 1231-1242.
Pyke, R. (1961b). Markov renewal processes with finitely many states. Ann. Math. Stat. 32, 1243-1259.
Pyke, R. and R. Schaufele (1964). Limit theorems for Markov renewal processes. Ann. Math. Stat. 35,
1746-1764.
Ripley, B. D. (1987). Stochastic Simulation. Wiley, New York.
Ross, S. M. (1990). A Course in Simulation. Maxwelt MacMillan International Editions.
Shahabuddin, P. (1994). Importance sampling for the simulation of highly reliable Markovian systems.
Manag. Sci. 40, 333-352.
Shahabuddin, P. and M. K. Nakayama (1993). Estimation of reliability and its derivatives for large
time horizons in Markovian systems. In 1993 Winter Simulation Conference Proceedings,
pp. 422429. IEEE Press.
Strickland, S. G. (1993). Optimal importance sampling for quick simulation of highly reliable
Markovian systems. In 1993 Winter Simulation Conference Proceedings, pp. 437~444. IEEE Press.
Taga, Y. (1963). On the limiting distributions in Markov renewal processes with finitely many states.
Ann. Inst. Star. Math. 15, 1-10.
N. Balakrishnan and C. R. Rao, eds., Handbook of Statistics, Vol. 20 ")
/_...,
© 2001 Elsevier Science B.V. All rights reserved.

The Weibull Nonhomogeneous Poisson Process

Asit P. Basu and Steven E. Rigdon

The Weibull h a z a r d function is often parametrized as


(x_k ~-1
h(x) = u \0/ , x > 0 (1)
or

h(x) = ,~fix/~-x, x > 0 . (2)

The n o n h o m o g e n e o u s Poisson process that has an intensity function o f f o r m (1) or


(2) is often called the WeibuU proeess, or m o r e c o m m o n l y , the power law proeess.
Such a process is often used to model the occurrence o f events in time, and in
particular, to model the failure times of repairable systems. We therefore begin
with a discussion of the h o m o g e n e o u s and n o n h o m o g e n e o u s Poisson processes.

1. The Poisson processes

Let N(t) denote the n u m b e r of events that occur at or before time t. Such a
r a n d o m variable is called a eounting proeess. W h e n the argument to N is an
interval, such as (a,b], then N(a,b] is defined to be the n u m b e r o f events that
occur in that interval. Thus, N(t) = N(0, tl. A counting process N(t) is said to be a
Poisson process if:
1. N(0) = 0.
2. F o r any a < b _< c < d the r a n d o m variables N(a, b] and N(c, d] are indepen-
dent. This property is called the independent inerements property.
3. There is a function 2, called the intensity fnnetion, such that

2(t) = lim P(N(t, t + At] = 1)


At-+O At
4.
lim P(N(t,t+ At] > 2) = 0 .
At~O At
The last property precludes the possibility o f simultaneous failures.

43
44 A. P. Basu and S. E. Rigdon

These four properties, as minimal as they seem, are enough to establish


the property that the number of failures in the interval (a, b] has a Poisson
distribution with mean equal to
Ja b
E(N(a, b]) = 2(t)dt .

The proof involves recursively solving a system of differential equations; see


Rigdon and Basu (2000) for the derivation. The function

A(t) = ~0t ,Z(x)dx,

which gives the expected number of events through time t, is called the mean
funetion for the process. Clearly, A'(t) = 2(t).
The nonhomogeneous Poisson process having an intensity function of the form

=0~,0/ , t>0 (3)

or
,~(t) = ,t/~t ~-~, t> o

is called the power law process or the Weibull nonhomogeneous Poisson process.
This model has gone by many other names as well, including and most notably,
the Weibull process. When fi < 1 the intensity is a decreasing function of t. In this
case, failures will become less frequent as the system ages; this is reliability im-
provement. When fl > 1 the intensity is an increasing function of t, and in this case
failures will become more frequent as the system ages. This is called deterioration.
When fi = 1, then the intensity function is a constant. Thus, the homogeneous
Poisson process is a special case of the power law process. The power law process
can thereYore be used to model systems that improve, deteriorate, or remain
steady over time, hut it cannot be used to model systems that improve for some
intervals of t and deteriorate for other intervals.
Some repairable systems have an intensity function that has the bathtub shape
as shown in Figure 1. For small values of t, that is, when the system is young, the
rate of occurrence of failures (ROCOF) is high and failures are frequent. After the
bugs are removed, or after some of the weakest components fail, the R O C O F will
be smaller, and it will remain at this level throughout its useful life. Then as the
system ages, the R O C O F begins to increase. At this stage, the system is deteri-
orating.
The two functions in Figure 1 look nearly identical, but there is an important
difference in their interpretations. The bathtub intensity function indicates that
the system will initially experience reliability growth. A few early failures will be
followed by the useful life when failures occur at roughly a eonstant rate.
The Weibull nonhomogeneous Poisson process 45

~(t) Bathtub intensity function Bathtub hazard function


h(x)

< #
I' t "X
"~--Early failures'-'~ ~ Constant ~,----~ ~ Deterioration ~- Burn-in ~ ~ Useful life ~ ",*---- W e a r o u t

Fig. 1. Bathtub intensity and bathtub hazard functions.

Eventually, as the system ages, the failures become more frequent. On the other
hand, the bathtub hazard function indicates that there is a high chance that the
system will fail (for the first and only time) early in its life. A few of the systems
have serious defects that will cause early failures. Eventually, a working system
will begin to wear out and the failure will ensue. The hazard function is the limit
of a conditional probability. For a system that is wearing out, the probability of
failure in (xo, xo + Ax] conditioned on survival past time x0 will be smaller than the
probability of failure in (xl,xl + Ax] conditioned on survival past time xl, pro-
vided x0 < xl. There are therefore two bathtub curves: a bathtub intensity func-
tion for repairable systems and a bathtub hazard function for non-repairable
systems. The bathtub hazard function expresses conditional probabilities of the
one and only failure of the system. The bathtub intensity function indicates that
the system will experience many failures early in its life, which will be followed by
a time when the R O C O F is constant; finally, as the system ages, the failures will
become more frequent.
Although the power law process cannot model the bathtub-shaped intensity
function, the bathtub curve concept helps to illustrate the difference between the
interpretations of the intensity and hazard functions. In particular, it helps to
illustrate the difference between the power law process (i.e., the Weibull non-
homogeneous Poisson process) and the Weibull distribution. The power law
process is the nonhomogeneous Poisson process with intensity function
2(t) = (~/O)(t/O) p 1 and the Weibull distribution is that distribution having a
hazard function of the form h(x) = (~/O)(x/O) ~ 1.

2. Models for the reliability of repairable systems

The power law process can be used to model the occurrence of events in time. The
most common application has been to model the failures of a repairable system.
We assume that a failed unit is immediately repaired, or that the repair time is not
counted in the operating time. If we also assume that a failed unit is brought back
to exactly the same condition as it was just before the failure, then it is clear that
the nonhomogeneous Poisson process is the appropriate model for the failure
Other documents randomly have
different content
= op uw credit worden geboekt; The property passed under the
will, was very large = bij testamentaire beschikking vermaakt; He
passed up coppers to the conductor = gaf door; This judgment was
passed upon him = werd over hem uitgesproken; Pass-bill =
geleibiljet; Pass-book = bestelboekje, kassiersboekje; Pass-check
= vrijbiljet; contremarque; Pass examination = gewoon examen,
tegenover Honours exam.; Pass-holder = houder van een vrijbiljet
of abonnementskaart; Pass-key = looper, huissleutel; Passman =
geslaagde (tegenover Classman = de met grooten lof geslaagde);
Pass-paper = schriftelijke examenopgaaf; Passport = paspoort;
Password = parool, wachtwoord; Passable = gangbaar, dragelijk,
begaanbaar; Passer = die passeert; Passer-by = voorbijganger;
Passing = voorbijgang, verloop, aanneming; adj. voortreffelijk,
uitstekend, in hooge mate, voorbijgaand, terloopsch: Passing-bell
= doodsklok; Passing-note = overgangstoon (muz.); He said it in
passing = in ’t voorbijgaan; We don’t see a bit of passing = zien
(hier) niemand voorbijgaan.

Passage, pasidž, doorgang, uitgang, binnenkomst, gang, weg, reis,


passage, doorvaart, aanneming (v. een wet), stoelgang, voorval,
gebeurtenis, etc.: Had you a good passage? = hebt ge een goede
overtocht gehad; To take a passage for = biljet nemen; Birds of
passage = trekvogels; They indulged in frequent passages of
words = hielden herhaalde woordenwisselingen; Passage-money
= passagiersvracht.

Passant, pas’nt, gaande: Lion passant = gaande leeuw (Herald.).

Passenger, pas’ndžə, passagier, reiziger, passagierstrein: Cabin


passenger; Deck passenger; Effects of passengers =
passagiersgoed; Through-passengers = doorgaande reizigers;
Passenger-carriage = personenwagen; Passenger-pigeon =
postduif; Passenger-traffic = personenvervoer; Passenger-train
= personentrein.

Passerine, pasər(a)in, tot de musschensoort behoorende.

Passiflora, pasiflôrə, passiebloem.

Passim, pasim, hier en daar.

Passion, paš’n, het lijden (vooral het laatste lijden des Heeren),
hartstocht, liefde, toorn, smart, geestdrift, vuur: To be in a
towering passion = in hevigen toorn ontstoken; He fell (flew)
into a passion = werd woedend; Don’t give way to passion =
laat u niet door drift medesleepen; To have a passion for =
voorliefde hebben; To put a person into a passion = iemand in
drift doen ontsteken; Passion-flower = passiebloem; Passion-
play = passiespel; Passion-Sunday = Zondag vóór Paschen;
Passion tide = lijdensweken; Passion week = lijdensweek;
Passionate = hartstochtelijk, driftig, oploopend; subst.
Passionateness; Passionists = een bepaalde godsdienstige orde
der R. Katholieken, die behalve de gewone 3 beloften nog een vierde
afleggen, nl. tot voortdurende overweging van het lijden Onzes
Heeren (vandaar de naam); Passionless.

Passive, pasiv, lijdend, passief, indifferent: Passive obedience =


lijdelijke gehoorzaamheid; Passive resistance; Passive verb =
lijdend werkwoord; subst. Passiveness.

Passover, pasouvə, Joodsch paaschfeest, feest ter herinnering aan


de verlossing uit Egypte; paaschlam; Passover-bread, Passover-
cake = Paaschbrood.

Past, pâst, subst. verleden; adj. voorbij(gegaan), verleden,


doorgebracht; onovertroffen; adv. en prep, over, overheen, te boven,
voorbij: She has a past = iets op haar kerfstok (een “verleden”);
He is a past-master in villainy = een aartsschelm; He came past
our house = langs; He was past that now = er nu overheen; Past
comprehension (all common sense) = alle begrip (gezond
verstand) te boven gaande; The patient is past cure =
onherstelbaar, ongeneeslijk; Half past four = half vijf; Past hope
= hopeloos; I am past marrying = te oud om te trouwen; For
many years past = vele jaren geleden.

Paste, peist, subst. deeg, pasta, glasdeeg, valsche diamant; adj. uit
pasta gemaakt, onecht; Paste verb. vastplakken, beplakken; in
pasta werken; afranselen: A pair of ear-drops of glittering paste
= een paar simili oorknopjes; Pasteboard = bordpapier,
visitekaartje, speelkaart, biljet; Paste-pot = lijmpot.

Pastel, past’l, weede (plant); pastel; Pastellist = pastelteekenaar.

Pastern, pastən, koot van een paard; Pastern-joint =


kootgewricht.

Pasteurize, pastɐ̂ raiz, pastɐ̂ raiz, pasteuriseeren.

Pastil, pastil; Pastille, pastîl, pastille.

Pastime, pâstaim, tijdverdrijf, genoegen.

Pastor, pâstə, (geestelijk) herder; Pastoral, subst. idylle, landelijk


gedicht, herdersdicht, herderlijk schrijven, pastorale (muziek); adj.
landelijk, herderlijk: Pastoral visitation = huisbezoek;
Pastoralism = herderlijke omgeving of natuur; Pastorate =
herderlijk ambt = Pastorship.

Pastorale, pastərâli, pastorale (muz.).


Pastry, peistri, gebak, pastei: Pastry-cook = pastei- of
banketbakker.

Pasturable, pâstjurəb’l, voor beweiding geschikt; Pasturage,


pâstjuridž, weiden, weiland; Pasture, pâstjə, subst. weide, gras;
Pasture verb, weiden, grazen: Pasture-ground (Pasture-land);
Pastureless.

Pasty, peisti, subst. pastei(tje); adj. als deeg: A pasty-faced


youth = bleeke jonge man.

Pat, pat, verk. van Patrick; Ier.

Pat, pat, subst. tikje, klapje, opgemaakt stuk boter; adj. geschikt,
net van pas, toepasselijk; Pat verb. zachtjes tikken of kloppen: Pat
to the time = te rechter tijd; He [390]said the words pat on = glad
achter elkaar op; He had rhymes pat about all the persons present
= juist van toepassing; It came pat to the purpose = net van
pas; He patted little children on the back, head = tikte
(goedkeurend) op den rug, op het hoofd; Patness = juistheid,
gepastheid.

Patagonia, patəgounjə.

Patch, patš, subst. lap, stuk, moesje, stuk of lapje grond; Patch
verb. lappen, oplappen, samenflansen: This comedian is not a
patch upon his fellow-artist = haalt in de verte niet bij; She laid
on patches and made herself ridiculous = zij zag er belachelijk uit
met hare schoonheidsmoesjes; To put a patch on = lap opzetten;
The plaster is patching off the walls = valt bij stukken van den
muur af; The dress was patched up = in haast en slordig
opgelapt; Peace was patched up with them = een overhaaste
vrede werd gesloten; Patchwork quilt = lappendeken; Patcher =
lapper, knoeier; Patchy = gelapt, saamgeflanst; knorrig.
Patchouly, patšəli, patšûli, patchoeli plant, parfum daaruit bereid.

Pate, peit, kop: He broke his pate = kreeg een gat in zijn kop;
Pated (in samenst., zooals: Curly-pated).

Patella, pətelə, schoteltje, knieschijf; Patelliform.

Paten, pat’n, pateen, vlakke gouden of vergulde schaal waarop de


H. Hostie ligt.

Patent, peit’nt, subst. octrooi, vergunning, patent; adj. openbaar,


duidelijk, gepatenteerd, uitstekend; Patent verb. octrooi verleenen,
door een octrooi zich verzekeren: To take out a patent for =
patent nemen op; Dissensions were becoming patent to the
household = hunne geschillen werden duidelijk voor de
huisgenooten; Patent-law; Patent-leather = verlakt leder;
Patent-office = bureau der octrooien; Patent-right =
patent(recht); Patent-rolls = register der ingeschreven octrooien;
Patentable = waarvoor octrooi kan worden genomen; Patentee,
peit’ntî, pat’ntî, patenthouder.

Pater, peitə, “ouwe heer”: My Pater; Pater-familias,


peitəfəmiljəs, huisvader; Paternoster, patənostə, peitənostə, het
“Onze Vader”, rozenkrans; Paternal, pətɐ̂ n’l, vaderlijk, erfelijk;
Paternity, pətɐ̂ niti, vaderschap: Inquiry into the paternity of an
illegitimate child = onderzoek naar het vaderschap; Paternity law
= wet op het vaderschap.

Paterson, patəs’n; Patey, peiti.

Path, pɐ̂ th, subst. pad: To break (open) a path = een weg
banen; To leave the path to = iemand uit den weg gaan; Path-
breakers = baanbrekers; Pathway = voetpad; Pathless =
ongebaand.
Pathetic, pəthetik, gevoelvol, aandoenlijk.

Pathogeny, pəthodžəni, leer van het ontstaan der ziekten.

Pathologic(al), pathəlodžik(’l), pathologisch; Pathologist =


patholoog; Pathology = ziektenleer.

Pathos, peithos, pathos.

Patience, peiš’ns, geduld, volharding, lijdzaamheid: I am out of


patience = mijn geduld is op; I have come to the end of my
patience = mijn geduld is ten einde; To lose one’s patience;
Possess your soul in patience = bezit je ziel in lijdzaamheid;
Take patience = heb geduld; Patient, peiš’nt, lijder, patient; adj.
geduldig, lijdzaam, kalm, toegevend, volhardend, taai: What is
death to the patient is profit to the physician = den een zijn
dood is den ander zijn brood; To be patient of = geduldig dragen.

Patina, patinə, vlakke schaal; groene roest op brons.

Patriarch, peitriâk, patriarch; Patriarchal, peitriâk’l,


aartsvaderlijk; Patriarchate = patriarchaat = Patriarchship.

Patrician, pətriš’n, patricisch; subst. patriciër.

Patrick, patrik, Patricius.

Patrimonial, patrimounj’l, tot het vaderlijk erfdeel of de


nalatenschap behoorende; Patrimony, patriməni, vaderlijk erfdeel.

Patriot, peitriət, patriət, subst. vaderlander, patriot; adj.


vaderlandslievend; Patriotic, peitriotic, patriotic =
vaderlandslievend; Patriotism, peitriətizm, patriətizm,
vaderlandsliefde.
Patrol, pətroul, subst. patrouille, ronde; Patrol verb.
patrouilleeren, de ronde doen.

Patron, peitr’n, patr’n, subst. beschermer, beschermheer,


beschermheilige, geregeld bezoeker, begunstiger; adj. beschermend,
bescherm …; Patronage, peitrənidž, patrənidž, bescherming,
patronaat, recht van begeving; Patroness = beschermvrouw;
Patronize, peitrənaiz, patrənaiz, beschermen, begunstigen;
Patronizer; Patronizing = beschermend, nederbuigend:
Patronizing air.

Patronymic, patrənimik, subst. geslachts- of familienaam; adj. v.


een voorvader verkregen (naam); Patronymical.

Patroon, pətrûn, grondbezitter aan wien, oorspronkelijk door de


Ned. W.-Ind. Comp., eenige heerlijke rechten waren toegekend.

Patten, pat’n, soort klomp, oudtijds soort stelt(schoen), patijn;


onderlaag v. een muur, voetstuk v. eene zuil.

Patter, patə, kletteren, trappelen, ratelen, klateren, snateren;


subst. gekletter, etc., tusschengevoegde en snel uitgesproken
woorden in een pattersong; They all pattered French more or
less = praatten, snapten.

Pattern, patən, subst. model, patroon, voorbeeld, staal, schabloon;


Pattern verb. copieeren, bespikkelen: A French patternbonnet =
modelhoed; The streets were starred and patterned with lights.

Patty, pati, pasteitje; Patty-pan.

Patulous, patjulɐs, afstaand, uitstaand.

Paucity, pôsiti, geringheid, schaarschte, gebrek: Paucity of


labour = gebrek aan werkkrachten.
Paul, pôl, Paulus; Pauline, pôlin, Paulina; pôl(a)in, Paulinisch.

Paunch, pônš, pânš, buik, pens, balg; Paunch-mat = stootmat


(scheepst.); Bow-paunched = met ronden buik; Paunchiness.

Pauper, pôpə, arme, bedeelde: Pauper children, Pauper school;


Pauperism = de bedeelden, armoede; Pauperization, subst. v.
Pauperize = tot diepe armoede brengen.

Pause, pôz, subst. rust, afbreking, twijfel, besluiteloosheid,


gedachtestreep: Pause verb. [391]pauseeren, rusten, weifelen,
aarzelen: Such things must give a sensible man pause = brengen
tot nadenken; He stood in pause = twijfelde, weifelde.

Pave, peiv, bevloeren, plaveien: To pave the way for = den weg
banen; Pavement = bestrating, plaveisel; Foot-, Side-pavement
= trottoir, kleine steentjes; Paver = straatmaker, wegbereider,
straatsteen, stamper.

Pavilion, pəvilj’n, paviljoen, vlag, standaard, verhemelte (Herald.);


oude munt.

Pavior, peivjə, straatmaker, stamper.

Pavo, peivou, pauw, sterrenb. de pauw; Pavonine, pavən(a)in,


met vele kleuren schitterend (als een pauwestaart).

Paw, pô, subst. poot met klauw; hand; Paw verb. (met den
voorpoot) krabben (v. paarden); ruw aanpakken, flikflooien.

Pawem, pôəm, aangenaam gedeelte, in een boek tusschen


zwaardere kost ingelascht.

Pawky, pôki, slim, schalks(ch): He takes a pawky view of things.


Pawl, pôl, pal.

Pawn, pôn, subst. pion (schaakspel), pand; Pawn verb. verpanden:


I have given my watch in (at) pawn = als pand gegeven;
Pawnbroker = lombardhouder; Pawnbroking = het houden v.
een lommerd; Pawner; Pawn-house = Pawnshop; Pawn-
tickets with the equity of redemption = lommerdbriefjes met het
recht van wederinkoop.

Pawnee, pônî: Brandy pawnee = cognacgroc (Brit Ind.).

Pawnee, pônî; Pawtucket, pôtɐkət.

Pax, paks. Zie Osculatory: To cry pax = roepen dat het ‘genoeg’ is
(Schoolslang).

Pax(y)-wax(y), paks(i)waks(i), sterke, dikke nekspier (bij


slachtvee).

Pay, pei, subst. betaling, loon, soldij; Pay verb. betalen, vergoeden,
vergelden, kwijten; teeren, smeren: No pay no play = zonder geld
heb je niets; In the pay of = in dienst van; Officer on halfpay =
op wachtgeld; To pay one’s addresses to = het hof maken; To
pay attention = opletten; You paid him a bad compliment =
maakte; When he came home there was the devil to pay =
waren de poppen aan het dansen; He robs Peter to pay Paul =
hij maakt een gat om een ander gat te stoppen; To pay the piper
= het gelag betalen; I will pay him full tale for this = het hem
dubbel en dwars betaald zetten; To pay a visit = brengen; If you
don’t wish to get into debt, you must pay your way = moet ge uwe
verplichtingen nakomen; You will have to pay down = gij zult
moeten opdokken, contant betalen; You shall pay for this = zult
boeten; Will you pay for the book? = het boek betalen; He paid for
his treachery with his life = boette zijn verraad; The sum was paid
in to your account = gestort, afbetaald op uw debet-rekening; To
pay into a bank, the hands of a banker = deponeeren bij; I have
paid him off = het volle bedrag uitbetaald; het hem betaald gezet;
The loan will be paid off at the price of £ 106 for every £ 100 = is
aflosbaar; Pay him on = sla er op, raak hem; I have paid him out
(for it) = paid him home = het hem betaald gezet; Pay out more
cable = vier; We had to pay through the nose = ons werd het vel
over de ooren getrokken; I want to pay up my arrears = wensch te
betalen; The business does not pay = rendeert niet; Paid-up
shares = volgestorte aandeelen; He got well paid = kreeg het
goed betaald; Pay-bill = betaalsrol; Pay-box = loket, plaatsbureau
(theaters); Pay-day = traktementsdag; Paymaster =
betaalmeester, kwartiermeester, officier van administratie; Pay-
office = betaalkantoor; Pay-sheet = betaalsrol; Payable =
betaalbaar: Payable at sight to Mr. X. or order = betaalbaar op
zicht aan den heer X. of order; Payee, peiî, wien betaald wordt;
Payer = betaler; Paying = loonend; Payment = betaling, loon: He
has stopped (suspended) payment = zijne betalingen gestaakt.

Payne, pein.

Paynim, peinim, heiden.

Paynize, peinaiz, hout voor bederf bewaren door eene injectie van
zekere oplossing.

Pea, pî, erwt: They are as much alike as two peas (in a pod) =
zij lijken als twee droppels water op elkaar; Pea(s)-cod (-pod, -
shell) = erwtenpeul; Pea-gun (Pea-shooter) = proppenschieter;
Pea-nut = aardnoot; Pea-soup = erwtensoep.

Peabody, pîbodi.
Peace, pîs, vrede, rust, kalmte, harmonie: Peace to his ashes =
hij ruste in vrede; Peace there! = stilte! At peace = verzoend;
dood; I am not at peace with myself = het met mijzelf niet eens;
Hold your peace = houd je stil; They kept the peace better than
I had expected = zij hielden zich rustiger dan ik verwacht had; He
promised to keep the king’s (queen’s) peace = om de orde niet
meer te verstoren; He was sent to peace = werd gedood; Peace-
breaker = rustverstoorder; Peacemaker = vredestichter; Peace-
offering = zoenoffer; Peace-officer = politieagent; sheriff;
Peace-party = vredepartij; The Peace-society = vredebond; In
their peace-strength = sterkte in tijd van vrede; Peaceable =
vreedzaam, vredelievend; subst. Peaceableness; Peaceful =
vredig, kalm, stil; subst. Peacefulness; Peaceless = rusteloos,
woelig.

Peach, pîtš, subst. perzik: He is no small peaches of an artist =


als kunstenaar is hij geen kwajongen; Peach-brandy = persico;
Peach-coloured = perzikkleurig; Peach-down = dons; Peach-
tree.

Peach, pîtš, aanklagen, verklappen: Speak, or I’ll peach =


spreek, of ik getuig tegen je, verklik je, geef je aan.

Peachick, pîtšik, jonge pauw; Peacock, pîkok, (mannetjes)pauw;


ook verb., stappen als een pauw; Peacock-butterfly = pauwoog;
Peacock-fish = pauwoog (visch); Peacock-hangings =
behangsel met pauwenpatroon; Peafowl = pauw; Peahen =
pauwin.

Pea-jacket, pîdžakət, pijjekker.

Peak, pîk, subst. piek, spits, punt, klep (van pet of hoed); Peak
verb. kwijnen, er ziekelijk uitzien, (eene ra) optoppen: A peaked
cap = met klep; Peaked beard = puntbaard; [392]A peaked look
= kwijnend; Peaked-up persons = stijve, opgeprikte personen;
Peaking = ziekelijk; geniepig; Peakish = ziekelijk; Peaky = spits,
ziekelijk uitziend.

Peal, pîl, subst. knal, slag, salvo, geratel, donderslag, klokgelui,


aantal klokken; Peal verb. luide weerklinken of weergalmen: The
peals were rung = er werd op de klokken gespeeld, klokkenspel
weerklonk; Peals of laughter, of thunder = schaterend gelach,
ratelende donderslagen; The bells pealed forth their merriest
sounds = de klokken deden hare vroolijkste klanken hooren.

Pear, pêə, peer; Pear-tree.

Pearce, pîəs.

Pearl, pɐ̂ l, parel, paarlemoer; pil; staar, (ook Pearl-eye),


diamantletter; ook adj.; Pearl verb. met parelen bezetten,
beperelen, parelen: Mother of pearl = paarlemoer; That’s casting
pearls to (before) the swine = dat is paarlen voor de zwijnen
gooien; The girl strung her beads and pearls = reeg hare kralen
en paarlen; Pearl-ash = parelasch; Pearl-barley = geparelde
gerst; Pearl-disease = parelziekte; Pearl-diver = parelvisscher;
Pearl-eye(d) = (met) een vlek (staar) op het oog; Pearl-fishery
= parelvisscherij, de plaats hiervoor; Pearl-oyster = pareloester;
Pearl-sago; Pearl-shell = parelschelp; Pearl-studded = met
paarlen bezet; Pearl-white = parelwit; Pearly = parelachtig, vol
paarlen.

Pears, pîəz; Pearson, pîəs’n.

Peasant, pez’nt, subst. en adj. boer(sch); Peasantry =


boerenstand, landvolk.

Pease, pîz, erwten.


Peat, pît, turf; ook = A lump of peat = een turf; Peat-bog = laag
veen; Peat-cutter = turfsteker; Peat-drag = bagger; Peat-moor
= hoog veen; Peat-moss = veenmos; Peat-stacks; Peatery =
veenderij; Peaty = uit turf bestaande, turf bevattende.

Pebble, peb’l, kiezelsteen (= Pebble-stone), agaat, bergkristal (=


Pebble-crystal); Pebbled, Pebbly = vol kiezelsteenen.

Peccability, pekəbiliti, zondigheid; Peccable, pekəb’l, zondig;


Peccadillo, pekədilou, kleine zonde, pekelzonde; Peccancy,
pek’nsi, zondige staat; Peccant = zondig, bedorven, slecht;
Peccavi, pekeivai, “ik heb gezondigd”, schuldbelijding: To cry
peccavi = schuld bekennen, om vergiffenis vragen.

Peck, pek, subst. maat van ± 9 L.; pik, hap, kus, voer, kost; groote
hoeveelheid; Peck verb. pikken, vitten, eten: There are pecks and
pecks of = hoopen; It gave me a peck of trouble = een heele
portie last; We had to keep them peck and perch, all the year
round = we moesten hen onderhouden; She pecked at her chop =
knabbelde aan, at met kleine beetjes van; She pecked up all the
crumbs = raapte (pikte) op; Peck-alley = keel(gat); Pecker =
specht; snavel: To keep up one’s pecker = den moed (of eetlust)
niet verliezen; To go to Peckham = gaan eten; All holiday at
Peckham = Schraalhans is keukenmeester; Peckish = hongerig.

Pectin, pektin, pectine.

Pectinate, pektənit, kamvormig.

Pectoral, pektər’l, borst - -; subst. borstmiddel, borstlap,


borstschild of kruis van priesters, borstspier; borstvin = Pectoral
fin.
Peculate, pekjuleit, (geld) verduisteren; Peculation: Peculation
was rife, and abuses were rampant = verduistering was algemeen;
Peculator.

Peculiar, pəkjûliə, eigenaardig, bijzonder, origineel; ook subst.: This


wine has a peculiar flavour = een gansch bijzonderen smaak (en
geur); Peculiarity, pəkjûliariti, eigenaardigheid, bijzonderheid.

Pecuniary, pəkjûniəri, geldelijk, geld(s) …: In pecuniary


difficulties, troubles = in geldelijke moeilijkheden.

Ped, ped, pedaal; voetganger; groote mand.

Pedagogue, pedəgog, pedagoog, schoolvos; Pedagogical,


pedəgodžik(’l), pedagogisch; Pedagogics, pedəgodžiks,
pedagogiek.

Pedal, ped’l, pîd’l, voet - -; subst. pedaal; Pedal verb. het pedaal
gebruiken; peddelen, fietsen; Pedal-note = aangehouden toon;
Pedal-pipes (of an organ); Pedal(l)er = fietser.

Pedant, ped’nt, schoolvos; Pedantic(al), pədantik(’l), pedant;


Pedantry = muggenzifterij, pedanterie.

Peddle, ped’l, venten, met eene mars loopen; zich met beuzelarijen
ophouden: Peddler = marskramer (Z. Pedlar); Peddling =
beuzelachtig, onbeteekenend.

Pedestal, pedəst’l, voetstuk: Pedestal writing-table = “bureau


ministre”; To put on a pedestal (fig.).

Pedestrian, pədestriən, voet - -; gewoon, alledaagsch, prozaïsch;


subst. voetganger: Pedestrian journey, Pedestrian tour =
voetreis; Pedestrianism = het wandelen, wandelwedstrijd,
hardloopwedstrijd; alledaagschheid.
Pedicel, pedisel, bloemsteel; Pedicellate, pediselit, gesteeld.

Pedigree, pedigrî, stam- of geslachtsboom, afkomst: Pedigree


cattle = stamboekvee.

Pediment, pediment, pediment, gevelveld.

Pedipalp, pedipalp, schorpioenspin; taster (van een spin).

Pedireme, pedirîm, waterwants.

Pedlar, Pedler, pedlə, marskramer; Pedlary = marskramerij;


bedriegerij.

Pedlington, pedliŋt’n: This is the Little Pedlington view of the


matter = kleingeestige, bekrompen kijk op.

Pedometer, pədomətə, pedometer.

Peduncle, pədɐŋk’l, (bloem)steel; Peduncular, pədɐŋkjulə,


bloemsteel …; steel …; Pedunculate, pədɐ̂ nkjulit, gesteeld.

Peek, pîk, verb. gluren: Peek-a-boo = Peep-bo: Peek-a-boo


blouse = blouse met transparant halsstuk; Peeker.

Peel, pîl, subst. schil; schieter (van bakkers), versterkte toren (op
de grens van Eng. en Schotl.); Peel verb. schillen, afschillen,
afschilferen, ontkleeden, plunderen: Candied peel = sukade;
Orange peel; To keep one’s eyes peeled = zijne oogen open
houden (fig.); Peelings = schillen.

Peel, pîl: Peeler = klabak.

Peep, pîp, subst. gepiep; het eerste dagen of verschijnen, gluren,


sluwe blik; Peep verb. [393]piepen; gluren (naar = at), gloren,
aanbreken, sluw kijken: At the peep of dawn = bij het gloren van
den dag; To take a peep at = gluren naar; Peep-bo = kiekeboe!
Peep-hole = kijkgaatje; Peep-o’-day boys = Iersche (Protest.)
opstandelingen (1784); Peep-show = kijkkast; Peeper = pas
geboren kuiken; loervogel (fig.); oog: Painted peepers = blauwe
(geslagen) oogen; Single peeper = éénoog; Peeping Tom =
loervogel.

Peer, pîə, gluren, kijken, te voorschijn komen.

Peer, pîə, subst. pair, gelijke; Peerage = pairschap, de hooge adel;


adelboek; Peeress = adellijke dame, vrouw van een peer;
Peerless(ness) weergaloos(heid).

Peevish, pîviš, gemelijk, knorrig; subst. Peevishness.

Peewit, pîwit, kievit.

Peg, peg, subst. pen of pin, nagel, klemhoutje, schroef; tand, voet,
bepaalde slag; cognac met spuitwater; Peg verb. met pinnen
vastmaken of merken, afranselen (into), geducht werken of eten
(away), met pennen uitzetten (out), uitknijpen (out), hard loopen:
Pegs = broek van boven zeer wijd en van onderen erg nauw; He is
a square (round) peg in a round (square) hole = niet op zijne
plaats; To come down a peg or two = een toontje lager zingen;
I’ll take (pull) him down a peg or two = ik zal hem een toon of
wat lager doen zingen; I have pegged two for you = twee voor u
aangezet; He was pegging away at his translation = werkte hard
aan; peg-top = priktol; peg-tops = broek, wijd van boven en
nauw om de enkels.

Pegasus, pegəsɐs, Pegasus; Peggoty, pegəti; Pegram, pîgrəm;


Peile, pîl; Peirce, pîəs, pɐ̂ s; Pekin, pəkin; Pekin(g), pikin(g).

Pekoe, pekou, pîkou, Pecco-thee.


Pelagic, piladžik, zee …

Pelargonium, peləgounj’m, pelargonium.

Pelerine, pelərîn, pelərîn, pelerine.

Pelf, pelf, vuil; “duiten”: It was the pelf that got her a husband =
zij kreeg een man om haar “moppen”.

Pelican, pelik’n, pelikaan; tang, haak; distilleerkolf.

Pelion, pîliən: To pile Ossa on Pelion = zich bovenmenschelijk


inspannen.

Pelisse, pəlîs, soort mantel, pelsmantel.

Pellet, pelət, balletje, propje, kogel(tje), grove hagelkorrel; Pellet


verb. met balletjes (hagelkorrels) schieten op.

Pellicle, pelik’l, huidje, vliesje.

Pellitory, pelitəri, glaskruid.

Pell-mell, pelmel, in groote verwarring, blindelings; subst.


verwarring, handgemeen.

Pellucid, pəl(j)ûsid, helder; doorschijnend; subst. Pellucidity,


pel(j)usiditi = Pellucidness.

Peloponnesian, peləpənîsiən; subst. Peloponnesus.

Pelt, pelt, gooien, werpen, kletterend neerkomen; subst. werpen


enz.; vel, vacht, ruwe huid: To pelt along = voortjakkeren
(snellen); Pelt-monger = handelaar in huiden en vellen; Pelt-
wool = wol (van doode schapen); Pelter = hagel (fig.), plasregen;
geweer, pistool, klein oorlogschip; Pelterer = handelaar in vellen;
Peltry = pelterij, huiden.

Pelvic, pelvik, bekken …; Pelvis, pelvis, bekken.

Pembroke, pembruk: Pembroke table = kleptafel.

Pem(m)ican, pemik’n, gedroogd gemalen en met vet tot koeken


geperst vleesch: This booklet is historical pem(m)ican = droog
maar degelijk.

Pen, pen, subst. schaapskooi, hok, perk; pen, veder, stijl; Pen verb.
opsluiten, stuwen; schrijven, neerpennen: Slip of the pen =
schrijffout, vergissing; The last stories from her pen = van hare
hand; These pens have hard, soft nibs = dit zijn pennen met
harde, zachte punten; I want this pen mended = ik wou deze
pen vermaakt hebben; To put the pen through = de pen halen
door; To set pen to paper: Pen-case = pennenkoker; Pen-driver
= pennelikker; Pen-fish = pijl-inktvisch; Penholder =
pennehouder: Penknife = pennemes; Pen-man = schoonschrijver;
Penmanship = schrijfkunst, manier van schrijven; Pen-name =
pseudoniem; Penwiper = pennenwisscher.

Penal, pîn’l, straf …, strafbaar: Penal code = wetboek van


strafrecht; Penal colony = strafkolonie; Penal laws = strafwetten;
Penal servitude = dwangarbeid; Penal settlement =
strafkolonie; Penalty, pen’lti, wettige straf of boete; extra gewicht
op een renpaard; bij een handicap; Penal kick = strafschop;
Penalize = strafbaar stellen: Abstention of voting ought to be
penalized = onthouding van stemming moest strafbaar gesteld
worden; Penance, pen’ns, boetedoening, zoenstraf, boetekleed;
ook verb.: To do penance.
Penang, pinaŋ: Penang-lawyer (Zie Lawyer); Penang-nut =
betelnoot.

Penarth, pînəth.

Penates, pəneitîz, Penaten of huisgoden.

Pence, pens, stuivers (als verzamelwoord): St. Peter’s Pence =


Pieterspenning; Bad sixpence always turns up = onkruid vergaat
niet.

Pencil, pensil, subst. penseel, potlood, bundel; Pencil verb.


schilderen, teekenen, met een potlood opschrijven: A pencil of
rays = stralenbundel; Pencil-case = potloodhouder; Pencil-
colo(u)rs = crayon; Pencil-compass = Pencil-pointer =
potloodscherper of -punter.

Pendant, pend’nt, oorhanger, hanger, wimpel, pendant, luster.

Pendency, pend’nsi, het hangen, onzekerheid, aanhangig zijn;


Pendent = hangend; vrij zwevend; Pendent bridge.

Pendennis, pendenis.

Pending, pendiŋ, adj. hangende, onbeslist, prep. gedurende:


Pending affairs = loopende zaken; He smiled, with life and
death pending = zwevende tusschen dood en leven; Pending the
inquiry = gedurende het onderzoek.

Pendragon, pendrag’n, opperste veldheer bij de Oude Britten;


Pendragonship.

Pendulous, pendjulɐs, hangend, schommelend, trillend: He had a


pendulous excrescence [394]on his nose = een slingerend uitwas;
subst. Pendulousness; Pendulum, pendjulɐm, slinger;
Pendulum-clock.

Penelope, pəneləpî.

Penetrability, penətrəbiliti, subst. v. Penetrable, penətrəb’l,


doordringbaar, ontvankelijk; subst. Penetrableness; Penetralia,
penətreiljə, binnenste van huis of tempel; geheimen; Penetrant =
doordringend; Penetrate, penətreit, doordringen, doorgronden;
Penetrating = doordringend, fijn onderscheidend, scherp;
Penetration, penətreiš’n, het doordringen of doorgronden,
doordringingsvermogen, scherpte, scherpzinnigheid; Penetrative,
penətrətiv, doordringend: Penetrative effect (van een schot);
subst. Penetrativeness.

Penguin, pengwin, Penguin, vetgans.

Peninsula, pəninsiulə, schiereiland; adj. Peninsular: The


Peninsular War = de oorlog in Spanje en Portugal tegen Napoleon
I (1808–1814).

Penitence, penitens, berouw, boete; Penitent, boetvaardig, boete


doende; subst. boetvaardige, boeteling; Penitential, penitenšl,
subst. boeteboek (ten behoeve van biechtvaders); adj.
berouwhebbend, boetvaardig: Penitential Psalm = boetpsalm;
Penitentiary, penitenšəri, boete …, boetvaardig; subst.
verbeteringsgesticht; tuchthuis; penitentiaris (hoofd van het
pauselijk college dat over dispensaties en vrijspraak beslist).

Pennant, pen’nt, vlaggetje, wimpel.

Pennate(d), penit(id), gevleugeld = Pennigerous.

Penniless, peniles, arm; subst. Pennilessness.


Pennon, pen’n, wimpel, vaantje; wiek.

Pennsylvania, pensilveinjə; adj. en subst. Pennsylvanian.

Penny, peni, 1⁄12 van een shilling, stuiver, geld: A penny saved is
a penny gained = een stuiver gespaard is een stuiver gewonnen;
In for a penny, in for a pound = komt men over den hond, dan
komt men over den staart; A penny at a time = zachtjes aan,
geleidelijk; To make a penny = geld verdienen; To turn an
honest penny = een eerlijk stuk brood verdienen; Penny-a-liner
= loon-, broodschrijver; Penny-a-lining = broodschrijverij;
Twopenny people = proleten; Twopenny halfpenny godliness
(picnic) penny goedkoope; Penny-in-the-slot machine =
automaat; Penny-gaff = tjingeltangel; Penny royal = polei;
Penny-wedding = bruiloftsfeest, waar de gasten bijdragen tot de
onkosten; Penny-weight = ± 1.55 gram; Penny-wise =
krenterig: Their penny-wise precautions = prullerige
voorzorgsmaatregelen; To be penny-wise and pound-foolish =
de zuinigheid de wijsheid laten bedriegen; Pennyworth, penəth,
peniwɐ̂ th, voor de waarde van een stuiver, iets ge- of verkochts,
koop, kleinigheid: You have got there a poor pennyworth = ge
hebt u daar leelijk in den nek laten zien.

Pensile, pens(a)il, hangend; subst. Pensileness.

Pension, penš’n, subst. pensioen, jaargeld, geld betaald in plaats


van tienden; met Fr. uitspr. pension; Pension verb. pensionneeren;
in pension zijn: Old Age Pensions Act = wet op
ouderdomspensioen; Not for a pension = voor geen geld ter
wereld; To return home on a pension; He was pensioned off on
another post = kreeg pensioen en werd in eene andere betrekking
geplaatst; Pensionable = gerechtigd tot (rechtgevend op) het
verkrijgen van een pensioen; Pensionary, subst. gepensionneerde,
pensionaris; adj. pensioen(s)…; pensioen trekkend: Grand
pensionary; Pensioner = gepensionneerde, iemand die jaargeld
geniet: Chelsea pensioner = invalide uit het Chelsea Hospital;
Greenwich pensioner = invalide (matroos) uit het Hospital te
Greenwich; Gentleman pensioner = een lid van het corps der
Gentlemen-at-Arms, een oude paleis-eerewacht.

Pensive, pensiv, peinzend, somber, zwaarmoedig; subst.


Pensiveness.

Penstock, penstok, verlaat; hydrant.

Pent, pent, opgesloten; ook = Penthouse.

Pentachord, pentəköd, vijfsnarige lier.

Pentacle, pentək’l, soort tooverzegel tegen heksen.

Pentadactyl(ous), pentədaktil(əs), met vijf vingers of teenen.

Pentagon, pentəgon, vijfhoek, fort met vijf bastions; Pentagonal,


pentagən’l, vijfhoekig.

Pentagraph, pentəgraf, teekenaap.

Pentahedral, pentəhîdr’l, pentəhedr’l, met vijf gelijke zijden;


Pentahedron, pentəhîdr’n, pentəhedr’n, regelmatige vijfhoek.

Pentameter, pentamətə, vijfvoetige versregel.

Pentarchy, pentəki, vijfmanschap.

Pentastich, pentəstik, vijfregelig gedicht.

Pentateuch, pentətjûk, vijf boeken van Moses.


Pentecost, pentəkost, pinkster(feest); Pentecostal, pentəkost’l,
Pinkster …

Penthouse, penthaus, afdak; ook verb.

Penult, pînɐlt, pinɐlt, Penultimate, tweede lettergreep van achter.

Penumbra, pinɐmbrə, bijschaduw.

Penurious, pənjûriəs, vrekkig, gierig; schraal, karig, behoeftig;


subst. Penuriousness; Penury, penjuri, diepe armoede, volslagen
gebrek.

Penzance, penzâns.

Peon, pîən, Indisch soldaat (bode, politieagent); Mexicaansche slaaf


of arbeider; pion (schaaksp.); Peonage = Peonism = dienstbaarheid.

Peony, pîəni, pioen.

People, pîp’l, subst. natie, volk. geslacht, ras, menschen,


bedienden; People verb. (zich) bevolken: The people = de groote
hoop. gepeupel; The peoples of antiquity = volkeren; My people
= mijne familie; One’s own people = iemands bloedverwanten.

Pepin, pepin.

Pepper, pepə, subst. peper; Pepper verb. peperen; bombardeeren,


er op los schieten of ranselen (away): Ground pepper = gemalen
peper; Whole (Round) pepper; Pepper and salt = zwart en wit,
gemengd-kleurig; I’ll give you pepper = ik zal het je inpeperen;
Pepper-box = peperbus, driftkop; Pepper-caster = peperbus;
revolver; Pepper-cake = peperkoek; Pepper-corn = peperkorrel;
iets van zeer geringe waarde: Pepper-corn rent = nominale pacht,
waardoor een eigenlijke lease-hold feitelijk tot een freehold wordt;
[395]Pepper-mill;Peppermint = pepermunt (Pepper-drop,
Pepper-lozenge); Pepper-wort = peperkruid; Pepperer =
driftkop; Peppery = gepeperd; scherp; driftig, heet gebakerd.

Pepsin(e), pepsin, pepsine.

Peptic, peptik, de spijsvertering bevorderend; middel ter


bevordering der spijsvertering: Peptics = spijsverteringsorganen;
Peptone, peptoun, pepton.

Pepys, peps, pips, pepis.

Per, pɐ̂ , per, door: Per advance = vooruit; Per advice = volgens
bericht; Per annum = per jaar; Per cent. = percent; As per
margin = volgens aanteekening op den rand.

Peradventure, peradventšə, misschien, toevallig; subst. twijfel,


onzekerheid.

Perambulate, pərambjuleit, door-, rondwandelen; Perambulation =


doorwandeling, rondgang, inspectie, schouw; Perambulator =
kinderwagen, melkwagentje.

Perceivable, Perceive, pəsîv, waarnemen, bemerken, inzien,


onderscheiden.

Percentage, pəsentidž, percentage.

Percept, pɐ̂ sept, waarneembaar iets, iets reëels: Snow is a


percept, the white of snow a concept = sneeuw is iets reëels,
de witte kleur ervan bestaat niet op zichzelf, maar wordt er aan
waargenomen; Perceptibility = waarneembaarheid; Perceptible,
pəseptib’l waarneembaar (voor = to); Perception, pəsepš’n,
waarneming, gewaarwording, begrip: Range of perception =
gezichtskring (fig.); Perceptive = vatbaar voor waarneming of
gewaarwording: Perceptive faculty = Perceptiveness =
Perceptivity = waarnemings-, gewaarwordingsvermogen.

Perch, pɐ̂ tš, subst. baars; stok, prik, hooge bok of zitplaats; maat
van 5,029 M.; ook 1⁄160 acre; Perch verb. als een vogel zitten of
gaan zitten, hoog zitten, zetten op: I am off to perch = ik ga op
stok (naar kooi); The bird was perched there; I was perched on
the roof = zat; Percher = vogel, die op takken pleegt te zitten;
groote altaarkaars; Perching-stick = prik.

Perchance, pətšans, misschien, bijgeval.

Percipience, pəsipj’ns, Percipiency, gewaarwording, waarneming;


Percipient, waarnemend, bespeurend; ook subst.

Percolate, pɐ̂ kəleit, doorzijgen, filtreeren, zuiveren; subst.


Percolation; Percolator = filter; filtreerkoffiekan.

Percuss, pəkɐs, stooten; percuteeren; Percussion, pəkɐš’n, schok,


slag; percussie (Med.): Percussion-cap = slaghoedje;
Percussion-lock = slot met slaghoedje; Percussive = schokkend,
slaand, slag- -, schok- -.

Percy, pɐ̂ si; Perdita, pɐ̂ ditə.

Perdie, pɐ̂ dî = par Dieu.

Perdition, pədiš’n, vernietiging, ondergang; verdoemenis: Go to


perdition = loop naar den duivel.

Peregrinate, perəgrineit, zwerven, rondtrekken; Peregrination,


perəgrineiš’n, rondzwerving, verblijf buitenslands; Peregrinator.

Peregrine, perəgrin.
Peregrine(-falcon), perəgrin(fôk’n), edelvalk.

Peremptoriness, perəm(p)tərinəs, subst. v. Peremptory,


per’m(p)təri, volstrekt, beslissend, afdoende, meesterachtig,
vasthoudend: His commands are peremptory = dulden geene
tegenspraak; To take a peremptory pipe = nog een laatste pijpje
stoppen.

Perennial, pərenj’l, subst. vaste, overblijvende plant; adj. een vol


jaar durend; overblijvend, onafgebroken.

Perfect, pɐ̂ fəkt, volmaakt, zuiver, zonder gebreken, rolvast; subst.


Perfectum; Perfect verb. pɐ̂ fəkt, pəfekt, volmaken, voleindigen,
volledig onderrichten: To be perfect in = goed kennen = To have
a thing perfect; Practice makes perfect = al doende leert men;
Perfecter; Perfectibility = volmaakbaarheid; Perfectible,
pəfektib’l, volmaakbaar; Perfection, pəfekš’n, volmaaktheid,
uitstekendheid: That approaches perfection = komt de
volmaaktheid nabij; To bring to perfection; He performed to
perfection = hij speelde uitstekend; Perfectionist = die op
zedelijke volmaaktheid boogt, of zedelijke volmaaktheid bereikbaar
acht; lid eener Amer. sekte; Perfectness = volmaaktheid,
volkomenheid.

Perfervid, pəfɐ̂ vid, gloedvol, vurig; subst. Perfervidness.

Perfidious, pəfidjəs, verraderlijk, trouweloos, bedriegelijk; subst.


Perfidiousness; Perfidy, pɐ̂ fidi, trouweloosheid, trouwbreuk.

Perforate, pɐ̂ fərit, adj. doorboord, geperforeerd; Perforate verb.


(pɐ̂ fəreit) doorboren; Perforation = doorboring, gaatje; Perforator =
perforeermachine, schedelboor.

Perforce, pəfös, met geweld, gedwongen.


Perform, pəföm, volvoeren, uitvoeren, volbrengen, vervullen,
nakomen, spelen, opvoeren; Performable = uitvoerbaar;
Performance = uitvoering, vervulling, opvoering, verrichting, daad:
No performance to-day = heden geene uitvoering (voorstelling);
Morning-performance = morgenvoorstelling; Promises without
performances = onvervulde beloften; Performer = uitvoerder,
volbrenger, acteur, zanger, speler, gymnast, etc.: A good promiser
but a bad performer.

Perfume, pɐ̂ fjûm, pəfjûm, subst. geur, reukwerk; Perfume verb.


geuren, met geuren doortrekken: Perfume-fountain = spuitje;
Perfumer, pəfjûmə, parfumeur; Perfumery, pəfjûməri,
reukwerk(en), parfumerie.

Perfunctoriness, pəfɐŋktərinəs, subst. v. Perfunctory, pəfɐŋktəri,


zorgeloos, oppervlakkig, nonchalant, slordig: He held perfunctory
receptions = maakte niet veel werk van het houden van recepties.

Perfuse, pəfjûz, besprenkelen, overgieten, vervullen; subst.


Perfusion, pəfjûž’n.

Perhaps, pəhaps, pəraps, misschien.

Peri, pîri, gevallen engel.

Perianth, perianth, bloembekleedselen.

Periapt, periapt, amulet, behoedmiddel.

Pericardiac, perikâdiək, Pericardial, Pericardian, Pericardic = tot


het hartzakje behoorende; Pericarditis, perikâdaitis, ontsteking
van het hartzakje; Pericardium = hartzakje.

Pericarp, perikâp, zaadhuisje.


Pericles, periklîz.

Pericranium, perikreinj’m, schedel; schedelhuid. [396]

Peridot, peridot, chrysoliet (goudsteen).

Perigee, peridžî, perigeum.

Perihelion, perihîlj’n, perihelium.

Peril, peril, subst. gevaar, risico; Peril verb. in gevaar brengen,


wagen: You do it at your own peril = op eigen verantwoording en
risico; He perilled his happiness upon it = waagde zijn geluk er
aan; Perilous = gevaarlijk; subst. Perilousness.

Period, pîriəd, periode, tijdkring, omloopstijd, grens, slot, zin, punt:


Girl of the period = modern meisje; Poets of the period =
dichters van onzen tijd; That put a period to our labours = maakte
een einde aan = brought them to a period; Periodic, pîriodik,
periodiek, kring …; Periodical, subst. tijdschrift; adj. periodiek;
Periodicity = het periodiek zijn of terugkeeren.

Perioeci, periîsai, omwoners = Perioecians.

Peripatetic, peripətetik, rondwandelend, peripatetisch; subst.


wandelaar; volgeling van Aristoteles; Peripateticism =
wijsbegeerte van A.

Periphery, pərifəri, omtrek, oppervlakte.

Periphrase, perifreiz, omschrijving; Periphrase verb. omschrijven;


Periphrasis, pərifrəsis, omschrijving; Periphrastic(al) conjugation
= omschrijvende vervoeging (gramm.).

Periscii, pərišiai, bewoners der Poolstreken.


Periscope, periskoup, soort objectief; reflector in een onderzeesch
vaartuig.

Perish, periš, omkomen, sterven, vervallen, vergaan, doen


omkomen; Perishable = vergankelijk, aan bederf onderhevig; ook
subst. Perishableness = vergankelijkheid, enz.; Perisher = drank,
proleet.

Peristaltic, peristaltik, gekronkeld.

Peristyle, peristail, peristyl.

Peritoneum, peritənîəm, buikvlies; Peritonitis, peritənaitis,


buikvliesontsteking.

Periwig, periwig, pruik; ook verb.; Periwig-maker.

Periwinkle, periwiŋk’l, alikruik; maagdepalm.

Perjure, pɐ̂ d[szə], een valschen eed doen: He perjured himself


= was meineedig; You are perjured = gij hebt een meineed
gedaan; Perjurer = meineedige; Perjurious = meineedig; Perjury,
pɐ̂ džəri, meineed, meineedigheid, eedschennis: Subornation of
perjury = omkoopen (of overhalen) der getuigen om een valschen
eed te doen.

Perk, pɐ̂ k, mooi, net, verwaand, brutaal; Perk verb. den neus in
den wind steken, zich uitrekken, uitsteken, mooi maken: She
perked up her cap = maakte zich mooi om te behagen;
Perkiness = brutaliteit; Perky = Perk.

Perkin, pɐ̂ kin, lichte ciderwijn.

Perlustration, pɐ̂ lɐstreiš’n, bezichtiging, monstering.


Permanence, pɐ̂ mənens, duurzaamheid, bestendigheid;
Permanent, pɐ̂ mənent, duurzaam, bestendig: Permanent
colours = vaste kleuren; The permanent way was broken up =
viaduct (Amer.).

Permeability, pɐ̂ miəbiliti, subst. v. Permeable, pɐ̂ miəb’l,


doordringbaar; Permeate, pɐ̂ mieit, doordringen, indringen; subst.
Permeation.

Permissible, pəmisib’l, toelaatbaar; Permission, pəmiš’n,


vergunning, verlof: By (with) your permission = met uw verlof;
Permissive = toestaand, veroorloovend: Permissive Bill =
wetsontwerp, waarbij twee derden van de belastingschuldigen eener
gemeente het vergunningsrecht binnen haar grondgebied kunnen
weigeren.

Permit, pɐ̂ mit, verlof, vergunning, toegangsbewijs; geleibiljet,


consent (voor uitvoer) ter terugverkrijging v. d. accijns.

Permit, pəmit, veroorloven, toelaten: I may be permitted to say


= het zij mij vergund te zeggen; The medical man would not
permit of any noise in the sickroom = wilde niet hebben.

Permutable, pəmjûtəb’l, verwisselbaar; subst. Permutableness;


Permutation, pɐ̂ mjuteiš’n, verwisseling, omzetting, permutatie:
Permutation-lock = letterslot.

Pernambuco, pernambûkou.

Pernicious, pənišəs, schadelijk, verderfelijk; subst.


Perniciousness.

Pernickety, pənikəti, peuterig.


Perorate, perəreit, zijne redevoering eindigen; doordraven; subst.
Peroration.

Peroxide, pəroks(a)id, peroxyde; Peroxidize, pəroksidaiz, tot den


hoogsten graad oxideeren.

Perpendicular, pɐ̂ p’ndikjulə, loodrecht, steil, rechtop, staande;


subst. loodlijn, enz., staande boterham: To erect, To let fall a
perpendicular = eene loodlijn oprichten, neerlaten; Out of the
perpendicular = uit het lood; To preserve one’s perpendicular;
Perpendicularity, pɐ̂ p’ndikjulariti, loodrechte stand.

Perpetrate, pɐ̂ pitreit, bedrijven, uithalen: He perpetrated a bad


(huge) joke; subst. Perpetration; Perpetrator = bedrijver,
schuldige.

Perpetual, pəpetjuəl, eeuwigdurend, levenslang, vast: Perpetual


imprisonment; Perpetual motion = perpetuum mobile;
Perpetual screw = schroef zonder eind; Perpetuate, pəpetjueit,
laten voortbestaan, vereeuwigen; subst. Perpetuation; Perpetuity,
pɐ̂ pətjûiti, eeuwigheid, eeuwige duur, levenslang bezit, levenslange
rente: For (In, To) perpetuity = in eeuwigheid.

Perplex, pəpleks, verwarren, verlegen maken, bemoeielijken; adj.


Perplexed; subst. Perplexedness = Perplexity = verlegenheid,
verwarring, moeielijkheid.

Perquisite, pɐ̂ kwizit, fooi, emolument, bijkomende voordeelen: The


nurse takes her evenings and Sundays, which is her lawful
perquisite and due = waarop ze wettig recht heeft.

Perroquet, perəket = Parakeet.

Perry, peri, perewijn.


Persecute, pɐ̂ sikjût, vervolgen, lastig vallen; subst. Persecution:
Persecution of Christians; Persecutor, p1ɐ̂ sikjutə, Persecutrix.

Perseus, pɐ̂ siûz, Perseus.

Perseverance, pɐ̂ səvîrəns, subst. v. Persevere, pɐ̂ səvîə,


volharden.

Persia, pɐ̂ šə, Perzië; Persian, subst. en adj. Pers, Perzisch(e taal);
soort v. dunne zijde: Persian blinds = persiennes, soort
zonneblind; Persian cat = cypersche kat; Persian powder =
insectenpoeder; Persian wheel = rad met aangehechte emmers
om water te putten. [397]

Persico, pɐ̂ sikou, persico.

Persimmon, pəsim’n, dadelpruim.

Persist, pəsist, krachtig volharden, volhouden: He persisted with


that subject = hield zich hardnekkig aan; Persistence,
Persistency = groote volharding, koppigheid; Persistent =
volhardend, hardnekkig, herhaald, niet afvallend (v. bladeren).

Person, pɐ̂ s’n, persoon, persoonlijkheid, iemand, vrouwspersoon,


meisje: Artificial person = zedelijk lichaam; For our persons =
wat ons aangaat; He came in person = in eigen persoon; There is
no respect of persons with God = God kent geen aanzien des
persoons: Without respect of persons = zonder aanzien des
persoons; Personable = flink, knap van uiterlijk; Personage,
pɐ̂ sənidž, gewichtig of voornaam persoon; persoon (theater):
Though a head-waiter, he is a personage = een heel heer;
Personal, persoonlijk; subst. roerend goed: Personal estate =
roerend goed; Personality = persoonlijkheid; Personization =
verpersoonlijking; Personalize = verpersoonlijken; Personalty:
The new duty will be charged on both realty and personalty =
zoowel van vast als van roerend goed; Personate, pɐ̂ səneit,
voorstellen, de rol spelen van, in eens anders plaats optreden, zich
uitgeven voor: There are no actresses, women being always
personated = vrouwenrollen worden altijd door jongens of mannen
vervuld; False personation = het optreden in eens anders plaats
met bedriegelijke oogmerken; Personator, pɐ̂ s’neitə, iemand die in
eens anders rol of plaats optreedt; Personification, subst. v.
Personify, pəsonifai, verpersoonlijken.

Perspective, pəspektiv, pɐ̂ spektiv, perspectivisch; subst. verschiet,


perspectief, perspectivische voorstelling; verrekijker.

Perspicacious, pɐ̂ spikeišəs, schrander, scherpziend; subst.


Perspicaciousness = Perspicacity, pɐ̂ spikasiti, helderheid,
duidelijkheid, doorzichtigheid; Perspicuity = duidelijkheid;
Perspicuous, pəspikjuəs, duidelijk; subst. Perspicuousness.

Perspirability, pəspairəbiliti, subst. v. Perspirable, pəspairəb’l,


wat uitgewasemd kan worden; Perspiration, pɐ̂ spireiš’n,
uitwaseming, zweet; Perspirative = uitwasemend; Perspiratory,
uitwaseming bevorderend, zweet - -; Perspire = uitwasemen,
zweeten.

Persuadable, pəsweidəb’l, (licht) te overreden; Persuade,


pəsweid, overreden, bepraten, brengen tot: I am persuaded that I
am lost = ik ben zeker; He could persuade himself of almost
everything = zich haast alles wijsmaken; Persuader = overreder,
zweepslag; Persuasibility, pəsweisibiliti, pəsweizibiliti,
overreedbaarheid; Persuasible, pəsweisib’l, pəsweizib’l,
overreedbaar; subst. Persuasibleness; Persuasion, pəsweiž’n,
overreding(skracht), geloof, vaste overtuiging; Persuasive,
pəsweisiv, subst. beweeggrond, beweegreden; adj. overredend;
subst. Persuasiveness.
Pert, pɐ̂ t, brutaal, vrijpostig; gezond, monter (Amer.); subst.
Pertness.

Pertain, pətein, behooren, betrekking hebben: “Social” is a word,


pertaining to society = in verband staande met.

Pertinacious, pɐ̂ tineišəs, hardnekkig, eigenzinnig, vastbesloten;


subst. Pertinacity, pɐ̂ tinasiti.

Pertinence, pɐ̂ tin’ns, gepastheid, geschiktheid, voegzaamheid;


Pertinent = voegzaam, passend bij: That is not pertinent to the
matter under discussion = heeft niets te maken.

Perturb, pətɐ̂ b, verstoren, verwarren, verontrusten; subst.


Perturbation; Perturbative, Perturbative = storend; Perturber.

Pertussis, pətɐsis, kinkhoest.

Peru, pərû, Peru.

Peruvian, pərûvj’n, subst. en adj. Peruviaan(sch): Peruvian bark


= kinabast.

Peruke, pərûk, pruik.

Perusal, pərûz’l, nauwkeurige kennisneming of lezing; Peruse,


pərûz, nauwkeurig doorlezen of nagaan; Peruser.

Pervade, pəveid, doordringen, zich uitstrekken tot: His all


pervading goodness = zijne zich tot alles uitstrekkende goedheid;
subst. Pervasion; adj. Pervasive.

Perverse, pəvɐ̂ s, verkeerd, slecht, verdorven, onhandelbaar,


afschuwelijk, pervers; spiegelbeeld; subst. Perverseness =
Perversity; Perversion, pəvɐ̂ š’n, verdraaiing, afwijking,
halsstarrigheid; spiegelbeeld; Perversive = verderfelijk.

Pervert, pɐ̂ vət, afvallige, goddelooze.

Pervert, pəvɐ̂ t, verdraaien, in ’t verderf storten, verleiden; slecht


worden, afvallig worden: It all came from love Perverted = uit
verdwaalde, in verkeerde richting geleide liefde; Perverter;
Pervertible = (licht) te verdraaien of te bederven.

Pervious, pɐ̂ vjəs, doordringbaar, toegankelijk: Pervious to the


eye = zichtbaar; subst. Perviousness.

Pesade, pəzeid, pəseid, steigeren.

Peshawar, Peschawur, pəšauə.

Pesky, peski, verduiveld; bovenmate (Am.).

Pessimism, pesimizm, pessimisme; Pessimist, subst. pessimist;


adj. pessimistisch = Pessimistic.

Pest, pest, plaag, onkruid; pest(ilentie): He’s a regular pest;


Pest-house = pesthuis, lazaret; Pester = kwellen, lastig vallen;
Pesterer = kwelgeest, plager; Pestiferous, pestifərɐs =
verpestend, besmettelijk, schadelijk, pest - -; Pestilence = pest;
besmetting; Pestilent = verpestend, schadelijk; lastig;
Pestilential, pestilenš’l = Pestiferous.

Pestle, pes’l, subst. stamper; Pestle verb. stampen.

Pet, pet, subst. lievelingsdier, lieveling, snoes; aanval van booze


luim of gemelijkheid; adj. lievelings - -, geliefd, vertroeteld; Pet
verb. vertroetelen; uit zijn humeur zijn: Of course Maggie is the pet
= lieveling; He invariably got in a pet at that juncture = werd altijd
Welcome to Our Bookstore - The Ultimate Destination for Book Lovers
Are you passionate about books and eager to explore new worlds of
knowledge? At our website, we offer a vast collection of books that
cater to every interest and age group. From classic literature to
specialized publications, self-help books, and children’s stories, we
have it all! Each book is a gateway to new adventures, helping you
expand your knowledge and nourish your soul
Experience Convenient and Enjoyable Book Shopping Our website is more
than just an online bookstore—it’s a bridge connecting readers to the
timeless values of culture and wisdom. With a sleek and user-friendly
interface and a smart search system, you can find your favorite books
quickly and easily. Enjoy special promotions, fast home delivery, and
a seamless shopping experience that saves you time and enhances your
love for reading.
Let us accompany you on the journey of exploring knowledge and
personal growth!

ebookgate.com

You might also like