0% found this document useful (0 votes)
4 views4 pages

5

This paper presents a novel fault diagnosis method for rolling bearings using K-L transformation and Lagrange support vector regression. The approach effectively transforms multidimensional correlated variables into low-dimensional independent eigenvectors, enhancing fault pattern recognition accuracy. Experimental results demonstrate that this method achieves over 95% accuracy in diagnosing various bearing faults, requiring fewer samples and less prior knowledge compared to traditional methods.

Uploaded by

Isabelle Simão
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views4 pages

5

This paper presents a novel fault diagnosis method for rolling bearings using K-L transformation and Lagrange support vector regression. The approach effectively transforms multidimensional correlated variables into low-dimensional independent eigenvectors, enhancing fault pattern recognition accuracy. Experimental results demonstrate that this method achieves over 95% accuracy in diagnosing various bearing faults, requiring fewer samples and less prior knowledge compared to traditional methods.

Uploaded by

Isabelle Simão
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

2009 International Joint Conference on Artificial Intelligence

Study on Fault Diagnosis of Rolling Bearing Based on K-L Transformation and


Lagrange Support Vector Regression

Xu Yangwen
College of information engineering
JinHua College of Profession & Technology
Jinhua, China
[email protected]

Abstract—On the basis of vibration signal of rolling bearing, a shown that, comparing with traditional neural network,
new method of fault diagnosis based on K-L transformation support vector machine has not only simpler structure, but
and Lagrange support vector regression is presented. also better performances, especially better generalization
Multidimensional correlated variable is transformed into low ability [3-6]. At same time, it transforms optimization
dimensional independent eigenvector by the means of K-L problems to convex quadratic programming problems, the
transformation. The pattern recognition and nonlinear solution is the sole global optimum. Lagrange support
regression are achieved by the method of Lagrange support vectors regression is proposed by O.L.Mangasarian, which is
vector regression. Lagrange support vector regression can be an effective algorithm of Support vector classification
used to recognize the fault after be trained by the example
machine in the case of solving linear planning problems. The
data. Theory and experiment shows that the recognition of
fault diagnosis of rolling bearing based on K-L transformation
Lagrange method for solving linear complementary problem
and Lagrange support vector regression theory is available to is introduced in the non-linear support vectors regression
recognize the fault pattern accurately and provides a new machine in this paper, the efficient iteration algorithm is
approach to intelligent fault diagnosis. obtained [7-8].
In this paper, according to existed problems of rolling
Keywords-K-L transformation; Lagrange support vector bearing fault diagnosis, for example: difficult to obtain a
regression; Rolling bearing; Fault diagnosis large number of faults data samples; difficult to obtain
diagnosis knowledge; weakness of reasoning ability; feature
I. INTRODUCTION extraction is difficult, and so on, a new method of fault
diagnosis based on K-L transformation and Lagrange support
At present, Rotating Machinery is developing in the vector regression is presented. Multidimensional correlated
direction of large-scale, high-speed, light type, automation variable is transformed into low dimensional independent
and large load, at the same time, the accuracy of diagnosis eigenvector by the means of K-L transformation. The pattern
for a variety of complex fault is the key of system recognition and nonlinear regression are achieved by the
performance guarantee. According to statistics, the 30% of method of Lagrange support vector regression. Theory and
Rotating Machinery fault is caused by rolling bearing fault. experiment shows that it can successfully achieve a bearing
Bearing vibration signals carry a wealth of information, failure mode identification and classification with
when the vibration signal is transiting from stationary to non- satisfactory results.
stationary, the highlight reflection is the impact, vibration,
structural changes and changes in the gap or crack which is II. CALCULATING THE PRINCIPAL EIGENVALUE OF K-L
caused by rolling bearing fault. The graphics of vibration TRANSFORM
signals is different in different conditions, the mission of Suppose that the vibration signals of rolling bearing can
pattern recognition for rolling bearing fault is to extract, be expressed by n-dimensional vector model
dispose and classify vibration signals in the vibratory signal,
X = {x i } , i = 1,2, n, it corresponds to a point in the
in order to deduce the state of rolling bearings operation[1-
2]. Due to the complexity of the rolling bearing system and original space, and exist an orthogonal function set
diversity of fault style, it does not exist definite function { }
A = A j (i ), i, j = 1,2, n , satisfied that:
between vibration signals and state information, there is a
complex non-linear mapping between signals set and the n
states set, which determines the difficulty and complexity of Y= ∑x j A j = AX (1)
rolling bearing fault recognition pattern. j =1
Support vector machine which is originated by Dr
Vapnik is new machine learning technique. Being different The eigenvector after the transformation
from traditional neural network, it is based on structure risk is Y = ( y1 , y 2 , , y n )T .
minimization principle, while the latter on empirical risk The transposed matrix of formula (1) is
minimization principle. A large number of experiments have

978-0-7695-3615-6/09 $25.00 © 2009 IEEE 333


DOI 10.1109/JCAI.2009.60

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA CATARINA. Downloaded on October 13,2022 at 03:54:54 UTC from IEEE Xplore. Restrictions apply.
Y T = X T AT (2) from the optimal hyperplanes to the origin. Under non-linear
circumstances, Support vector regression machines can
Multiply the formula (1) and formula (2), we can obtain
subtly bring the samples into high-dimensional space by
mathematical expectation is:
introducing the concept of kernel function, thus linear
[ ]
E YY T = AE XX T AT [ ] (3) regression analysis can be carried out in high-dimensional
space, avoiding the "dimension disaster."
The Common kernel functions are:
C y = AC n AT (4)
(1) Ordinary polynomial kernel function:
C x 、 C y are separately the covariance matrix of X and K ( x, y ) = ( x T y + c ) p , p ∈ N , c ≥ 0 (6)
Y. We can use K-L Linear transformation method to select
appropriate transformation matrix A , making each (2) Gaussian radial basis kernel function
component y i independent and making C y diagonal 2
x− y
matrix, that is K ( x, y ) = exp(− ) (7)
2σ 2
C y = diag (λ1 , λ 2 , λn ) (5) (3)Sigmoid kernel function

In order to achieve the relevant eigenvector X converting K ( x, y ) = tan( K ( x t y ) + c), K > 0, c < 0 (8)
into an independent eigenvector Y to complete the analysis
In this paper, we use Lagrange support vector regression
process of main eigenvalue (λ1 , λ 2 , λ n ) . machines, it is shown as follow, considering linear
Because C x is real symmetric matrix, C y is diagonal circumstance:
matrix, it is composed of n positive characteristic root 1 C t
2
λ i (i = 1,2, … , n) of C x , and λ1 > λ 2 > … λ n 。According to min (w + b2 ) + (ξ ξ + ξˆ t ξˆ) (9)
2 2
the concept of covariance, λi equal to the variance of the ith
n
Aw + be − y ≤ εe + ξ (10)
weight in Y, ∑λ
j =1
i represents the overall variance of

vibration signals. y − Aw − be ≤ εe + ξ (11)


In general, selecting the main characteristics which is
corresponding the front of m largest eigenvalues constituting Here: ξ 、 ξˆ are allowable error is penalty parameter
m-dimensional feature space in the n-dimensional vector or regularization parameters it reflects compromise between
space, making: the complexity and training error of model. A is a m×n
m n matrix whose row is x it ;e is Arbitrary-dimensional vector
∑ ∑λ λi / i > 85% whose component is 1; ε is free parameters of insensitive
i =1 i =1 loss function L( y, f ( x)) = L( y − f ( x) )
At this time, the eigenvector composed by the m To solve the dual model of model (9)-(11), we can obtain:
principal eigenvalue still retains enough information of the 1
original signal, it can study the different state of the main min (αˆ − α ) ' ( AA ' + ee ' )(αˆ − α ) − y ' (αˆ − α )
2
bearings eigenvector λi to achieve the purpose of fault
pattern recognition. 1
+ εe ' (αˆ + α ) + (αˆ 'αˆ + α 'α ) (12)
2C
III. LAGRANGE SUPPORT VECTOR REGRESSION MACHINE
Support vector regression machines method was αˆ , α ≥ 0 (13)
beginner with solving classification problem, support vector
regression machine problem is general described as: Here , αˆ , α are corresponding Lagrange multipliers
Given training samples set of inequality constraints model (9)-(11).
( x1 , y1 ), ( x 2 , y 2 ), … , ( x m , y m ), x i ∈ R n is input values, We can see from model (12)-(13) , Similar to the
y i ∈ R is corresponding target values, the aim is to find the classification case, the model contains only a simple non-
relevant regression function y = f ( x} through these training zero bound, there is no equality constraints and the
samples. Under linear circumstances, constraints on the sector, so the simple problem is easy to be
suppose f ( x) = wx + b , here, w is normal radial of the solved. At the same time, we can directly derive variable
optimal hyperplanes; b is threshold, it decides the distance from the solution of dual problems.

334

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA CATARINA. Downloaded on October 13,2022 at 03:54:54 UTC from IEEE Xplore. Restrictions apply.
w = A ' (αˆ − α ), b = e ' ((αˆ − α ) (14) 12 40

∑λ / ∑λ
i =1
i
i =1
i > 85%
So solution of the problems (12) ~ (13) is very important
as long as the issue is resolved, the optional regression So we choose 12 as principal eigenvalue, we use 12
surface f ( x) = wx + b can be easily expressed by formula principal eigenvalues to construct a
(14). eigenvector x = (λ1 , λ 2 , λ 23 ) , this eigenvector is used to
Suppose the optional solution of the problem be input element of four classification function f 1 ( x ) ,
is (αˆ * , α * ) , then w * = A ' (αˆ * − α * ) , the point f 2 ( x) , f 3 ( x) , f 4 ( x) of support vector regression
corresponding to αˆ * ≠ 0 or α * ≠ 0 is support vector machine. Classifier is composed of four Lagrange support
The above-mentioned problems is quadratic vector machines. LSVR 1 is used to judge normal bearings;
programming problem with only briefly bound by the lower LSVR 2 is used to judge outer ring fault bearings, LSVR 3 is
bound, it can be solved by linear complementarily problem used to judge inner ring fault bearings, LSVR 4 is used to
solving methods Lagrange, in order to get a simple and fast judge rolling element fault bearings,
iterative algorithm. First use the selected samples to train support vector
machines, training LSVR 1 to determine the samples
yˆ = f ( x) = x ' A ' (αˆ − α ) + e ' (αˆ − α ) (15) belonged to normal, the training samples belonging to
normal bearings as a category, expressed as 1, the remaining
As mentioned earlier, for non-linear regression function, samples will be marked as -1, according to the LSVR order,
it can be expressed by introduction of forms of the same to calculate optimize coefficient and establishing
n corresponding normal LSVR1.
kernel functions, for arbitrary x ∈ R :
To apply the same approach to train LSVR2, LSVR3,
yˆ = K (( x1' ) ' , ( Ae) ' )(αˆ − α ) (16) LSVR4. Let these classifications four function of classifiers
f 1 ( x) , f 2 ( x) , f 3 ( x) and f 4 ( x) make judge to input
Because the support vector regression machines is samples.
proposed for the two types of issues, it is needed to extend If the output of LSVR1 is 1, it belongs to normal
to a wide range of issues division in practical applications, bearings; if the output of LSVR2 is 1, it belongs to outer ring
because the real problems are generally multiple fault bearings; If the output of LSVR3 is 1, it belongs to
classification problems. For example, classification of the inner ring fault bearings; If the output of LSVR4 is 1, it
rolling bearing includes many kinds, such as the normal, the belongs to rolling element fault bearings; if the output of
outer ring fault, inner ring faults, rolling-fault. Different LSVR1 is 1 and one output of LSVR2、LSVR3、LSVR4 is
combination rules have different classification algorithm. In
1, then it belongs to misjudgment.
this paper, we adopt "one-to-many" classification which is
We take the 40 samples input classifiers, the calculated
composed of multi-fault classifier by 4 Lagrange support
values of part main eigenvectors is shown by Table 1.
vector regression machines (LSVR).
The accuracy rate of using the trained LSVR classifiers
IV. FAULT DIAGNOSIS OF ROLLING BEARING to carry out fault diagnosis for rolling bearing, normal
bearings, outer ring fault bearings, inner ring fault bearings
In order to verify the reasonableness and the feasibility of
and rolling element fault bearings is beyond 95%.
this method in the rolling bearing fault diagnosis, we take a
rolling bearing for example to test in the rolling bearing Comparing the method of this paper with BP neural
vibration experiments platform. In the experiment, we take network, this method needs less fault diagnosis samples,
40 bearings samples, including 10 normal bearings, 10 outer needn’t empirical knowledge of bearing fault classification
ring fault bearings, 10 inner ring fault bearings, 10 rolling in advance and data preprocessing.
element fault bearings, each bearing gather 10 times sample,
V. CONCLUSIONS
and gather 1024 data each time.
In the samples, the number of normal bearings, outer In this paper, a new method of fault diagnosis based on
ring fault bearings , inner ring fault bearings and rolling K-L transformation and Lagrange support vector regression
element fault bearings for training LSVR are 6, the machine is presented. This paper introduced K-L
remaining bearings is used as test samples after trained transformation method and Lagrange support vector
LSVR. The work speed of bearing is 10000 r/min, the input
sampling frequency is 30kHz, axial load is 500N, radial load
is 100N. Through experiment, when m=12, it satisfies:
Table 1 Parts of main characteristic vector
Main characteristic vector
Eigenvalue 1 2 3 4 5
λ1 0.78899 0.62807 0.04546 0.21667 0.32123
λ2 0.77762 0.62415 0.03899 0.21353 0.28399

335

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA CATARINA. Downloaded on October 13,2022 at 03:54:54 UTC from IEEE Xplore. Restrictions apply.
λ3 0.57662 0.58034 0.0384 0.19452 0.28027
λ4 0.56486 0.55428 0.03233 0.19011 0.25998
λ5 0.56368 0.54585 0.03135 0.17717 0.24812
λ6 0.55869 0.51096 0.02949 0.17649 0.24313
λ7 0.55369 0.50665 0.02312 0.16277 0.24244
λ8 0.51517 0.48029 0.02243 0.15963 0.22059
λ9 0.51508 0.4748 0.02047 0.1368 0.21814
λ10 0.50596 0.46578 0.01978 0.12611 0.17776
λ11 0.50439 0.4602 0.01831 0.12121 0.17345
λ12 0.48979 0.45588 0.01782 0.1169 0.17266
Fault type Normal Normal Outer ring fault Inner ring fault Rolling element fault

regression machine is presented. This paper introduced K-L [1] Lu Shuang, Zhang Zida, Li Meng. Fault Pattern Recognition of
Rolling Bearing Based on Radial Basis Function Neural Networks,
transformation method and Lagrange support vector chinese engineering science. 2004, 6(2): 56-60.
regression machine model, the latest achievements of [2] He Zhengjia, Meng Qingfeng. Fault Diagnosis Principle and
artificial intelligence is applied to fault diagnosis of rolling Application of Non-stationary Signals of Machinery and Equipment.
bearing. Reliability and accuracy of this method is proved Beijing: Higher Education press, 2001.
by examples of fault diagnosis, we can obtain the follow [3] OUSSAR Y, RIVALS I, PERSONNAZ L. Training wavelet
results: neuralnetwoks for nonlinear dynamic input- output modeling[J].
Neurocomputing, 1998, 20: 173-188
(1) Constructed main characteristic vector after K-L
[4] Zhang Xuegong. Introduction to Statistical Learning Theory and
transformation signals is decomposed can accurately reflect Support Vector Machine. Journal of Automation, 2000, 26(1): 32-42.
the situation that fault bearing vibration signal energy [5] Duan Jiangtao, Li Lingjun, Zhou Shengsuo. Application of Support
changing with the state information. These main Vector Machine in Multi-fault Classification of Mechanical System.
characteristic vectors are imputed to fault classifier which is Journal of Agricultural Machinery. 2004, 35(4): 144-147
composed by four Lagrange support vector regression [6] HSU Chih-we,i LIN Chih-jen. A Comparison of Methods for
machines; this algorithm is simple and high efficiency. Multiclass Support Vector Machines. IEEE Trans on
NeuralNetworks, 2002, 13(2): 415-425.
(2) Reliability and accuracy of this method is proved by
[7] Mangasarian O L,David R Musicant.Lagrangian support
examples of fault diagnosis, it can be used for other fault vectormachines. Journal of Machine Learning Research, 2001
diagnosis systems. (1):161-177
[8] Lee Y J,Mangasarian O L.SSVM: A smooth support vector machine
REFERENCE for classification. Computational Optimization and Applications,
2000,20(1):5-22.

336

Authorized licensed use limited to: UNIVERSIDADE FEDERAL DE SANTA CATARINA. Downloaded on October 13,2022 at 03:54:54 UTC from IEEE Xplore. Restrictions apply.

You might also like