Medical, Pharma, Engineering, Science, Technology and Business

Department of Biomedical Equipment, Azerbaijan Technical University, Baku, Azerbaijan

- *Corresponding Author:
- Abdullaev NT

Associate Professor, Department of Biomedical Equipment

Azerbaijan Technical University, Baku, Azerbaijan

**Tel:**+994126383383

**E-mail:**[email protected]

**Received Date:** July 11, 2017;** Accepted Date:** July 19, 2017;** Published Date:** July 23, 2017

**Citation:** Abdullaev NT, Dyshin OA, Ibragimova ID (2017) Prediction of the
Cardiovascular System on the Basis of an Assessment of Repeated Extreme
Values Heartbeat Intervals and Times to Achieve them in the Light of Short-term
and Long-term Relationships. J Biomed Eng Med Devic 2: 126. doi: 10.4172/2475-
7586.1000126

**Copyright:** © 2017 Abdullaev NT, et al. This is an open-access article distributed
under the terms of the Creative Commons Attribution License, which permits
unrestricted use, distribution, and reproduction in any medium, provided the
original author and source are credited.

**Visit for more related articles at** Journal of Biomedical Engineering and Medical Devices

A procedure for predicting large (with a record of more than a certain high threshold Q) Q-interval heart rate, which is a combination of the transformation of the original series of signals to a number of time to reach a predetermined threshold changes and the method of a large re-range prediction with a record of more than Q (Q-event) on the basis of estimate the probability WQ (t; Δt) that during the time DELTA.t will be at least one Q if an interval of time t until the last Q -Events came Q-event. This procedure allows you to speed up the implementation of the process of constructing the forecast and increase its reliability.

Cardiovascular system; Forecasting; Heartbeat intervals; Records; Assessment

In recent years, the problem of the study of rare (extreme) event attracted a lot of attention [1-3]. Generally, rare events with values much higher than the average value are considered to be independent, as the typical time between them is very large. However, in recent years it has become increasingly clear that this assumption is not always satisfied. To quantify rare events is usually the interval time between the appearance of successive events above (or below) a certain threshold Q. This is investigated as a function of the probability distribution of these repeated intervals (return intervals), as well as their long-term dependence (autocorrelation, conditional periods and reps etc.). In the numerical analysis, definitely not considered very high thresholds the Q, which provide a good statistical estimates of repeat intervals, and then try to extrapolate these results to the very high thresholds for which statistics are very poor.

For independent sets of data repeated intervals are independent
and (according to Poisson statistics) is exponentially distributed.
Clustering of rare events indicates the existence of a certain memory
in repeated intervals, and, as shown by recent studies [4-6], this type
of memory is a sequence of long-term dependency of time series
Long-term memory may be: (i) a linear (ii) or nonlinear (iii) for linear
and nonlinear characteristics of some other process characteristics.
In the first case, which is often called “monofraktal” (linear) the
autocorrelation function of the C_{x}(s) of input data decreases with
time s power law and exponent gamma fully It
describes the correlation between the records (extreme values). In this
case, as repeated intervals, and records have long-term correlations
and their distribution on a large scale is characterized by a stretched
exponential exponential gamma c, but on a small scale are subject
to a power law with an exponent γ-1 [4,6,7]. Such phenomena are
observed in long-term climate records [4] and volatility (variability)
of the financial records of [5], despite the fact that the volatile memory
comprises non-linear and, therefore, relates to the case (iii).

In the second case, when the records form a “multifractal” linear
autocorrelation function C_{x}(s) vanishes for s>0, and the records are
characterized by non-linear multifractal correlations that cannot
be described by a single exponential. With the generation of a
multiplicative random cascade (multiplicative random cascade-MRC) in [8] that the non-linear correlations inherent in such a time series
provide statisticians repeated intervals peculiar effect, which manifests
itself in submission to a power law probability density functions
(probability density functions-PDFs), the autocorrelation function
(autocorrelation functions - ACFs), and repeated periods of conditional
(conditional return periods-CRPs), which contradicts the properties
of independence and monofractal in the presence of long-term
correlations in the original data. [2] Exhibitors corresponding power
laws essentially depend on the selected threshold level, i.e., repeated
intervals will have different behavior at high and low thresholds.
Consequently, direct extrapolation of the laws governing repeated
intervals with low thresholds would not be lawful for a quantitative
description of repeated intervals with large rapids.

In [9], based on 24-hour Holter monitoring data shows that
the linear and non-linear long-term memory, inherent in repeated
intervals heart, leading to a power law changes PDF. As a result, the
power law will satisfy the probability W_{Q}(t;Δt) that Δt. of time units
that have elapsed since the last re-interval with an extreme event
(record) larger than the threshold Q (Q-short interval), there will be at
least a Q-interval if for t time units until the last Q-Q-interval appeared
heartbeat interval.

In this paper, prediction of large (with a record of more than a certain high threshold Q) re Q heartbeat interval is carried out using the procedure of forecasting [9] with repeated intervals statistics estimates [10] and with the preliminary selection of repeat Q-intervals with persistent (steadily increasing) records on the basis of conversion [11] of the original signals (in this case repeated Q-intervals) in a time to reach a predetermined change threshold.

1. Prognosis large repeated intervals heart with a great record.
Preexisting repeated intervals forecasting strategy is based, as a rule,
short-term pre-history, and were based on the construction of the
training sample (precursors) consisting of the
events preceding the extreme events y_{n}>Q. These strategies are based
mainly on two approaches. In the first approach considers the big
events with the appropriate precursors and their frequencies, which are
determined based on the posterior probability P(y_{n,k}=y_{n}>Q) In the second
approach considers all the precursors of which is preceded by some record y_{n}, and calculates the probability while y_{n,k} considered a precursor of extreme events y_{n}>Q
[12,13]. The second approach is more comprehensive because it takes
into account information precursors of all the events, thus providing
additional information about the studied ranks contained in the shortand
long-term correlations of the original data [14,15].

A more direct way of dealing with the study of the problem is
proposed in [12] a method in which the precursor is sought prior to
the consideration of extreme events and with the highest probability
of a generation of alarm (alarm) of the appearance of such a precursor.
For physiological extreme events (records) that appear in the nonlinear
complex systems (which, in particular, relates cardiovascular system),
the said precursor may not be representative due to the fact that many
other precursors may be of comparable probability to determine a
second interval following During the extreme event y_{n} . In this case it is
advisable to generate an alarm signal in accordance with a preliminary
estimate of the probability P with which there are extreme events
exceeding the threshold Q_{P}. Selecting Q_{P} usually carried out optimally,
minimizing the total loss associated with the forecast errors, including
false alarms and disturbing events (artifacts), with a preliminary
specification of losses from a false alarm and a disturbing event (which
can vary greatly in different tasks [12]).

In this regard, [9] developed a third approach to the problem of
predicting large repeated intervals, requiring less information and
therefore more convenient than the conventional approach. This
approach, which uses non-volatile memory and named [9] RIAapproach
(return intervals appkoach), based on the use of statistics
repeated intervals and very useful, in particular, in the study of
records having nonlinear(Multifractal) long-term memory. PDFbased
assessments repeated intervals size r, consisting of events with a
value greater than Q (briefly, this PDF is designated P_{Q}(r); Properties
of P_{Q}(r); are considered in [11] estimates) obtained two values, it is
essential to predict the record (extreme events) with a value greater
than Q (Q-short event) that appears after the last Q-events. The first
of these variables, the expected number of τ_{Q}(t) time units, after which
you receive the following Q-event will take place as soon as the t time
units after the last Q-events. By definition, _{Q}(0). τ is equivalent to the
period of repetitive intervals R_{Q} Q-events (Q-short interval). In general,
τ_{Q}(t) associated with P_{Q}(r) ratio:

(1)

and τ_{Q}(t) for multifaraktalnyh data satisfies the scaling relation

(2)

PDF to repeat r>r_{0} size intervals denoted briefly by P_{Q}(r|r_{0}).
Generalization values τ_{Q}(t) on Q events included in repeated intervals
size r>r_{0} (summarized these events will be denoted by Q(r_{0}) Events),
leads to the concept of magnitude τ_{Q}(t|r_{0}) in which the definition of
Q-event replaced to Q(r_{0})-event. **Figure 1a** and **1c** of [9] shows the value of the global τ_{Q}(t) and conditional τ_{Q}(t|r_{0}) the magnitude of the
expected temporary units until the next event, their numerical values
obtained for the MRC-models: (a) R_{Q}=10 and (c) for R_{Q} =70. The
values of r_{0} only considered when r_{0}=1, r_{0}=3. From **Figure 1**, it follows
that τ_{Q}(t) satisfies a power:

(3)

where the exponent ξ(Q) decreases with increasing Q(ξ=0.6 for
R_{Q}=10 and ((ξ=0.47 for R_{Q}=70) Conditional expected number of time
units τ_{Q}(t|r_{0}) for r_{0}=1 may also be described by a power law. about
the same exponential ξ(Q,r0), as well as the global value of ξ(Q). On
the contrary, the value for r_{0}>3 τ_{Q}(t|r_{0}) significantly deviates from the
power law for small values of the argument t/R_{Q}. For large values. t/
R_{Q}, corresponding τ_{Q}(t|r_{0}) graphics for both values r_{0}=1 and r_{0}=3 close
to collapse (merge) W_{a}(t,Δt) is the probability that the time Δt for the
units following the last Q-event will be at least one heartbeat interval
Q-, if for t units of time before the last Q-event event appeared interval.
This value is related to P_{Q}(r) ratio:

(4)

Since the value of W_{Q}(t;Δt) is limited by the number 1 when t/R_{Q}→0,
it can satisfy the power law only if and written as [9]:

(5)

**Figure 1c** and **1d** shows graphs for W_{Q} records MRC-model,
characterized the power spectrum of the form 1/f 1 when R_{Q}=10 and
R_{Q}=70, respectively. For such records PDF is better described by a
gamma distribution than the power law, significantly deviate from it
on a large scale, to obtain an analytical expression for the W_{Q} in this
case is very difficult. Empirically [9] shows that the best estimate for W_{Q} obtained in this case, if the denominator of the fraction (5) is replaced by at which leads to the estimate:

(6)

For large t/R_{Q} there are strong finite-dimensional effects that
manifest themselves especially at high R_{Q} (**Figure 1d**). These finitedimensional
effects are reduced with a decrease in R_{Q} and with
increasing length l of the time series, understating the denominator in
(5), and thereby artificially inflating assessment W_{Q}.

The simplest forecast is obtained by choosing the estimate (6)
with a high probability of a fixed value Δt=1. These results agree well
with the corresponding results for the model MRC. To build a more
accurate prediction [9] propose an algorithm, providing a comparison
W_{Q} assessments at various fixed values of Q_{P} and calculating the
relevant risk probabilities. For a fixed value of Q_{P} determined by two
indicators: sensitivity (sensitivity) Sens, who correctly predicted the
share harakteriruet Q-Events and specified index (specifity) Spec,
which characterizes the proportion correctly predicted not Q-events.
Larger values and Sens Spec provide a better prognosis. To increase
the efficiency of use of the forecast analysis using the reception signal
operator (receiver operator characteristic), called the ROC-analysis,
according to which plotted by Sens Spec for all possible values of Q_{P}.
By opredelnie Q_{|P}=0, when Sens=1 Spec=1 and 0, while Q_{P}=1 when
Sens=0 and Spec=1. At 0<Q_{P} <10 curve ROC extends from the top left
to the bottom right corner on the plane (Sens, Spec). When there is
no memory in Sens+Spec=1 data and the ROC curve is a straight line
connecting both of the angle (dashed lines in **Figure 2** [9]). General
PP forecast accuracy measure 0<PP<1, it is an integral over the ROC curve, which is equal to 1, with absolutely tochnom forecast and equal
to ½ for random data. To estimate the probability of risk can be used
to “teach” a sample or model of the observed records of Records as the
MRC-model. The traditional technique of recognition given patterns
(templates, standards), so-called recognition PRT-technique (pattern
recognition technique), based on short-term memory, built a database
of all possible patterns y_{n,k} from a previous event with a sliding window,
which divides full sample records possible values y_{i} the l levels with
the same number of values, so that the total number of patterns is
equal to l^{k} l. Then, for each pattern precursor yn,k estimated probability
P(y_{n}>Q| y_{n,k}) that exceed the following event yn Q. The main difficulty
here is the need for multiple settings in order to find the optimal
values of parameters l and k, resulting in high accuracy of the forecast.
In an alternative RIA-techniques using non-volatile memory, the
probability W_{Q}(t;Δt) is determined from the observed records using
equation (1.4), or analytical expression (6). As shown in [9] RIAapproach
to forecasting Q-intervals, which does not require the limited
ispolzuyumyh statistics, it gives in all cases the best result.

**Figure 2** [9] has shown that when R_{Q}=10, both approaches provide
similar results in three representative cases, patterns of k=2, k=3 and k=6, and for RQ=70 ROC-curve is systematically located above the RIAcurve,
especially near Sens=1. Experimental studies suggest [9] that the
PRT-forecasts using the “training” of the observed sample records are
usually more accurate than forecasts obtained through records MRCmodel.
The reason for this is the limited ability to MRC-model for
describing the dynamics of short-term heart rate intervals, including
individual variations in the physiological regulation. In this regard, the
high sensitivity of RIA-techniques results in significantly fewer false
alarms than PRT-technique.

Thus, the use in the study of records in non-volatile memory heartbeat intervals inherent in events that appear after the last Q-events, has the undoubted advantage compared with PRT-technique, using only short-term memory. RIA- approaches main disadvantage is that it typically cannot predict the events of the first Q-cluster event, a large number of heartbeat intervals t used when W (t; Δt) becomes low. However, due to multifractality records in clusters of extreme events, benefit from better than expected, following the first event in the cluster Q-events, and from the reduction of false alarms in the RIA-approach is much higher than the loss of the weak predictability of the first event in the cluster Q-events that confirmed ROC-analysis. In addition, RIA-intensive approach does not require the use of multiple training procedures and test patterns facilitating its numerical implementation compared to PRT-approach.

To improve the efficiency Q-event prediction obtained RIAmethod, we will use a combination of this method with the method of achieving change threshold [10]. The basis of this method is necessary the original signal into a number of time to reach the threshold of change p [16]. It allows, firstly, the aggregate signal without loss of significant information about it, and secondly, not predict the next signal value, and the time in which the signal change exceeds the known threshold p. Prediction made two-layer perceptron. Consequently, the original signal is converted as follows:

(7)

(8)

(9)

(10)

where N-number of samples in the original signal, x’-converted signal, in which left only the values relative difference between the intervals is greater than the threshold p, N-number of samples in the transformed signal, τ-number time to reach a predetermined change threshold, where each value means the time it took the signal to exceed the threshold p changes. Furthermore, this method enables a combined prediction, which includes:

1) assessment for the x ' next value

The agreement marks the differences and svidetelst- It exists on the persistence (sustainable growth) as the values and the time between the values of and When applied to the records (extreme events) heart which means a steady growth of Records and the intervals between consecutive records

The transformation (7) - (10) with the replacement of (9) to

(9)

You can apply for the pre-selection of records in the heart of the above Q-range forecasting process at large Q-based RIA-technology that will speed up the last procedure and increase the reliability of the proqnosis.

- Bunde A, Kropp J, Schellnhuber HJ (2003) The science of disasters. Berlin, Heidelberg, New York: Springer, p: 453.
- Bogachev MI, Eichner JF, Bunde A (2007) Effect of nonlinear correlations on the statistics of return intervals in multifractal data sets. Physical Review Letters 99: 240601.
- Bogachev MI (2009) On the issue of projected emissions of time series with fractal properties by using information about linear and nonlinear components of long-term dependence. Math Russian Universities Electronics5: 31-40.
- Bunde A, Eichner JF, Kantelhardt JW, Havlin S (2005) Long-term memory: A natural mechanism for the clustering of extreme events and anomalous residual times in climate records. Physical Review Letters 94: 048701.
- Yamasaki K, Muchnik L, Havlin S, Bunde A, Stanley HE (2005) Proc Nat Acad Sci 102: 9424.
- Eichner JF, Kantelhardt JW, Bunde A, Havlin S (2007) Statistics of return intervals in long-term correlated records. Physical Review E 75: 011128.
- Altmann EG, Kantz H (2005) Recurrence time analysis, long-term correlations, and extreme events. Physical Review E 71: 056106.
- Bogachev MI,Bunde A (2008) Memory effects in the statistics of interocsurence times between large returns in financial records. Phys RevE78: 036114.
- Bogachev MI, Kireenkov IS, Nifontov EM, Bunde A (2009) Statistics of return intervals between long heartbeat intervals and their usability for online prediction of disorders. New Journal of Physics 11: 063036.
- Abdullaev NT, Dyshin OA, Ibragimova ID (2016) Diagnosis of heart disease based on the statistics of repeated intervals between extreme events in heart rate. Proceedings FREME, Vladimir.
- Sidorkina IG,Egoshin FV, Shumkov DS, Kudrin PA (2010) Prediction of complex signals based on the selection boundaries implementations of dynamic systems. Scientific herald technichesky St. Petersburg State University of Information Technologies, Mechanics and Optics 2: 49-53.
- Bernado JM, Smith AFM (1994)Baiessian Theory. New York: Wiley
- Bishop CM (1995) Neural Networks for Pattern Recognition. Oxford University Press.
- Hallerberg S, Altmann EG, Holstein D, Kantz H (2007) Precursors of extreme increments. Physical Review E 75: 016706.
- Hallerberg S, Kantz H (2008) Influence of the event magnitude on the predictability of an extreme event. Physical Review E 77: 011108.
- Egoshin AV (2007) Analysis of time to achieve the threshold signal changes in the problem of neural network time series prediction. Information-computing Technologies and their Applications, pp: 74-76.

Select your language of interest to view the total content in your interested language

- Artificial Organs
- Biomedical Applications
- Biomedical Computation and Modeling
- Biomedical Engineering
- Brain Imaging
- Cell Culture & Bioreactors
- Cellular Biomechanics
- Drug Delivery and Therapeutics
- Hearing Research
- Hydrogels
- Medical Materials
- Medical Physics
- Microfluidics
- Organ Replacement
- Sports and Biomechanics
- Stem Cell Technology
- Surgical Items

- International Conference on
**Medical**and**Health Science**

August 24-25, 2018 Tokyo, Japan - International Conference on
**Clinical**and**Medical**Case Reports

October 22-23, 2018 Istanbul, Turkey

- Total views:
**559** - [From(publication date):

August-2017 - Aug 20, 2018] - Breakdown by view type
- HTML page views :
**494** - PDF downloads :
**65**

Peer Reviewed Journals

International Conferences 2018-19