Title:

On nonstationarity from operational and environmental effects in structural health monitoring bridge data

Structural Health Monitoring (SHM) describes a set of activities that can be followed in order to collect data from an existent structure, generate databased information about its current condition, identify the presence of any signs of abnormality and forecast its future response. These activities, among others, include instrumentation, data acquisition, processing, generation of diagnostic tools, as well as transmission of information to engineers, owners and authorities. SHM and, more specifically, continuous monitoring can provide numerous measures, which can be generally classified into three categories; vibrationalbased, which includes natural frequencies, modeshapes, damping ratios, componentbased, such as strains, tensions, deflections and environmental and operational variations (EOVs), associated with temperature, wind, traffic humidity and others. One of the main technical problems that SHM has to tackle is that of data normalisation. In abstract terms, this describes the impact that EOVs can have on SHM measures. In many cases, with interest placed on bridges, it has been observed that EOVs introduce nonstationary to SHM signals that can mask the variability that can be associated with the presence of damage; making damage detection attempts difficult. Hence, it is desirable to quantify the impacts of EOVs on damage sensitive features, project them out, using methods such as the cointegration, Principal Component Analysis (PCA) or others, in order to achieve a stationary signal. This type of signal can be assessed over time using tools, such as statistical process control (SPC) charts, to identify the existence of novelty, which can be linked with damage. As one can understand from the latter, it is important to detect the presence of nonstationary in SHM signals and identify its sources. However, this is not a straightforward procedure and one important question that need to be answered is; how one can judge if a signal is stationary or not. Inside this work, this question is discussed, focusing on the definition of weak stationarity and under which assumption this judgement holds. In particular, the data coming from SHM are finite samples. Therefore, the mean and variance of a signal can be tracked, using a sequence of moving windows, something that needs a prior determination of the width of window. However, the major concern here is that the SHM signals can be characterised as periodicallycorrelated or cyclostationary. In such cases, it seems that it is better for one to use more advanced statistical tools to assess a signal's nonstationary. More specifically, nonstationary tests coming from the context of Econometrics and timeseries analysis can be employed. In order to use such proxies more extensively, one should build trust on their indications by understanding the mechanism under which they perform. This work concentrates on the Augmented DickeyFuller (ADF) nonstationary test and emphasis is placed on the hypothesis (unit root) under which performs its assessment. In brief, a series of simulations are generated, and based on dimensional analysis, it is shown that the ADF test is essentially counts the number of cycles/periods of the dominant periodic component. Its indications depend on the number of observations/cycles, the normalised frequency of the signal, the sampling rate and signaltonoise ratio (SNR). The most important conclusion made is that knowing the sampling frequency of any given signal, a critical frequency in Hz can be found, which can be derived from the critical normalised one, as a function of the number of cycles, which can be directly used to judge if the signal is stationary or not. In other words, this investigation provides an answer to the question; after how many cycles of continuous monitoring (i.e. days), an SHM signal can be judged as stationary? As well as considering nonstationary in a general way, this thesis returns to the main issue of data normalisation. To begin with, a laboratory test is performed, at the laboratory (Jonas lab) of Sheffield University, on an aluminium truss bridge model manufactured there. In particular, that involved vibration analysis of the truss bridge inside an environmental chamber, which simulated varying temperature conditions from 10 to 20 deg. Celsius, while damage introduced on the structure by the removal of bolts and connecting brackets in two locations of the model. This experiment provided interesting results to discuss further the impact of EOVs on data coming from the monitoring of a smallscale structure. After that, the thesis discusses the use of Johansen's approach to cointegration in the context of SHM, demonstrate its use on the laboratory truss bridge data and provides a review of the available methods that can be used to monitor the cointegration residual. The latter is the stationary signal provided by cointegration which is free from EOVs and capable for novelty detection. The methodologies reviewed are various SPC charts, while also the use of ADF is also explored, providing extensive discussion. Furthermore, an important conclusion from the SHM literature is that the impact of EOVs on SHM signals can occur on widely disparate time scales. Therefore, the quantification and elimination of these impacts from signals is not an easy procedure and prior knowledge is needed. For such purposes, refined means originated from the field of signal processing can be used within SHM. Of particular interest here is the concept of multiresolution analysis (MRA), which has been used in SHM in order to decompose a given signal in its frequency components (different timescales) and evaluate the damage sensitivity of each one, employing the Johansen's approach to cointegration, which is able to project out the impact of EOVs from multiple SHM series. A more principled way to perform MRA is proposed here, in order to decompose SHM signals, by introducing two additional steps. The first step is the ADF test, which can be used to assess each one of the MRA levels in terms of nonstationary. In this way, a critical decomposition level (L*) can be found and used to decompose the original SHM signal into a nonstationary and stationary part. The second step introduced includes the use of autocorrelation functions (ACFs) in order to test the stationary MRA levels and identify those that can be considered as deltacorrelated. These levels can be used to form a noisy component inside the stationary one. Assuming that all the aforementioned steps are confirmed, the original signal can now be decomposed into a stationary, a mean, a nonstationary and a noisy component. The proposed decomposition can be of great interest not only for SHM purposes, but also in the general context of timeseries analysis, as it provides a principled way to perform MRA. The proposed analysis is demonstrated on natural frequency and temperature data of the Z24 Bridge. All in all, the thesis tries to answer the following questions: 1) How an SHM signal can be judged as nonstationary/stationary and under which assumptions? 2) After how many cycles of continuous monitoring an SHM signal that is initially nonstationary becomes stationary? 3) Which are the main drivers of this nonstationary (i.e. EOVs, abnormality/damage or others)? 4) How one can distinguish the effect of EOVs from this of abnormality/damage? 5) How one can project out the confounding} influence of EOVs from an SHM signal and provide a signal that is capable for novelty detection? 6) Is it possible to decompose an SHM signal and study each one of these components separately? 7) Which of these components are mostly affected by EOVs, which from damage and which do not include important information in terms of novelty detection? Understanding and answering all the aforementioned questions can help on identifying signals that can be monitored over time or in data windows, ensuring that stationarity achieved, employing methodologies such as statistical process control (SPC) for novelty detection.
