906 resultados para statistical techniques


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The use of Wireless Sensor Networks (WSNs) for vibration-based Structural Health Monitoring (SHM) has become a promising approach due to many advantages such as low cost, fast and flexible deployment. However, inherent technical issues such as data asynchronicity and data loss have prevented these distinct systems from being extensively used. Recently, several SHM-oriented WSNs have been proposed and believed to be able to overcome a large number of technical uncertainties. Nevertheless, there is limited research verifying the applicability of those WSNs with respect to demanding SHM applications like modal analysis and damage identification. Based on a brief review, this paper first reveals that Data Synchronization Error (DSE) is the most inherent factor amongst uncertainties of SHM-oriented WSNs. Effects of this factor are then investigated on outcomes and performance of the most robust Output-only Modal Analysis (OMA) techniques when merging data from multiple sensor setups. The two OMA families selected for this investigation are Frequency Domain Decomposition (FDD) and data-driven Stochastic Subspace Identification (SSI-data) due to the fact that they both have been widely applied in the past decade. Accelerations collected by a wired sensory system on a large-scale laboratory bridge model are initially used as benchmark data after being added with a certain level of noise to account for the higher presence of this factor in SHM-oriented WSNs. From this source, a large number of simulations have been made to generate multiple DSE-corrupted datasets to facilitate statistical analyses. The results of this study show the robustness of FDD and the precautions needed for SSI-data family when dealing with DSE at a relaxed level. Finally, the combination of preferred OMA techniques and the use of the channel projection for the time-domain OMA technique to cope with DSE are recommended.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recurrence relations in mathematics form a very powerful and compact way of looking at a wide range of relationships. Traditionally, the concept of recurrence has often been a difficult one for the secondary teacher to convey to students. Closely related to the powerful proof technique of mathematical induction, recurrences are able to capture many relationships in formulas much simpler than so-called direct or closed formulas. In computer science, recursive coding often has a similar compactness property, and, perhaps not surprisingly, suffers from similar problems in the classroom as recurrences: the students often find both the basic concepts and practicalities elusive. Using models designed to illuminate the relevant principles for the students, we offer a range of examples which use the modern spreadsheet environment to powerfully illustrate the great expressive and computational power of recurrences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We first classify the state-of-the-art stream authentication problem in the multicast environment and group them into Signing and MAC approaches. A new approach for authenticating digital streams using Threshold Techniques is introduced. The new approach main advantages are in tolerating packet loss, up to a threshold number, and having a minimum space overhead. It is most suitable for multicast applications running over lossy, unreliable communication channels while, in same time, are pertain the security requirements. We use linear equations based on Lagrange polynomial interpolation and Combinatorial Design methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The occurrence of extreme water level events along low-lying, highly populated and/or developed coastlines can lead to devastating impacts on coastal infrastructure. Therefore it is very important that the probabilities of extreme water levels are accurately evaluated to inform flood and coastal management and for future planning. The aim of this study was to provide estimates of present day extreme total water level exceedance probabilities around the whole coastline of Australia, arising from combinations of mean sea level, astronomical tide and storm surges generated by both extra-tropical and tropical storms, but exclusive of surface gravity waves. The study has been undertaken in two main stages. In the first stage, a high-resolution (~10 km along the coast) hydrodynamic depth averaged model has been configured for the whole coastline of Australia using the Danish Hydraulics Institute’s Mike21 modelling suite of tools. The model has been forced with astronomical tidal levels, derived from the TPX07.2 global tidal model, and meteorological fields, from the US National Center for Environmental Prediction’s global reanalysis, to generate a 61-year (1949 to 2009) hindcast of water levels. This model output has been validated against measurements from 30 tide gauge sites around Australia with long records. At each of the model grid points located around the coast, time series of annual maxima and the several highest water levels for each year were derived from the multi-decadal water level hindcast and have been fitted to extreme value distributions to estimate exceedance probabilities. Stage 1 provided a reliable estimate of the present day total water level exceedance probabilities around southern Australia, which is mainly impacted by extra-tropical storms. However, as the meteorological fields used to force the hydrodynamic model only weakly include the effects of tropical cyclones the resultant water levels exceedance probabilities were underestimated around western, northern and north-eastern Australia at higher return periods. Even if the resolution of the meteorological forcing was adequate to represent tropical cyclone-induced surges, multi-decadal periods yielded insufficient instances of tropical cyclones to enable the use of traditional extreme value extrapolation techniques. Therefore, in the second stage of the study, a statistical model of tropical cyclone tracks and central pressures was developed using histroic observations. This model was then used to generate synthetic events that represented 10,000 years of cyclone activity for the Australia region, with characteristics based on the observed tropical cyclones over the last ~40 years. Wind and pressure fields, derived from these synthetic events using analytical profile models, were used to drive the hydrodynamic model to predict the associated storm surge response. A random time period was chosen, during the tropical cyclone season, and astronomical tidal forcing for this period was included to account for non-linear interactions between the tidal and surge components. For each model grid point around the coast, annual maximum total levels for these synthetic events were calculated and these were used to estimate exceedance probabilities. The exceedance probabilities from stages 1 and 2 were then combined to provide a single estimate of present day extreme water level probabilities around the whole coastline of Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents the field applications and validations for the controlled Monte Carlo data generation scheme. This scheme was previously derived to assist the Mahalanobis squared distance–based damage identification method to cope with data-shortage problems which often cause inadequate data multinormality and unreliable identification outcome. To do so, real-vibration datasets from two actual civil engineering structures with such data (and identification) problems are selected as the test objects which are then shown to be in need of enhancement to consolidate their conditions. By utilizing the robust probability measures of the data condition indices in controlled Monte Carlo data generation and statistical sensitivity analysis of the Mahalanobis squared distance computational system, well-conditioned synthetic data generated by an optimal controlled Monte Carlo data generation configurations can be unbiasedly evaluated against those generated by other set-ups and against the original data. The analysis results reconfirm that controlled Monte Carlo data generation is able to overcome the shortage of observations, improve the data multinormality and enhance the reliability of the Mahalanobis squared distance–based damage identification method particularly with respect to false-positive errors. The results also highlight the dynamic structure of controlled Monte Carlo data generation that makes this scheme well adaptive to any type of input data with any (original) distributional condition.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper the renormalization group (RG) method of Chen, Goldenfeld, and Oono [Phys. Rev. Lett., 73 (1994), pp.1311-1315; Phys. Rev. E, 54 (1996), pp.376-394] is presented in a pedagogical way to increase its visibility in applied mathematics and to argue favorably for its incorporation into the corresponding graduate curriculum.The method is illustrated by some linear and nonlinear singular perturbation problems. Key word. © 2012 Society for Industrial and Applied Mathematics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Following the derivation of amplitude equations through a new two-time-scale method [O'Malley, R. E., Jr. & Kirkinis, E (2010) A combined renormalization group-multiple scale method for singularly perturbed problems. Stud. Appl. Math. 124, 383-410], we show that a multi-scale method may often be preferable for solving singularly perturbed problems than the method of matched asymptotic expansions. We illustrate this approach with 10 singularly perturbed ordinary and partial differential equations. © 2011 Cambridge University Press.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we introduce a new technique to obtain the slow-motion dynamics in nonequilibrium and singularly perturbed problems characterized by multiple scales. Our method is based on a straightforward asymptotic reduction of the order of the governing differential equation and leads to amplitude equations that describe the slowly-varying envelope variation of a uniformly valid asymptotic expansion. This may constitute a simpler and in certain cases a more general approach toward the derivation of asymptotic expansions, compared to other mainstream methods such as the method of Multiple Scales or Matched Asymptotic expansions because of its relation with the Renormalization Group. We illustrate our method with a number of singularly perturbed problems for ordinary and partial differential equations and recover certain results from the literature as special cases. © 2010 - IOS Press and the authors. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With nine examples, we seek to illustrate the utility of the Renormalization Group approach as a unification of other asymptotic and perturbation methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article elucidates and analyzes the fundamental underlying structure of the renormalization group (RG) approach as it applies to the solution of any differential equation involving multiple scales. The amplitude equation derived through the elimination of secular terms arising from a naive perturbation expansion of the solution to these equations by the RG approach is reduced to an algebraic equation which is expressed in terms of the Thiele semi-invariants or cumulants of the eliminant sequence { Zi } i=1 . Its use is illustrated through the solution of both linear and nonlinear perturbation problems and certain results from the literature are recovered as special cases. The fundamental structure that emerges from the application of the RG approach is not the amplitude equation but the aforementioned algebraic equation. © 2008 The American Physical Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A catchment-scale multivariate statistical analysis of hydrochemistry enabled assessment of interactions between alluvial groundwater and Cressbrook Creek, an intermittent drainage system in southeast Queensland, Australia. Hierarchical cluster analyses and principal component analysis were applied to time-series data to evaluate the hydrochemical evolution of groundwater during periods of extreme drought and severe flooding. A simple three-dimensional geological model was developed to conceptualise the catchment morphology and the stratigraphic framework of the alluvium. The alluvium forms a two-layer system with a basal coarse-grained layer overlain by a clay-rich low-permeability unit. In the upper and middle catchment, alluvial groundwater is chemically similar to streamwater, particularly near the creek (reflected by high HCO3/Cl and K/Na ratios and low salinities), indicating a high degree of connectivity. In the lower catchment, groundwater is more saline with lower HCO3/Cl and K/Na ratios, notably during dry periods. Groundwater salinity substantially decreased following severe flooding in 2011, notably in the lower catchment, confirming that flooding is an important mechanism for both recharge and maintaining groundwater quality. The integrated approach used in this study enabled effective interpretation of hydrological processes and can be applied to a variety of hydrological settings to synthesise and evaluate large hydrochemical datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The objective of this chapter is to provide an overview of traffic data collection that can and should be used for the calibration and validation of traffic simulation models. There are big differences in availability of data from different sources. Some types of data such as loop detector data are widely available and used. Some can be measured with additional effort, for example, travel time data from GPS probe vehicles. Some types such as trajectory data are available only in rare situations such as research projects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For the renewable energy sources whose outputs vary continuously, a Z-source current-type inverter has been proposed as a possible buck-boost alternative for grid-interfacing. With a unique X-shaped LC network connected between its dc power source and inverter topology, Z-source current-type inverter is however expected to suffer from compounded resonant complications in addition to those associated with its second-order output filter. To improve its damping performance, this paper proposes the careful integration of Posicast or three-step compensators before the inverter pulse-width modulator for damping triggered resonant oscillations. In total, two compensators are needed for wave-shaping the inverter boost factor and modulation ratio, and they can conveniently be implemented using first-in first-out stacks and embedded timers of modern digital signal processors widely used in motion control applications. Both techniques are found to damp resonance of ac filter well, but for cases of transiting from current-buck to boost state, three-step technique is less effective due to the sudden intermediate discharging interval introduced by its non-monotonic stepping (unlike the monotonic stepping of Posicast damping). These findings have been confirmed both in simulations and experiments using an implemented laboratory prototype.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A probabilistic method is proposed to evaluate voltage quality of grid-connected photovoltaic (PV) power systems. The random behavior of solar irradiation is described in statistical terms and the resulting voltage fluctuation probability distribution is then derived. Reactive power capabilities of the PV generators are then analyzed and their operation under constant power factor mode is examined. By utilizing the reactive power capability of the PV-generators to the full, it is shown that network voltage quality can be greatly enhanced.