983 resultados para nonlinear regression
Resumo:
Hot spot identification (HSID) aims to identify potential sites—roadway segments, intersections, crosswalks, interchanges, ramps, etc.—with disproportionately high crash risk relative to similar sites. An inefficient HSID methodology might result in either identifying a safe site as high risk (false positive) or a high risk site as safe (false negative), and consequently lead to the misuse the available public funds, to poor investment decisions, and to inefficient risk management practice. Current HSID methods suffer from issues like underreporting of minor injury and property damage only (PDO) crashes, challenges of accounting for crash severity into the methodology, and selection of a proper safety performance function to model crash data that is often heavily skewed by a preponderance of zeros. Addressing these challenges, this paper proposes a combination of a PDO equivalency calculation and quantile regression technique to identify hot spots in a transportation network. In particular, issues related to underreporting and crash severity are tackled by incorporating equivalent PDO crashes, whilst the concerns related to the non-count nature of equivalent PDO crashes and the skewness of crash data are addressed by the non-parametric quantile regression technique. The proposed method identifies covariate effects on various quantiles of a population, rather than the population mean like most methods in practice, which more closely corresponds with how black spots are identified in practice. The proposed methodology is illustrated using rural road segment data from Korea and compared against the traditional EB method with negative binomial regression. Application of a quantile regression model on equivalent PDO crashes enables identification of a set of high-risk sites that reflect the true safety costs to the society, simultaneously reduces the influence of under-reported PDO and minor injury crashes, and overcomes the limitation of traditional NB model in dealing with preponderance of zeros problem or right skewed dataset.
Resumo:
Visual localization in outdoor environments is often hampered by the natural variation in appearance caused by such things as weather phenomena, diurnal fluctuations in lighting, and seasonal changes. Such changes are global across an environment and, in the case of global light changes and seasonal variation, the change in appearance occurs in a regular, cyclic manner. Visual localization could be greatly improved if it were possible to predict the appearance of a particular location at a particular time, based on the appearance of the location in the past and knowledge of the nature of appearance change over time. In this paper, we investigate whether global appearance changes in an environment can be learned sufficiently to improve visual localization performance. We use time of day as a test case, and generate transformations between morning and afternoon using sample images from a training set. We demonstrate the learned transformation can be generalized from training data and show the resulting visual localization on a test set is improved relative to raw image comparison. The improvement in localization remains when the area is revisited several weeks later.
Resumo:
Due to knowledge gaps in relation to urban stormwater quality processes, an in-depth understanding of model uncertainty can enhance decision making. Uncertainty in stormwater quality models can originate from a range of sources such as the complexity of urban rainfall-runoff-stormwater pollutant processes and the paucity of observed data. Unfortunately, studies relating to epistemic uncertainty, which arises from the simplification of reality are limited and often deemed mostly unquantifiable. This paper presents a statistical modelling framework for ascertaining epistemic uncertainty associated with pollutant wash-off under a regression modelling paradigm using Ordinary Least Squares Regression (OLSR) and Weighted Least Squares Regression (WLSR) methods with a Bayesian/Gibbs sampling statistical approach. The study results confirmed that WLSR assuming probability distributed data provides more realistic uncertainty estimates of the observed and predicted wash-off values compared to OLSR modelling. It was also noted that the Bayesian/Gibbs sampling approach is superior compared to the most commonly adopted classical statistical and deterministic approaches commonly used in water quality modelling. The study outcomes confirmed that the predication error associated with wash-off replication is relatively higher due to limited data availability. The uncertainty analysis also highlighted the variability of the wash-off modelling coefficient k as a function of complex physical processes, which is primarily influenced by surface characteristics and rainfall intensity.
Jacobian-free Newton-Krylov methods with GPU acceleration for computing nonlinear ship wave patterns
Resumo:
The nonlinear problem of steady free-surface flow past a submerged source is considered as a case study for three-dimensional ship wave problems. Of particular interest is the distinctive wedge-shaped wave pattern that forms on the surface of the fluid. By reformulating the governing equations with a standard boundary-integral method, we derive a system of nonlinear algebraic equations that enforce a singular integro-differential equation at each midpoint on a two-dimensional mesh. Our contribution is to solve the system of equations with a Jacobian-free Newton-Krylov method together with a banded preconditioner that is carefully constructed with entries taken from the Jacobian of the linearised problem. Further, we are able to utilise graphics processing unit acceleration to significantly increase the grid refinement and decrease the run-time of our solutions in comparison to schemes that are presently employed in the literature. Our approach provides opportunities to explore the nonlinear features of three-dimensional ship wave patterns, such as the shape of steep waves close to their limiting configuration, in a manner that has been possible in the two-dimensional analogue for some time.
Resumo:
We report a new approach that uses the single beam Z-scan technique, to discriminate between excited state absorption (ESA) and two and three photon nonlinear absorption. By measuring the apparent delay or advance of the pulse in reaching the detector, the nonlinear absorption can be unambiguously identified as either instantaneous or transient. The simple method does not require a large range of input fluences or sophisticated pulse-probe experimental apparatus. The technique is easily extended to any absorption process dependent on pulse width and to nonlinear refraction measurements. We demonstrate in particular, that the large nonlinear absorption in ZnO nanocones when exposed to nanosecond 532 nm pulses, is due mostly to ESA, not pure two-photon absorption.
Resumo:
A nonlinear finite element analysis was carried out to investigate the viscoplastic deformation of solder joints in a ball grid array (BGA) package under temperature cycle. The effects of constraint on print circuit board (PCB) and stiffness of substrate on the deformation behaviour of the solder joints were also studied. A relative damage stress was adopted to analyze the potential failure sites in the solder joints. The results indicated that high inelastic strain and strain energy density were developed in the joints close to the package center. On the other hand, high constraint and high relative damage stress were associated with the joint closest to the edge of the silicon chip. The joint closest to the edge of the silicon chip was regarded as the most susceptible failure site if cavitation instability is the dominant failure mechanism. Increase the external constraint on the print circuit board (PCB) causes a slight increase in stress triaxiality (m/eq) and relative damage stress in the joint closest to the edge of silicon die. The relative damage stress is not sensitive to the Young’s modulus of the substrate.
Resumo:
This paper presents two novel nonlinear models of u-shaped anti-roll tanks for ships, and their linearizations. In addition, a third simplified nonlinear model is presented. The models are derived using Lagrangian mechanics. This formulation not only simplifies the modeling process, but also allows one to obtain models that satisfy energy-related physical properties. The proposed nonlinear models and their linearizations are validated using model-scale experimental data. Unlike other models in the literature, the nonlinear models in this paper are valid for large roll amplitudes. Even at moderate roll angles, the nonlinear models have three orders of magnitude lower mean square error relative to experimental data than the linear models.
Resumo:
Parametric roll is a critical phenomenon for ships, whose onset may cause roll oscillations up to +-40 degrees, leading to very dangerous situations and possibly capsizing. Container ships have been shown to be particularly prone to parametric roll resonance when they are sailing in moderate to heavy head seas. A Matlab/Simulink parametric roll benchmark model for a large container ship has been implemented and validated against a wide set of experimental data. The model is a part of a Matlab/Simulink Toolbox (MSS, 2007). The benchmark implements a 3rd-order nonlinear model where the dynamics of roll is strongly coupled with the heave and pitch dynamics. The implemented model has shown good accuracy in predicting the container ship motions, both in the vertical plane and in the transversal one. Parametric roll has been reproduced for all the data sets in which it happened, and the model provides realistic results which are in good agreement with the model tank experiments.
Resumo:
In this paper, we consider a passivity-based approach for the design of a control law of multiple ship-roll gyro-stabiliser units. We extend previous work on control of ship roll gyro-stabilisation by considering the problem within a nonlinear framework. In particular, we derive an energy-based model using the port-Hamiltonian theory and then design an active precession controller using passivity-based control interconnection and damping assignment. The design considers the possibility of having multiple gyro-stabiliser units, and the desired potential energy of the system (in closed loop) is chosen to behave like a barrier function, which allows us to enforce constraints on the precession angle of the gyros.
Resumo:
This paper presents a nonlinear observer for estimating parameters associated with the restoring term of a roll motion model of a marine vessel in longitudinal waves. Changes in restoring, also referred to as transverse stability, can be the result of changes in the vessel's centre of gravity due to, for example, water on deck and also in changes in the buoyancy triggered by variations in the water-plane area produced by longitudinal waves -- propagating along the fore-aft direction along the hull. These variations in the restoring can change dramatically the dynamics of the roll motion leading to dangerous resonance. Therefore, it is of interest to estimate and detect such changes.
Resumo:
This paper presents a method for the estimation of thrust model parameters of uninhabited airborne systems using specific flight tests. Particular tests are proposed to simplify the estimation. The proposed estimation method is based on three steps. The first step uses a regression model in which the thrust is assumed constant. This allows us to obtain biased initial estimates of the aerodynamic coeficients of the surge model. In the second step, a robust nonlinear state estimator is implemented using the initial parameter estimates, and the model is augmented by considering the thrust as random walk. In the third step, the estimate of the thrust obtained by the observer is used to fit a polynomial model in terms of the propeller advanced ratio. We consider a numerical example based on Monte-Carlo simulations to quantify the sampling properties of the proposed estimator given realistic flight conditions.
Resumo:
This research quantifies the lag effects and vulnerabilities of temperature effects on cardiovascular disease in Changsha—a subtropical climate zone of China. A Poisson regression model within a distributed lag nonlinear models framework was used to examine the lag effects of cold- and heat-related CVD mortality. The lag effect for heat-related CVD mortality was just 0–3 days. In contrast, we observed a statistically significant association with 10–25 lag days for cold-related CVD mortality. Low temperatures with 0–2 lag days increased the mortality risk for those ≥65 years and females. For all ages, the cumulative effects of cold-related CVD mortality was 6.6% (95% CI: 5.2%–8.2%) for 30 lag days while that of heat-related CVD mortality was 4.9% (95% CI: 2.0%–7.9%) for 3 lag days. We found that in Changsha city, the lag effect of hot temperatures is short while the lag effect of cold temperatures is long. Females and older people were more sensitive to extreme hot and cold temperatures than males and younger people.
Resumo:
Background Heatwaves could cause the population excess death numbers to be ranged from tens to thousands within a couple of weeks in a local area. An excess mortality due to a special event (e.g., a heatwave or an epidemic outbreak) is estimated by subtracting the mortality figure under ‘normal’ conditions from the historical daily mortality records. The calculation of the excess mortality is a scientific challenge because of the stochastic temporal pattern of the daily mortality data which is characterised by (a) the long-term changing mean levels (i.e., non-stationarity); (b) the non-linear temperature-mortality association. The Hilbert-Huang Transform (HHT) algorithm is a novel method originally developed for analysing the non-linear and non-stationary time series data in the field of signal processing, however, it has not been applied in public health research. This paper aimed to demonstrate the applicability and strength of the HHT algorithm in analysing health data. Methods Special R functions were developed to implement the HHT algorithm to decompose the daily mortality time series into trend and non-trend components in terms of the underlying physical mechanism. The excess mortality is calculated directly from the resulting non-trend component series. Results The Brisbane (Queensland, Australia) and the Chicago (United States) daily mortality time series data were utilized for calculating the excess mortality associated with heatwaves. The HHT algorithm estimated 62 excess deaths related to the February 2004 Brisbane heatwave. To calculate the excess mortality associated with the July 1995 Chicago heatwave, the HHT algorithm needed to handle the mode mixing issue. The HHT algorithm estimated 510 excess deaths for the 1995 Chicago heatwave event. To exemplify potential applications, the HHT decomposition results were used as the input data for a subsequent regression analysis, using the Brisbane data, to investigate the association between excess mortality and different risk factors. Conclusions The HHT algorithm is a novel and powerful analytical tool in time series data analysis. It has a real potential to have a wide range of applications in public health research because of its ability to decompose a nonlinear and non-stationary time series into trend and non-trend components consistently and efficiently.
Resumo:
For users of germplasm collections, the purpose of measuring characterization and evaluation descriptors, and subsequently using statistical methodology to summarize the data, is not only to interpret the relationships between the descriptors, but also to characterize the differences and similarities between accessions in relation to their phenotypic variability for each of the measured descriptors. The set of descriptors for the accessions of most germplasm collections consists of both numerical and categorical descriptors. This poses problems for a combined analysis of all descriptors because few statistical techniques deal with mixtures of measurement types. In this article, nonlinear principal component analysis was used to analyze the descriptors of the accessions in the Australian groundnut collection. It was demonstrated that the nonlinear variant of ordinary principal component analysis is an appropriate analytical tool because subspecies and botanical varieties could be identified on the basis of the analysis and characterized in terms of all descriptors. Moreover, outlying accessions could be easily spotted and their characteristics established. The statistical results and their interpretations provide users with a more efficient way to identify accessions of potential relevance for their plant improvement programs and encourage and improve the usefulness and utilization of germplasm collections.
Resumo:
This paper develops a semiparametric estimation approach for mixed count regression models based on series expansion for the unknown density of the unobserved heterogeneity. We use the generalized Laguerre series expansion around a gamma baseline density to model unobserved heterogeneity in a Poisson mixture model. We establish the consistency of the estimator and present a computational strategy to implement the proposed estimation techniques in the standard count model as well as in truncated, censored, and zero-inflated count regression models. Monte Carlo evidence shows that the finite sample behavior of the estimator is quite good. The paper applies the method to a model of individual shopping behavior. © 1999 Elsevier Science S.A. All rights reserved.