955 resultados para statistical lip modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliability of electronic parts is a major concern for many manufacturers, since early failures in the field can cost an enormous amount to repair - in many cases far more than the original cost of the product. A great deal of effort is expended by manufacturers to determine the failure rates for a process or the fraction of parts that will fail in a period of time. It is widely recognized that the traditional approach to reliability predictions for electronic systems are not suitable for today's products. This approach, based on statistical methods only, does not address the physics governing the failure mechanisms in electronic systems. This paper discusses virtual prototyping technologies which can predict the physics taking place and relate this to appropriate failure mechanisms. Simulation results illustrate the effect of temperature on the assembly process of an electronic package and the lifetime of a flip-chip package.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper details and demonstrates integrated optimisation-reliability modelling for predicting the performance of solder joints in electronic packaging. This integrated modelling approach is used to identify efficiently and quickly the most suitable design parameters for solder joint performance during thermal cycling and is demonstrated on flip-chip components using “no-flow” underfills. To implement “optimisation in reliability” approach, the finite element simulation tool – PHYSICA, is coupled with optimisation and statistical tools. This resulting framework is capable of performing design optimisation procedures in an entirely automated and systematic manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes a computational strategy for virtual design and prototyping of electronic components and assemblies. The design process is formulated as a design optimisation problem. The solution of this problem identifies not only the design which meets certain user specified requirements but also the design with the maximum possible improvement in particular aspects such as reliability, cost, etc. The modelling approach exploits numerical techniques for computational analysis (Finite Element Analysis) integrated with numerical methods for approximation, statistical analysis and optimisation. A software framework of modules that incorporates the required numerical techniques is developed and used to carry out the design optimisation modelling of fine-pitch flip-chip lead free solder interconnects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ocean biogeochemistry (OBGC) models span a wide variety of complexities, including highly simplified nutrient-restoring schemes, nutrient–phytoplankton–zooplankton–detritus (NPZD) models that crudely represent the marine biota, models that represent a broader trophic structure by grouping organisms as plankton functional types (PFTs) based on their biogeochemical role (dynamic green ocean models) and ecosystem models that group organisms by ecological function and trait. OBGC models are now integral components of Earth system models (ESMs), but they compete for computing resources with higher resolution dynamical setups and with other components such as atmospheric chemistry and terrestrial vegetation schemes. As such, the choice of OBGC in ESMs needs to balance model complexity and realism alongside relative computing cost. Here we present an intercomparison of six OBGC models that were candidates for implementation within the next UK Earth system model (UKESM1). The models cover a large range of biological complexity (from 7 to 57 tracers) but all include representations of at least the nitrogen, carbon, alkalinity and oxygen cycles. Each OBGC model was coupled to the ocean general circulation model Nucleus for European Modelling of the Ocean (NEMO) and results from physically identical hindcast simulations were compared. Model skill was evaluated for biogeochemical metrics of global-scale bulk properties using conventional statistical techniques. The computing cost of each model was also measured in standardised tests run at two resource levels. No model is shown to consistently outperform all other models across all metrics. Nonetheless, the simpler models are broadly closer to observations across a number of fields and thus offer a high-efficiency option for ESMs that prioritise high-resolution climate dynamics. However, simpler models provide limited insight into more complex marine biogeochemical processes and ecosystem pathways, and a parallel approach of low-resolution climate dynamics and high-complexity biogeochemistry is desirable in order to provide additional insights into biogeochemistry–climate interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work in this paper is of particular significance since it considers the problem of modelling cross- and auto-correlation in statistical process monitoring. The presence of both types of correlation can lead to fault insensitivity or false alarms, although in published literature to date, only autocorrelation has been broadly considered. The proposed method, which uses a Kalman innovation model, effectively removes both correlations. The paper (and Part 2 [2]) has emerged from work supported by EPSRC grant GR/S84354/01 and is of direct relevance to problems in several application areas including chemical, electrical, and mechanical process monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present results from a time-dependent gas-phase chemical model of a hot core based on the physical conditions of G305.2+0.2. While the cyanopolyyne HC3N has been observed in hot cores, the longer chained species, HC5N, HC7N and HC9N, have not been considered as the typical hot-core species. We present results which show that these species can be formed under hot core conditions. We discuss the important chemical reactions in this process and, in particular, show that their abundances are linked to the parent species acetylene which is evaporated from icy grain mantles. The cyanopolyynes show promise as ‘chemical clocks’ which may aid future observations in determining the age of hot core sources. The abundance of the larger cyanopolyynes increases and decreases over relatively short time-scales, ~10^2.5 yr. We present results from a non-local thermodynamic equilibrium statistical equilibrium excitation model as a series of density, temperature and column density dependent contour plots which show both the line intensities and several line ratios. These aid in the interpretation of spectral-line data, even when there is limited line information available. In particular, non-detections of HC5N and HC7N in Walsh et al. are analysed and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the application of multivariate regression techniques to the Tennessee Eastman benchmark process for modelling and fault detection. Two methods are applied : linear partial least squares, and a nonlinear variant of this procedure using a radial basis function inner relation. The performance of the RBF networks is enhanced through the use of a recently developed training algorithm which uses quasi-Newton optimization to ensure an efficient and parsimonious network; details of this algorithm can be found in this paper. The PLS and PLS/RBF methods are then used to create on-line inferential models of delayed process measurements. As these measurements relate to the final product composition, these models suggest that on-line statistical quality control analysis should be possible for this plant. The generation of `soft sensors' for these measurements has the further effect of introducing a redundant element into the system, redundancy which can then be used to generate a fault detection and isolation scheme for these sensors. This is achieved by arranging the sensors and models in a manner comparable to the dedicated estimator scheme of Clarke et al. 1975, IEEE Trans. Pero. Elect. Sys., AES-14R, 465-473. The effectiveness of this scheme is demonstrated on a series of simulated sensor and process faults, with full detection and isolation shown to be possible for sensor malfunctions, and detection feasible in the case of process faults. Suggestions for enhancing the diagnostic capacity in the latter case are covered towards the end of the paper.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing need to identify the rheological properties of cement grout using a simple test to determine the fluidity, and other properties of underwater applications such as washout resistance and compressive strength. This paper reviews statistical models developed using a factorial design that was carried out to model the influence of key parameters on properties affecting the performance of underwater cement grout. Such responses of fluidity included minislump and flow time measured by Marsh cone, washout resistance, unit weight, and compressive strength. The models are valid for mixes with 0.35–0.55 water-to-binder ratio (W/B), 0.053–0.141% of antiwashout admixture (AWA), by mass of water, and 0.4–1.8% (dry extract) of superplasticizer (SP), by mass of binder. Two types of underwater grout were tested: the first one made with cement and the second one made with 20% of pulverised fuel ash (PFA) replacement, by mass of binder. Also presented are the derived models that enable the identification of underlying primary factors and their interactions that influence the modelled responses of underwater cement grout. Such parameters can be useful to reduce the test protocol needed for proportioning of underwater cement grout. This paper attempts also to demonstrate the usefulness of the models to better understand trade-offs between parameters and compare the responses obtained from the various test methods that are highlighted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To quantify how much of the coronary heart disease (CHD) mortality decline in Northern Ireland between 1987 and 2007 could be attributed to medical and surgical treatments and how much to changes in population cardiovascular risk factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivated by the need to solve ecological problems (climate change, habitat fragmentation and biological invasions), there has been increasing interest in species distribution models (SDMs). Predictions from these models inform conservation policy, invasive species management and disease-control measures. However, predictions are subject to uncertainty, the degree and source of which is often unrecognized. Here, we review the SDM literature in the context of uncertainty, focusing on three main classes of SDM: niche-based models, demographic models and process-based models. We identify sources of uncertainty for each class and discuss how uncertainty can be minimized or included in the modelling process to give realistic measures of confidence around predictions. Because this has typically not been performed, we conclude that uncertainty in SDMs has often been underestimated and a false precision assigned to predictions of geographical distribution. We identify areas where development of new statistical tools will improve predictions from distribution models, notably the development of hierarchical models that link different types of distribution model and their attendant uncertainties across spatial scales. Finally, we discuss the need to develop more defensible methods for assessing predictive performance, quantifying model goodness-of-fit and for assessing the significance of model covariates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of joint modelling approaches is becoming increasingly popular when an association exists between survival and longitudinal processes. Widely recognized for their gain in efficiency, joint models also offer a reduction in bias compared with naïve methods. With the increasing popularity comes a constantly expanding literature on joint modelling approaches. The aim of this paper is to give an overview of recent literature relating to joint models, in particular those that focus on the time-to-event survival process. A discussion is provided on the range of survival submodels that have been implemented in a joint modelling framework. A particular focus is given to the recent advancements in software used to build these models. Illustrated through the use of two different real-life data examples that focus on the survival of end-stage renal disease patients, the use of the JM and joineR packages within R are demonstrated. The possible future direction for this field of research is also discussed. © 2013 International Statistical Institute.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Human occupants within indoor environments are not always stationary and their movement will lead to temporal channel variations that strongly affect the quality of indoor wireless communication systems. This paper describes a statistical channel characterization, based on experimental measurements, of human body effects on line-of-sight indoor narrowband propagation at 5.2 GHz. The analysis shows that, as the number of pedestrians within the measurement location increases, the Ricean K-factor that best fits the empirical data tends to decrease proportionally, ranging from K=7 with 1 pedestrian to K=0 with 4 pedestrians. Level crossing rate results were Rice distributed, while average fade duration results were significantly higher than theoretically computed Rice and Rayleigh, due to the fades caused by pedestrians. A novel CDF that accurately characterizes the 5.2 GHz channel in the considered indoor environment is proposed. For the first time, the received envelope CDF is explicitly described in terms of a quantitative measurement of pedestrian traffic within the indoor environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Identifying processes that shape species geographical ranges is a prerequisite for understanding environmental change. Currently, species distribution modelling methods do not offer credible statistical tests of the relative influence of climate factors and typically ignore other processes (e.g. biotic interactions and dispersal limitation). We use a hierarchical model fitted with Markov Chain Monte Carlo to combine ecologically plausible niche structures using regression splines to describe unimodal but potentially skewed response terms. We apply spatially explicit error terms that account for (and may help identify) missing variables. Using three example distributions of European bird species, we map model results to show sensitivity to change in each covariate. We show that the overall strength of climatic association differs between species and that each species has considerable spatial variation in both the strength of the climatic association and the sensitivity to climate change. Our methods are widely applicable to many species distribution modelling problems and enable accurate assessment of the statistical importance of biotic and abiotic influences on distributions.