46 resultados para model validation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this study was to develop a predictive model for adverse drug events (ADEs) in elderly patients. Socio-demographic and medical data were collected from chart reviews, computerised information and a patient interview, for a population of 929 elderly patients (aged greater than or equal to 65 years) whose admission to the Waveney/B raid Valley Hospital in Northern Ireland was not scheduled. A further 204 patients formed a validation group. An ADE score was assigned to each patient using a modified Naranjo algorithm scoring system. The ADE scores ranged from 0 to 8. For the purposes of developing a risk model, scores of 4 or more were considered to constitute a high risk of an ADE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer-assisted topology predictions are widely used to build low-resolution structural models of integral membrane proteins (IMPs). Experimental validation of these models by traditional methods is labor intensive and requires modifications that might alter the IMP native conformation. This work employs oxidative labeling coupled with mass spectrometry (MS) as a validation tool for computer-generated topology models. ·OH exposure introduces oxidative modifications in solvent-accessible regions, whereas buried segments (e.g., transmembrane helices) are non-oxidizable. The Escherichia coli protein WaaL (O-antigen ligase) is predicted to have 12 transmembrane helices and a large extramembrane domain (Pérez et al., Mol. Microbiol. 2008, 70, 1424). Tryptic digestion and LC-MS/MS were used to map the oxidative labeling behavior of WaaL. Met and Cys exhibit high intrinsic reactivities with ·OH, making them sensitive probes for solvent accessibility assays. Overall, the oxidation pattern of these residues is consistent with the originally proposed WaaL topology. One residue (M151), however, undergoes partial oxidation despite being predicted to reside within a transmembrane helix. Using an improved computer algorithm, a slightly modified topology model was generated that places M151 closer to the membrane interface. On the basis of the labeling data, it is concluded that the refined model more accurately reflects the actual topology of WaaL. We propose that the combination of oxidative labeling and MS represents a useful strategy for assessing the accuracy of IMP topology predictions, supplementing data obtained in traditional biochemical assays. In the future, it might be possible to incorporate oxidative labeling data directly as constraints in topology prediction algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A reconfigurable reflectarray which exploits the dielectric anisotropy of liquid crystals (LC) has been designed to operate in the frequency range from 96 to 104 GHz. The unit cells are composed of three unequal length parallel dipoles placed above an LC substrate. The reflectarray has been designed using an accurate model which includes the effects of anisotropy and inhomogeneity. An effective permittivity that accounts for the real effects of the LC has also been used to simplify the analysis and design of the unit cells. The geometrical parameters of the cells have been adjusted to simultaneously improve the bandwidth, maximize the tunable phase-range and reduce the sensitivity to the angle of incidence. The performance of the LC based unit cells has been experimentally evaluated by measuring the reflection amplitude and phase of a reflectarray consisting of 52x54 identical cells. The good agreement between measurements and simulations validate the analysis and design techniques and demonstrate the capabilities of the proposed reflectarray to provide beam scanning in F band.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fuel economy has become an important consideration in forklift truck design, particularly in Europe. A simulation of the fuel consumption and performance of a forklift truck has been developed, validated and subsequently used to determine the energy consumed by individual powertrain components during drive cycles.
The truck used in this study has a rated lifting capacity of 2500kg, and is powered by a 2.6 litre naturally aspirated diesel engine with a fuel pump containing a mechanical variable-speed governor. The drivetrain consisted of a torque convertor, hydraulic clutch and single speed transmission.
AVL Cruise was used to simulate the vehicle powertrain, with coupled Mathworks Simulink models used to simulate the hydraulic and control systems and governor. The vehicle has been simulated on several performance and fuel consumption drive cycles with the main focus being the VDI 2198 fuel consumption drive cycle.
To validate the model, a truck was instrumented and measurements taken to compare the performance and instantaneous fuel consumption to simulated values. The fuel injector pump was modified and calibrated to enable instantaneous fuel flow to be measured.
The model has been validated to within acceptable limits and has been used to investigate the effect four different torque converters have on the fuel consumption and performance of the forklift truck. The study demonstrates how the model can be used to compare the fuel consumption and performance trade-offs when selecting drivetrain components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Model selection between competing models is a key consideration in the discovery of prognostic multigene signatures. The use of appropriate statistical performance measures as well as verification of biological significance of the signatures is imperative to maximise the chance of external validation of the generated signatures. Current approaches in time-to-event studies often use only a single measure of performance in model selection, such as logrank test p-values, or dichotomise the follow-up times at some phase of the study to facilitate signature discovery. In this study we improve the prognostic signature discovery process through the application of the multivariate partial Cox model combined with the concordance index, hazard ratio of predictions, independence from available clinical covariates and biological enrichment as measures of signature performance. The proposed framework was applied to discover prognostic multigene signatures from early breast cancer data. The partial Cox model combined with the multiple performance measures were used in both guiding the selection of the optimal panel of prognostic genes and prediction of risk within cross validation without dichotomising the follow-up times at any stage. The signatures were successfully externally cross validated in independent breast cancer datasets, yielding a hazard ratio of 2.55 [1.44, 4.51] for the top ranking signature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: A previously described economic model was based on average values for patients diagnosed with chronic periodontitis (CP). However, tooth loss varies among treated patients and factors for tooth loss include CP severity and risk. The model was refined to incorporate CP severity and risk to determine the cost of treating a specific level of CP severity and risk that is associated with the benefit of tooth preservation.

Methods: A population that received and another that did not receive periodontal treatment were used to determine treatment costs and tooth loss. The number of teeth preserved was the difference of the number of teeth lost between the two populations. The cost of periodontal treatment was divided by the number of teeth preserved for combinations of CP severity and risk.

Results: The cost of periodontal treatment divided by the number of teeth preserved ranged from (US) $ 1,405 to $ 4,895 for high or moderate risk combined with any severity of CP and was more than $ 8,639 for low risk combined with mild CP. The cost of a three-unit bridge was $ 3,416, and the cost of a single-tooth replacement was $ 4,787.

Conclusion: Periodontal treatment could be justified on the sole basis of tooth preservation when CP risk is moderate or high regardless of disease severity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An intralaminar damage model (IDM), based on continuum damage mechanics, was developed for the simulation of composite structures subjected to damaging loads. This model can capture the complex intralaminar damage mechanisms, accounting for mode interactions, and delaminations. Its development is driven by a requirement for reliable crush simulations to design composite structures with a high specific energy absorption. This IDM was implemented as a user subroutine within the commercial finite element package, Abaqus/Explicit[1]. In this paper, the validation of the IDM is presented using two test cases. Firstly, the IDM is benchmarked against published data for a blunt notched specimen under uniaxial tensile loading, comparing the failure strength as well as showing the damage. Secondly, the crush response of a set of tulip-triggered composite cylinders was obtained experimentally. The crush loading and the associated energy of the specimen is compared with the FE model prediction. These test cases show that the developed IDM is able to capture the structural response with satisfactory accuracy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports the detailed description and validation of a fully automated, computer controlled analytical method to spatially probe the gas composition and thermal characteristics in packed bed systems. This method has been designed to limit the invasiveness of the probe, a characteristic assessed using CFD. The thermocouple is aligned with the sampling holes to enable simultaneous recording of the gas composition and temperature profiles. This analysis technique has been validated by studying CO oxidation over a 1% Pt/Al2O3 catalyst. The resultant profiles have been compared with a micro-kinetic model, to further assess the strength of the technique. 

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper addresses the issue of choice of bandwidth in the application of semiparametric estimation of the long memory parameter in a univariate time series process. The focus is on the properties of forecasts from the long memory model. A variety of cross-validation methods based on out of sample forecasting properties are proposed. These procedures are used for the choice of bandwidth and subsequent model selection. Simulation evidence is presented that demonstrates the advantage of the proposed new methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The term fatigue loads on the Oyster Oscillating Wave Surge Converter (OWSC) is used to describe hydrostatic loads due to water surface elevation with quasi-static changes of state. Therefore a procedure to implement hydrostatic pressure distributions into finite element analysis of the structure is desired. Currently available experimental methods enable one to measure time variant water surface elevation at discrete locations either on or around the body of the scale model during tank tests. This paper discusses the development of a finite element analysis procedure to implement time variant, spatially distributed hydrostatic pressure derived from discretely measured water surface elevation. The developed method can process differently resolved (temporal and spatial) input data and approximate the elevation over the flap faces with user defined properties. The structural loads, namely the forces and moments on the body can then be investigated by post processing the numerical results. This method offers the possibility to process surface elevation or hydrostatic pressure data from computational fluid dynamics simulations and can thus be seen as a first step to a fluid-structure interaction model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical downscaling (SD) methods have become a popular, low-cost and accessible means of bridging the gap between the coarse spatial resolution at which climate models output climate scenarios and the finer spatial scale at which impact modellers require these scenarios, with various different SD techniques used for a wide range of applications across the world. This paper compares the Generator for Point Climate Change (GPCC) model and the Statistical DownScaling Model (SDSM)—two contrasting SD methods—in terms of their ability to generate precipitation series under non-stationary conditions across ten contrasting global climates. The mean, maximum and a selection of distribution statistics as well as the cumulative frequencies of dry and wet spells for four different temporal resolutions were compared between the models and the observed series for a validation period. Results indicate that both methods can generate daily precipitation series that generally closely mirror observed series for a wide range of non-stationary climates. However, GPCC tends to overestimate higher precipitation amounts, whilst SDSM tends to underestimate these. This infers that GPCC is more likely to overestimate the effects of precipitation on a given impact sector, whilst SDSM is likely to underestimate the effects. GPCC performs better than SDSM in reproducing wet and dry day frequency, which is a key advantage for many impact sectors. Overall, the mixed performance of the two methods illustrates the importance of users performing a thorough validation in order to determine the influence of simulated precipitation on their chosen impact sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article examines the influence on the engineering design process of the primary objective of validation, whether it is proving a model, a technology or a product. Through the examination of a number of stiffened panel case studies, the relationships between simulation, validation, design and the final product are established and discussed. The work demonstrates the complex interactions between the original (or anticipated) design model, the analysis model, the validation activities and the product in service. The outcome shows clearly some unintended consequences. High fidelity validation test simulations require a different set of detailed parameters to accurately capture behaviour. By doing so, there is a divergence from the original computer-aided design model, intrinsically limiting the value of the validation with respect to the product. This work represents a shift from the traditional perspective of encapsulating and controlling errors between simulation and experimental test to consideration of the wider design-test process. Specifically, it is a reflection on the implications of how models are built and validated, and the effect on results and understanding of structural behaviour. This article then identifies key checkpoints in the design process and how these should be used to update the computer-aided design system parameters for a design. This work strikes at a fundamental challenge in understanding the interaction between design, certification and operation of any complex system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An experimental investigation is carried out to verify the feasibility of using an instrumented vehicle to detect and monitor bridge dynamic parameters. The low-cost method consists of the use of a moving vehicle fitted with accelerometers on its axles. In the laboratory experiment, the vehicle–bridge interaction model consists of a scaled two-axle vehicle model crossing a simply supported steel beam. The bridge model also includes a scaled road surface profile. The effects of varying the vehicle model configuration and speed are investigated. A finite element beam model is calibrated using the experimental results, and a novel algorithm for the identification of global bridge stiffness is validated. Using measured vehicle accelerations as input to the algorithm, the beam stiffness is identified with a reasonable degree of accuracy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a novel and effective lip-based biometric identification approach with the Discrete Hidden Markov Model Kernel (DHMMK) is developed. Lips are described by shape features (both geometrical and sequential) on two different grid layouts: rectangular and polar. These features are then specifically modeled by a DHMMK, and learnt by a support vector machine classifier. Our experiments are carried out in a ten-fold cross validation fashion on three different datasets, GPDS-ULPGC Face Dataset, PIE Face Dataset and RaFD Face Dataset. Results show that our approach has achieved an average classification accuracy of 99.8%, 97.13%, and 98.10%, using only two training images per class, on these three datasets, respectively. Our comparative studies further show that the DHMMK achieved a 53% improvement against the baseline HMM approach. The comparative ROC curves also confirm the efficacy of the proposed lip contour based biometrics learned by DHMMK. We also show that the performance of linear and RBF SVM is comparable under the frame work of DHMMK.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of ground source heat pump (GSHP) systems are used as an aid for the correct design and optimization of the system. For this purpose, it is necessary to develop models which correctly reproduce the dynamic thermal behavior of each component in a short-term basis. Since the borehole heat exchanger (BHE) is one of the main components, special attention should be paid to ensuring a good accuracy on the prediction of the short-term response of the boreholes. The BHE models found in literature which are suitable for short-term simulations usually present high computational costs. In this work, a novel TRNSYS type implementing a borehole-to-ground (B2G) model, developed for modeling the short-term dynamic performance of a BHE with low computational cost, is presented. The model has been validated against experimental data from a GSHP system located at Universitat Politècnica de València, Spain. Validation results show the ability of the model to reproduce the short-term behavior of the borehole, both for a step-test and under normal operating conditions.