205 resultados para external validation

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Resumo:

To independently evaluate and compare the performance of the Ocular Hypertension Treatment Study-European Glaucoma Prevention Study (OHTS-EGPS) prediction equation for estimating the 5-year risk of open-angle glaucoma (OAG) in four cohorts of adults with ocular hypertension.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To test the applicability of the sex-specific 2008 Framingham general cardiovascular risk equation for coronary heart disease (CHD) and stroke in European middle-aged men from Ireland and France.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Model selection between competing models is a key consideration in the discovery of prognostic multigene signatures. The use of appropriate statistical performance measures as well as verification of biological significance of the signatures is imperative to maximise the chance of external validation of the generated signatures. Current approaches in time-to-event studies often use only a single measure of performance in model selection, such as logrank test p-values, or dichotomise the follow-up times at some phase of the study to facilitate signature discovery. In this study we improve the prognostic signature discovery process through the application of the multivariate partial Cox model combined with the concordance index, hazard ratio of predictions, independence from available clinical covariates and biological enrichment as measures of signature performance. The proposed framework was applied to discover prognostic multigene signatures from early breast cancer data. The partial Cox model combined with the multiple performance measures were used in both guiding the selection of the optimal panel of prognostic genes and prediction of risk within cross validation without dichotomising the follow-up times at any stage. The signatures were successfully externally cross validated in independent breast cancer datasets, yielding a hazard ratio of 2.55 [1.44, 4.51] for the top ranking signature.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the context of products from certain regions or countries being banned because of an identified or non-identified hazard, proof of geographical origin is essential with regard to feed and food safety issues. Usually, the product labeling of an affected feed lot shows origin, and the paper documentation shows traceability. Incorrect product labeling is common in embargo situations, however, and alternative analytical strategies for controlling feed authenticity are therefore needed. In this study, distillers' dried grains and solubles (DDGS) were chosen as the product on which to base a comparison of analytical strategies aimed at identifying the most appropriate one. Various analytical techniques were investigated for their ability to authenticate DDGS, including spectroscopic and spectrometric techniques combined with multivariate data analysis, as well as proven techniques for authenticating food, such as DNA analysis and stable isotope ratio analysis. An external validation procedure (called the system challenge) was used to analyze sample sets blind and to compare analytical techniques. All the techniques were adapted so as to be applicable to the DDGS matrix. They produced positive results in determining the botanical origin of DDGS (corn vs. wheat), and several of them were able to determine the geographical origin of the DDGS in the sample set. The maintenance and extension of the databanks generated in this study through the analysis of new authentic samples from a single location are essential in order to monitor developments and processing that could affect authentication.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Poor sleep is increasingly being recognised as an important prognostic parameter of health. For those with suspected sleep disorders, patients are referred to sleep clinics which guide treatment. However, sleep clinics are not always a viable option due to their high cost, a lack of experienced practitioners, lengthy waiting lists and an unrepresentative sleeping environment. A home-based non-contact sleep/wake monitoring system may be used as a guide for treatment potentially stratifying patients by clinical need or highlighting longitudinal changes in sleep and nocturnal patterns. This paper presents the evaluation of an under-mattress sleep monitoring system for non-contact sleep/wake discrimination. A large dataset of sensor data with concomitant sleep/wake state was collected from both younger and older adults participating in a circadian sleep study. A thorough training/testing/validation procedure was configured and optimised feature extraction and sleep/wake discrimination algorithms evaluated both within and across the two cohorts. An accuracy, sensitivity and specificity of 74.3%, 95.5%, and 53.2% is reported over all subjects using an external validation
dataset (71.9%, 87.9% and 56%, and 77.5%, 98% and 57% is reported for younger and older subjects respectively). These results compare favourably with similar research, however this system provides an ambient alternative suitable for long term continuous sleep monitoring, particularly amongst vulnerable populations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowledge production in entrepreneurship requires inclusivity as well as diversity and pluralism in research perspectives and approaches. In this article, the authors address concerns about interpretivist research regarding validity, reliability, objectivity, generalizability, and communicability of results that militate against its more widespread acceptance. Following the nonfoundationalist argument that all observation is theory-laden, context specific, and that there are no external criteria against which to assess research design and execution and the data produced, the authors propose that quality must be internalized within the underlying research philosophy rather than something to be tested upon completion. This requires a shift from the notion of validity as an outcome to validation as a process. To elucidate this, they provide a guiding framework and present a case illustration that will assist an interpretivist entrepreneurship researcher to establish and demonstrate the quality of their work.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Photographs have been used to enhance consumer reporting of preference of meat doneness, however, the use of photographs has not been validated for this purpose. This study used standard cooking methods to produce steaks of five different degrees of doneness (rare medium, medium well, well done and very well done) to study the consumer’s perception of doneness, from both the external and internal surface of the cooked steak and also from corresponding photographs of each sample. Consumers evaluated each surface of the cooked steaks in relation to doneness for acceptability, ‘just about right’ and perception of doneness. Data were analysed using a split plot ANOVA and least significant test. Perception scores (for both external and internal surfaces) between different presentation methods (steak samples and corresponding photos), were not significantly different (p > 0.05). The result indicates that photographs can be used as a valid approach for assessing preference for meat doneness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the impedance cardiogram recorded by an automated external defibrillator during cardiac arrest to facilitate emergency care by lay persons. Lay persons are poor at emergency pulse checks (sensitivity 84%, specificity 36%); guidelines recommend they should not be performed. The impedance cardiogram (dZ/dt) is used to indicate stroke volume. Can an impedance cardiogram algorithm in a defibrillator determine rapidly circulatory arrest and facilitate prompt initiation of external cardiac massage?

DESIGN: Clinical study.

SETTING: University hospital.

PATIENTS: Phase 1 patients attended for myocardial perfusion imaging. Phase 2 patients were recruited during cardiac arrest. This group included nonarrest controls.

INTERVENTIONS: The impedance cardiogram was recorded through defibrillator/electrocardiographic pads oriented in the standard cardiac arrest position.

MEASUREMENTS AND MAIN RESULTS: Phase 1: Stroke volumes from gated myocardial perfusion imaging scans were correlated with parameters from the impedance cardiogram system (dZ/dt(max) and the peak amplitude of the Fast Fourier Transform of dZ/dt between 1.5 Hz and 4.5 Hz). Multivariate analysis was performed to fit stroke volumes from gated myocardial perfusion imaging scans with linear and quadratic terms for dZ/dt(max) and the Fast Fourier Transform to identify significant parameters for incorporation into a cardiac arrest diagnostic algorithm. The square of the peak amplitude of the Fast Fourier Transform of dZ/dt was the best predictor of reduction in stroke volumes from gated myocardial perfusion imaging scans (range = 33-85 mL; p = .016). Having established that the two pad impedance cardiogram system could detect differences in stroke volumes from gated myocardial perfusion imaging scans, we assessed its performance in diagnosing cardiac arrest. Phase 2: The impedance cardiogram was recorded in 132 "cardiac arrest" patients (53 training, 79 validation) and 97 controls (47 training, 50 validation): the diagnostic algorithm indicated cardiac arrest with sensitivities and specificities (+/- exact 95% confidence intervals) of 89.1% (85.4-92.1) and 99.6% (99.4-99.7; training) and 81.1% (77.6-84.3) and 97% (96.7-97.4; validation).

CONCLUSIONS: The impedance cardiogram algorithm is a significant marker of circulatory collapse. Automated defibrillators with an integrated impedance cardiogram could improve emergency care by lay persons, enabling rapid and appropriate initiation of external cardiac massage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A single-step lateral flow immunoassay (LFIA) was developed and validated for the rapid screening of paralytic shellfish toxins (PSTs) from a variety of shellfish species, at concentrations relevant to regulatory limits of 800 μg STX-diHCl equivalents/kg shellfish meat. A simple aqueous extraction protocol was performed within several minutes from sample homogenate. The qualitative result was generated after a 5 min run time using a portable reader which removed subjectivity from data interpretation. The test was designed to generate noncompliant results with samples containing approximately 800 μg of STX-diHCl/kg. The cross-reactivities in relation to STX, expressed as mean ± SD, were as follows: NEO: 128.9% ± 29%; GTX1&4: 5.7% ± 1.5%; GTX2&3: 23.4% ± 10.4%; dcSTX: 55.6% ± 10.9%; dcNEO: 28.0% ± 8.9%; dcGTX2&3: 8.3% ± 2.7%; C1&C2: 3.1% ± 1.2%; GTX5: 23.3% ± 14.4% (n = 5 LFIA lots). There were no indications of matrix effects from the different samples evaluated (mussels, scallops, oysters, clams, cockles) nor interference from other shellfish toxins (domoic acid, okadaic acid group). Naturally contaminated sample evaluations showed no false negative results were generated from a variety of different samples and profiles (n = 23), in comparison to reference methods (MBA method 959.08, LC-FD method 2005.06). External laboratory evaluations of naturally contaminated samples (n = 39) indicated good correlation with reference methods (MBA, LC-FD). This is the first LFIA which has been shown, through rigorous validation, to have the ability to detect most major PSTs in a reliable manner and will be a huge benefit to both industry and regulators, who need to perform rapid and reliable testing to ensure shellfish are safe to eat.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper describes the detailed validation of a computer model designed to simulate the transient light-off in a two-stroke oxidation catalyst. A plug flow reactor is employed to provide measurements of temperature and gas concentration at various radial and axial locations inside the catalyst. These measurements are recorded at discrete intervals during a transient light-off in which the inlet temperature is increased from ambient to 300oC at rates of up to 6oC/sec. The catalyst formulation used in the flow reactor, and its associated test procedures, are then simulated by the computer and a comparison made between experimental readings and model predictions. The design of the computer model to which this validation exercise relates is described in detail in a separate technical paper. The first section of the paper investigates the warm-up characteristics of the substrate and examines the validity of the heat transfer predictions between the wall and the gas in the absence of chemical reactions. The predictions from a typical single-component CO transient light-off test are discussed in the second section and are compared with experimental data. In particular the effect of the temperature ramp on the light-off curve and reaction zone development is examined. An analysis of the C3H6 conversion is given in the third section while the final section examines the accuracy of the light-off curves which are produced when both CO and C3H6 are present in the feed gas. The analysis shows that the heat and mass transfer calculations provided reliable predictions of the warm-up behaviour and post light-off gas concentration profiles. The self-inhibition and cross-inhibition terms in the global rate expressions were also found to be reasonably reliable although the surface reaction rates required calibration with experimental data.