847 resultados para Cut-off operation


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Alpine heavy precipitation events often affect small catchments, although the circulation pattern leading to the event extends over the entire North Atlantic. The various scale interactions involved are particularly challenging for the numerical weather prediction of such events. Unlike previous studies focusing on the southern Alps, here a comprehensive study of a heavy precipitation event in the northern Alps in October 2011 is presented with particular focus on the role of the large-scale circulation in the North Atlantic/European region. During the event exceptionally high amounts of total precipitable water occurred in and north of the Alps. This moisture was initially transported along the flanks of a blocking ridge over the North Atlantic. Subsequently, strong and persistent northerly flow established at the upstream flank of a trough over Europe and steered the moisture towards the northern Alps. Lagrangian diagnostics reveal that a large fraction of the moisture emerged from the West African coast where a subtropical upper-level cut-off low served as an important moisture collector. Wave activity flux diagnostics show that the ridge was initiated as part of a low-frequency, large-scale Rossby wave train while convergence of fast transients helped to amplify it locally in the North Atlantic. A novel diagnostic for advective potential vorticity tendencies sheds more light on this amplification and further emphasizes the role of the ridge in amplifying the trough over Europe. Operational forecasts misrepresented the amplitude and orientation of this trough. For the first time, this study documents an important pathway for northern Alpine flooding, in which the interaction of synoptic-scale to large-scale weather systems and of long-range moisture transport from the Tropics are dominant. Moreover, the trapping of moisture in a subtropical cut-off near the West African coast is found to be a crucial precursor to the observed European high-impact weather.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Ab initio calculations of Afρ are presented using Mie scattering theory and a Direct Simulation Monte Carlo (DSMC) dust outflow model in support of the Rosetta mission and its target 67P/Churyumov-Gerasimenko (CG). These calculations are performed for particle sizes ranging from 0.010 μm to 1.0 cm. The present status of our knowledge of various differential particle size distributions is reviewed and a variety of particle size distributions is used to explore their effect on Afρ , and the dust mass production View the MathML sourcem˙. A new simple two parameter particle size distribution that curtails the effect of particles below 1 μm is developed. The contributions of all particle sizes are summed to get a resulting overall Afρ. The resultant Afρ could not easily be predicted a priori and turned out to be considerably more constraining regarding the mass loss rate than expected. It is found that a proper calculation of Afρ combined with a good Afρ measurement can constrain the dust/gas ratio in the coma of comets as well as other methods presently available. Phase curves of Afρ versus scattering angle are calculated and produce good agreement with observational data. The major conclusions of our calculations are: – The original definition of A in Afρ is problematical and Afρ should be: qsca(n,λ)×p(g)×f×ρqsca(n,λ)×p(g)×f×ρ. Nevertheless, we keep the present nomenclature of Afρ as a measured quantity for an ensemble of coma particles.– The ratio between Afρ and the dust mass loss rate View the MathML sourcem˙ is dominated by the particle size distribution. – For most particle size distributions presently in use, small particles in the range from 0.10 to 1.0 μm contribute a large fraction to Afρ. – Simplifying the calculation of Afρ by considering only large particles and approximating qsca does not represent a realistic model. Mie scattering theory or if necessary, more complex scattering calculations must be used. – For the commonly used particle size distribution, dn/da ∼ a−3.5 to a−4, there is a natural cut off in Afρ contribution for both small and large particles. – The scattering phase function must be taken into account for each particle size; otherwise the contribution of large particles can be over-estimated by a factor of 10. – Using an imaginary index of refraction of i = 0.10 does not produce sufficient backscattering to match observational data. – A mixture of dark particles with i ⩾ 0.10 and brighter silicate particles with i ⩽ 0.04 matches the observed phase curves quite well. – Using current observational constraints, we find the dust/gas mass-production ratio of CG at 1.3 AU is confined to a range of 0.03–0.5 with a reasonably likely value around 0.1.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Currently, most cosmic ray data are obtained by detectors on satellites, aircraft, high-altitude balloons and ground (neutron monitors). In our work, we examined whether Liulin semiconductor spectrometers (simple silicon planar diode detectors with spectrometric properties) located at high mountain observatories could contribute new information to the monitoring of cosmic rays by analyzing data from selected solar events between 2005 and 2013. The decision thresholds and detection limits of these detectors placed at Jungfraujoch (Switzerland; 3475 m a.s.l.; vertical cut-off rigidity 4.5 GV) and Lomnicky stıt (Slovakia; 2633 m a.s.l.; vertical cut-off rigidity 3.84 GV) highmountain observatories were determined. The data showed that only the strongest variations of the cosmic ray flux in this period were detectable. The main limitation in the performance of these detectors is their small sensitive volume and low sensitivity of the PIN photodiode to neutrons.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Intravenous anaesthetic drugs are the primary means for producing general anaesthesia in equine practice. The ideal drug for intravenous anaesthesia has high reliability and pharmacokinetic properties indicating short elimination and lack of accumulation when administered for prolonged periods. Induction of general anaesthesia with racemic ketamine preceded by profound sedation has already an established place in the equine field anaesthesia. Due to potential advantages over racemic ketamine, S-ketamine has been employed in horses to induce general anaesthesia, but its optimal dose remains under investigation. The objective of this study was to evaluate whether 2.5 mg/kg S-ketamine could be used as a single intravenous bolus to provide short-term surgical anaesthesia in colts undergoing surgical castration, and to report its pharmacokinetic profile. RESULTS After premedication with romifidine and L-methadone, the combination of S-ketamine and diazepam allowed reaching surgical anaesthesia in the 28 colts. Induction of anaesthesia as well as recovery were good to excellent in the majority (n = 22 and 24, respectively) of the colts. Seven horses required additional administration of S-ketamine to prolong the duration of surgical anaesthesia. Redosing did not compromise recovery quality. Plasma concentration of S-ketamine decreased rapidly after administration, following a two-compartmental model, leading to the hypothesis of a consistent unchanged elimination of the parent compound into the urine beside its conversion to S-norketamine. The observed plasma concentrations of S-ketamine at the time of first movement were various and did not support the definition of a clear cut-off value to predict the termination of the drug effect. CONCLUSIONS The administration of 2.5 mg/kg IV S-ketamine after adequate premedication provided good quality of induction and recovery and a duration of action similar to what has been reported for racemic ketamine at the dose of 2.2 mg/kg. Until further investigations will be provided, close monitoring to adapt drug delivery is mandatory, particularly once the first 10 minutes after injection are elapsed. Taking into account rapid elimination of S-ketamine, significant inter-individual variability and rapid loss of effect over a narrow range of concentrations a sudden return of consciousness has to be foreseen.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

People with sequence-space synesthesia (SSS) report stable visuo-spatial forms corresponding to numbers, days, and months (amongst others). This type of synesthesia has intrigued scientists for over 130 years but the lack of an agreed upon tool for assessing it has held back research on this phenomenon. The present study builds on previous tests by measuring the consistency of spatial locations that is known to discriminate controls from synesthetes. We document, for the first time, the sensitivity and specificity of such a test and suggest a diagnostic cut-off point for discriminating between the groups based on the area bounded by different placement attempts with the same item.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The most commonly used method for formally assessing grapheme-colour synaesthesia (i.e., experiencing colours in response to letter and/or number stimuli) involves selecting colours from a large colour palette on several occasions and measuring consistency of the colours selected. However, the ability to diagnose synaesthesia using this method depends on several factors that have not been directly contrasted. These include the type of colour space used (e.g., RGB, HSV, CIELUV, CIELAB) and different measures of consistency (e.g., city block and Euclidean distance in colour space). This study aims to find the most reliable way of diagnosing grapheme-colour synaesthesia based on maximising sensitivity (i.e., ability of a test to identify true synaesthetes) and specificity (i.e., ability of a test to identify true non-synaesthetes). We show, applying ROC (Receiver Operating Characteristics) to binary classification of a large sample of self-declared synaesthetes and non-synaesthetes, that the consistency criterion (i.e., cut-off value) for diagnosing synaesthesia is considerably higher than the current standard in the field. We also show that methods based on perceptual CIELUV and CIELAB colour models (rather than RGB and HSV colour representations) and Euclidean distances offer an even greater sensitivity and specificity than most currently used measures. Together, these findings offer improved heuristics for the behavioural assessment of grapheme-colour synaesthesia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIMS A non-invasive gene-expression profiling (GEP) test for rejection surveillance of heart transplant recipients originated in the USA. A European-based study, Cardiac Allograft Rejection Gene Expression Observational II Study (CARGO II), was conducted to further clinically validate the GEP test performance. METHODS AND RESULTS Blood samples for GEP testing (AlloMap(®), CareDx, Brisbane, CA, USA) were collected during post-transplant surveillance. The reference standard for rejection status was based on histopathology grading of tissue from endomyocardial biopsy. The area under the receiver operating characteristic curve (AUC-ROC), negative (NPVs), and positive predictive values (PPVs) for the GEP scores (range 0-39) were computed. Considering the GEP score of 34 as a cut-off (>6 months post-transplantation), 95.5% (381/399) of GEP tests were true negatives, 4.5% (18/399) were false negatives, 10.2% (6/59) were true positives, and 89.8% (53/59) were false positives. Based on 938 paired biopsies, the GEP test score AUC-ROC for distinguishing ≥3A rejection was 0.70 and 0.69 for ≥2-6 and >6 months post-transplantation, respectively. Depending on the chosen threshold score, the NPV and PPV range from 98.1 to 100% and 2.0 to 4.7%, respectively. CONCLUSION For ≥2-6 and >6 months post-transplantation, CARGO II GEP score performance (AUC-ROC = 0.70 and 0.69) is similar to the CARGO study results (AUC-ROC = 0.71 and 0.67). The low prevalence of ACR contributes to the high NPV and limited PPV of GEP testing. The choice of threshold score for practical use of GEP testing should consider overall clinical assessment of the patient's baseline risk for rejection.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Polymorbid patients, diverse diagnostic and therapeutic options, more complex hospital structures, financial incentives, benchmarking, as well as perceptional and societal changes put pressure on medical doctors, specifically if medical errors surface. This is particularly true for the emergency department setting, where patients face delayed or erroneous initial diagnostic or therapeutic measures and costly hospital stays due to sub-optimal triage. A "biomarker" is any laboratory tool with the potential better to detect and characterise diseases, to simplify complex clinical algorithms and to improve clinical problem solving in routine care. They must be embedded in clinical algorithms to complement and not replace basic medical skills. Unselected ordering of laboratory tests and shortcomings in test performance and interpretation contribute to diagnostic errors. Test results may be ambiguous with false positive or false negative results and generate unnecessary harm and costs. Laboratory tests should only be ordered, if results have clinical consequences. In studies, we must move beyond the observational reporting and meta-analysing of diagnostic accuracies for biomarkers. Instead, specific cut-off ranges should be proposed and intervention studies conducted to prove outcome relevant impacts on patient care. The focus of this review is to exemplify the appropriate use of selected laboratory tests in the emergency setting for which randomised-controlled intervention studies have proven clinical benefit. Herein, we focus on initial patient triage and allocation of treatment opportunities in patients with cardiorespiratory diseases in the emergency department. The following five biomarkers will be discussed: proadrenomedullin for prognostic triage assessment and site-of-care decisions, cardiac troponin for acute myocardial infarction, natriuretic peptides for acute heart failure, D-dimers for venous thromboembolism, C-reactive protein as a marker of inflammation, and procalcitonin for antibiotic stewardship in infections of the respiratory tract and sepsis. For these markers we provide an overview on physiopathology, historical evolution of evidence, strengths and limitations for a rational implementation into clinical algorithms. We critically discuss results from key intervention trials that led to their use in clinical routine and potential future indications. The rational for the use of all these biomarkers, first, tackle diagnostic ambiguity and consecutive defensive medicine, second, delayed and sub-optimal therapeutic decisions, and third, prognostic uncertainty with misguided triage and site-of-care decisions all contributing to the waste of our limited health care resources. A multifaceted approach for a more targeted management of medical patients from emergency admission to discharge including biomarkers, will translate into better resource use, shorter length of hospital stay, reduced overall costs, improved patients satisfaction and outcomes in terms of mortality and re-hospitalisation. Hopefully, the concepts outlined in this review will help the reader to improve their diagnostic skills and become more parsimonious laboratory test requesters.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We describe and test a nonperturbatively improved single-plaquette lattice action for 4-d SU(2) and SU(3) pure gauge theory, which suppresses large fluctuations of the plaquette, without requiring the naive continuum limit for smooth fields. We tune the action parameters based on torelon masses in moderate cubic physical volumes, and investigate the size of cut-off effects in other physical quantities, including torelon masses in asymmetric spatial volumes, the static quark potential, and gradient flow observables. In 2-d O(N) models similarly constructed nearest-neighbor actions have led to a drastic reduction of cut-off effects, down to the permille level, in a wide variety of physical quantities. In the gauge theories, we find significant reduction of lattice artifacts, and for some observables, the coarsest lattice result is very close to the continuum value. We estimate an improvement factor of 40 compared to using the Wilson gauge action to achieve the same statistical accuracy and suppression of cut-off effects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND: This study focused on the descriptive analysis of cattle movements and farm-level parameters derived from cattle movements, which are considered to be generically suitable for risk-based surveillance systems in Switzerland for diseases where animal movements constitute an important risk pathway. METHODS: A framework was developed to select farms for surveillance based on a risk score summarizing 5 parameters. The proposed framework was validated using data from the bovine viral diarrhoea (BVD) surveillance programme in 2013. RESULTS: A cumulative score was calculated per farm, including the following parameters; the maximum monthly ingoing contact chain (in 2012), the average number of animals per incoming movement, use of mixed alpine pastures and the number of weeks in 2012 a farm had movements registered. The final score for the farm depended on the distribution of the parameters. Different cut offs; 50, 90, 95 and 99%, were explored. The final scores ranged between 0 and 5. Validation of the scores against results from the BVD surveillance programme 2013 gave promising results for setting the cut off for each of the five selected farm level criteria at the 50th percentile. Restricting testing to farms with a score ≥ 2 would have resulted in the same number of detected BVD positive farms as testing all farms, i.e., the outcome of the 2013 surveillance programme could have been reached with a smaller survey. CONCLUSIONS: The seasonality and time dependency of the activity of single farms in the networks requires a careful assessment of the actual time period included to determine farm level criteria. However, selecting farms in the sample for risk-based surveillance can be optimized with the proposed scoring system. The system was validated using data from the BVD eradication program. The proposed method is a promising framework for the selection of farms according to the risk of infection based on animal movements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION The aim of the study was to identify the appropriate level of Charlson comorbidity index (CCI) in older patients (>70 years) with high-risk prostate cancer (PCa) to achieve survival benefit following radical prostatectomy (RP). METHODS We retrospectively analyzed 1008 older patients (>70 years) who underwent RP with pelvic lymph node dissection for high-risk prostate cancer (preoperative prostate-specific antigen >20 ng/mL or clinical stage ≥T2c or Gleason ≥8) from 14 tertiary institutions between 1988 and 2014. The study population was further grouped into CCI < 2 and ≥2 for analysis. Survival rate for each group was estimated with Kaplan-Meier method and competitive risk Fine-Gray regression to estimate the best explanatory multivariable model. Area under the curve (AUC) and Akaike information criterion were used to identify ideal 'Cut off' for CCI. RESULTS The clinical and cancer characteristics were similar between the two groups. Comparison of the survival analysis using the Kaplan-Meier curve between two groups for non-cancer death and survival estimations for 5 and 10 years shows significant worst outcomes for patients with CCI ≥ 2. In multivariate model to decide the appropriate CCI cut-off point, we found CCI 2 has better AUC and p value in log rank test. CONCLUSION Older patients with fewer comorbidities harboring high-risk PCa appears to benefit from RP. Sicker patients are more likely to die due to non-prostate cancer-related causes and are less likely to benefit from RP.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION The neural correlates of impaired performance of gestures are currently unclear. Lesion studies showed variable involvement of the ventro-dorsal stream particularly left inferior frontal gyrus (IFG) in gesture performance on command. However, findings cannot be easily generalized as lesions may be biased by the architecture of vascular supply and involve brain areas beyond the critical region. The neuropsychiatric syndrome of schizophrenia shares apraxic-like errors and altered brain structure without macroanatomic lesions. Schizophrenia may therefore qualify as a model disorder to test neural correlates of gesture impairments. METHODS We included 45 schizophrenia patients and 44 healthy controls in the study to investigate the structural brain correlates of defective gesturing in schizophrenia using voxel based morphometry. Gestures were tested in two domains: meaningful gestures (transitive and intransitive) on verbal command and imitation of meaningless gestures. Cut-off scores were used to separate patients with deficits, patients without deficits and controls. Group differences in gray matter (GM) volume were explored in an ANCOVA. RESULTS Patients performed poorer than controls in each gesture category (p < .001). Patients with deficits in producing meaningful gestures on command had reduced GM predominantly in left IFG, with additional involvement of right insula and anterior cingulate cortex. Patients with deficits differed from patients without deficits in right insula, inferior parietal lobe (IPL) and superior temporal gyrus. CONCLUSIONS Impaired performance of meaningful gestures on command was linked to volume loss predominantly in the praxis network in schizophrenia. Thus, the behavioral similarities between apraxia and schizophrenia are paralleled by structural alterations. However, few associations between behavioral impairment and structural brain alterations appear specific to schizophrenia.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Diagnosis of primary ciliary dyskinesia (PCD) lacks a "gold standard" test and is therefore based on combinations of tests including nasal nitric oxide (nNO), high-speed video microscopy analysis (HSVMA), genotyping and transmission electron microscopy (TEM). There are few published data on the accuracy of this approach.Using prospectively collected data from 654 consecutive patients referred for PCD diagnostics we calculated sensitivity and specificity for individual and combination testing strategies. Not all patients underwent all tests.HSVMA had excellent sensitivity and specificity (100% and 93%, respectively). TEM was 100% specific, but 21% of PCD patients had normal ultrastructure. nNO (30 nL·min(-1) cut-off) had good sensitivity and specificity (91% and 96%, respectively). Simultaneous testing using HSVMA and TEM was 100% sensitive and 92% specific.In conclusion, combination testing was found to be a highly accurate approach for diagnosing PCD. HSVMA alone has excellent accuracy, but requires significant expertise, and repeated sampling or cell culture is often needed. TEM alone is specific but misses 21% of cases. nNO (≤30 nL·min(-1)) contributes well to the diagnostic process. In isolation nNO screening at this cut-off would miss ∼10% of cases, but in combination with HSVMA could reduce unnecessary further testing. Standardisation of testing between centres is a future priority.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Symptoms of primary ciliary dyskinesia (PCD) are nonspecific and guidance on whom to refer for testing is limited. Diagnostic tests for PCD are highly specialised, requiring expensive equipment and experienced PCD scientists. This study aims to develop a practical clinical diagnostic tool to identify patients requiring testing.Patients consecutively referred for testing were studied. Information readily obtained from patient history was correlated with diagnostic outcome. Using logistic regression, the predictive performance of the best model was tested by receiver operating characteristic curve analyses. The model was simplified into a practical tool (PICADAR) and externally validated in a second diagnostic centre.Of 641 referrals with a definitive diagnostic outcome, 75 (12%) were positive. PICADAR applies to patients with persistent wet cough and has seven predictive parameters: full-term gestation, neonatal chest symptoms, neonatal intensive care admittance, chronic rhinitis, ear symptoms, situs inversus and congenital cardiac defect. Sensitivity and specificity of the tool were 0.90 and 0.75 for a cut-off score of 5 points. Area under the curve for the internally and externally validated tool was 0.91 and 0.87, respectively.PICADAR represents a simple diagnostic clinical prediction rule with good accuracy and validity, ready for testing in respiratory centres referring to PCD centres.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Linkage and association studies are major analytical tools to search for susceptibility genes for complex diseases. With the availability of large collection of single nucleotide polymorphisms (SNPs) and the rapid progresses for high throughput genotyping technologies, together with the ambitious goals of the International HapMap Project, genetic markers covering the whole genome will be available for genome-wide linkage and association studies. In order not to inflate the type I error rate in performing genome-wide linkage and association studies, multiple adjustment for the significant level for each independent linkage and/or association test is required, and this has led to the suggestion of genome-wide significant cut-off as low as 5 × 10 −7. Almost no linkage and/or association study can meet such a stringent threshold by the standard statistical methods. Developing new statistics with high power is urgently needed to tackle this problem. This dissertation proposes and explores a class of novel test statistics that can be used in both population-based and family-based genetic data by employing a completely new strategy, which uses nonlinear transformation of the sample means to construct test statistics for linkage and association studies. Extensive simulation studies are used to illustrate the properties of the nonlinear test statistics. Power calculations are performed using both analytical and empirical methods. Finally, real data sets are analyzed with the nonlinear test statistics. Results show that the nonlinear test statistics have correct type I error rates, and most of the studied nonlinear test statistics have higher power than the standard chi-square test. This dissertation introduces a new idea to design novel test statistics with high power and might open new ways to mapping susceptibility genes for complex diseases. ^