916 resultados para Failure time analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

23rd International Conference on Real-Time Networks and Systems (RTNS 2015). 4 to 6, Nov, 2015, Main Track. Lille, France. Best Paper Award Nominee

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Magdeburg, Univ., Fak. für Wirtschaftswiss., Diss., 2011

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A capillary microtrap thermal desorption module is developed for near real-time analysis of volatile organic compounds (VOCs) at sub-ppbv levels in air samples. The device allows the direct injection of the thermally desorbed VOCs into a chromatographic column. It does not use a second cryotrap to focalize the adsorbed compounds before entering the separation column so reducing the formation of artifacts. The connection of the microtrap to a GC–MS allows the quantitative determination of VOCs in less than 40 min with detection limits of between 5 and 10 pptv (25 °C and 760 mmHg), which correspond to 19–43 ng m−3, using sampling volumes of 775 cm3. The microtrap is applied to the analysis of environmental air contamination in different laboratories of our faculty. The results obtained indicate that most volatile compounds are easily diffused through the air and that they also may contaminate the surrounding areas when the habitual safety precautions (e.g., working under fume hoods) are used during the manipulation of solvents. The application of the microtrap to the analysis of VOCs in breath samples suggest that 2,5-dimethylfuran may be a strong indicator of a person's smoking status

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accelerated failure time models with a shared random component are described, and are used to evaluate the effect of explanatory factors and different transplant centres on survival times following kidney transplantation. Different combinations of the distribution of the random effects and baseline hazard function are considered and the fit of such models to the transplant data is critically assessed. A mixture model that combines short- and long-term components of a hazard function is then developed, which provides a more flexible model for the hazard function. The model can incorporate different explanatory variables and random effects in each component. The model is straightforward to fit using standard statistical software, and is shown to be a good fit to the transplant data. Copyright (C) 2004 John Wiley Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In a previous work, Vieira Neto & Winter (2001) numerically explored the capture times of particles as temporary satellites of Uranus. The study was made in the framework of the spatial, circular, restricted three-body problem. Regions of the initial condition space whose trajectories are apparently stable were determined. The criterion adopted was that the trajectories do not escape from the planet during an integration of 10(5) years. These regions occur for a wide range of orbital initial inclinations (i). In the present work it is studied the reason for the existence of such stable regions. The stability of the planar retrograde trajectories is due to a family of simple periodic orbits and the associated quasi-periodic orbits that oscillate around them. These planar stable orbits had already been studied (Henon 1970; Huang & Innanen 1983). Their results are reviewed using Poincare surface of sections. The stable non-planar retrograde trajectories, 110 degrees less than or equal to i < 180, are found to be tridimensional quasi-periodic orbits around the same family of periodic orbits found for the planar case (i = 180 degrees). It was not found any periodic orbit out of the plane associated to such quasi-periodic orbits. The largest region of stable prograde trajectories occurs at i = 60 degrees. Trajectories in such region are found to behave as quasi-periodic orbits evolving similarly to the stable retrograde trajectories that occurs at i = 120 degrees.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We study the problem of gravitational capture in the framework of the Sun-Uranus-particle system. Part of the space of initial conditions is systematically explored, and the duration of temporary gravitational capture is measured. The location and size of different capture-time regions are given in terms of diagrams of initial semimajor axis versus eccentricity. The other initial orbital elements - inclination (i), longitude of the node (Ω), argument of pericenter (ω), and time of pericenter passage (τ) - are first taken to be zero. Then we investigate the cases with ω = 90°, 180°, and 270°. We also present a sample of results for Ω = 90°, considering the cases i = 60°, 120°, 150°, and 180°. Special attention is given to the influence of the initial orbital inclination, taking orbits initially in opposition at pericenter. In this case, the initial inclination is varied from 0° to 180° in steps of 10°. The success of the final stage of the capture problem, which involves the transformation of temporary captures into permanent ones, is highly dependent on the initial conditions associated with the longest capture times. The largest regions of the initial-conditions space with the longest capture times occur at inclinations of 60°-70° and 160°. The regions of possible stability as a function of initial inclination are also delimited. These regions include not only a known set of retrograde orbits, but also a new sort of prograde orbit with inclinations greater than zero.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ziel dieser Dissertation ist die experimentelle Charakterisierung und quantitative Beschreibung der Hybridisierung von komplementären Nukleinsäuresträngen mit oberflächengebundenen Fängermolekülen für die Entwicklung von integrierten Biosensoren. Im Gegensatz zu lösungsbasierten Verfahren ist mit Microarray Substraten die Untersuchung vieler Nukleinsäurekombinationen parallel möglich. Als biologisch relevantes Evaluierungssystem wurde das in Eukaryoten universell exprimierte Actin Gen aus unterschiedlichen Pflanzenspezies verwendet. Dieses Testsystem ermöglicht es, nahe verwandte Pflanzenarten auf Grund von geringen Unterschieden in der Gen-Sequenz (SNPs) zu charakterisieren. Aufbauend auf dieses gut studierte Modell eines House-Keeping Genes wurde ein umfassendes Microarray System, bestehend aus kurzen und langen Oligonukleotiden (mit eingebauten LNA-Molekülen), cDNAs sowie DNA und RNA Targets realisiert. Damit konnte ein für online Messung optimiertes Testsystem mit hohen Signalstärken entwickelt werden. Basierend auf den Ergebnissen wurde der gesamte Signalpfad von Nukleinsärekonzentration bis zum digitalen Wert modelliert. Die aus der Entwicklung und den Experimenten gewonnen Erkenntnisse über die Kinetik und Thermodynamik von Hybridisierung sind in drei Publikationen zusammengefasst die das Rückgrat dieser Dissertation bilden. Die erste Publikation beschreibt die Verbesserung der Reproduzierbarkeit und Spezifizität von Microarray Ergebnissen durch online Messung von Kinetik und Thermodynamik gegenüber endpunktbasierten Messungen mit Standard Microarrays. Für die Auswertung der riesigen Datenmengen wurden zwei Algorithmen entwickelt, eine reaktionskinetische Modellierung der Isothermen und ein auf der Fermi-Dirac Statistik beruhende Beschreibung des Schmelzüberganges. Diese Algorithmen werden in der zweiten Publikation beschrieben. Durch die Realisierung von gleichen Sequenzen in den chemisch unterschiedlichen Nukleinsäuren (DNA, RNA und LNA) ist es möglich, definierte Unterschiede in der Konformation des Riboserings und der C5-Methylgruppe der Pyrimidine zu untersuchen. Die kompetitive Wechselwirkung dieser unterschiedlichen Nukleinsäuren gleicher Sequenz und die Auswirkungen auf Kinetik und Thermodynamik ist das Thema der dritten Publikation. Neben der molekularbiologischen und technologischen Entwicklung im Bereich der Sensorik von Hybridisierungsreaktionen oberflächengebundener Nukleinsäuremolekülen, der automatisierten Auswertung und Modellierung der anfallenden Datenmengen und der damit verbundenen besseren quantitativen Beschreibung von Kinetik und Thermodynamik dieser Reaktionen tragen die Ergebnisse zum besseren Verständnis der physikalisch-chemischen Struktur des elementarsten biologischen Moleküls und seiner nach wie vor nicht vollständig verstandenen Spezifizität bei.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The positive and negative predictive value are standard measures used to quantify the predictive accuracy of binary biomarkers when the outcome being predicted is also binary. When the biomarkers are instead being used to predict a failure time outcome, there is no standard way of quantifying predictive accuracy. We propose a natural extension of the traditional predictive values to accommodate censored survival data. We discuss not only quantifying predictive accuracy using these extended predictive values, but also rigorously comparing the accuracy of two biomarkers in terms of their predictive values. Using a marginal regression framework, we describe how to estimate differences in predictive accuracy and how to test whether the observed difference is statistically significant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The beneficial effects of beta-blockers and aldosterone receptor antagonists are now well established in patients with severe systolic chronic heart failure (CHF). However, it is unclear whether beta-blockers are able to provide additional benefit in patients already receiving aldosterone antagonists. We therefore examined this question in the COPERNICUS study of 2289 patients with severe CHF receiving the beta1-beta2/alpha1 blocker carvedilol compared with placebo. METHODS: Patients were divided post hoc into subgroups according to whether they were receiving spironolactone (n = 445) or not (n = 1844) at baseline. Consistency of the effect of carvedilol versus placebo was examined for these subgroups with respect to the predefined end points of all-cause mortality, death or CHF-related hospitalizations, death or cardiovascular hospitalizations, and death or all-cause hospitalizations. RESULTS: The beneficial effect of carvedilol was similar among patients who were or were not receiving spironolactone for each of the 4 efficacy measures. For all-cause mortality, the Cox model hazard ratio for carvedilol compared with placebo was 0.65 (95% CI 0.36-1.15) in patients receiving spironolactone and 0.65 (0.51-0.83) in patients not receiving spironolactone. Hazard ratios for death or all-cause hospitalization were 0.76 (0.55-1.05) versus 0.76 (0.66-0.88); for death or cardiovascular hospitalization, 0.61 (0.42-0.89) versus 0.75 (0.64-0.88); and for death or CHF hospitalization, 0.63 (0.43-0.94) versus 0.70 (0.59-0.84), in patients receiving and not receiving spironolactone, respectively. The safety and tolerability of treatment with carvedilol were also similar, regardless of background spironolactone. CONCLUSION: Carvedilol remained clinically efficacious in the COPERNICUS study of patients with severe CHF when added to background spironolactone in patients who were practically all receiving angiotensin-converting enzyme inhibitor (or angiotensin II antagonist) therapy. Therefore, the use of spironolactone in patients with severe CHF does not obviate the necessity of additional treatment that interferes with the adverse effects of sympathetic activation, specifically beta-blockade.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT: It is uncertain whether intensified heart failure therapy guided by N-terminal brain natriuretic peptide (BNP) is superior to symptom-guided therapy. OBJECTIVE: To compare 18-month outcomes of N-terminal BNP-guided vs symptom-guided heart failure therapy. DESIGN, SETTING, AND PATIENTS: Randomized controlled multicenter Trial of Intensified vs Standard Medical Therapy in Elderly Patients With Congestive Heart Failure (TIME-CHF) of 499 patients aged 60 years or older with systolic heart failure (ejection fraction < or = 45%), New York Heart Association (NYHA) class of II or greater, prior hospitalization for heart failure within 1 year, and N-terminal BNP level of 2 or more times the upper limit of normal. The study had an 18-month follow-up and it was conducted at 15 outpatient centers in Switzerland and Germany between January 2003 and June 2008. INTERVENTION: Uptitration of guideline-based treatments to reduce symptoms to NYHA class of II or less (symptom-guided therapy) and BNP level of 2 times or less the upper limit of normal and symptoms to NYHA class of II or less (BNP-guided therapy). MAIN OUTCOME MEASURES: Primary outcomes were 18-month survival free of all-cause hospitalizations and quality of life as assessed by structured validated questionnaires. RESULTS: Heart failure therapy guided by N-terminal BNP and symptom-guided therapy resulted in similar rates of survival free of all-cause hospitalizations (41% vs 40%, respectively; hazard ratio [HR], 0.91 [95% CI, 0.72-1.14]; P = .39). Patients' quality-of-life metrics improved over 18 months of follow-up but these improvements were similar in both the N-terminal BNP-guided and symptom-guided strategies. Compared with the symptom-guided group, survival free of hospitalization for heart failure, a secondary end point, was higher among those in the N-terminal BNP-guided group (72% vs 62%, respectively; HR, 0.68 [95% CI, 0.50-0.92]; P = .01). Heart failure therapy guided by N-terminal BNP improved outcomes in patients aged 60 to 75 years but not in those aged 75 years or older (P < .02 for interaction) CONCLUSION: Heart failure therapy guided by N-terminal BNP did not improve overall clinical outcomes or quality of life compared with symptom-guided treatment. TRIAL REGISTRATION: isrctn.org Identifier: ISRCTN43596477.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES The aim of this prospective cohort trial was to perform a cost/time analysis for implant-supported single-unit reconstructions in the digital workflow compared to the conventional pathway. MATERIALS AND METHODS A total of 20 patients were included for rehabilitation with 2 × 20 implant crowns in a crossover study design and treated consecutively each with customized titanium abutments plus CAD/CAM-zirconia-suprastructures (test: digital) and with standardized titanium abutments plus PFM-crowns (control conventional). Starting with prosthetic treatment, analysis was estimated for clinical and laboratory work steps including measure of costs in Swiss Francs (CHF), productivity rates and cost minimization for first-line therapy. Statistical calculations were performed with Wilcoxon signed-rank test. RESULTS Both protocols worked successfully for all test and control reconstructions. Direct treatment costs were significantly lower for the digital workflow 1815.35 CHF compared to the conventional pathway 2119.65 CHF [P = 0.0004]. For subprocess evaluation, total laboratory costs were calculated as 941.95 CHF for the test group and 1245.65 CHF for the control group, respectively [P = 0.003]. The clinical dental productivity rate amounted to 29.64 CHF/min (digital) and 24.37 CHF/min (conventional) [P = 0.002]. Overall, cost minimization analysis exhibited an 18% cost reduction within the digital process. CONCLUSION The digital workflow was more efficient than the established conventional pathway for implant-supported crowns in this investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In situ and simultaneous measurement of the three most abundant isotopologues of methane using mid-infrared laser absorption spectroscopy is demonstrated. A field-deployable, autonomous platform is realized by coupling a compact quantum cascade laser absorption spectrometer (QCLAS) to a preconcentration unit, called trace gas extractor (TREX). This unit enhances CH4 mole fractions by a factor of up to 500 above ambient levels and quantitatively separates interfering trace gases such as N2O and CO2. The analytical precision of the QCLAS isotope measurement on the preconcentrated (750 ppm, parts-per-million, µmole mole−1) methane is 0.1 and 0.5 ‰ for δ13C- and δD-CH4 at 10 min averaging time. Based on repeated measurements of compressed air during a 2-week intercomparison campaign, the repeatability of the TREX–QCLAS was determined to be 0.19 and 1.9 ‰ for δ13C and δD-CH4, respectively. In this intercomparison campaign the new in situ technique is compared to isotope-ratio mass spectrometry (IRMS) based on glass flask and bag sampling and real time CH4 isotope analysis by two commercially available laser spectrometers. Both laser-based analyzers were limited to methane mole fraction and δ13C-CH4 analysis, and only one of them, a cavity ring down spectrometer, was capable to deliver meaningful data for the isotopic composition. After correcting for scale offsets, the average difference between TREX–QCLAS data and bag/flask sampling–IRMS values are within the extended WMO compatibility goals of 0.2 and 5 ‰ for δ13C- and δD-CH4, respectively. This also displays the potential to improve the interlaboratory compatibility based on the analysis of a reference air sample with accurately determined isotopic composition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background. The purpose of this study was to describe the risk factors and demographics of persons with salmonellosis and shigellosis and to investigate both seasonal and spatial variations in the occurrence of these infections in Texas from 2000 to 2004, utilizing time series analyses and the geographic information system digital mapping methods. ^ Methods. Spatial Analysis: MapInfo software was used to map the distribution of age-adjusted rates of reported shigellosis and salmonellosis in Texas from 2000–2004 by zip codes. Census data on above or below poverty level, household income, highest level of educational attainment, race, ethnicity, and urban/rural community status was obtained from the 2000 Decennial Census for each zip code. The zip codes with the upper 10% and lower 10% were compared using t-tests and logistic regression to determine whether there were any potential risk factors. ^ Temporal analysis. Seasonal patterns in the prevalence of infections in Texas from 2000 to 2003 were determined by performing time-series analysis on the numbers of cases of salmonellosis and shigellosis. A linear regression was also performed to assess for trends in the incidence of each disease, along with auto-correlation and multi-component cosinor analysis. ^ Results. Spatial analysis: Analysis by general linear model showed a significant association between infection rates and age, with young children aged less than 5 and those aged 5–9 years having increased risk of infection for both disease conditions. The data demonstrated that those populations with high percentages of people who attained a higher than high school education were less likely to be represented in zip codes with high rates of shigellosis. However, for salmonellosis, logistic regression models indicated that when compared to populations with high percentages of non-high school graduates, having a high school diploma or equivalent increased the odds of having a high rate of infection. ^ Temporal analysis. For shigellosis, multi-component cosinor analyses were used to determine the approximated cosine curve which represented a statistically significant representation of the time series data for all age groups by sex. The shigellosis results show 2 peaks, with a major peak occurring in June and a secondary peak appearing around October. Salmonellosis results showed a single peak and trough in all age groups with the peak occurring in August and the trough occurring in February. ^ Conclusion. The results from this study can be used by public health agencies to determine the timing of public health awareness programs and interventions in order to prevent salmonellosis and shigellosis from occurring. Because young children depend on adults for their meals, it is important to increase the awareness of day-care workers and new parents about modes of transmission and hygienic methods of food preparation and storage. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper discusses some issues which arise in the dataflow analysis of constraint logic programming (CLP) languages. The basic technique applied is that of abstract interpretation. First, some types of optimizations possible in a number of CLP systems (including efficient parallelization) are presented and the information that has to be obtained at compile-time in order to be able to implement such optimizations is considered. Two approaches are then proposed and discussed for obtaining this information for a CLP program: one based on an analysis of a CLP metainterpreter using standard Prolog analysis tools, and a second one based on direct analysis of the CLP program. For the second approach an abstract domain which approximates groundness (also referred to as "definiteness") information (i.e. constraint to a single valué) and the related abstraction functions are presented.