914 resultados para time to event analysis
Resumo:
The hydrologic risk (and the hydro-geologic one, closely related to it) is, and has always been, a very relevant issue, due to the severe consequences that may be provoked by a flooding or by waters in general in terms of human and economic losses. Floods are natural phenomena, often catastrophic, and cannot be avoided, but their damages can be reduced if they are predicted sufficiently in advance. For this reason, the flood forecasting plays an essential role in the hydro-geological and hydrological risk prevention. Thanks to the development of sophisticated meteorological, hydrologic and hydraulic models, in recent decades the flood forecasting has made a significant progress, nonetheless, models are imperfect, which means that we are still left with a residual uncertainty on what will actually happen. In this thesis, this type of uncertainty is what will be discussed and analyzed. In operational problems, it is possible to affirm that the ultimate aim of forecasting systems is not to reproduce the river behavior, but this is only a means through which reducing the uncertainty associated to what will happen as a consequence of a precipitation event. In other words, the main objective is to assess whether or not preventive interventions should be adopted and which operational strategy may represent the best option. The main problem for a decision maker is to interpret model results and translate them into an effective intervention strategy. To make this possible, it is necessary to clearly define what is meant by uncertainty, since in the literature confusion is often made on this issue. Therefore, the first objective of this thesis is to clarify this concept, starting with a key question: should be the choice of the intervention strategy to adopt based on the evaluation of the model prediction based on its ability to represent the reality or on the evaluation of what actually will happen on the basis of the information given by the model forecast? Once the previous idea is made unambiguous, the other main concern of this work is to develope a tool that can provide an effective decision support, making possible doing objective and realistic risk evaluations. In particular, such tool should be able to provide an uncertainty assessment as accurate as possible. This means primarily three things: it must be able to correctly combine all the available deterministic forecasts, it must assess the probability distribution of the predicted quantity and it must quantify the flooding probability. Furthermore, given that the time to implement prevention strategies is often limited, the flooding probability will have to be linked to the time of occurrence. For this reason, it is necessary to quantify the flooding probability within a horizon time related to that required to implement the intervention strategy and it is also necessary to assess the probability of the flooding time.
Resumo:
BACKGROUND The use of combination antiretroviral therapy (cART) comprising three antiretroviral medications from at least two classes of drugs is the current standard treatment for HIV infection in adults and children. Current World Health Organization (WHO) guidelines for antiretroviral therapy recommend early treatment regardless of immunologic thresholds or the clinical condition for all infants (less than one years of age) and children under the age of two years. For children aged two to five years current WHO guidelines recommend (based on low quality evidence) that clinical and immunological thresholds be used to identify those who need to start cART (advanced clinical stage or CD4 counts ≤ 750 cells/mm(3) or per cent CD4 ≤ 25%). This Cochrane review will inform the current available evidence regarding the optimal time for treatment initiation in children aged two to five years with the goal of informing the revision of WHO 2013 recommendations on when to initiate cART in children. OBJECTIVES To assess the evidence for the optimal time to initiate cART in treatment-naive, HIV-infected children aged 2 to 5 years. SEARCH METHODS We searched the Cochrane Central Register of Controlled Trials (CENTRAL), MEDLINE, EMBASE, the AEGIS conference database, specific relevant conferences, www.clinicaltrials.gov, the World Health Organization International Clinical Trials Registry platform and reference lists of articles. The date of the most recent search was 30 September 2012. SELECTION CRITERIA Randomised controlled trials (RCTs) that compared immediate with deferred initiation of cART, and prospective cohort studies which followed children from enrolment to start of cART and on cART. DATA COLLECTION AND ANALYSIS Two review authors considered studies for inclusion in the review, assessed the risk of bias, and extracted data on the primary outcome of death from all causes and several secondary outcomes, including incidence of CDC category C and B clinical events and per cent CD4 cells (CD4%) at study end. For RCTs we calculated relative risks (RR) or mean differences with 95% confidence intervals (95% CI). For cohort data, we extracted relative risks with 95% CI from adjusted analyses. We combined results from RCTs using a random effects model and examined statistical heterogeneity. MAIN RESULTS Two RCTs in HIV-positive children aged 1 to 12 years were identified. One trial was the pilot study for the larger second trial and both compared initiation of cART regardless of clinical-immunological conditions with deferred initiation until per cent CD4 dropped to <15%. The two trials were conducted in Thailand, and Thailand and Cambodia, respectively. Unpublished analyses of the 122 children enrolled at ages 2 to 5 years were included in this review. There was one death in the immediate cART group and no deaths in the deferred group (RR 2.9; 95% CI 0.12 to 68.9). In the subgroup analysis of children aged 24 to 59 months, there was one CDC C event in each group (RR 0.96; 95% CI 0.06 to 14.87) and 8 and 11 CDC B events in the immediate and deferred groups respectively (RR 0.95; 95% CI 0.24 to 3.73). In this subgroup, the mean difference in CD4 per cent at study end was 5.9% (95% CI 2.7 to 9.1). One cohort study from South Africa, which compared the effect of delaying cART for up to 60 days in 573 HIV-positive children starting tuberculosis treatment (median age 3.5 years), was also included. The adjusted hazard ratios for the effect on mortality of delaying ART for more than 60 days was 1.32 (95% CI 0.55 to 3.16). AUTHORS' CONCLUSIONS This systematic review shows that there is insufficient evidence from clinical trials in support of either early or CD4-guided initiation of ART in HIV-infected children aged 2 to 5 years. Programmatic issues such as the retention in care of children in ART programmes in resource-limited settings will need to be considered when formulating WHO 2013 recommendations.
Resumo:
The present dataset includes results of analysis of 227 zooplankton samples taken in and off the Sevastopol Bay in the Black Sea in 1976, 1979-1980, 1989-1990, 1995-1996 and 2002-2003. Exact coordinates for stations 1, 4, 5 and 6 are unknown and were calculated using Google-earth program. Data on Ctenophora Mnemiopsis leidyi and Beroe ovata are not included. Juday net: Vertical tows of a Juday net, with mouth area 0.1 m**2, mesh size 150µm. Tows were performed at layers. Towing speed: about 0.5 m/s. Samples were preserved by a 4% formaldehyde sea water buffered solution. Sampling volume was estimated by multiplying the mouth area with the wire length. The collected material was analysed using the method of portions (Yashnov, 1939). Samples were brought to volume of 50 - 100 ml depending upon zooplankton density and mixed intensively until all organisms were distributed randomly in the sample volume. After that 1 ml of sample was taken by calibrated Stempel-pipette. This operation was produced twice. If divergence between two examined subsamples was more than 30% one more subsample was examined. Large (> 1 mm body length) and not abundant species were calculated in 1/2, 1/4, 1/8, 1/16 or 1/32 part of sample. Counting and measuring of organisms were made in the Bogorov chamber under the stereomicroscope to the lowest taxon possible. Number of organisms per sample was calculated as simple average of two subsamples meanings multiplied on subsample volume. Total abundance of mesozooplankton was calculated as sum of taxon-specific abundances and total abundance of Copepods was calculated as sum of copepods taxon-specific abundances.
Resumo:
Real-time scheduling usually considers worst-case values for the parameters of task (or message stream) sets, in order to provide safe schedulability tests for hard real-time systems. However, worst-case conditions introduce a level of pessimism that is often inadequate for a certain class of (soft) real-time systems. In this paper we provide an approach for computing the stochastic response time of tasks where tasks have inter-arrival times described by discrete probabilistic distribution functions, instead of minimum inter-arrival (MIT) values.
Resumo:
Over the past decades several approaches for schedulability analysis have been proposed for both uni-processor and multi-processor real-time systems. Although different techniques are employed, very little has been put forward in using formal specifications, with the consequent possibility for mis-interpretations or ambiguities in the problem statement. Using a logic based approach to schedulability analysis in the design of hard real-time systems eases the synthesis of correct-by-construction procedures for both static and dynamic verification processes. In this paper we propose a novel approach to schedulability analysis based on a timed temporal logic with time durations. Our approach subsumes classical methods for uni-processor scheduling analysis over compositional resource models by providing the developer with counter-examples, and by ruling out schedules that cause unsafe violations on the system. We also provide an example showing the effectiveness of our proposal.
Resumo:
Monitoring organic environmental contaminants is of crucial importance to ensure public health. This requires simple, portable and robust devices to carry out on-site analysis. For this purpose, a low-temperature co-fired ceramics (LTCC) microfluidic potentiometric device (LTCC/μPOT) was developed for the first time for an organic compound: sulfamethoxazole (SMX). Sensory materials relied on newly designed plastic antibodies. Sol–gel, self-assembling monolayer and molecular-imprinting techniques were merged for this purpose. Silica beads were amine-modified and linked to SMX via glutaraldehyde modification. Condensation polymerization was conducted around SMX to fill the vacant spaces. SMX was removed after, leaving behind imprinted sites of complementary shape. The obtained particles were used as ionophores in plasticized PVC membranes. The most suitable membrane composition was selected in steady-state assays. Its suitability to flow analysis was verified in flow-injection studies with regular tubular electrodes. The LTCC/μPOT device integrated a bidimensional mixer, an embedded reference electrode based on Ag/AgCl and an Ag-based contact screen-printed under a micromachined cavity of 600 μm depth. The sensing membranes were deposited over this contact and acted as indicating electrodes. Under optimum conditions, the SMX sensor displayed slopes of about −58.7 mV/decade in a range from 12.7 to 250 μg/mL, providing a detection limit of 3.85 μg/mL and a sampling throughput of 36 samples/h with a reagent consumption of 3.3 mL per sample. The system was adjusted later to multiple analyte detection by including a second potentiometric cell on the LTCC/μPOT device. No additional reference electrode was required. This concept was applied to Trimethoprim (TMP), always administered concomitantly with sulphonamide drugs, and tested in fish-farming waters. The biparametric microanalyzer displayed Nernstian behaviour, with average slopes −54.7 (SMX) and +57.8 (TMP) mV/decade. To demonstrate the microanalyzer capabilities for real applications, it was successfully applied to single and simultaneous determination of SMX and TMP in aquaculture waters.
Resumo:
BACKGROUND: Excision and primary midline closure for pilonidal disease (PD) is a simple procedure; however, it is frequently complicated by infection and prolonged healing. The aim of this study was to analyze risk factors for surgical site infection (SSI) in this context. METHODS: All consecutive patients undergoing excision and primary closure for PD from January 2002 through October 2008 were retrospectively assessed. The end points were SSI, as defined by the Center for Disease Control, and time to healing. Univariable and multivariable risk factor analyses were performed. RESULTS: One hundred thirty-one patients were included [97 men (74%), median age = 24 (range 15-66) years]. SSI occurred in 41 (31%) patients. Median time to healing was 20 days (range 12-76) in patients without SSI and 62 days (range 20-176) in patients with SSI (P < 0.0001). In univariable and multivariable analyses, smoking [OR = 2.6 (95% CI 1.02, 6.8), P = 0.046] and lack of antibiotic prophylaxis [OR = 5.6 (95% CI 2.5, 14.3), P = 0.001] were significant predictors for SSI. Adjusted for SSI, age over 25 was a significant predictor of prolonged healing. CONCLUSION: This study suggests that the rate of SSI after excision and primary closure of PD is higher in smokers and could be reduced by antibiotic prophylaxis. SSI significantly prolongs healing time, particularly in patients over 25 years.
Resumo:
BACKGROUND AND STUDY AIMS Colon capsule endoscopy (CCE) was developed for the evaluation of colorectal pathology. In this study, our aim was to assess if a dual-camera analysis using CCE allows better evaluation of the whole gastrointestinal (GI) tract compared to a single-camera analysis. PATIENTS AND METHODS We included 21 patients (12 males, mean age 56.20 years) submitted for a CCE examination. After standard colon preparation, the colon capsule endoscope (PillCam Colon™) was swallowed after reinitiation from its "sleep" mode. Four physicians performed the analysis: two reviewed both video streams at the same time (dual-camera analysis); one analyzed images from one side of the device ("camera 1"); and the other reviewed the opposite side ("camera 2"). We compared numbers of findings from different parts of the entire GI tract and level of agreement among reviewers. RESULTS A complete evaluation of the GI tract was possible in all patients. Dual-camera analysis provided 16% and 5% more findings compared to camera 1 and camera 2 analysis, respectively. Overall agreement was 62.7% (kappa = 0.44, 95% CI: 0.373-0.510). Esophageal (kappa = 0.611) and colorectal (kappa = 0.595) findings had a good level of agreement, while small bowel (kappa = 0.405) showed moderate agreement. CONCLUSION The use of dual-camera analysis with CCE for the evaluation of the GI tract is feasible and detects more abnormalities when compared with single-camera analysis.
Resumo:
INTRODUCTION Tolerability and convenience are crucial aspects for the long-term success of combined antiretroviral therapy (cART). The aim of this study was to investigate the impact in routine clinical practice of switching to the single tablet regimen (STR) RPV/FTC/TDF in patients with intolerance to previous cART, in terms of patients' well-being, assessed by several validated measures. METHODS Prospective, multicenter study. Adult HIV-infected patients with viral load under 1.000 copies/mL while receiving a stable ART for at least the last three months and switched to RPV/FTC/TDF due to intolerance of previous regimen, were included. Analyses were performed by ITT. Presence/magnitude of symptoms (ACTG-HIV Symptom Index), quality of life (EQ-5D, EUROQoL & MOS-HIV), adherence (SMAQ), preference of treatment and perceived ease of medication (ESTAR) through 48 weeks were performed. RESULTS Interim analysis of 125 patients with 16 weeks of follow up was performed. 100 (80%) were male, mean age 46 years. Mean CD4 at baseline was 629.5±307.29 and 123 (98.4%) had viral load <50 copies/mL; 15% were HCV co-infected. Ninety two (73.6%) patients switched from a NNRTI (84.8% from EFV/FTC/TDF) and 33 (26.4%) from a PI/r. The most frequent reasons for switching were psychiatric disorders (51.2%), CNS adverse events (40.8%), gastrointestinal (19.2%) and metabolic disorders (19.2%). At the time of this analysis (week 16), four patients (3.2%) discontinued treatment: one due to adverse events, two virologic failures and one with no data. A total of 104 patients (83.2%) were virologically suppressed (<50 copies/mL). The average degree of discomfort in the ACTG-HIV Symptom Index significantly decreased from baseline (21±15.55) to week 4 (10.89±12.36) & week 16 (10.81±12.62), p<0.001. In all the patients, quality of life tools showed a significant benefit in well-being of the patients (Table 1). Adherence to therapy significantly and progressively increased (SMAQ) from baseline (54.4%) to week 4 (68%), p<0.001 and to week 16 (72.0%), p<0.001. CONCLUSIONS Switching to RPV/FTC/TDF from another ARV regimen due to toxicity, significantly improved the quality of life of HIV-infected patients, both in mental and physical components, and improved adherence to therapy while maintaining a good immune and virological response.
Resumo:
BACKGROUND AND STUDY AIMS Colon capsule endoscopy (CCE) was developed for the evaluation of colorectal pathology. In this study, our aim was to assess if a dual-camera analysis using CCE allows better evaluation of the whole gastrointestinal (GI) tract compared to a single-camera analysis. PATIENTS AND METHODS We included 21 patients (12 males, mean age 56.20 years) submitted for a CCE examination. After standard colon preparation, the colon capsule endoscope (PillCam Colon™) was swallowed after reinitiation from its "sleep" mode. Four physicians performed the analysis: two reviewed both video streams at the same time (dual-camera analysis); one analyzed images from one side of the device ("camera 1"); and the other reviewed the opposite side ("camera 2"). We compared numbers of findings from different parts of the entire GI tract and level of agreement among reviewers. RESULTS A complete evaluation of the GI tract was possible in all patients. Dual-camera analysis provided 16% and 5% more findings compared to camera 1 and camera 2 analysis, respectively. Overall agreement was 62.7% (kappa = 0.44, 95% CI: 0.373-0.510). Esophageal (kappa = 0.611) and colorectal (kappa = 0.595) findings had a good level of agreement, while small bowel (kappa = 0.405) showed moderate agreement. CONCLUSION The use of dual-camera analysis with CCE for the evaluation of the GI tract is feasible and detects more abnormalities when compared with single-camera analysis.
Resumo:
Because of the various matrices available for forensic investigations, the development of versatile analytical approaches allowing the simultaneous determination of drugs is challenging. The aim of this work was to assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) platform allowing the rapid quantification of colchicine in body fluids and tissues collected in the context of a fatal overdose. For this purpose, filter paper was used as a sampling support and was associated with an automated 96-well plate extraction performed by the LC autosampler itself. The developed method features a 7-min total run time including automated filter paper extraction (2 min) and chromatographic separation (5 min). The sample preparation was reduced to a minimum regardless of the matrix analyzed. This platform was fully validated for dried blood spots (DBS) in the toxic concentration range of colchicine. The DBS calibration curve was applied successfully to quantification in all other matrices (body fluids and tissues) except for bile, where an excessive matrix effect was found. The distribution of colchicine for a fatal overdose case was reported as follows: peripheral blood, 29 ng/ml; urine, 94 ng/ml; vitreous humour and cerebrospinal fluid, < 5 ng/ml; pericardial fluid, 14 ng/ml; brain, < 5 pg/mg; heart, 121 pg/mg; kidney, 245 pg/mg; and liver, 143 pg/mg. Although filter paper is usually employed for DBS, we report here the extension of this alternative sampling support to the analysis of other body fluids and tissues. The developed platform represents a rapid and versatile approach for drug determination in multiple forensic media.
Resumo:
We conducted a preliminary, questionnaire-based, retrospective analysis of training and injury in British National Squad Olympic distance (OD) and Ironman distance (IR) triathletes. The main outcome measures were training duration and training frequency and injury frequency and severity. The number of overuse injuries sustained over a 5-year period did not differ between OD and IR. However, the proportions of OD and IR athletes who were affected by injury to particular anatomical sites differed (p < 0.05). Also, fewer OD athletes (16.7 vs. 36.8%, p < 0.05) reported that their injury recurred. Although OD sustained fewer running injuries than IR (1.6 +/- 0.5 vs. 1.9 +/- 0.3, p < 0.05), more subsequently stopped running (41.7 vs. 15.8%) and for longer (33.5 +/- 43.0 vs. 16.7 +/- 16.6 days, p < 0.01). In OD, the number of overuse injuries sustained inversely correlated with percentage training time, and number of sessions, doing bike hill repetitions (r = -0.44 and -0.39, respectively, both p < 0.05). The IR overuse injury number correlated with the amount of intensive sessions done (r = 0.67, p < 0.01 and r = 0.56, p < 0.05 for duration of "speed run" and "speed bike" sessions). Coaches should note that training differences between triathletes who specialize in OD or IR competition may lead to their exhibiting differential risk for injury to specific anatomical sites. It is also important to note that cycle and run training may have a "cumulative stress" influence on injury risk. Therefore, the tendency of some triathletes to modify rather than stop training when injured-usually by increasing load in another discipline from that in which the injury first occurred-may increase both their risk of injury recurrence and time to full rehabilitation.
Resumo:
Accurate diagnosis of orthopedic device-associated infections can be challenging. Culture of tissue biopsy specimens is often considered the gold standard; however, there is currently no consensus on the ideal incubation time for specimens. The aim of our study was to assess the yield of a 14-day incubation protocol for tissue biopsy specimens from revision surgery (joint replacements and internal fixation devices) in a general orthopedic and trauma surgery setting. Medical records were reviewed retrospectively in order to identify cases of infection according to predefined diagnostic criteria. From August 2009 to March 2012, 499 tissue biopsy specimens were sampled from 117 cases. In 70 cases (59.8%), at least one sample showed microbiological growth. Among them, 58 cases (82.9%) were considered infections and 12 cases (17.1%) were classified as contaminations. The median time to positivity in the cases of infection was 1 day (range, 1 to 10 days), compared to 6 days (range, 1 to 11 days) in the cases of contamination (P < 0.001). Fifty-six (96.6%) of the infection cases were diagnosed within 7 days of incubation. In conclusion, the results of our study show that the incubation of tissue biopsy specimens beyond 7 days is not productive in a general orthopedic and trauma surgery setting. Prolonged 14-day incubation might be of interest in particular situations, however, in which the prevalence of slow-growing microorganisms and anaerobes is higher.
Resumo:
This tutorial review details some of the recent advances in signal analyses applied to event-related potential (ERP) data. These "electrical neuroimaging" analyses provide reference-independent measurements of response strength and response topography that circumvent statistical and interpretational caveats of canonical ERP analysis methods while also taking advantage of the greater information provided by high-density electrode montages. Electrical neuroimaging can be applied across scales ranging from group-averaged ERPs to single-subject and single-trial datasets. We illustrate these methods with a tutorial dataset and place particular emphasis on their suitability for studies of clinical and/or developmental populations.
Resumo:
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0