973 resultados para Early decision


Relevância:

30.00% 30.00%

Publicador:

Resumo:

INTRODUÇÃO: A decisão de quando iniciar a diálise em pacientes com lesão renal aguda (LRA) que apresentam síndrome urêmica está bem estabelecida, entretanto, com ureia < 200 mg/dl o melhor momento para iniciar a diálise torna-se incerto. OBJETIVO: Este estudo teve como objetivo avaliar a mortalidade e a recuperação da função renal em pacientes com LRA, cujo início da diálise ocorreu em diferentes níveis de ureia. MÉTODOS: Estudo retrospectivo desenvolvido em hospital escola, no estado de São Paulo, Brasil, envolvendo 86 pacientes submetidos à diálise. RESULTADOS: A diálise foi iniciada com uréia > 150 mg/dl em 23 pacientes (grupo I) e uréia > 150 mg/dl em 63 pacientes (grupo II). Hipervolemia e mortalidade foram mais frequentes no grupo I que no grupo II (65,2 x 14,2% - p < 0,05; 39,1 x 68,9% - p < 0,05, respectivamente). Entre os sobreviventes, a recuperação renal foi maior no grupo I (71,4 e 36,8%, respectivamente, p < 0,05). A análise multivariada mostrou risco independente de mortalidade relacionado à sepse, idade > 60 anos, diálise peritoneal e uréia > 150 mg/dl no início da diálise. CONCLUSÃO: Menor mortalidade e maior recuperação renal estão associadas com o diálise iniciada precocemente, conforme baixos níveis de ureia, em pacientes com LRA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Leptospirosis is an important zoonotic disease associated with poor areas of urban settings of developing countries and early diagnosis and prompt treatment may prevent disease. Although rodents are reportedly considered the main reservoirs of leptospirosis, dogs may develop the disease, may become asymptomatic carriers and may be used as sentinels for disease epidemiology. The use of Geographical Information Systems (GIS) combined with spatial analysis techniques allows the mapping of the disease and the identification and assessment of health risk factors. Besides the use of GIS and spatial analysis, the technique of data mining, decision tree, can provide a great potential to find a pattern in the behavior of the variables that determine the occurrence of leptospirosis. The objective of the present study was to apply Geographical Information Systems and data prospection (decision tree) to evaluate the risk factors for canine leptospirosis in an area of Curitiba, PR.Materials, Methods & Results: The present study was performed on the Vila Pantanal, a urban poor community in the city of Curitiba. A total of 287 dog blood samples were randomly obtained house-by-house in a two-day sampling on January 2010. In addition, a questionnaire was applied to owners at the time of sampling. Geographical coordinates related to each household of tested dog were obtained using a Global Positioning System (GPS) for mapping the spatial distribution of reagent and non-reagent dogs to leptospirosis. For the decision tree, risk factors included results of microagglutination test (MAT) from the serum of dogs, previous disease on the household, contact with rats or other dogs, dog breed, outdoors access, feeding, trash around house or backyard, open sewer proximity and flooding. A total of 189 samples (about 2/3 of overall samples) were randomly selected for the training file and consequent decision rules. The remained 98 samples were used for the testing file. The seroprevalence showed a pattern of spatial distribution that involved all the Pantanal area, without agglomeration of reagent animals. In relation to data mining, from 189 samples used in decision tree, a total of 165 (87.3%) animal samples were correctly classified, generating a Kappa index of 0.413. A total of 154 out of 159 (96.8%) samples were considered non-reagent and were correctly classified and only 5/159 (3.2%) were wrongly identified. on the other hand, only 11 (36.7%) reagent samples were correctly classified, with 19 (63.3%) samples failing diagnosis.Discussion: The spatial distribution that involved all the Pantanal area showed that all the animals in the area are at risk of contamination by Leptospira spp. Although most samples had been classified correctly by the decision tree, a degree of difficulty of separability related to seropositive animals was observed, with only 36.7% of the samples classified correctly. This can occur due to the fact of seronegative animals number is superior to the number of seropositive ones, taking the differences in the pattern of variable behavior. The data mining helped to evaluate the most important risk factors for leptospirosis in an urban poor community of Curitiba. The variables selected by decision tree reflected the important factors about the existence of the disease (default of sewer, presence of rats and rubbish and dogs with free access to street). The analyses showed the multifactorial character of the epidemiology of canine leptospirosis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines the influence of early experience with different forms of aggressive behaviour on the fighting behaviour of young fish. Fry of the cichlid fish, Oreochromis niloticus, were raised from hatching in small groups consisting of a normal individual (the test fish) and either mutant conspecifics lacking the dorsal fin and thereby the ability to perform fin displays, or normal ones. Following a 63-day period of development in groups the test fish were confronted in their home tanks with an unfamiliar normal fish for 10 min. The fighting behaviour of the test fish was analyzed considering their previous group type (mutant or normal) and rank (alpha or beta). There was no difference between test fish in the rate and sequence of behaviour patterns used in fighting. However, test fish that had developed in mutant groups were rarely the first to bite in contests and had a longer latency to biting following the first bite of the stimulus fish than rest fish with normal experience. This finding is attributable to the form of aggressive behaviour experienced by the test fish during development but not to existing differences in the amount of aggression previously experienced, nor to previous rank, sex, or size relative to the stimulus fish. The results suggest that early experience influenced decision making by the test fish during the fight. The involvement of the fin displays and the possible mechanism of this influence are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Colorectal cancer (CRC) is the most common tumour type in both sexes combined in Western countries. Although screening programmes including the implementation of faecal occult blood test and colonoscopy might be able to reduce mortality by removing precursor lesions and by making diagnosis at an earlier stage, the burden of disease and mortality is still high. Improvement of diagnostic and treatment options increased staging accuracy, functional outcome for early stages as well as survival. Although high quality surgery is still the mainstay of curative treatment, the management of CRC must be a multi-modal approach performed by an experienced multi-disciplinary expert team. Optimal choice of the individual treatment modality according to disease localization and extent, tumour biology and patient factors is able to maintain quality of life, enables long-term survival and even cure in selected patients by a combination of chemotherapy and surgery. Treatment decisions must be based on the available evidence, which has been the basis for this consensus conference-based guideline delivering a clear proposal for diagnostic and treatment measures in each stage of rectal and colon cancer and the individual clinical situations. This ESMO guideline is recommended to be used as the basis for treatment and management decisions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims Cardiac grafts from non-heartbeating donors (NHBDs) could significantly increase organ availability and reduce waiting-list mortality. Reluctance to exploit hearts from NHBDs arises from obligatory delays in procurement leading to periods of warm ischemia and possible subsequent contractile dysfunction. Means for early prediction of graft suitability prior to transplantation are thus required for development of heart transplantation programs with NHBDs. Methods and Results Hearts (n = 31) isolated from male Wistar rats were perfused with modified Krebs-Henseleit buffer aerobically for 20 min, followed by global, no-flow ischemia (32°C) for 30, 50, 55 or 60 min. Reperfusion was unloaded for 20 min, and then loaded, in working-mode, for 40 min. Left ventricular (LV) pressure was monitored using a micro-tip pressure catheter introduced via the mitral valve. Several hemodynamic parameters measured during early, unloaded reperfusion correlated significantly with LV work after 60 min reperfusion (p<0.001). Coronary flow and the production of lactate and lactate dehydrogenase (LDH) also correlated significantly with outcomes after 60 min reperfusion (p<0.05). Based on early reperfusion hemodynamic measures, a composite, weighted predictive parameter, incorporating heart rate (HR), developed pressure (DP) and end-diastolic pressure, was generated and evaluated against the HR-DP product after 60 min of reperfusion. Effective discriminating ability for this novel parameter was observed for four HR*DP cut-off values, particularly for ≥20 *103 mmHg*beats*min−1 (p<0.01). Conclusion Upon reperfusion of a NHBD heart, early evaluation, at the time of organ procurement, of cardiac hemodynamic parameters, as well as easily accessible markers of metabolism and necrosis seem to accurately predict subsequent contractile recovery and could thus potentially be of use in guiding the decision of accepting the ischemic heart for transplantation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: To assess the virological outcome of patients with undetectable human immunodeficiency (HI) viremia switched to tenofovir (TDF)-containing nucleosideonly (NUKE-only) treatments and to investigate the factors influencing the physicians' decision for application of a nonestablished therapy. METHOD: Patients' characteristics and history were taken from the cohort database. To study the decision-making process, questionnaires were sent to all treating physicians. RESULTS: 49 patients were changed to TDF-containing NUKE-only treatment and 46 had a follow-up measurement of HI viremia. Virological failure occurred in 16 (35%) patients. Virological failure was associated with previous mono or dual therapy and with a regimen including didanosine or abacavir. No failure occurred in 15 patients without these predisposing factors. The main reasons for change to TDF-containing NUKE-only treatment were side effects and presumed favorable toxicity profile. The rationale behind this decision was mainly analogy to the zidovudine/lamivudine/abacavir maintenance therapy. CONCLUSION: TDF-containing NUKE-only treatment is associated with high early failure rates in patients with previous nucleoside reverse transcriptase inhibitor mono or dual therapy and in drug combinations containing didanosine or abacavir but not in patients without these predisposing factors. In HIV medicine, treatment strategies that are not evidence-based are followed by a minority of experienced physicians and are driven by patients' needs, mainly to minimize treatment side effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies with chronic schizophrenia patients have demonstrated that patients fluctuate between rigid and unpredictable responses in decision-making situations, a phenomenon which has been called dysregulation. The aim of this study was to investigate whether schizophrenia patients already display dysregulated behavior at the beginning of their illness. Thirty-two first-episode schizophrenia or schizophreniform patients and 30 healthy controls performed the two-choice prediction task. The decision-making behavior of first-episode patients was shown to be characterized by a high degree of dysregulation accompanied by low metric entropy and a tendency towards increased mutual information. These results indicate that behavioral abnormalities during the two-choice prediction task are already present during the early stages of the illness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most recently discussion about the optimal treatment for different subsets of patients suffering from coronary artery disease has re-emerged, mainly because of the uncertainty caused by doctors and patients regarding the phenomenon of unpredictable early and late stent thrombosis. Surgical revascularization using multiple arterial bypass grafts has repeatedly proven its superiority compared to percutaneous intervention techniques, especially in patients suffering from left main stem disease and coronary 3-vessels disease. Several prospective randomized multicenter studies comparing early and mid-term results following PCI and CABG have been really restrictive, with respect to patient enrollment, with less than 5% of all patients treated during the same time period been enrolled. Coronary artery bypass grafting allows the most complete revascularization in one session, because all target coronary vessels larger than 1 mm can be bypassed in their distal segments. Once the patient has been turn-off for surgery, surgeons have to consider the most complete arterial revascularization in order to decrease the long-term necessity for re-revascularization; for instance patency rate of the left internal thoracic artery grafted to the distal part left anterior descending artery may be as high as 90-95% after 10 to 15 years. Early mortality following isolated CABG operation has been as low as 0.6 to 1% in the most recent period (reports from the University Hospital Berne and the University Hospital of Zurich); beside these excellent results, the CABG option seems to be less expensive than PCI with time, since the necessity for additional PCI is rather high following initial PCI, and the price of stent devices is still very high, particularly in Switzerland. Patients, insurance and experts in health care should be better and more honestly informed concerning the risk and costs of PCI and CABG procedures as well as about the much higher rate of subsequent interventions following PCI. Team approach for all patients in whom both options could be offered seems mandatory to avoid unbalanced information of the patients. Looking at the recent developments in transcatheter valve treatments, the revival of cardiological-cardiosurgical conferences seems to a good option to optimize the cooperation between the two medical specialties: cardiology and cardiac surgery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Randomized control trials (RCTs) stopped early for benefit (truncated RCTs) are increasingly common and, on average, overestimate the relative magnitude of benefit by approximately 30%. Investigators stop trials early when they consider it is no longer ethical to enroll patients in a control group. The goal of this systematic review is to determine how investigators of ongoing or planned RCTs respond to the publication of a truncated RCT addressing a similar question. METHODS/DESIGN We will conduct systematic reviews to update the searches of 210 truncated RCTs to identify similar trials ongoing at the time of publication, or started subsequently, to the truncated trials ('subsequent RCTs'). Reviewers will determine in duplicate the similarity between the truncated and subsequent trials. We will analyze the epidemiology, distribution, and predictors of subsequent RCTs. We will also contact authors of subsequent trials to determine reasons for beginning, continuing, or prematurely discontinuing their own trials, and the extent to which they rely on the estimates from truncated trials. DISCUSSION To the extent that investigators begin or continue subsequent trials they implicitly disagree with the decision to stop the truncated RCT because of an ethical mandate to administer the experimental treatment. The results of this study will help guide future decisions about when to stop RCTs early for benefit.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECT After subarachnoid hemorrhage (SAH), seizure occurs in up to 26% of patients. The impact of seizure on outcome has been studied, yet its impact on grading is unknown. The authors evaluated the impact of early-onset seizures (EOS) on grading of spontaneous SAH and on outcome. METHODS This retrospective analysis included consecutive patients with SAH who were treated at the NeuroCenter, Inselspital, University Hospital Bern, Switzerland, between January 2005 and December 2010. Demographic data, clinical data, and reports of EOS were recorded. The EOS were defined as seizures occurring within 24 hours after ictus. Patients were graded according to the World Federation of Neurosurgical Societies (WFNS) scale pre- and postresuscitation and dichotomized into good (WFNS I-III) and poor (WFNS IV-V) grades. Outcome was assessed at 6 months by using the modified Rankin Scale (mRS); an mRS score of 0-3 was considered a good outcome and an mRS score of 4-6 was considered a poor outcome. RESULTS Forty-one of 425 patients with SAH had EOS. Twenty-seven of those 41 patients (65.9%) had a poor WFNS grade. Twenty-eight (68.3%) achieved a good outcome, 11 (26.8%) had a poor outcome, and 2 (4.9%) were lost to followup. Early-onset seizures were proven in 9 of 16 electroencephalograms. The EOS were associated with poor WFNS grade (OR 2.81, 97.5% CI 1.14-7.46; p = 0.03) and good outcome (OR 4.01, 97.5% CI 1.63-10.53; p = 0.03). Increasing age, hydrocephalus, intracerebral hemorrhage, and intraventricular hemorrhage were associated with poor WFNS grade, whereas only age, intracerebral hemorrhage (p < 0.001), and poor WFNS grade (p < 0.001) were associated with poor outcome. CONCLUSIONS Patients with EOS were classified significantly more often in a poor grade initially, but then they significantly more often achieved a good outcome. The authors conclude that EOS can negatively influence grading. This might influence decision making for the care of patients with SAH, so grading of patients with EOS should be interpreted with caution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To determine life expectancy for older women with breast cancer. DESIGN Prospective longitudinal study with 10 years of follow-up data. SETTING Hospitals or collaborating tumor registries in four geographic regions (Los Angeles, California; Minnesota; North Carolina; Rhode Island). PARTICIPANTS Women aged 65 and older at time of breast cancer diagnosis with Stage I to IIIA disease with measures of self-rated health (SRH) and walking ability at baseline (N = 615; 17% aged ≥80, 52% Stage I, 58% with ≥2 comorbidities). MEASUREMENTS Baseline SRH, baseline self-reported walking ability, all-cause and breast cancer-specific estimated probability of 5- and 10-year survival. RESULTS At the time of breast cancer diagnosis, 39% of women reported poor SRH, and 28% reported limited ability to walk several blocks. The all-cause survival curves appear to separate after approximately 3 years, and the difference in survival probability between those with low SRH and limited walking ability and those with high SRH and no walking ability limitation was significant (0.708 vs 0.855 at 5 years, P ≤ .001; 0.300 vs 0.648 at 10 years, P < .001). There were no differences between the groups in breast cancer-specific survival at 5 and 10 years (P = .66 at 5 years, P = .16 at 10 years). CONCLUSION The combination of low SRH and limited ability to walk several blocks at diagnosis is an important predictor of worse all-cause survival at 5 and 10 years. These self-report measures easily assessed in clinical practice may be an effective strategy to improve treatment decision-making in older adults with cancer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND Uncertainty about the presence of infection results in unnecessary and prolonged empiric antibiotic treatment of newborns at risk for early-onset sepsis (EOS). This study evaluates the impact of this uncertainty on the diversity in management. METHODS A web-based survey with questions addressing management of infection risk-adjusted scenarios was performed in Europe, North America, and Australia. Published national guidelines (n=5) were reviewed and compared to the results of the survey. RESULTS 439 Clinicians (68% were neonatologists) from 16 countries completed the survey. In the low-risk scenario, 29% would start antibiotic therapy and 26% would not, both groups without laboratory investigations; 45% would start if laboratory markers were abnormal. In the high-risk scenario, 99% would start antibiotic therapy. In the low-risk scenario, 89% would discontinue antibiotic therapy before 72 hours. In the high-risk scenario, 35% would discontinue therapy before 72 hours, 56% would continue therapy for five to seven days, and 9% for more than 7 days. Laboratory investigations were used in 31% of scenarios for the decision to start, and in 72% for the decision to discontinue antibiotic treatment. National guidelines differ considerably regarding the decision to start in low-risk and regarding the decision to continue therapy in higher risk situations. CONCLUSIONS There is a broad diversity of clinical practice in management of EOS and a lack of agreement between current guidelines. The results of the survey reflect the diversity of national guidelines. Prospective studies regarding management of neonates at risk of EOS with safety endpoints are needed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Group sequential methods and response adaptive randomization (RAR) procedures have been applied in clinical trials due to economical and ethical considerations. Group sequential methods are able to reduce the average sample size by inducing early stopping, but patients are equally allocated with half of chance to inferior arm. RAR procedures incline to allocate more patients to better arm; however it requires more sample size to obtain a certain power. This study intended to combine these two procedures. We applied the Bayesian decision theory approach to define our group sequential stopping rules and evaluated the operating characteristics under RAR setting. The results showed that Bayesian decision theory method was able to preserve the type I error rate as well as achieve a favorable power; further by comparing with the error spending function method, we concluded that Bayesian decision theory approach was more effective on reducing average sample size.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation focuses mainly on Bayesian adaptive designs for phase I and phase II clinical trials. It includes three specific topics: (1) proposing a novel two-dimensional dose-finding algorithm for biological agents, (2) developing Bayesian adaptive screening designs to provide more efficient and ethical clinical trials, and (3) incorporating missing late-onset responses to make an early stopping decision. Treating patients with novel biological agents is becoming a leading trend in oncology. Unlike cytotoxic agents, for which toxicity and efficacy monotonically increase with dose, biological agents may exhibit non-monotonic patterns in their dose-response relationships. Using a trial with two biological agents as an example, we propose a phase I/II trial design to identify the biologically optimal dose combination (BODC), which is defined as the dose combination of the two agents with the highest efficacy and tolerable toxicity. A change-point model is used to reflect the fact that the dose-toxicity surface of the combinational agents may plateau at higher dose levels, and a flexible logistic model is proposed to accommodate the possible non-monotonic pattern for the dose-efficacy relationship. During the trial, we continuously update the posterior estimates of toxicity and efficacy and assign patients to the most appropriate dose combination. We propose a novel dose-finding algorithm to encourage sufficient exploration of untried dose combinations in the two-dimensional space. Extensive simulation studies show that the proposed design has desirable operating characteristics in identifying the BODC under various patterns of dose-toxicity and dose-efficacy relationships. Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multi-arm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while at the same time allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while at the same time providing higher power to identify the best treatment at the end of the trial. Phase II trial studies usually are single-arm trials which are conducted to test the efficacy of experimental agents and decide whether agents are promising to be sent to phase III trials. Interim monitoring is employed to stop the trial early for futility to avoid assigning unacceptable number of patients to inferior treatments. We propose a Bayesian single-arm phase II design with continuous monitoring for estimating the response rate of the experimental drug. To address the issue of late-onset responses, we use a piece-wise exponential model to estimate the hazard function of time to response data and handle the missing responses using the multiple imputation approach. We evaluate the operating characteristics of the proposed method through extensive simulation studies. We show that the proposed method reduces the total length of the trial duration and yields desirable operating characteristics for different physician-specified lower bounds of response rate with different true response rates.