935 resultados para Duration of studies


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A produção da seringueira é reduzida pelas plantas daninhas que competem por recursos ambientais; portanto, a época e duração do controle de plantas daninhas influencia a interferência das plantas daninhas. Os objetivos deste estudo foram: avaliar o crescimento de plantas de seringueira (Hevea brasiliensis), determinar o período crítico para controle das plantas daninhas e avaliar a recuperação do crescimento das seringueiras que conviveram com plantas daninhas por diferentes períodos de tempo após o plantio. Dois grupos de tratamentos foram estabelecidos em condições de campo, no primeiro ano de investigação: um grupo conteve períodos crescentes de infestação de plantas daninhas, enquanto o outro conteve períodos crescentes de controle das plantas daninhas, também incluindo uma testemunha livre de plantas daninhas e uma testemunha com infestação total de plantas daninhas. No segundo ano da investigação, as plantas daninhas foram totalmente controladas. Urochloa decumbens foi a planta daninha dominante (mais de 90% de cobertura). O crescimento da cultura foi grandemente reduzido devido à interferência de plantas daninhas. A altura de plantas decresceu mais rapidamente que qualquer outra característica. Altura de planta, massa seca de folhas e área foliar decresceram em 99%, 97% e 96%, respectivamente, e foram as características mais reduzidas. A altura de plantas também se recuperou mais rapidamente que qualquer outra característica quando o período de controle das plantas daninhas foi entendido. Contudo, a massa seca do caule aumentou em 750%, fazendo desta a característica mais recuperada. O período crítico para o controle de plantas daninhas foi entre 4 e 9½ meses após o plantio, no primeiro ano; contudo, as seringueiras mostraram expressiva recuperação do crescimento quando as plantas daninhas foram controladas ao longo do segundo ano.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This trial aimed to compare the dialysis complications occurring during different durations of extended daily dialysis (EDD) sessions in critically ill AKI patients. We included patients older than 18 years with AKI associated with sepsis admitted to the intensive care unit and using noradrenaline dose ranging from 0.3 to 0.7 mu g/kg/min. Patients were divided into two groups randomly: in G1, 6 h sessions were performed and, in G2, 10 h sessions were performed. Seventy-five patients were treated with 195 EDD sessions for 18 consecutive months. The prevalence of hypotension, filter clotting, hypokalaemia, and hypophosphataemia was 82.6, 25.3, 20, and 10.6%, respectively. G1 and G2 were similar in male predominance and SOFA. There was no significant difference between the two groups in hypotension, filter clotting, hypokalaemia, and hypophosphataemia. However, the group treated with sessions of 10 hours showed higher refractory to clinical measures for hypotension and dialysis sessions were interrupted more often. Metabolic control and fluid balance were similar between G1 and G2. In conclusion, intradialysis hypotension was common in AKI patients treated with EDD. There was no difference in the prevalence of dialysis complications in patients undergoing different durations of EDD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: In the British Isles, control of cattle tuberculosis (TB) is hindered by persistent infection of wild badger (Meles meles) populations. A large-scale field trial—the Randomised Badger Culling Trial (RBCT)—previously showed that widespread badger culling produced modest reductions in cattle TB incidence during culling, which were offset by elevated TB risks for cattle on adjoining lands. Once culling was halted, beneficial effects inside culling areas increased, while detrimental effects on adjoining lands disappeared. However, a full assessment of the utility of badger culling requires information on the duration of culling effects. Methodology/Principal Findings: We monitored cattle TB incidence in and around RBCT areas after culling ended. We found that benefits inside culled areas declined over time, and were no longer detectable by three years post-culling. On adjoining lands, a trend suggesting beneficial effects immediately after the end of culling was insignificant, and disappeared after 18 months post-culling. From completion of the first cull to the loss of detectable effects (an average five-year culling period plus 2.5 years post-culling), cattle TB incidence was 28.7% lower (95% confidence interval [CI] 20.7 to 35.8% lower) inside ten 100 km2 culled areas than inside ten matched no-culling areas, and comparable (11.7% higher, 95% CI: 13.0% lower to 43.4% higher, p = 0.39) on lands #2 km outside culled and no-culling areas. The financial costs of culling an idealized 150 km2 area would exceed the savings achieved through reduced cattle TB, by factors of 2 to 3.5. Conclusions/Significance: Our findings show that the reductions in cattle TB incidence achieved by repeated badger culling were not sustained in the long term after culling ended and did not offset the financial costs of culling. These results, combined with evaluation of alternative culling methods, suggest that badger culling is unlikely to contribute effectively to the control of cattle TB in Britain.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: We investigated the relation between duration of dual antiplatelet therapy (DAPT) and clinical outcomes up to 12 months after Genous (TM) endothelial progenitor cell capturing R stent (TM) placement in patients from the e-HEALING registry. Background: Cessation of (DAPT) has been shown to be associated with the occurrence of stent thrombosis (ST). After Genous placement, 1 month of DAPT is recommended. Methods: Patients were analyzed according to continuation or discontinuation of DAPT at a 30-day and 6-month landmark, excluding patients with events before the landmark. Each landmark was a new baseline, and outcomes were followed up to 12 months after stenting. The main outcome for our current analysis was target vessel failure (TVF), defined as target vessel-related cardiac death or myocardial infarction and target vessel revascularization. Secondary outcomes included ST. (Un)adjusted hazard ratios (HR) for TVF were calculated with Cox regression. Results: No difference was observed in the incidence of TVF [HR: 1.03; 95% confidence intervals (CI): 0.651.65, P = 0.89] in patients continuing DAPT (n = 4,249) at 30 days versus patients stopped (n = 309), and HR: 0.82 (95% CI: 0.551.23, P = 0.34) in patients continuing DAPT (n = 2,654) at 6 months versus patients stopped [n = 1,408] DAPT). Furthermore, no differences were observed in ST. Even after addition of identified independent predictors for TVF, adjusted TVF hazards were comparable. Conclusions: In a post-hoc analysis of e-HEALING, duration of DAPT was not associated with the occurrence of the outcomes TVF or ST. The Genous stent may be an attractive treatment especially in patients at increased risk for (temporary) cessation of DAPT or bleeding. (C) 2011 Wiley Periodicals, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-month anticoagulation is recommended to treat provoked or first distal deep-vein thrombosis (DVT), and indefinite-duration anticoagulation should be considered for patients with unprovoked proximal, unprovoked recurrent, or cancer-associated DVT. In the prospective Outpatient Treatment of Deep Vein Thrombosis in Switzerland (OTIS-DVT) Registry of 502 patients with acute objectively confirmed lower extremity DVT (59% provoked or first distal DVT; 41% unprovoked proximal, unprovoked recurrent, or cancer-associated DVT) from 53 private practices and 11 hospitals, we investigated the planned duration of anticoagulation at the time of treatment initiation. The decision to administer limited-duration anticoagulation therapy was made in 343 (68%) patients with a median duration of 107 (interquartile range 91-182) days for provoked or first distal DVT, and 182 (interquartile range 111-184) days for unprovoked proximal, unprovoked recurrent, or cancer-associated DVT. Among patients with provoked or first distal DVT, anticoagulation was recommended for < 3 months in 11%, 3 months in 63%, and for an indefinite period in 26%. Among patients with unprovoked proximal, unprovoked recurrent, or cancer-associated DVT, anticoagulation was recommended for < 6 months in 22%, 6-12 months in 38%, and for an indefinite period in 40%. Overall, there was more frequent planning of indefinite-duration therapy from hospital physicians as compared with private practice physicians (39% vs. 28%; p=0.019). Considerable inconsistency in planning the duration of anticoagulation therapy mandates an improvement in risk stratification of outpatients with acute DVT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this article was to record reporting characteristics related to study quality of research published in major specialty dental journals with the highest impact factor (Journal of Endodontics, Journal of Oral and Maxillofacial Surgery, American Journal of Orthodontics and Dentofacial Orthopedics; Pediatric Dentistry, Journal of Clinical Periodontology, and International Journal of Prosthetic Dentistry). The included articles were classified into the following 3 broad subject categories: (1) cross-sectional (snap-shot), (2) observational, and (3) interventional. Multinomial logistic regression was conducted for effect estimation using the journal as the response and randomization, sample calculation, confounding discussed, multivariate analysis, effect measurement, and confidence intervals as the explanatory variables. The results showed that cross-sectional studies were the dominant design (55%), whereas observational investigations accounted for 13%, and interventions/clinical trials for 32%. Reporting on quality characteristics was low for all variables: random allocation (15%), sample size calculation (7%), confounding issues/possible confounders (38%), effect measurements (16%), and multivariate analysis (21%). Eighty-four percent of the published articles reported a statistically significant main finding and only 13% presented confidence intervals. The Journal of Clinical Periodontology showed the highest probability of including quality characteristics in reporting results among all dental journals.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: Neurofunctional alterations are correlates of vulnerability to psychosis, as well as of the disorder itself. How these abnormalities relate to different probabilities for later transition to psychosis is unclear. We investigated vulnerability- versus disease-related versus resilience biomarkers of psychosis during working memory (WM) processing in individuals with an at-risk mental state (ARMS). Experimental design: Patients with “first-episode psychosis” (FEP, n = 21), short-term ARMS (ARMS-ST, n = 17), long-term ARMS (ARMS-LT, n = 16), and healthy controls (HC, n = 20) were investigated with an n-back WM task. We examined functional magnetic resonance imaging (fMRI) and structural magnetic resonance imaging (sMRI) data in conjunction using biological parametric mapping (BPM) toolbox. Principal observations: There were no differences in accuracy, but the FEP and the ARMS-ST group had longer reaction times compared with the HC and the ARMS-LT group. With the 2-back > 0-back contrast, we found reduced functional activation in ARMS-ST and FEP compared with the HC group in parietal and middle frontal regions. Relative to ARMS-LT individuals, FEP patients showed decreased activation in the bilateral inferior frontal gyrus and insula, and in the left prefrontal cortex. Compared with the ARMS-LT, the ARMS-ST subjects showed reduced activation in the right inferior frontal gyrus and insula. Reduced insular and prefrontal activation was associated with gray matter volume reduction in the same area in the ARMS-LT group. Conclusions: These findings suggest that vulnerability to psychosis was associated with neurofunctional alterations in fronto-temporo-parietal networks in a WM task. Neurofunctional differences within the ARMS were related to different duration of the prodromal state and resilience factors

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High-resolution measurements of chemical impurities and methane concentrations in Greenland ice core samples from the early glacial period allow the extension of annual-layer counted chronologies and the improvement of gas age-ice age difference (Δage) essential to the synchronization of ice core records. We report high-resolution measurements of a 50 m section of the NorthGRIP ice core and corresponding annual layer thicknesses in order to constrain the duration of the Greenland Stadial 22 (GS-22) between Greenland Interstadials (GIs) 21 and 22, for which inconsistent durations and ages have been reported from Greenland and Antarctic ice core records as well as European speleothems. Depending on the chronology used, GS-22 occurred between approximately 89 (end of GI-22) and 83 kyr b2k (onset of GI-21). From annual layer counting, we find that GS-22 lasted between 2696 and 3092 years and was followed by a GI-21 pre-cursor event lasting between 331 and 369 yr. Our layer-based counting agrees with the duration of stadial 22 as determined from the NALPS speleothem record (3250 ± 526 yr) but not with that of the GICC05modelext chronology (2620 yr) or an alternative chronology based on gas-marker synchronization to EPICA Dronning Maud Land ice core. These results show that GICC05modelext overestimates accumulation and/or underestimates thinning in this early part of the last glacial period. We also revise the possible ranges of NorthGRIP Δdepth (5.49 to 5.85 m) and Δage (498 to 601 yr) at the warming onset of GI-21 as well as the Δage range at the onset of the GI-21 precursor warming (523 to 654 yr), observing that temperature (represented by the δ15N proxy) increases before CH4 concentration by no more than a few decades.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To examine the duration of methicillin-resistant Staphylococcus aureus (MRSA) carriage and its determinants and the influence of eradication regimens. DESIGN: Retrospective cohort study. SETTING: A 1,033-bed tertiary care university hospital in Bern, Switzerland, in which the prevalence of methicillin resistance among S. aureus isolates is less than 5%. PATIENTS: A total of 116 patients with first-time MRSA detection identified at University Hospital Bern between January 1, 2000, and December 31, 2003, were followed up for a mean duration of 16.2 months. RESULTS: Sixty-eight patients (58.6%) cleared colonization, with a median time to clearance of 7.4 months. Independent determinants for shorter carriage duration were the absence of any modifiable risk factor (receipt of antibiotics, use of an indwelling device, or presence of a skin lesion) (hazard ratio [HR], 0.20 [95% confidence interval {CI}, 0.09-0.42]), absence of immunosuppressive therapy (HR, 0.49 [95% CI, 0.23-1.02]), and hemodialysis (HR, 0.08 [95% CI, 0.01-0.66]) at the time MRSA was first MRSA detected and the administration of decolonization regimen in the absence of a modifiable risk factor (HR, 2.22 [95% CI, 1.36-3.64]). Failure of decolonization treatment was associated with the presence of risk factors at the time of treatment (P=.01). Intermittent screenings that were negative for MRSA were frequent (26% of patients), occurred early after first detection of MRSA (median, 31.5 days), and were associated with a lower probability of clearing colonization (HR, 0.34 [95% CI, 0.17-0.67]) and an increased risk of MRSA infection during follow-up. CONCLUSIONS: Risk factors for MRSA acquisition should be carefully assessed in all MRSA carriers and should be included in infection control policies, such as the timing of decolonization treatment, the definition of MRSA clearance, and the decision of when to suspend isolation measures.