189 resultados para Reduced-Impact Logging
em Biblioteca Digital da Produ
Resumo:
There is more to sustainable forest management than reduced impact logging. Partnerships between multiple actors are needed in order to create the institutional context for good forest governance and sustainable forest management and stimulate the necessary local community involvement. The idea behind this is that the parties would be able to achieve more jointly than on their own by combining assets, knowledge, skills and political power of actors at different levels of scale. This article aims to demonstrate by example the nature and variety of forest-related partnerships in Brazilian Amazonia. Based on the lessons learned from these cases and the authors` experience, the principal characteristics of successful partnerships are described, with a focus on political and socioeconomic aspects. These characteristics include fairly negotiated partnership objectives, the active involvement of the public sector as well as impartial brokers, equitable and cost-effective institutional arrangements, sufficient and equitably shared benefits for all the parties involved, addressing socioeconomic drawbacks, and taking measures to maintain sustainable exploitation levels. The authors argue that, in addition to product-oriented partnerships which focus on sustainable forest management, there is also a need for politically oriented partnerships based on civil society coalitions. The watchdog function of these politically oriented partnerships, their awareness-raising campaigns regarding detrimental policies and practices, and advocacy for good forest governance are essential for the creation of the appropriate legal and political framework for sustainable forest management. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Using data from a logging experiment in the eastern Brazilian Amazon region, we develop a matrix growth and yield model that captures the dynamic effects of harvest system choice on forest structure and composition. Multinomial logistic regression is used to estimate the growth transition parameters for a 10-year time step, while a Poisson regression model is used to estimate recruitment parameters. The model is designed to be easily integrated with an economic model of decisionmaking to perform tropical forest policy analysis. The model is used to compare the long-run structure and composition of a stand arising from the choice of implementing either conventional logging techniques or more carefully planned and executed reduced-impact logging (RIL) techniques, contrasted against a baseline projection of an unlogged forest. Results from log and leave scenarios show that a stand logged according to Brazilian management requirements will require well over 120 years to recover its initial commercial volume, regardless of logging technique employed. Implementing RIL, however, accelerates this recovery. Scenarios imposing a 40-year cutting cycle raise the possibility of sustainable harvest volumes, although at significantly lower levels than is implied by current regulations. Meeting current Brazilian forest policy goals may require an increase in the planned total area of permanent production forest or the widespread adoption of silvicultural practices that increase stand recovery and volume accumulation rates after RIL harvests. Published by Elsevier B.V.
Resumo:
Tropical forests are characterized by diverse assemblages of plant and animal species compared to temperate forests. Corollary to this general rule is that most tree species, whether valued for timber or not, occur at low densities (<1 adult tree ha(-1)) or may be locally rare. In the Brazilian Amazon, many of the most highly valued timber species occur at extremely low densities yet are intensively harvested with little regard for impacts on population structures and dynamics. These include big-leaf mahogany (Swietenia macrophylla), ipe (Tabebuia serratifolia and Tabebuia impetiginosa), jatoba (Hymenaea courbaril), and freijo cinza (Cordia goeldiana). Brazilian forest regulations prohibit harvests of species that meet the legal definition of rare - fewer than three trees per 100 ha - but treat all species populations exceeding this density threshold equally. In this paper we simulate logging impacts on a group of timber species occurring at low densities that are widely distributed across eastern and southern Amazonia, based on field data collected at four research sites since 1997, asking: under current Brazilian forest legislation, what are the prospects for second harvests on 30-year cutting cycles given observed population structures, growth, and mortality rates? Ecologically `rare` species constitute majorities in commercial species assemblages in all but one of the seven large-scale inventories we analyzed from sites spanning the Amazon (range 49-100% of total commercial species). Although densities of only six of 37 study species populations met the Brazilian legal definition of a rare species, timber stocks of five of the six timber species declined substantially at all sites between first and second harvests in simulations based on legally allowable harvest intensities. Reducing species-level harvest intensity by increasing minimum felling diameters or increasing seed tree retention levels improved prospects for second harvests of those populations with a relatively high proportion of submerchantable stems, but did not dramatically improve projections for populations with relatively flat diameter distributions. We argue that restrictions on logging very low-density timber tree populations, such as the current Brazilian standard, provide inadequate minimum protection for vulnerable species. Population declines, even if reduced-impact logging (RIL) is eventually adopted uniformly, can be anticipated for a large pool of high-value timber species unless harvest intensities are adapted to timber species population ecology, and silvicultural treatments are adopted to remedy poor natural stocking in logged stands. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Objective: The purpose of the present study was to investigate the influence that education and depression have on the performance of elderly people in neuropsychological tests. Methods: The study was conducted at the Institute of Psychiatry, University of Sao Paulo School of Medicine, Hospital das Clinicas. All of the individuals evaluated were aged 60 or older. The study sample consisted of 59 outpatients with depressive disorders and 51 healthy controls. We stratified the sample by level of education: low = 1-4 years of schooling; high = 5 or more years of schooling. Evaluations consisted of psychiatric assessment, cognitive assessment, laboratory tests and cerebral magnetic resonance imaging. Results: We found that level of education influenced all the measures of cognitive domains investigated (intellectual efficiency, processing speed, attention, executive function and memory) except the Digit Span Forward and Fuld Object Memory Evaluation (immediate and delayed recall), whereas depressive symptoms influenced some measures of memory, attention, executive function and processing speed. Although the combination of a low level of education and depression had a significant negative influence on Stroop Test part B, Trail Making Test part B and Logical Memory (immediate recall), we found no other significant effects of the interaction between level of education and depression. Conclusion: The results of this study underscore the importance of considering the level of education in the analysis of cognitive performance in depressed elderly patients, as well as the relevance of developing new cognitive function tests in which level of education has a reduced impact on the results.
Resumo:
INTRODUCTION: Excessive group 2 carbapenem use may result in decreased bacterial susceptibility. OBJECTIVE: We evaluated the impact of a carbapenem stewardship program, restricting imipenem and meropenem use. METHODS: Ertapenem was mandated for ESBL-producing Enterobacteriaceae infections in the absence of non-fermenting Gram-negative bacilli (GNB) from April 2006 to March 2008. Group 2 carbapenems were restricted for use against GNB infections susceptible only to carbapenems and suspected GNB infections in unstable patients. Cumulative susceptibility tests were done for nosocomial pathogens before and after restriction using Clinical and Laboratory Standards Institute (CLSI) guide-lines.Vitek System or conventional identification methods were performed and susceptibility testing done by disk diffusion according to CLSI.Antibiotic consumption (t-test) and susceptibilities (McNemar's test) were determined. RESULTS: The defined daily doses (DDD) of group 2 carbapenems declined from 61.1 to 48.7 DDD/1,000 patient-days two years after ertapenem introduction (p = 0.027). Mean ertapenem consumption after restriction was 31.5 DDD/1,000 patient-days. Following ertapenem introduction no significant susceptibility changes were noticed among Gram-positive cocci. The most prevalent GNB were P. aeruginosa, Klebsiella pneumoniae, and Acinetobacter spp. There was no change in P. aeruginosa susceptibility to carbapenems. Significantly improved P. aeruginosa and K. pneumoniae ciprofloxacin susceptibilities were observed, perhaps due to decreased group 2 carbapenem use. K. pneumoniae susceptibility to trimethoprim-sulfamethoxazole improved. CONCLUSION: Preferential use of ertapenem resulted in reduced group 2 carbapenem use, with a positive impact on P. aeruginosa and K. pneumoniae susceptibility.
Resumo:
Background This study aimed to evaluate the association between the total suspended particles (TSP) generated from burning sugar cane plantations and the incidence of hospital admissions from hypertension in the city of Araraquara. Methods The study was an ecological time-series study. Total daily records of hypertension (ICD 10th I10-15) were obtained from admitted patients of all ages in a hospital in Araraquara, Sao Paulo State, Brazil, from 23 March 2003 to 27 July 2004. The daily concentration of TSP (mu g/m(3)) was obtained using a Handi-Vol sampler placed in downtown Araraquara. The local airport provided daily measures of temperature and humidity. In generalised linear Poisson regression models, the daily number of hospital admissions for hypertension was considered to be the dependent variable and the daily TSP concentration the independent variable. Results TSP presented a lagged effect on hypertension admissions, which was first observed 1 day after a TSP increase and remained almost unchanged for the following 2 days. A 10 mu g/m(3) increase in the TSP 3 day moving average lagged in 1 day led to an increase in hypertension-related hospital admissions during the harvest period (12.5%, 95% CI 5.6% to 19.9%) that was almost 30% higher than during non-harvest periods (9.0%, 95% CI 4.0% to 14.3%). Conclusions Increases in TSP concentrations were associated with hypertension-related hospital admissions. Despite the benefits of reduced air pollution in urban cities achieved by using ethanol produced from sugar cane to power automobiles, areas where the sugar cane is produced and harvested were found to have increased public health risk.
Resumo:
Background: The effects of renal denervation on cardiovascular reflexes and markers of nephropathy in diabetic-hypertensive rats have not yet been explored. Methods: Aim: To evaluate the effects of renal denervation on nephropathy development mechanisms (blood pressure, cardiovascular autonomic changes, renal GLUT2) in diabetic-hypertensive rats. Forty-one male spontaneously hypertensive rats (SHR) similar to 250 g were injected with STZ or not; 30 days later, surgical renal denervation (RD) or sham procedure was performed; 15 days later, glycemia and albuminuria (ELISA) were evaluated. Catheters were implanted into the femoral artery to evaluate arterial pressure (AP) and heart rate variability (spectral analysis) one day later in conscious animals. Animals were killed, kidneys removed, and cortical renal GLUT2 quantified (Western blotting). Results: Higher glycemia (p < 0.05) and lower mean AP were observed in diabetics vs. nondiabetics (p < 0.05). Heart rate was higher in renal-denervated hypertensive and lower in diabetic-hypertensive rats (384.8 +/- 37, 431.3 +/- 36, 316.2 +/- 5, 363.8 +/- 12 bpm in SHR, RD-SHR, STZ-SHR and RD-STZ-SHR, respectively). Heart rate variability was higher in renal-denervated diabetic-hypertensive rats (55.75 +/- 25.21, 73.40 +/- 53.30, 148.4 +/- 93 in RD-SHR, STZ-SHR-and RD-STZ-SHR, respectively, p < 0.05), as well as the LF component of AP variability (1.62 +/- 0.9, 2.12 +/- 0.9, 7.38 +/- 6.5 in RD-SHR, STZ-SHR and RD-STZ-SHR, respectively, p < 0.05). GLUT2 renal content was higher in all groups vs. SHR. Conclusions: Renal denervation in diabetic-hypertensive rats improved previously reduced heart rate variability. The GLUT2 equally overexpressed by diabetes and renal denervation may represent a maximal derangement effect of each condition.
Resumo:
Background: Worldwide, a high proportion of HIV-infected individuals enter into HIV care late. Here, our objective was to estimate the impact that late entry into HIV care has had on AIDS mortality rates in Brazil. Methodology/Principal Findings: We analyzed data from information systems regarding HIV-infected adults who sought treatment at public health care facilities in Brazil from 2003 to 2006. We initially estimated the prevalence of late entry into HIV care, as well as the probability of death in the first 12 months, the percentage of the risk of death attributable to late entry, and the number of avoidable deaths. We subsequently adjusted the annual AIDS mortality rate by excluding such deaths. Of the 115,369 patients evaluated, 50,358 (43.6%) had entered HIV care late, and 18,002 died in the first 12 months, representing a 16.5% probability of death in the first 12 months (95% CI: 16.3-16.7). By comparing patients who entered HIV care late with those who gained timely access, we found that the risk ratio for death was 49.5 (95% CI: 45.1-54.2). The percentage of the risk of death attributable to late entry was 95.5%, translating to 17,189 potentially avoidable deaths. Averting those deaths would have lowered the 2003-2006 AIDS mortality rate by 39.5%. Including asymptomatic patients with CD4(+) T cell counts >200 and <= 350 cells/mm(3) in the group who entered HIV care late increased this proportion by 1.8%. Conclusions/Significance: In Brazil, antiretroviral drugs reduced AIDS mortality by 43%. Timely entry would reduce that rate by a similar proportion, as well as resulting in a 45.2% increase in the effectiveness of the program for HIV care. The World Health Organization recommendation that asymptomatic patients with CD4(+) T cell counts <= 350 cells/mm(3) be treated would not have a significant impact on this scenario.
Resumo:
Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.
Resumo:
The influence of guar and xanthan gum and their combined use on dough proofing rate and its calorimetric properties was investigated. Fusion enthalpy, which is related to the amount of frozen water, was influenced by frozen dough formulation and storage time; specifically gum addition reduced the fusion enthalpy in comparison to control formulation, 76.9 J/g for formulation with both gums and 81.2 J/g for control, at 28th day. Other calorimetric parameters, such as T(g) and freezable water amount, were also influenced by frozen storage time. For all formulations, proofing rate of dough after freezing, frozen storage time and thawing, decreased in comparison to non-frozen dough, indicating that the freezing process itself was more detrimental to the proofing rate than storage time. For all formulations, the mean value of proofing rate was 2.97 +/- 0.24 cm(3) min(-1) per 100 g of non-frozen dough and 2.22 +/- 0.12 cm(3) min(-1) per 100 g of frozen dough. Also the proofing rate of non-frozen dough with xanthan gum decreased significantly in relation to dough without gums and dough with only guar gum. Optical microscopy analyses showed that the gas cell production after frozen storage period was reduced, which is in agreement with the proofing rate results. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
Financial institutions are directly exposed to the credit risk, that is, the risk of the borrower not fulfill with their obligations, paying their debts in its stated periods established previously. The bank predict this type of risk, including them in their balance-sheets. In 2006/2007 there was the impact of a new financial crisis that spread around the world, known as the crisis of subprime. The objective of this study is to analyze if the provisions for credit risk or liquidation increased the sprouting of the crisis of subprime in ten major national banks, chosen accordant to their total assets. To answer this question, the balance-sheets of each one of these banks in the period of 2005 to 2007 were analyzed. This research is characterized, as for its objectives, as descriptive and as for the procedures as documentary research. It is also characterized as having a qualitative approach. The results show that the crisis of subprime has caused little impact in the credit risk provision of the analyzed institutions. It was noticed a slight increase in the provision indicators at the peak of the crisis in 2006. These percentages were reduced in, 2007, probably reflecting the economic stability of Brazil and the stagnation of the crisis Of subprime in that year, at least in relation to in our country.
Resumo:
Objective: To investigate the effects of the rate of airway pressure increase and duration of recruitment maneuvers on lung function and activation of inflammation, fibrogenesis, and apoptosis in experimental acute lung injury. Design: Prospective, randomized, controlled experimental study. Setting: University research laboratory. Subjects: Thirty-five Wistar rats submitted to acute lung injury induced by cecal ligation and puncture. Interventions: After 48 hrs, animals were randomly distributed into five groups (seven animals each): 1) nonrecruited (NR); 2) recruitment maneuvers (RMs) with continuous positive airway pressure (CPAP) for 15 secs (CPAP15); 3) RMs with CPAP for 30 secs (CPAP30); 4) RMs with stepwise increase in airway pressure (STEP) to targeted maximum within 15 secs (STEP15); and 5) RMs with STEP within 30 secs (STEP30). To perform STEP RMs, the ventilator was switched to a CPAP mode and positive end-expiratory pressure level was increased stepwise. At each step, airway pressure was held constant. RMs were targeted to 30 cm H(2)O. Animals were then ventilated for 1 hr with tidal volume of 6 mL/kg and positive end-expiratory pressure of 5 cm H(2)O. Measurements and Main Results: Blood gases, lung mechanics, histology (light and electronic microscopy), interleukin-6, caspase 3, and type 3 procollagen mRNA expressions in lung tissue. All RMs improved oxygenation and lung static elastance and reduced alveolar collapse compared to NR. STEP30 resulted in optimal performance, with: 1) improved lung static elastance vs. NR, CPAP15, and STEP15; 2) reduced alveolar-capillary membrane detachment and type 2 epithelial and endothelial cell injury scores vs. CPAP15 (p < .05); and 3) reduced gene expression of interleukin-6, type 3 procollagen, and caspase 3 in lung tissue vs. other RMs. Conclusions: Longer-duration RMs with slower airway pressure increase efficiently improved lung function, while minimizing the biological impact on lungs. (Crit Care Med 2011; 39:1074-1081)
Resumo:
The aim of this study was to evaluate the frequency of polymorphisms in the TYMS, XRCC1, and ERCC2 DNA repair genes in pediatric patients with acute lymphoblastic leukemia using polymerase chain reaction (PCR) and PCR-restriction fragment length polymorphism (RFLP) approaches. The study was conducted in 206 patients and 364 controls from a Brazilian population. No significant differences were observed among the analyzed groups regarding XRCC1 codon 399 and codon 194 and ERCC2 codon 751 and codon 312 polymorphisms. The TYMS 3R variant allele was significantly associated with a reduced risk of childhood ALL, represented by the sum of heterozygous and polymorphic homozygous genotypes (odds ratio 0.60; 95% confidence interval 0.37-0.99). The results suggest that polymorphism in TYMS may play a protective role against the development of childhood ALL.
Resumo:
Background Occupational risk due to airborne disease challenges healthcare institutions. Environmental measures are effective but their cost-effectiveness is still debatable and most of the capacity planning is based on occupational rates. Better indices to plan and evaluate capacity are needed. Goal To evaluate the impact of installing an exclusively dedicated respiratory isolation room (EDRIR) in a tertiary emergency department (ED) determined by a time-to-reach-facility method. Methods A group of patients in need of respiratory isolation were first identified-group I (2004; 29 patients; 44.1 +/- 3.4 years) and the occupational rate and time intervals (arrival to diagnosis, diagnosis to respiratory isolation indication and indication to effective isolation) were determined and it was estimated that adding an EDRIR would have a significant impact over the time to isolation. After implementing the EDRIR, a second group of patients was gathered in the same period of the year-group II (2007; 50 patients; 43.4 +/- 1.8 years) and demographic and functional parameters were recorded to evaluate time to isolation. Cox proportional hazard models adjusted for age, gender and inhospital respiratory isolation room availability were obtained. Results Implementing an EDRIR decreased the time from arrival to indication of respiratory isolation (27.5 +/- 9.3 X 3.7 +/- 2.0; p = 0.0180) and from indication to effective respiratory isolation (13.3 +/- 3.0 X 2.94 +/- 1.06; p = 0.003) but not the respiratory isolation duration and total hospital stay. The impact on crude isolation rates was very significant (8.9 X 75.4/100.000 patients; p < 0.001). The HR for effective respiratory isolation was 26.8 (95% CI 7.42 to 96.9) p < 0.001 greater for 2007. Conclusion Implementing an EDRIR in a tertiary ED significantly reduced the time to respiratory isolation.
Resumo:
In this study, we investigated the hematopoietic response of rats pretreated with CV and exposed to the impact of acute escapable, inescapable or psychogenical stress on responsiveness to an in vivo challenge with Listeria monocytogenes. No consistent changes were observed after exposure to escapable footshock. Conversely, the impact of uncontrollable stress (inescapable and psychogenical) was manifested by an early onset and increased severity and duration of myrelossuppression produced by the infection. Small size CFU-CM colonies and increased numbers of clusters were observed, concurrently to a greater expansion in the more mature population of bone marrow granulocytes. No differences were observed between the responses of both uncontrollable stress regimens. CV prevented the myelossuppression caused by stress/infection due to increased numbers of CFU-GM in the bone marrow. Colonies of cells tightly packed, with a very condensed nucleus; in association with a greater expansion in the more immature population of bone marrow granulocytes were observed. Investigation of the production of colony-stimulating factors revealed increased colony-stimulating activity (CSA) in the serum of normal and infected/stressed rats treated with the algae. CV treatment restored/enhanced the changes produced by stress/infection in total and differential bone marrow and peripheral cells counts. Further studies demonstrated that INF-gamma is significantly reduced, whereas IL-10 is significantly increased after exposure to Uncontrollable stress. Treatment with CV significantly increased INF-gamma levels and diminished the levels of IL-10. Uncontrollable stress reduced the protection afforded by CV to a lethal dose of L. monocytogenes, with survival rates being reduced from (50%) in infected rats to 20% in infected/stressed rats. All together, our results suggest Chlorella treatment as an effective tool for the prophylaxis of post-stress myelossupression, including the detrimental effect of stress on the course and outcome of infections. (C) 2008 Elsevier Inc. All rights reserved.