996 resultados para IMPACT PARAMETER
Resumo:
Background: The effects of renal denervation on cardiovascular reflexes and markers of nephropathy in diabetic-hypertensive rats have not yet been explored. Methods: Aim: To evaluate the effects of renal denervation on nephropathy development mechanisms (blood pressure, cardiovascular autonomic changes, renal GLUT2) in diabetic-hypertensive rats. Forty-one male spontaneously hypertensive rats (SHR) similar to 250 g were injected with STZ or not; 30 days later, surgical renal denervation (RD) or sham procedure was performed; 15 days later, glycemia and albuminuria (ELISA) were evaluated. Catheters were implanted into the femoral artery to evaluate arterial pressure (AP) and heart rate variability (spectral analysis) one day later in conscious animals. Animals were killed, kidneys removed, and cortical renal GLUT2 quantified (Western blotting). Results: Higher glycemia (p < 0.05) and lower mean AP were observed in diabetics vs. nondiabetics (p < 0.05). Heart rate was higher in renal-denervated hypertensive and lower in diabetic-hypertensive rats (384.8 +/- 37, 431.3 +/- 36, 316.2 +/- 5, 363.8 +/- 12 bpm in SHR, RD-SHR, STZ-SHR and RD-STZ-SHR, respectively). Heart rate variability was higher in renal-denervated diabetic-hypertensive rats (55.75 +/- 25.21, 73.40 +/- 53.30, 148.4 +/- 93 in RD-SHR, STZ-SHR-and RD-STZ-SHR, respectively, p < 0.05), as well as the LF component of AP variability (1.62 +/- 0.9, 2.12 +/- 0.9, 7.38 +/- 6.5 in RD-SHR, STZ-SHR and RD-STZ-SHR, respectively, p < 0.05). GLUT2 renal content was higher in all groups vs. SHR. Conclusions: Renal denervation in diabetic-hypertensive rats improved previously reduced heart rate variability. The GLUT2 equally overexpressed by diabetes and renal denervation may represent a maximal derangement effect of each condition.
Resumo:
As a contribution to the Large-Scale Biosphere-Atmosphere Experiment in Amazonia - Cooperative LBA Airborne Regional Experiment (LBA-CLAIRE-2001) field campaign in the heart of the Amazon Basin, we analyzed the temporal and spatial dynamics of the urban plume of Manaus City during the wet-to-dry season transition period in July 2001. During the flights, we performed vertical stacks of crosswind transects in the urban outflow downwind of Manaus City, measuring a comprehensive set of trace constituents including O(3), NO, NO(2), CO, VOC, CO(2), and H(2)O. Aerosol loads were characterized by concentrations of total aerosol number (CN) and cloud condensation nuclei (CCN), and by light scattering properties. Measurements over pristine rainforest areas during the campaign showed low levels of pollution from biomass burning or industrial emissions, representative of wet season background conditions. The urban plume of Manaus City was found to be joined by plumes from power plants south of the city, all showing evidence of very strong photochemical ozone formation. One episode is discussed in detail, where a threefold increase in ozone mixing ratios within the atmospheric boundary layer occurred within a 100 km travel distance downwind of Manaus. Observation-based estimates of the ozone production rates in the plume reached 15 ppb h(-1). Within the plume core, aerosol concentrations were strongly enhanced, with Delta CN/Delta CO ratios about one order of magnitude higher than observed in Amazon biomass burning plumes. Delta CN/Delta CO ratios tended to decrease with increasing transport time, indicative of a significant reduction in particle number by coagulation, and without substantial new particle nucleation occurring within the time/space observed. While in the background atmosphere a large fraction of the total particle number served as CCN (about 60-80% at 0.6% supersaturation), the CCN/CN ratios within the plume indicated that only a small fraction (16 +/- 12 %) of the plume particles were CCN. The fresh plume aerosols showed relatively weak light scattering efficiency. The CO-normalized CCN concentrations and light scattering coefficients increased with plume age in most cases, suggesting particle growth by condensation of soluble organic or inorganic species. We used a Single Column Chemistry and Transport Model (SCM) to infer the urban pollution emission fluxes of Manaus City, implying observed mixing ratios of CO, NO(x) and VOC. The model can reproduce the temporal/spatial distribution of ozone enhancements in the Manaus plume, both with and without accounting for the distinct (high NO(x)) contribution by the power plants; this way examining the sensitivity of ozone production to changes in the emission rates of NO(x). The VOC reactivity in the Manaus region was dominated by a high burden of biogenic isoprene from the background rainforest atmosphere, and therefore NO(x) control is assumed to be the most effective ozone abatement strategy. Both observations and models show that the agglomeration of NO(x) emission sources, like power plants, in a well-arranged area can decrease the ozone production efficiency in the near field of the urban populated cores. But on the other hand remote areas downwind of the city then bear the brunt, being exposed to increased ozone production and N-deposition. The simulated maximum stomatal ozone uptake fluxes were 4 nmol m(-2) s(-1) close to Manaus, and decreased only to about 2 nmol m(-2) s(-1) within a travel distance >1500 km downwind from Manaus, clearly exceeding the critical threshold level for broadleaf trees. Likewise, the simulated N deposition close to Manaus was similar to 70 kg N ha(-1) a(-1) decreasing only to about 30 kg N ha(-1) a(-1) after three days of simulation.
Resumo:
In extensions of the standard model with a heavy fourth generation, one important question is what makes the fourth-generation lepton sector, particularly the neutrinos, so different from the lighter three generations. We study this question in the context of models of electroweak symmetry breaking in warped extra dimensions, where the flavor hierarchy is generated by choosing the localization of the zero-mode fermions in the extra dimension. In this setup the Higgs sector is localized near the infrared brane, whereas the Majorana mass term is localized at the ultraviolet brane. As a result, light neutrinos are almost entirely Majorana particles, whereas the fourth-generation neutrino is mostly a Dirac fermion. We show that it is possible to obtain heavy fourth-generation leptons in regions of parameter space where the light neutrino masses and mixings are compatible with observation. We study the impact of these bounds, as well as the ones from lepton flavor violation, on the phenomenology of these models.
Resumo:
In this paper we study the one-and two-loop contribution to the free energy in QED with Lorentz symmetry breaking introduced via constant CPT-even Lorentz-breaking parameters at the high temperature limit. We find the impact of the Lorentz-violating term for the free energy and carry out a numerical estimation for the Lorentz-breaking parameter.
Resumo:
The addition of transition metals to III-V semiconductors radically changes their electronic, magnetic, and structural properties. We show by ab initio calculations that in contrast to the conventional semiconductor alloys, the lattice parameter in magnetic semiconductor alloys, including those with diluted concentration, strongly deviates from Vegard's law. We find a direct correlation between the magnetic moment and the anion-transition metal bond lengths and derive a simple and general formula that determines the lattice parameter of a particular magnetic semiconductor by considering both the composition and magnetic moment. This dependence can explain some experimentally observed anomalies and stimulate other kind of investigations.
Resumo:
We report results of magnetoacoustic studies in the quantum spin-chain magnet NiCl(2)-4SC(NH(2))(2) (DTN) having a field-induced ordered antiferromagnetic (AF) phase. In the vicinity of the quantum critical points (QCPs) the acoustic c(33) mode manifests a pronounced softening accompanied by energy dissipation of the sound wave. The acoustic anomalies are traced up to T > T(N), where the thermodynamic properties are determined by fermionic magnetic excitations, the ""hallmark"" of one-dimensional (1D) spin chains. On the other hand, as established in earlier studies, the AF phase in DTN is governed by bosonic magnetic excitations. Our results suggest the presence of a crossover from a 1D fermionic to a three-dimensional bosonic character of the magnetic excitations in DTN in the vicinity of the QCPs.
Resumo:
We consider a polling model with multiple stations, each with Poisson arrivals and a queue of infinite capacity. The service regime is exhaustive and there is Jacksonian feedback of served customers. What is new here is that when the server comes to a station it chooses the service rate and the feedback parameters at random; these remain valid during the whole stay of the server at that station. We give criteria for recurrence, transience and existence of the sth moment of the return time to the empty state for this model. This paper generalizes the model, when only two stations accept arriving jobs, which was considered in [Ann. Appl. Probab. 17 (2007) 1447-1473]. Our results are stated in terms of Lyapunov exponents for random matrices. From the recurrence criteria it can be seen that the polling model with parameter regeneration can exhibit the unusual phenomenon of null recurrence over a thick region of parameter space.
Resumo:
Background: Worldwide, a high proportion of HIV-infected individuals enter into HIV care late. Here, our objective was to estimate the impact that late entry into HIV care has had on AIDS mortality rates in Brazil. Methodology/Principal Findings: We analyzed data from information systems regarding HIV-infected adults who sought treatment at public health care facilities in Brazil from 2003 to 2006. We initially estimated the prevalence of late entry into HIV care, as well as the probability of death in the first 12 months, the percentage of the risk of death attributable to late entry, and the number of avoidable deaths. We subsequently adjusted the annual AIDS mortality rate by excluding such deaths. Of the 115,369 patients evaluated, 50,358 (43.6%) had entered HIV care late, and 18,002 died in the first 12 months, representing a 16.5% probability of death in the first 12 months (95% CI: 16.3-16.7). By comparing patients who entered HIV care late with those who gained timely access, we found that the risk ratio for death was 49.5 (95% CI: 45.1-54.2). The percentage of the risk of death attributable to late entry was 95.5%, translating to 17,189 potentially avoidable deaths. Averting those deaths would have lowered the 2003-2006 AIDS mortality rate by 39.5%. Including asymptomatic patients with CD4(+) T cell counts >200 and <= 350 cells/mm(3) in the group who entered HIV care late increased this proportion by 1.8%. Conclusions/Significance: In Brazil, antiretroviral drugs reduced AIDS mortality by 43%. Timely entry would reduce that rate by a similar proportion, as well as resulting in a 45.2% increase in the effectiveness of the program for HIV care. The World Health Organization recommendation that asymptomatic patients with CD4(+) T cell counts <= 350 cells/mm(3) be treated would not have a significant impact on this scenario.
Resumo:
Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.
Resumo:
The Community Climate Model (CCM3) from the National Center for Atmospheric Research (NCAR) is used to investigate the effect of the South Atlantic sea surface temperature (SST) anomalies on interannual to decadal variability of South American precipitation. Two ensembles composed of multidecadal simulations forced with monthly SST data from the Hadley Centre for the period 1949 to 2001 are analysed. A statistical treatment based on signal-to-noise ratio and Empirical Orthogonal Functions (EOF) is applied to the ensembles in order to reduce the internal variability among the integrations. The ensemble treatment shows a spatial and temporal dependence of reproducibility. High degree of reproducibility is found in the tropics while the extratropics is apparently less reproducible. Austral autumn (MAM) and spring (SON) precipitation appears to be more reproducible over the South America-South Atlantic region than the summer (DJF) and winter (JJA) rainfall. While the Inter-tropical Convergence Zone (ITCZ) region is dominated by external variance, the South Atlantic Convergence Zone (SACZ) over South America is predominantly determined by internal variance, which makes it a difficult phenomenon to predict. Alternatively, the SACZ over western South Atlantic appears to be more sensitive to the subtropical SST anomalies than over the continent. An attempt is made to separate the atmospheric response forced by the South Atlantic SST anomalies from that associated with the El Nino - Southern Oscillation (ENSO). Results show that both the South Atlantic and Pacific SSTs modulate the intensity and position of the SACZ during DJF. Particularly, the subtropical South Atlantic SSTs are more important than ENSO in determining the position of the SACZ over the southeast Brazilian coast during DJF. On the other hand, the ENSO signal seems to influence the intensity of the SACZ not only in DJF but especially its oceanic branch during MAM. Both local and remote influences, however, are confounded by the large internal variance in the region. During MAM and JJA, the South Atlantic SST anomalies affect the magnitude and the meridional displacement of the ITCZ. In JJA, the ENSO has relatively little influence on the interannual variability of the simulated rainfall. During SON, however, the ENSO seems to counteract the effect of the subtropical South Atlantic SST variations on convection over South America.
Resumo:
Despite the fact that the majority of the catalytic electro-oxidation of small organic molecules presents oscillatory kinetics under certain conditions, there are few systematic studies concerning the influence of experimental parameters on the oscillatory dynamics. Of the studies available, most are devoted to C1 molecules and just some scattered data are available for C2 molecules. We present in this work a comprehensive study of the electro-oxidation of ethylene glycol on polycrystalline platinum surfaces and in alkaline media. The system was studied by means of electrochemical impedance spectroscopy, cyclic voltammetry, and chronoamperometry, and the impact of parameters such as applied current, ethylene glycol concentration, and temperature were investigated. As in the case of other parent systems, the instabilities in this system were associated with a hidden negative differential resistance, as identified by impedance data. Very rich and robust dynamics were observed, including the presence of harmonic and mixed mode oscillations and chaotic states, in some parameter region. Oscillation frequencies of about 16 Hz characterized the fastest oscillations ever reported for the electro-oxidation of small organic molecules. Those high frequencies were strongly influenced by the electrolyte pH and far less affected by the EG concentration. The system was regularly dependent on temperature under voltammetric conditions but rather independent within the oscillatory regime.
Resumo:
There is more to sustainable forest management than reduced impact logging. Partnerships between multiple actors are needed in order to create the institutional context for good forest governance and sustainable forest management and stimulate the necessary local community involvement. The idea behind this is that the parties would be able to achieve more jointly than on their own by combining assets, knowledge, skills and political power of actors at different levels of scale. This article aims to demonstrate by example the nature and variety of forest-related partnerships in Brazilian Amazonia. Based on the lessons learned from these cases and the authors` experience, the principal characteristics of successful partnerships are described, with a focus on political and socioeconomic aspects. These characteristics include fairly negotiated partnership objectives, the active involvement of the public sector as well as impartial brokers, equitable and cost-effective institutional arrangements, sufficient and equitably shared benefits for all the parties involved, addressing socioeconomic drawbacks, and taking measures to maintain sustainable exploitation levels. The authors argue that, in addition to product-oriented partnerships which focus on sustainable forest management, there is also a need for politically oriented partnerships based on civil society coalitions. The watchdog function of these politically oriented partnerships, their awareness-raising campaigns regarding detrimental policies and practices, and advocacy for good forest governance are essential for the creation of the appropriate legal and political framework for sustainable forest management. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
The Random Parameter model was proposed to explain the structure of the covariance matrix in problems where most, but not all, of the eigenvalues of the covariance matrix can be explained by Random Matrix Theory. In this article, we explore the scaling properties of the model, as observed in the multifractal structure of the simulated time series. We use the Wavelet Transform Modulus Maxima technique to obtain the multifractal spectrum dependence with the parameters of the model. The model shows a scaling structure compatible with the stylized facts for a reasonable choice of the parameter values. (C) 2009 Elsevier B.V. All rights reserved.
Impact of cancer-related symptom synergisms on health-related quality of life and performance status
Resumo:
To identify the impact of multiple symptoms and their co-occurrence on health-related quality of life (HRQOL) dimensions and performance status (PS), 115 outpatients with cancer, who were not receiving active cancer treatment and were recruited from, a university hospital in Sao Paulo, Brazil completed the European Organization for Research and Treatment of Cancer Quality of Life Questionnaire-C30, the Beck Depression Inventory, and the Brief Pain Inventory. Karnofsky Performance Status scores also were completed. Application of TwoStep Cluster analysis resulted in two distinct patient subgroups based on 113 patient experiences with pain, depression, fatigue, insomnia, constipation, lack of appetite, dyspnea, nausea, vomiting, and diarrhea. One group had multiple and severe symptom subgroup and another had Less symptoms and with lower severity. Multiple and severe symptoms had worse PS, role functioning, and physical, emotional, cognitive, social, and overall HRQOL. Multiple and severe symptom subgroup was also six times as likely as lower severity to have poor role functioning;five times more likely to have poor emotional;four times more likely to have poor PS, physical, and overall HRQOL, and three times as likely to have poor cognitive and social HRQOL, independent of gender, age, level of education, and economic condition. Classification and Regression Tree analyses were undertaken to identify which co-occurring symptoms would best determine reduction in HRQOL and PS. Pain and fatigue were identified as indicators of reduction on physical HRQOL and PS. Fatigue and insomnia were associated with reduction in cognitive; depression and pain in social; and fatigue and constipation in role functioning. Only depression was associated with reduction in overall HRQOL. These data demonstrate that there is a synergic effect among distinct cancer symptoms that result in reduction in HRQOL dimensions and PS.