985 resultados para error rates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mental health problems are common in primary health care, particularly anxiety and depression. This study aims to estimate the prevalence of common mental disorders and their associations with socio-demographic characteristics in primary care in Brazil (Family Health Strategy). It involved a multicenter cross-sectional study with patients from Rio de Janeiro, São Paulo, Fortaleza (Ceará State) and Porto Alegre (Rio Grande do Sul State), assessed using the General Health Questionnaire (GHQ-12) and the Hospital Anxiety and Depression Scale (HAD). The rate of mental disorders in patients from Rio de Janeiro, São Paulo, Fortaleza and Porto Alegre were found to be, respectively, 51.9%, 53.3%, 64.3% and 57.7% with significant differences between Porto Alegre and Fortaleza compared to Rio de Janeiro after adjusting for confounders. Prevalence proportions of mental problems were especially common for females, the unemployed, those with less education and those with lower incomes. In the context of the Brazilian government's moves towards developing primary health care and reorganizing mental health policies it is relevant to consider common mental disorders as a priority alongside other chronic health conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Melatonin (MEL) acts as a powerful scavenger of free radicals and direct gonadal responses to melatonin have been reported in the literature. Few studies, however, have evaluated the effect of MEL during in vitro maturation (IVM) on bovine embryos. This study tested the addition of MEL to maturation medium (MM) with no gonadotropins on nuclear maturation and embryo development rates and the incidence of DNA damage in resulting embryos. Cumulus-oocyte complexes were aspirated from abattoir ovaries and cultured in MM (TCM-199 medium supplemented with 10% fetal calf serum - FCS) at 39ºC and 5% CO2 in air. After 24 hours of culture in MM with 0.5 µg mL-1 FSH and 5.0 µg mL-1 LH; 10-9 M MEL) or 10-9 M MEL, 0.5 µg mL-1 FSH and 5.0 µg mL-1 LH, the oocytes were stained with Hoechst 33342 to evaluate nuclear maturation rate. After in vitro fertilization and embryo culture, development rates were evaluated and the blastocysts were assessed for DNA damage by Comet assay. There was no effect of melatonin added to the MM, alone or in combination with gonadotropins, on nuclear maturation, cleavage and blastocyst rates. These rates ranged between 88% to 90%, 85% to 88% and 42% to 46%, respectively. The extent of DNA damage in embryos was also not affected by MEL supplementation during IVM. The addition of 10-9 M MEL to the MM failed to improve nuclear maturation and embryo development rates and the incidence of DNA damage in resulting embryos, but was able to properly substitute for gonadotropins during IVM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work evaluated the effects of Tris (hydroxymethyl)-aminomethane (TRIS) buffer and its interaction with nutrient concentration on the development of Gracilaria birdie, a common species on the Brazilian coast that has been exploited for agar production. Responses to different conditions were assessed through growth rates and pigment content (chlorophyll a, phycoerythrin, phycocyanin and allophycocyanin). Provasoli's nutrient solution with and without TRIS addition was tested at concentrations of 12.5, 25 and 50%. The pH was also monitored. G. birdiae grew better in the absence of TRIS and at low nutrient concentrations, 12.5 and 25% (growth rates of 10.8-11.3%.day-1). Higher contents of phycoerythrin and chlorophyll a were observed without TRIS at 12.5 and 25% (Phycoerythrin, 649.6-698.0 μg g-1 fresh biomass; Chlorophyll a, 156.0-168.6 μg g-1 fresh biomass). These findings highlight the deleterious effect of TRIS on growth and phycoerythrin and chlorophyll a content. They also demonstrate the importance of appropriate nutrient concentration for laboratory cultures, depending on the intrinsic characteristics of each species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The tolerance to the combined effects of temperature and salinity was investigated in the interstitial isopod Coxicerberus ramosae (Albuquerque, 1978), a species of intertidal zone of sandy beaches in Rio de Janeiro, Brazil. The animals were collected on Praia Vermelha Beach. The experiments lasted 24 h and nine salinities and seven temperatures were used for a total of 63 combinations. Thirty animals were tested in each combination. The species showed high survival in most of the combinations. The temperature of 35 ºC was lethal and at 5 ºC, the animals tolerated only a narrow range of salinities. The statistical analyses showed that the effects of temperature and salinity were significant on the survival, which confirmed the euryhalinity and eurythermy of this species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

As a part of the Tigecycline Evaluation and Surveillance Trial (T.E.S.T.), Gram-positive and Gram-negative bacterial isolates were collected from 33 centers in Latin America (centers in Argentina, Brazil, Chile, Colombia, Guatemala, Honduras, Jamaica, Mexico, Panama, Puerto Rico, and Venezuela) from January 2004 to September 2007. Argentina and Mexico were the greatest contributors of isolates to this study. Susceptibilities were determined according to Clinical Laboratory Standards Institute guidelines. Resistance levels were high for most key organisms across Latin America: 48.3% of Staphylococcus aureus isolates were methicillin-resistant while 21.4% of Acinetobacter spp. isolates were imipenem-resistant. Extended-spectrum β-lactamase were reported in 36.7% of Klebsiella pneumoniae and 20.8% of E. coli isolates. Tigecycline was the most active agent against Gram-positive isolates. Tigecycline was also highly active against all Gram-negative organisms, with the exception of Pseuodomonas aeruginosa, against which piperacillin-tazobactam was the most active agent tested (79.3% of isolates susceptible). The in vitro activity of tigecycline against both Gram-positive and Gram-negative isolates indicates that it may be an useful tool for the treatment of nosocomial infections, even those caused by organisms that are resistant to other antibacterial agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study proposes a simplified mathematical model to describe the processes occurring in an anaerobic sequencing batch biofilm reactor (ASBBR) treating lipid-rich wastewater. The reactor, subjected to rising organic loading rates, contained biomass immobilized cubic polyurethane foam matrices, and was operated at 32 degrees C +/- 2 degrees C, using 24-h batch cycles. In the adaptation period, the reactor was fed with synthetic substrate for 46 days and was operated without agitation. Whereas agitation was raised to 500 rpm, the organic loading rate (OLR) rose from 0.3 g chemical oxygen demand (COD) . L(-1) . day(-1) to 1.2 g COD . L(-1) . day(-1). The ASBBR was fed fat-rich wastewater (dairy wastewater), in an operation period lasting for 116 days, during which four operational conditions (OCs) were tested: 1.1 +/- 0.2 g COD . L(-1) . day(-1) (OC1), 4.5 +/- 0.4 g COD . L(-1) . day(-1) (OC2), 8.0 +/- 0.8 g COD . L(-1) . day(-1) (OC3), and 12.1 +/- 2.4 g COD . L(-1) . day(-1) (OC4). The bicarbonate alkalinity (BA)/COD supplementation ratio was 1:1 at OC1, 1:2 at OC2, and 1:3 at OC3 and OC4. Total COD removal efficiencies were higher than 90%, with a constant production of bicarbonate alkalinity, in all OCs tested. After the process reached stability, temporal profiles of substrate consumption were obtained. Based on these experimental data a simplified first-order model was fit, making possible the inference of kinetic parameters. A simplified mathematical model correlating soluble COD with volatile fatty acids (VFA) was also proposed, and through it the consumption rates of intermediate products as propionic and acetic acid were inferred. Results showed that the microbial consortium worked properly and high efficiencies were obtained, even with high initial substrate concentrations, which led to the accumulation of intermediate metabolites and caused low specific consumption rates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a new statistical algorithm to estimate rainfall over the Amazon Basin region using the Tropical Rainfall Measuring Mission (TRMM) Microwave Imager (TMI). The algorithm relies on empirical relationships derived for different raining-type systems between coincident measurements of surface rainfall rate and 85-GHz polarization-corrected brightness temperature as observed by the precipitation radar (PR) and TMI on board the TRMM satellite. The scheme includes rain/no-rain area delineation (screening) and system-type classification routines for rain retrieval. The algorithm is validated against independent measurements of the TRMM-PR and S-band dual-polarization Doppler radar (S-Pol) surface rainfall data for two different periods. Moreover, the performance of this rainfall estimation technique is evaluated against well-known methods, namely, the TRMM-2A12 [ the Goddard profiling algorithm (GPROF)], the Goddard scattering algorithm (GSCAT), and the National Environmental Satellite, Data, and Information Service (NESDIS) algorithms. The proposed algorithm shows a normalized bias of approximately 23% for both PR and S-Pol ground truth datasets and a mean error of 0.244 mm h(-1) ( PR) and -0.157 mm h(-1)(S-Pol). For rain volume estimates using PR as reference, a correlation coefficient of 0.939 and a normalized bias of 0.039 were found. With respect to rainfall distributions and rain area comparisons, the results showed that the formulation proposed is efficient and compatible with the physics and dynamics of the observed systems over the area of interest. The performance of the other algorithms showed that GSCAT presented low normalized bias for rain areas and rain volume [0.346 ( PR) and 0.361 (S-Pol)], and GPROF showed rainfall distribution similar to that of the PR and S-Pol but with a bimodal distribution. Last, the five algorithms were evaluated during the TRMM-Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) 1999 field campaign to verify the precipitation characteristics observed during the easterly and westerly Amazon wind flow regimes. The proposed algorithm presented a cumulative rainfall distribution similar to the observations during the easterly regime, but it underestimated for the westerly period for rainfall rates above 5 mm h(-1). NESDIS(1) overestimated for both wind regimes but presented the best westerly representation. NESDIS(2), GSCAT, and GPROF underestimated in both regimes, but GPROF was closer to the observations during the easterly flow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bee males (drones) of stingless bees tend to congregate near entrances of conspecific nests, where they wait for virgin queens that initiate their nuptial flight. We observed that the Neotropical solitary wasp Trachypus boharti (Hymenoptera, Cabronidae) specifically preys on males of the stingless bee Scaptotrigona postica (Hymenoptera, Apidae); these wasps captured up to 50 males per day near the entrance of a single hive. Over 90% of the wasp attacks were unsuccessful; such erroneous attacks often involved conspecific wasps and worker bees. After the capture of non-male prey, wasps almost immediately released these individuals unharmed and continued hunting. A simple behavioral experiment showed that at short distances wasps were not specifically attracted to S. postica males nor were they repelled by workers of the same species. Likely, short-range prey detection near the bees' nest is achieved mainly by vision whereas close-range prey recognition is based principally on chemical and/or mechanical cues. We argue that the dependence on the wasp's visual perception during attack and the crowded and dynamic hunting conditions caused wasps to make many preying attempts that failed. Two wasp-density-related factors, wasp-prey distance and wasp-wasp encounters, may account for the fact that the highest male capture and unsuccessful wasp bee encounter rates occurred at intermediate wasp numbers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Worldwide, a high proportion of HIV-infected individuals enter into HIV care late. Here, our objective was to estimate the impact that late entry into HIV care has had on AIDS mortality rates in Brazil. Methodology/Principal Findings: We analyzed data from information systems regarding HIV-infected adults who sought treatment at public health care facilities in Brazil from 2003 to 2006. We initially estimated the prevalence of late entry into HIV care, as well as the probability of death in the first 12 months, the percentage of the risk of death attributable to late entry, and the number of avoidable deaths. We subsequently adjusted the annual AIDS mortality rate by excluding such deaths. Of the 115,369 patients evaluated, 50,358 (43.6%) had entered HIV care late, and 18,002 died in the first 12 months, representing a 16.5% probability of death in the first 12 months (95% CI: 16.3-16.7). By comparing patients who entered HIV care late with those who gained timely access, we found that the risk ratio for death was 49.5 (95% CI: 45.1-54.2). The percentage of the risk of death attributable to late entry was 95.5%, translating to 17,189 potentially avoidable deaths. Averting those deaths would have lowered the 2003-2006 AIDS mortality rate by 39.5%. Including asymptomatic patients with CD4(+) T cell counts >200 and <= 350 cells/mm(3) in the group who entered HIV care late increased this proportion by 1.8%. Conclusions/Significance: In Brazil, antiretroviral drugs reduced AIDS mortality by 43%. Timely entry would reduce that rate by a similar proportion, as well as resulting in a 45.2% increase in the effectiveness of the program for HIV care. The World Health Organization recommendation that asymptomatic patients with CD4(+) T cell counts <= 350 cells/mm(3) be treated would not have a significant impact on this scenario.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: There are several studies in the literature depicting measurement error in gene expression data and also, several others about regulatory network models. However, only a little fraction describes a combination of measurement error in mathematical regulatory networks and shows how to identify these networks under different rates of noise. Results: This article investigates the effects of measurement error on the estimation of the parameters in regulatory networks. Simulation studies indicate that, in both time series (dependent) and non-time series (independent) data, the measurement error strongly affects the estimated parameters of the regulatory network models, biasing them as predicted by the theory. Moreover, when testing the parameters of the regulatory network models, p-values computed by ignoring the measurement error are not reliable, since the rate of false positives are not controlled under the null hypothesis. In order to overcome these problems, we present an improved version of the Ordinary Least Square estimator in independent (regression models) and dependent (autoregressive models) data when the variables are subject to noises. Moreover, measurement error estimation procedures for microarrays are also described. Simulation results also show that both corrected methods perform better than the standard ones (i.e., ignoring measurement error). The proposed methodologies are illustrated using microarray data from lung cancer patients and mouse liver time series data. Conclusions: Measurement error dangerously affects the identification of regulatory network models, thus, they must be reduced or taken into account in order to avoid erroneous conclusions. This could be one of the reasons for high biological false positive rates identified in actual regulatory network models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Little is known about the role of deep roots in the nutrition of forest trees and their ability to provide a safety-net service taking up nutrients leached from the topsoil. 2. To address this issue, we studied the potential uptake of N, K and Ca by Eucalyptus grandis trees (6 years of age - 25 m mean height), in Brazil, as a function of soil depth, texture and water content. We injected NO(3)(-)- (15)N, Rb(+) (analogue of K(+)) and Sr(2+) (analogue of Ca(2+)) tracers simultaneously in a solution through plastic tubes at 10, 50, 150 and 300 cm in depth in a sandy and a clayey Ferralsol soil. A complete randomized design was set up with three replicates of paired trees per injection depth and soil type. Recently expanded leaves were sampled at various times after tracer injection in the summer, and the experiment was repeated in the winter. Soil water contents were continuously monitored at the different depths in the two soils. 3. Determination of foliar Rb and Sr concentrations and (15)N atom % made it possible to estimate the relative uptake potential (RUP) of tracer injections from the four soil depths and the specific RUP (SRUP), defined as RUP, per unit of fine root length density in the corresponding soil layer. 4. The highest tracer uptake rates were found in the topsoil, but contrasting RUP distributions were observed for the three tracers. Whilst the RUP was higher for NO(3)(-)- (15)N than for Rb(+) and Sr(2+) in the upper 50 cm of soil, the highest SRUP values for Sr(2+) and Rb(+) were found at a depth of 300 cm in the sandy soil, as well as in the clayey soil when gravitational solutions reached that depth. 5. Our results suggest that the fine roots of E. grandis trees exhibit contrasting potential uptake rates with depth depending on the nutrient. This functional specialization of roots might contribute to the high growth rates of E. grandis trees, efficiently providing the large amounts of nutrients required throughout the development of these fast-growing plantations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: We carry out a systematic assessment on a suite of kernel-based learning machines while coping with the task of epilepsy diagnosis through automatic electroencephalogram (EEG) signal classification. Methods and materials: The kernel machines investigated include the standard support vector machine (SVM), the least squares SVM, the Lagrangian SVM, the smooth SVM, the proximal SVM, and the relevance vector machine. An extensive series of experiments was conducted on publicly available data, whose clinical EEG recordings were obtained from five normal subjects and five epileptic patients. The performance levels delivered by the different kernel machines are contrasted in terms of the criteria of predictive accuracy, sensitivity to the kernel function/parameter value, and sensitivity to the type of features extracted from the signal. For this purpose, 26 values for the kernel parameter (radius) of two well-known kernel functions (namely. Gaussian and exponential radial basis functions) were considered as well as 21 types of features extracted from the EEG signal, including statistical values derived from the discrete wavelet transform, Lyapunov exponents, and combinations thereof. Results: We first quantitatively assess the impact of the choice of the wavelet basis on the quality of the features extracted. Four wavelet basis functions were considered in this study. Then, we provide the average accuracy (i.e., cross-validation error) values delivered by 252 kernel machine configurations; in particular, 40%/35% of the best-calibrated models of the standard and least squares SVMs reached 100% accuracy rate for the two kernel functions considered. Moreover, we show the sensitivity profiles exhibited by a large sample of the configurations whereby one can visually inspect their levels of sensitiveness to the type of feature and to the kernel function/parameter value. Conclusions: Overall, the results evidence that all kernel machines are competitive in terms of accuracy, with the standard and least squares SVMs prevailing more consistently. Moreover, the choice of the kernel function and parameter value as well as the choice of the feature extractor are critical decisions to be taken, albeit the choice of the wavelet family seems not to be so relevant. Also, the statistical values calculated over the Lyapunov exponents were good sources of signal representation, but not as informative as their wavelet counterparts. Finally, a typical sensitivity profile has emerged among all types of machines, involving some regions of stability separated by zones of sharp variation, with some kernel parameter values clearly associated with better accuracy rates (zones of optimality). (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the innovation approach is used to estimate the measurement total error associated with power system state estimation. This is required because the power system equations are very much correlated with each other and as a consequence part of the measurements errors is masked. For that purpose an index, innovation index (II), which provides the quantity of new information a measurement contains is proposed. A critical measurement is the limit case of a measurement with low II, it has a zero II index and its error is totally masked. In other words, that measurement does not bring any innovation for the gross error test. Using the II of a measurement, the masked gross error by the state estimation is recovered; then the total gross error of that measurement is composed. Instead of the classical normalised measurement residual amplitude, the corresponding normalised composed measurement residual amplitude is used in the gross error detection and identification test, but with m degrees of freedom. The gross error processing turns out to be very simple to implement, requiring only few adaptations to the existing state estimation software. The IEEE-14 bus system is used to validate the proposed gross error detection and identification test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the relentless quest for improved performance driving ever tighter tolerances for manufacturing, machine tools are sometimes unable to meet the desired requirements. One option to improve the tolerances of machine tools is to compensate for their errors. Among all possible sources of machine tool error, thermally induced errors are, in general for newer machines, the most important. The present work demonstrates the evaluation and modelling of the behaviour of the thermal errors of a CNC cylindrical grinding machine during its warm-up period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An accurate estimate of machining time is very important for predicting delivery time, manufacturing costs, and also to help production process planning. Most commercial CAM software systems estimate the machining time in milling operations simply by dividing the entire tool path length by the programmed feed rate. This time estimate differs drastically from the real process time because the feed rate is not always constant due to machine and computer numerical controlled (CNC) limitations. This study presents a practical mechanistic method for milling time estimation when machining free-form geometries. The method considers a variable called machine response time (MRT) which characterizes the real CNC machine`s capacity to move in high feed rates in free-form geometries. MRT is a global performance feature which can be obtained for any type of CNC machine configuration by carrying out a simple test. For validating the methodology, a workpiece was used to generate NC programs for five different types of CNC machines. A practical industrial case study was also carried out to validate the method. The results indicated that MRT, and consequently, the real machining time, depends on the CNC machine`s potential: furthermore, the greater MRT, the larger the difference between predicted milling time and real milling time. The proposed method achieved an error range from 0.3% to 12% of the real machining time, whereas the CAM estimation achieved from 211% to 1244% error. The MRT-based process is also suggested as an instrument for helping in machine tool benchmarking.