54 resultados para data fitting
em Scielo Saúde Pública - SP
Resumo:
The soil P sorption capacity has been studied for many years, but little attention has been paid to the rate of this process, which is relevant in the planning of phosphate fertilization. The purpose of this experiment was to assess kinetics of P sorption in 12 representative soil profiles of the State of Paraíba (Brazil), select the best data fitting among four equations and relate these coefficients to the soil properties. Samples of 12 soils with wide diversity of physical, chemical and mineralogical properties were agitated in a horizontal shaker, with 10 mmo L-1 CaCl2 solution containing 6 and 60 mg L-1 P, for periods of 5, 15, 30, 45, 60, 90, 120, 420, 720, 1,020, and 1,440 min. After each shaking period, the P concentration in the equilibrium solution was measured and three equations were fitted based on the Freundlich equation and one based on the Elovich equation, to determine which soil had the highest sorption rate (kinetics) and which soil properties correlated to this rate. The kinetics of P sorption in soils with high maximum P adsorption capacity (MPAC) was fast for 30 min at the lower initial P concentration (6 mg L-1). No difference was observed between soils at the higher initial P concentration (60 mg L-1). The P adsorption kinetics were positively correlated with clay content, MPAC and the amount of Al extracted with dithionite-citrate-bicarbonate. The data fitted well to Freundlich-based equations equation, whose coefficients can be used to predict P adsorption rates in soils.
Resumo:
Acid base properties of mixed species of the microalgae Spirulina were studied by potentiometric titration in medium of 0.01 and 0.10 mols L-1 NaNO3 at 25.0±0.10 C using modified Gran functions or nonlinear regression techniques for data fitting. The discrete site distribution model was used, permitting the characterization of five classes of ionizable sites in both ionic media. This fact suggests that the chemical heterogeneity of the ionizable sites on the cell surface plays a major role on the acid-base properties of the suspension in comparison to electrostatic effects due to charge-charge interactions. The total of ionizable sites were 1.75±0.10 and 1.86±0.20 mmolsg-1 in ionic media of 0.01 and 0.10 mols L-1 NaNO3, respectively. A major contribution of carboxylic groups was observed with an average 34 and 22% of ionizable sites being titrated with conditional pcKa of 4.0 and 5.4, respectively. The remaining 44% of ionizable sites were divided in three classes with averaged conditional pcKa of 6.9, 8.7 and 10.12, which may be assigned respectively to imidazolic, aminic, and phenolic functionalities.
Resumo:
In this study, we concentrate on modelling gross primary productivity using two simple approaches to simulate canopy photosynthesis: "big leaf" and "sun/shade" models. Two approaches for calibration are used: scaling up of canopy photosynthetic parameters from the leaf to the canopy level and fitting canopy biochemistry to eddy covariance fluxes. Validation of the models is achieved by using eddy covariance data from the LBA site C14. Comparing the performance of both models we conclude that numerically (in terms of goodness of fit) and qualitatively, (in terms of residual response to different environmental variables) sun/shade does a better job. Compared to the sun/shade model, the big leaf model shows a lower goodness of fit and fails to respond to variations in the diffuse fraction, also having skewed responses to temperature and VPD. The separate treatment of sun and shade leaves in combination with the separation of the incoming light into direct beam and diffuse make sun/shade a strong modelling tool that catches more of the observed variability in canopy fluxes as measured by eddy covariance. In conclusion, the sun/shade approach is a relatively simple and effective tool for modelling photosynthetic carbon uptake that could be easily included in many terrestrial carbon models.
Resumo:
Digital information generates the possibility of a high degree of redundancy in the data available for fitting predictive models used for Digital Soil Mapping (DSM). Among these models, the Decision Tree (DT) technique has been increasingly applied due to its capacity of dealing with large datasets. The purpose of this study was to evaluate the impact of the data volume used to generate the DT models on the quality of soil maps. An area of 889.33 km² was chosen in the Northern region of the State of Rio Grande do Sul. The soil-landscape relationship was obtained from reambulation of the studied area and the alignment of the units in the 1:50,000 scale topographic mapping. Six predictive covariates linked to the factors soil formation, relief and organisms, together with data sets of 1, 3, 5, 10, 15, 20 and 25 % of the total data volume, were used to generate the predictive DT models in the data mining program Waikato Environment for Knowledge Analysis (WEKA). In this study, sample densities below 5 % resulted in models with lower power of capturing the complexity of the spatial distribution of the soil in the study area. The relation between the data volume to be handled and the predictive capacity of the models was best for samples between 5 and 15 %. For the models based on these sample densities, the collected field data indicated an accuracy of predictive mapping close to 70 %.
Resumo:
Some aspects of the application of electrochemical impedance spectroscopy to studies of solid electrode / solution interface, in the absence of faradaic processes, are analysed. In order to perform this analysis, gold electrodes with (111) and (210) crystallographic orientations in an aqueous solution containing 10 mmol dm-3 KF, as supporting electrolyte, and a pyridine concentration varying from 0.01 to 4.6 mmol dm-3, were used. The experimental data was analysed by using EQUIVCRT software, which utilises non-linear least squares routines, attributing to the solid electrode / solution interface behaviour described by an equivalent circuit with a resistance in series with a constant phase element. The results of this fitting procedure were analysed by the dependence on the electrode potential on two parameters: the pre-exponential factor, Y0, and the exponent n f, related with the phase angle shift. By this analysis it was possible to observe that the pyridine adsorption is strongly affected by the crystallographic orientation of the electrode surface and that the extent of deviation from ideal capacitive behaviour is mainly of interfacial origin.
Resumo:
The rural electrification is characterized by geographical dispersion of the population, low consumption, high investment by consumers and high cost. Moreover, solar radiation constitutes an inexhaustible source of energy and in its conversion into electricity photovoltaic panels are used. In this study, equations were adjusted to field conditions presented by the manufacturer for current and power of small photovoltaic systems. The mathematical analysis was performed on the photovoltaic rural system I-100 from ISOFOTON, with power 300 Wp, located at the Experimental Farm Lageado of FCA/UNESP. For the development of such equations, the circuitry of photovoltaic cells has been studied to apply iterative numerical methods for the determination of electrical parameters and possible errors in the appropriate equations in the literature to reality. Therefore, a simulation of a photovoltaic panel was proposed through mathematical equations that were adjusted according to the data of local radiation. The results have presented equations that provide real answers to the user and may assist in the design of these systems, once calculated that the maximum power limit ensures a supply of energy generated. This real sizing helps establishing the possible applications of solar energy to the rural producer and informing the real possibilities of generating electricity from the sun.
Resumo:
In vivo proton magnetic resonance spectroscopy (¹H-MRS) is a technique capable of assessing biochemical content and pathways in normal and pathological tissue. In the brain, ¹H-MRS complements the information given by magnetic resonance images. The main goal of the present study was to assess the accuracy of ¹H-MRS for the classification of brain tumors in a pilot study comparing results obtained by manual and semi-automatic quantification of metabolites. In vivo single-voxel ¹H-MRS was performed in 24 control subjects and 26 patients with brain neoplasms that included meningiomas, high-grade neuroglial tumors and pilocytic astrocytomas. Seven metabolite groups (lactate, lipids, N-acetyl-aspartate, glutamate and glutamine group, total creatine, total choline, myo-inositol) were evaluated in all spectra by two methods: a manual one consisting of integration of manually defined peak areas, and the advanced method for accurate, robust and efficient spectral fitting (AMARES), a semi-automatic quantification method implemented in the jMRUI software. Statistical methods included discriminant analysis and the leave-one-out cross-validation method. Both manual and semi-automatic analyses detected differences in metabolite content between tumor groups and controls (P < 0.005). The classification accuracy obtained with the manual method was 75% for high-grade neuroglial tumors, 55% for meningiomas and 56% for pilocytic astrocytomas, while for the semi-automatic method it was 78, 70, and 98%, respectively. Both methods classified all control subjects correctly. The study demonstrated that ¹H-MRS accurately differentiated normal from tumoral brain tissue and confirmed the superiority of the semi-automatic quantification method.
Resumo:
A gestão do conhecimento abrange toda a forma de gerar, armazenar, distribuir e utilizar o conhecimento, tornando necessária a utilização de tecnologias de informação para facilitar esse processo, devido ao grande aumento no volume de dados. A descoberta de conhecimento em banco de dados é uma metodologia que tenta solucionar esse problema e o data mining é uma técnica que faz parte dessa metodologia. Este artigo desenvolve, aplica e analisa uma ferramenta de data mining, para extrair conhecimento referente à produção científica das pessoas envolvidas com a pesquisa na Universidade Federal de Lavras. A metodologia utilizada envolveu a pesquisa bibliográfica, a pesquisa documental e o método do estudo de caso. As limitações encontradas na análise dos resultados indicam que ainda é preciso padronizar o modo do preenchimento dos currículos Lattes para refinar as análises e, com isso, estabelecer indicadores. A contribuição foi gerar um banco de dados estruturado, que faz parte de um processo maior de desenvolvimento de indicadores de ciência e tecnologia, para auxiliar na elaboração de novas políticas de gestão científica e tecnológica e aperfeiçoamento do sistema de ensino superior brasileiro.
Resumo:
The nutritional status according to anthropometric data was assessed in 756 schoolchildren from 5 low-income state schools and in one private school in the same part of Rio de Janeiro, Brazil. The prevalence of stunting and wasting (cut-off point: <90% ht/age and <80% wt/ht) ranged in the public schools from 6.2 to 15.2% and 3.3 to 24.0%, respectively, whereas the figures for the private school were 2.3 and 3.5%, respectively. Much more obesity was found in the private school (18.0%) than in the state schools (0.8 - 6.2%). Nutritional problems seem to develop more severely in accordance with the increasing age of the children. Therefore it appears advisable to assess schoolchildren within the context of a nutritional surveillance system.
Resumo:
OBJECTIVE: A cross-sectional population-based study was conducted to assess, in active smokers, the relationship of number of cigarettes smoked and other characteristics to salivary cotinine concentrations. METHODS: A random sample of active smokers aged 15 years or older was selected using a stepwise cluster sample strategy, in the year 2000 in Rio de Janeiro, Brazil. The study included 401 subjects. Salivary cotinine concentration was determined using gas chromatography with nitrogen-phosphorus detection. A standard questionnaire was used to collect demographic and smoking behavioral data. The relation between the number of cigarettes smoked in the last 24h and cotinine level was examined by means of a nonparametric fitting technique of robust locally weighted regression. RESULTS: Significantly (p<0.05) higher adjusted mean cotinine levels were found in subjects smoking their first cigarette within five minutes after waking up, and in those smoking 1-20 cigarettes in the last 24h who reported inhaling more than ½ the time. In those smoking 1-20 cigarettes, the slope was significantly higher for those subjects waiting for more than five minutes before smoking their first cigarette after waking up, and those smoking "light" cigarettes when compared with their counterparts. These heterogeneities became negligible and non-significant when subjects with cotinine >40 ng/mL per cigarette were excluded. CONCLUSIONS: There was found a positive association between self-reporting smoking five minutes after waking up, and inhaling more than ½ the time are consistent and higher cotinine levels. These can be markers of dependence and higher nicotine intake. Salivary cotinine proved to be a useful biomarker of recent smoking and can be used in epidemiological studies and smoking cessation programs.
Resumo:
The objective of the study was to compare information collected through face-to-face interviews at first time and six years later in a city of Southeastern Brazil. In 1998, 32 mothers (N=32) of children aged 20 to 30 months answered a face-to-face interview with structured questions regarding their children's brushing habits. Six years later this same interview was repeated with the same mothers. Both interviews were compared for overall agreement, kappa and weighted kappa. Overall agreement between both interviews varied from 41 to 96%. Kappa values ranged from 0.00 to 0.65 (very bad to good) without any significant differences. The results showed lack of agreement when the same interview is conducted six years later, showing that the recall bias can be a methodological problem of interviews.
Resumo:
OBJECTIVE: To estimate the spatial intensity of urban violence events using wavelet-based methods and emergency room data. METHODS: Information on victims attended at the emergency room of a public hospital in the city of São Paulo, Southeastern Brazil, from January 1, 2002 to January 11, 2003 were obtained from hospital records. The spatial distribution of 3,540 events was recorded and a uniform random procedure was used to allocate records with incomplete addresses. Point processes and wavelet analysis technique were used to estimate the spatial intensity, defined as the expected number of events by unit area. RESULTS: Of all georeferenced points, 59% were accidents and 40% were assaults. There is a non-homogeneous spatial distribution of the events with high concentration in two districts and three large avenues in the southern area of the city of São Paulo. CONCLUSIONS: Hospital records combined with methodological tools to estimate intensity of events are useful to study urban violence. The wavelet analysis is useful in the computation of the expected number of events and their respective confidence bands for any sub-region and, consequently, in the specification of risk estimates that could be used in decision-making processes for public policies.
Resumo:
OBJECTIVE: To estimate the basic reproduction number (R0) of dengue fever including both imported and autochthonous cases. METHODS: The study was conducted based on epidemiological data of the 2003 dengue epidemic in Brasília, Brazil. The basic reproduction number is estimated from the epidemic curve, fitting linearly the increase of initial cases. Aiming at simulating an epidemic with both autochthonous and imported cases, a "susceptible-infectious-resistant" compartmental model was designed, in which the imported cases were considered as an external forcing. The ratio between R0 of imported versus autochthonous cases was used as an estimator of real R0. RESULTS: The comparison of both reproduction numbers (only autochthonous versus all cases) showed that considering all cases as autochthonous yielded a R0 above one, although the real R0 was below one. The same results were seen when the method was applied on simulated epidemics with fixed R0. This method was also compared to some previous proposed methods by other authors and showed that the latter underestimated R0 values. CONCLUSIONS: It was shown that the inclusion of both imported and autochthonous cases is crucial for the modeling of the epidemic dynamics, and thus provides critical information for decision makers in charge of prevention and control of this disease.
Resumo:
ABSTRACT OBJECTIVE To develop an assessment tool to evaluate the efficiency of federal university general hospitals. METHODS Data envelopment analysis, a linear programming technique, creates a best practice frontier by comparing observed production given the amount of resources used. The model is output-oriented and considers variable returns to scale. Network data envelopment analysis considers link variables belonging to more than one dimension (in the model, medical residents, adjusted admissions, and research projects). Dynamic network data envelopment analysis uses carry-over variables (in the model, financing budget) to analyze frontier shift in subsequent years. Data were gathered from the information system of the Brazilian Ministry of Education (MEC), 2010-2013. RESULTS The mean scores for health care, teaching and research over the period were 58.0%, 86.0%, and 61.0%, respectively. In 2012, the best performance year, for all units to reach the frontier it would be necessary to have a mean increase of 65.0% in outpatient visits; 34.0% in admissions; 12.0% in undergraduate students; 13.0% in multi-professional residents; 48.0% in graduate students; 7.0% in research projects; besides a decrease of 9.0% in medical residents. In the same year, an increase of 0.9% in financing budget would be necessary to improve the care output frontier. In the dynamic evaluation, there was progress in teaching efficiency, oscillation in medical care and no variation in research. CONCLUSIONS The proposed model generates public health planning and programming parameters by estimating efficiency scores and making projections to reach the best practice frontier.