15 resultados para C51 - Model Construction and Estimation
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The aim of this paper is to verify the influence of composition variability of recycled aggregates (RA) of construction and demolition wastes (CDW) on the performance of concretes. Performance was evaluated building mathematical models for compressive strength, modulus of elasticity and drying shrinkage. To obtain such models, an experimental program comprising 50 concrete mixtures was carried out. Specimens were casted, tested and results for compressive strength, modulus of elasticity and drying shrinkage were statistically analyzed. Models inputs are CDW composition observed at seven Brazilian cities. Results confirm that using RA from CDW for concrete building is quite feasible, independently of its composition, once compressive strength and modulus of elasticity still reached considerable values. We concluded the variability presented by recycled aggregates of CDW does not compromise their use for concrete building. However, this information must be used with caution, and experimental tests should always be performed to certify concrete properties.
Resumo:
Defining pharmacokinetic parameters and depletion intervals for antimicrobials used in fish represents important guidelines for future regulation by Brazilian agencies of the use of these substances in fish farming. This article presents a depletion study for oxytetracycline (OTC) in tilapias (Orechromis niloticus) farmed under tropical conditions during the winter season. High performance liquid chromatography, with fluorescence detection for the quantitation of OTC in tilapia fillets and medicated feed, was developed and validated. The depletion study with fish was carried out under monitored environmental conditions. OTC was administered in the feed for five consecutive days at daily dosages of 80 mg/kg body weight. Groups of ten fish were slaughtered at 1, 2, 3, 4, 5, 8, 10, 15, 20, and 25 days after medication. After the 8th day posttreatment, OTC concentrations in the tilapia fillets were below the limit of quantitation (13 ng/g) of the method. Linear regression of the mathematical model of data analysis presented a coefficient of 0.9962. The elimination half- life for OTC in tilapia fillet and the withdrawal period were 1.65 and 6 days, respectively, considering a percentile of 99% with 95% of confidence and a maximum residue limit of 100 ng/g. Even though the study was carried out in the winter under practical conditions where water temperature varied, the results obtained are similar to others from studies conducted under controlled temperature.
Resumo:
Polychlorinated biphenyls (PCBs) and organochlorine pesticides are compounds that do not occur naturally in the environment and are not easily degraded by chemical or microbiological action. In the present work, those compounds were analysed in unhatched penguin eggs and whole krill collected in Admiralty Bay, King George Island, Antarctica in the austral summers of 2004-2005 and 2005-2006. The compounds found in higher levels (in a wet weight basis) were, in most of the egg samples, the PCBs (2.53-78.7 ng g(-1)), DDTs (2.07-38.0 ng g(-1)) and HCB (4.99-39.1 ng g(-1)) and after Kruskal-Wallis ANOVA, the occurrence seemed to be species-specific for the Pygoscelis genus. In all of the cases, the levels found were not higher than the ones in Arctic birds in a similar trophic level. The krill samples analysis allowed estimating the biomagnification factors (which resulted in up to 363 for HCB, one order of magnitude higher than DDTs and chlordanes and two orders of magnitude higher than the other groups) of the compounds found in eggs, whose only source of contamination is the female-offspring transfer. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
This work reports the analytical application of surface-enhanced Raman spectroscopy (SERS) in the trace analysis of organophosphorous pesticides (trichlorfon and glyphosate) and model organophosphorous compounds (dimethyl methylphosphonate and o-ethyl methylphosphonothioate) bearing different functional groups. SERS measurements were carried out using Ag nanocubes with an edge square dimension of ca. 100 nm as substrates. Density functional theory (DFT) with the B3LYP functional was used for the optimization of ground state geometries and simulation of Raman spectra of the organophosphorous compounds and their silver complexes. Adsorption geometries and marker bands were identified for each of the investigated compound. Results indicate the usefulness of SERS methodology for the sensitive analyses of organophosphorous compounds through the use of vibrational spectroscopy.
Resumo:
Multi-element analysis of honey samples was carried out with the aim of developing a reliable method of tracing the origin of honey. Forty-two chemical elements were determined (Al, Cu, Pb, Zn, Mn, Cd, Tl, Co, Ni, Rb, Ba, Be, Bi, U, V, Fe, Pt, Pd, Te, Hf, Mo, Sn, Sb, P, La, Mg, I, Sm, Tb, Dy, Sd, Th, Pr, Nd, Tm, Yb, Lu, Gd, Ho, Er, Ce, Cr) by inductively coupled plasma mass spectrometry (ICP-MS). Then, three machine learning tools for classification and two for attribute selection were applied in order to prove that it is possible to use data mining tools to find the region where honey originated. Our results clearly demonstrate the potential of Support Vector Machine (SVM), Multilayer Perceptron (MLP) and Random Forest (RF) chemometric tools for honey origin identification. Moreover, the selection tools allowed a reduction from 42 trace element concentrations to only 5. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
We present a generalized test case generation method, called the G method. Although inspired by the W method, the G method, in contrast, allows for test case suite generation even in the absence of characterization sets for the specification models. Instead, the G method relies on knowledge about the index of certain equivalences induced at the implementation models. We show that the W method can be derived from the G method as a particular case. Moreover, we discuss some naturally occurring infinite classes of FSM models over which the G method generates test suites that are exponentially more compact than those produced by the W method.
Resumo:
Objective: The aim of this study was to construct and to validate a measure of the consequences of domestic violence on women's health during climacterium. Methods: A questionnaire was administered at the Outpatient Climacterium Clinic to 124 women aged 40 to 65 years who were the victims of domestic and/or sexual violence (experimental group). They were divided into three groups: (1) those who were victims of violence exclusively during childhood/adolescence, (2) those who were victims of violence exclusively during adulthood, and (3) those who were victims of violence throughout their lives. The instrument included 34 items evaluating the beginning, frequency, and type of violence; the search for health assistance and reporting of the violence; the violence and the number of comorbidities; and violence and the Kupperman Menopausal Index. We also included a control group composed of perimenopausal and postmenopausal women who did not experience any violence (n = 120). Results: The instrument presented a Cronbach alpha = 0.82, good reliability among the examiners (+0.80), and a good possibility of reproducibility. The mean age of menopause was 45.4 years, and the mean age in the control group was 48.1 years. Group 1 showed a mean of 5.1 comorbidities, Group 2 had 4.6, and Group 3 had 4.4. Sexual violence (43.5%) and other types of violence both presented average comorbidities (4.60) but represented a significant impairment in the victim's sexual life. There were significant associations in group 3 and a high Kupperman Menopausal Index score. In the experimental group, 80.6% did not seek health services for the violence they experienced. Conclusions: The questionnaire presented good internal consistency and a validated construction. It can be easily reproduced and is indicated to evaluate the consequences of domestic and/or sexual violence on women's health during climacterium.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).
Resumo:
The direction of care delivery goes from the action to the being; a process built from professional experience, which gains special characteristics when the service is delivered by telephone. The goal of this research was to understand the interaction between professionals and users in a remote care service; to do so, a research is presented, using Grounded Theory and Symbolic Interactionism as theoretical references. Data were collected through eight interviews with professionals who deliver care by telephone. The theoretical understanding permitted the creation of the theoretical model of the Imaginative Construction of Care, which shows the interaction processes the professional experiences when delivering care by telephone. In this model, individual and social facts are added, showing the link between the concepts, with special emphasis on uncertainty, sensitivity and professional responsibility, as essential components of this experience.
Resumo:
Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.
Resumo:
Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment.
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.
Resumo:
Abstract Background The importance of the lung parenchyma in the pathophysiology of asthma has previously been demonstrated. Considering that nitric oxide synthases (NOS) and arginases compete for the same substrate, it is worthwhile to elucidate the effects of complex NOS-arginase dysfunction in the pathophysiology of asthma, particularly, related to distal lung tissue. We evaluated the effects of arginase and iNOS inhibition on distal lung mechanics and oxidative stress pathway activation in a model of chronic pulmonary allergic inflammation in guinea pigs. Methods Guinea pigs were exposed to repeated ovalbumin inhalations (twice a week for 4 weeks). The animals received 1400 W (an iNOS-specific inhibitor) for 4 days beginning at the last inhalation. Afterwards, the animals were anesthetized and exsanguinated; then, a slice of the distal lung was evaluated by oscillatory mechanics, and an arginase inhibitor (nor-NOHA) or vehicle was infused in a Krebs solution bath. Tissue resistance (Rt) and elastance (Et) were assessed before and after ovalbumin challenge (0.1%), and lung strips were submitted to histopathological studies. Results Ovalbumin-exposed animals presented an increase in the maximal Rt and Et responses after antigen challenge (p<0.001), in the number of iNOS positive cells (p<0.001) and in the expression of arginase 2, 8-isoprostane and NF-kB (p<0.001) in distal lung tissue. The 1400 W administration reduced all these responses (p<0.001) in alveolar septa. Ovalbumin-exposed animals that received nor-NOHA had a reduction of Rt, Et after antigen challenge, iNOS positive cells and 8-isoprostane and NF-kB (p<0.001) in lung tissue. The activity of arginase 2 was reduced only in the groups treated with nor-NOHA (p <0.05). There was a reduction of 8-isoprostane expression in OVA-NOR-W compared to OVA-NOR (p<0.001). Conclusions In this experimental model, increased arginase content and iNOS-positive cells were associated with the constriction of distal lung parenchyma. This functional alteration may be due to a high expression of 8-isoprostane, which had a procontractile effect. The mechanism involved in this response is likely related to the modulation of NF-kB expression, which contributed to the activation of the arginase and iNOS pathways. The association of both inhibitors potentiated the reduction of 8-isoprostane expression in this animal model.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.