929 resultados para C51 - Model Construction and Estimation
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
The use of bamboo as construction and raw material for producing products can be considered a feasible alternative to the abusive use of steel, concrete and oil byproducts. Its use can also reduce the pressure on the use of wood from native and planted forests. Although there are thousands of bamboo species spread about the world and Brazil itself has hundreds of native species, the use and basic knowledge of its characteristics and applications are still little known and little disseminated. This paper's main objective is to introduce the species, the management phases, the physical and mechanical characteristics and the experiences in using bamboo in design and civil construction as per the Bamboo Project implemented at UNESP, Bauru campus since 1994. The results are divided into: a) Field activities - description of the technological species of interest, production chain flows, types of preservative treatments and clump management practices for the development, adaptation and production of different species of culms; b) Lab experiments - physical and mechanical characterization of culms processed as laminated strips and as composite material (glue laminated bamboo – glubam); c) Uses in projects - experiences with natural bamboo and glubam in design, architecture and civil construction projects. In the final remarks, the study aims to demonstrate, through practical and laboratory results, the material's multi-functionality and the feasibility in using bamboo as a sustainable material.
Resumo:
The objective of this paper is compare the common traffic lights (CTL) to three different types of traffic lights with countdown displays (SCD) and assess their effects on road safety and capacity. This comparison is required because the results found in the literature are divergent among countries and cities, and one of the SCD analyzed in our study is different from the SCD used worldwide. An observational before-after study was conducted to evaluate the safety and capacity in a period of one year before and one year after the implementation of the SCD in three Brazilian cities. The results indicate that the SCD models 1 and 3 had around 35%±14% reduction in the total number of accidents; the model 2, does not have significant reduction. In order to perform the capacity analysis a framework for data collection and an adaptation for estimation of initial lost time in each phase were developed. Considering the capacity analysis there was a reduction around 11% in the lost time in SCD model 1, 7% in SCD model 2 and 3% in SCD model 3. However the implications of this on capacity are trifle due to a small increase in the average headways for all SCD models compare to CTL.
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
The building budgeting quickly and accurately is a challenge faced by the companies in the sector. The cost estimation process is performed from the quantity takeoff and this process of quantification, historically, through the analysis of the project, scope of work and project information contained in 2D design, text files and spreadsheets. This method, in many cases, present itself flawed, influencing the making management decisions, once it is closely coupled to time and cost management. In this scenario, this work intends to make a critical analysis of conventional process of quantity takeoff, from the quantification through 2D designs, and with the use of the software Autodesk Revit 2016, which uses the concepts of building information modeling for automated quantity takeoff of 3D model construction. It is noted that the 3D modeling process should be aligned with the goals of budgeting. The use of BIM technology programs provides several benefits compared to traditional quantity takeoff process, representing gains in productivity, transparency and assertiveness
Resumo:
The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).
Resumo:
The direction of care delivery goes from the action to the being; a process built from professional experience, which gains special characteristics when the service is delivered by telephone. The goal of this research was to understand the interaction between professionals and users in a remote care service; to do so, a research is presented, using Grounded Theory and Symbolic Interactionism as theoretical references. Data were collected through eight interviews with professionals who deliver care by telephone. The theoretical understanding permitted the creation of the theoretical model of the Imaginative Construction of Care, which shows the interaction processes the professional experiences when delivering care by telephone. In this model, individual and social facts are added, showing the link between the concepts, with special emphasis on uncertainty, sensitivity and professional responsibility, as essential components of this experience.
Resumo:
Long-term survival models have historically been considered for analyzing time-to-event data with long-term survivors fraction. However, situations in which a fraction (1 - p) of systems is subject to failure from independent competing causes of failure, while the remaining proportion p is cured or has not presented the event of interest during the time period of the study, have not been fully considered in the literature. In order to accommodate such situations, we present in this paper a new long-term survival model. Maximum likelihood estimation procedure is discussed as well as interval estimation and hypothesis tests. A real dataset illustrates the methodology.
Resumo:
Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment.
Resumo:
In this article, we propose a new Bayesian flexible cure rate survival model, which generalises the stochastic model of Klebanov et al. [Klebanov LB, Rachev ST and Yakovlev AY. A stochastic-model of radiation carcinogenesis - latent time distributions and their properties. Math Biosci 1993; 113: 51-75], and has much in common with the destructive model formulated by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)]. In our approach, the accumulated number of lesions or altered cells follows a compound weighted Poisson distribution. This model is more flexible than the promotion time cure model in terms of dispersion. Moreover, it possesses an interesting and realistic interpretation of the biological mechanism of the occurrence of the event of interest as it includes a destructive process of tumour cells after an initial treatment or the capacity of an individual exposed to irradiation to repair altered cells that results in cancer induction. In other words, what is recorded is only the damaged portion of the original number of altered cells not eliminated by the treatment or repaired by the repair system of an individual. Markov Chain Monte Carlo (MCMC) methods are then used to develop Bayesian inference for the proposed model. Also, some discussions on the model selection and an illustration with a cutaneous melanoma data set analysed by Rodrigues et al. [Rodrigues J, de Castro M, Balakrishnan N and Cancho VG. Destructive weighted Poisson cure rate models. Technical Report, Universidade Federal de Sao Carlos, Sao Carlos-SP. Brazil, 2009 (accepted in Lifetime Data Analysis)] are presented.
Resumo:
Abstract Background The importance of the lung parenchyma in the pathophysiology of asthma has previously been demonstrated. Considering that nitric oxide synthases (NOS) and arginases compete for the same substrate, it is worthwhile to elucidate the effects of complex NOS-arginase dysfunction in the pathophysiology of asthma, particularly, related to distal lung tissue. We evaluated the effects of arginase and iNOS inhibition on distal lung mechanics and oxidative stress pathway activation in a model of chronic pulmonary allergic inflammation in guinea pigs. Methods Guinea pigs were exposed to repeated ovalbumin inhalations (twice a week for 4 weeks). The animals received 1400 W (an iNOS-specific inhibitor) for 4 days beginning at the last inhalation. Afterwards, the animals were anesthetized and exsanguinated; then, a slice of the distal lung was evaluated by oscillatory mechanics, and an arginase inhibitor (nor-NOHA) or vehicle was infused in a Krebs solution bath. Tissue resistance (Rt) and elastance (Et) were assessed before and after ovalbumin challenge (0.1%), and lung strips were submitted to histopathological studies. Results Ovalbumin-exposed animals presented an increase in the maximal Rt and Et responses after antigen challenge (p<0.001), in the number of iNOS positive cells (p<0.001) and in the expression of arginase 2, 8-isoprostane and NF-kB (p<0.001) in distal lung tissue. The 1400 W administration reduced all these responses (p<0.001) in alveolar septa. Ovalbumin-exposed animals that received nor-NOHA had a reduction of Rt, Et after antigen challenge, iNOS positive cells and 8-isoprostane and NF-kB (p<0.001) in lung tissue. The activity of arginase 2 was reduced only in the groups treated with nor-NOHA (p <0.05). There was a reduction of 8-isoprostane expression in OVA-NOR-W compared to OVA-NOR (p<0.001). Conclusions In this experimental model, increased arginase content and iNOS-positive cells were associated with the constriction of distal lung parenchyma. This functional alteration may be due to a high expression of 8-isoprostane, which had a procontractile effect. The mechanism involved in this response is likely related to the modulation of NF-kB expression, which contributed to the activation of the arginase and iNOS pathways. The association of both inhibitors potentiated the reduction of 8-isoprostane expression in this animal model.
Resumo:
Abstract Background Over the last years, a number of researchers have investigated how to improve the reuse of crosscutting concerns. New possibilities have emerged with the advent of aspect-oriented programming, and many frameworks were designed considering the abstractions provided by this new paradigm. We call this type of framework Crosscutting Frameworks (CF), as it usually encapsulates a generic and abstract design of one crosscutting concern. However, most of the proposed CFs employ white-box strategies in their reuse process, requiring two mainly technical skills: (i) knowing syntax details of the programming language employed to build the framework and (ii) being aware of the architectural details of the CF and its internal nomenclature. Also, another problem is that the reuse process can only be initiated as soon as the development process reaches the implementation phase, preventing it from starting earlier. Method In order to solve these problems, we present in this paper a model-based approach for reusing CFs which shields application engineers from technical details, letting him/her concentrate on what the framework really needs from the application under development. To support our approach, two models are proposed: the Reuse Requirements Model (RRM) and the Reuse Model (RM). The former must be used to describe the framework structure and the later is in charge of supporting the reuse process. As soon as the application engineer has filled in the RM, the reuse code can be automatically generated. Results We also present here the result of two comparative experiments using two versions of a Persistence CF: the original one, whose reuse process is based on writing code, and the new one, which is model-based. The first experiment evaluated the productivity during the reuse process, and the second one evaluated the effort of maintaining applications developed with both CF versions. The results show the improvement of 97% in the productivity; however little difference was perceived regarding the effort for maintaining the required application. Conclusion By using the approach herein presented, it was possible to conclude the following: (i) it is possible to automate the instantiation of CFs, and (ii) the productivity of developers are improved as long as they use a model-based instantiation approach.