936 resultados para Predicting model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

As a new modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In this study, the SVR models had been introduced and developed to predict body and carcass-related characteristics of 2 strains of broiler chicken. To evaluate the prediction ability of SVR models, we compared their performance with that of neural network (NN) models. Evaluation of the prediction accuracy of models was based on the R-2, MS error, and bias. The variables of interest as model output were BW, empty BW, carcass, breast, drumstick, thigh, and wing weight in 2 strains of Ross and Cobb chickens based on intake dietary nutrients, including ME (kcal/bird per week), CP, TSAA, and Lys, all as grams per bird per week. A data set composed of 64 measurements taken from each strain were used for this analysis, where 44 data lines were used for model training, whereas the remaining 20 lines were used to test the created models. The results of this study revealed that it is possible to satisfactorily estimate the BW and carcass parts of the broiler chickens via their dietary nutrient intake. Through statistical criteria used to evaluate the performance of the SVR and NN models, the overall results demonstrate that the discussed models can be effective for accurate prediction of the body and carcass-related characteristics investigated here. However, the SVR method achieved better accuracy and generalization than the NN method. This indicates that the new data mining technique (SVR model) can be used as an alternative modeling tool for NN models. However, further reevaluation of this algorithm in the future is suggested.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently, many museums, botanic gardens and herbariums keep data of biological collections and using computational tools researchers digitalize and provide access to their data using data portals. The replication of databases in portals can be accomplished through the use of protocols and data schema. However, the implementation of this solution demands a large amount of time, concerning both the transfer of fragments of data and processing data within the portal. With the growth of data digitalization in institutions, this scenario tends to be increasingly exacerbated, making it hard to maintain the records updated on the portals. As an original contribution, this research proposes analysing the data replication process to evaluate the performance of portals. The Inter-American Biodiversity Information Network (IABIN) biodiversity data portal of pollinators was used as a study case, which supports both situations: conventional data replication of records of specimen occurrences and interactions between them. With the results of this research, it is possible to simulate a situation before its implementation, thus predicting the performance of replication operations. Additionally, these results may contribute to future improvements to this process, in order to decrease the time required to make the data available in portals. © Rinton Press.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a numerical approach to model the complex failure mechanisms that define the ultimate rotational capacity of reinforced concrete beams. The behavior in tension and compression is described by a constitutive damage model derived from a combination of two specific damage models [1]. The nonlinear behavior of the compressed region is treated by the compressive damage model based on the Drucker-Prager criterion written in terms of the effective stresses. The tensile damage model employs a failure criterion based on the strain energy associated with the positive part the effective stress tensor. This model is used to describe the behavior of very thin bands of strain localization, which are embedded in finite elements to represent multiple cracks that occur in the tensioned region [2]. The softening law establishes dissipation energy compatible with the fracture energy of the concrete. The reinforcing steel bars are modeled by truss elements with elastic-perfect plastic behavior. It is shown that the resulting approach is able to predict the different stages of the collapse mechanism of beams with distinct sizes and reinforcement ratios. The tensile damage model and the finite element embedded crack approach are able to describe the stiffness reduction due to concrete cracking in the tensile zone. The truss elements are able to reproduce the effects of steel yielding and, finally, the compressive damage model is able to describe the non-linear behavior of the compressive zone until the complete collapse of the beam due to crushing of concrete. The proposed approach is able to predict well the plastic rotation capacity of tested beams [3], including size-scale effects.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A second-order closure is developed for predicting turbulent flows of viscoelastic fluids described by a modified generalised Newtonian fluid model incorporating a nonlinear viscosity that depends on a strain-hardening Trouton ratio as a means to handle some of the effects of viscoelasticity upon turbulent flows. Its performance is assessed by comparing its predictions for fully developed turbulent pipe flow with experimental data for four different dilute polymeric solutions and also with two sets of direct numerical simulation data for fluids theoretically described by the finitely extensible nonlinear elastic - Peterlin model. The model is based on a Newtonian Reynolds stress closure to predict Newtonian fluid flows, which incorporates low Reynolds number damping functions to properly deal with wall effects and to provide the capability to handle fluid viscoelasticity more effectively. This new turbulence model was able to capture well the drag reduction of various viscoelastic fluids over a wide range of Reynolds numbers and performed better than previously developed models for the same type of constitutive equation, even if the streamwise and wall-normal turbulence intensities were underpredicted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given the importance of Guzera breeding programs for milk production in the tropics, the objective of this study was to compare alternative random regression models for estimation of genetic parameters and prediction of breeding values. Test-day milk yields records (TDR) were collected monthly, in a maximum of 10 measurements. The database included 20,524 records of first lactation from 2816 Guzera cows. TDR data were analyzed by random regression models (RRM) considering additive genetic, permanent environmental and residual effects as random and the effects of contemporary group (CG), calving age as a covariate (linear and quadratic effects) and mean lactation curve as fixed. The genetic additive and permanent environmental effects were modeled by RRM using Wilmink, All and Schaeffer and cubic B-spline functions as well as Legendre polynomials. Residual variances were considered as heterogeneous classes, grouped differently according to the model used. Multi-trait analysis using finite-dimensional models (FDM) for testday milk records (TDR) and a single-trait model for 305-days milk yields (default) using the restricted maximum likelihood method were also carried out as further comparisons. Through the statistical criteria adopted, the best RRM was the one that used the cubic B-spline function with five random regression coefficients for the genetic additive and permanent environmental effects. However, the models using the Ali and Schaeffer function or Legendre polynomials with second and fifth order for, respectively, the additive genetic and permanent environmental effects can be adopted, as little variation was observed in the genetic parameter estimates compared to those estimated by models using the B-spline function. Therefore, due to the lower complexity in the (co)variance estimations, the model using Legendre polynomials represented the best option for the genetic evaluation of the Guzera lactation records. An increase of 3.6% in the accuracy of the estimated breeding values was verified when using RRM. The ranks of animals were very close whatever the RRM for the data set used to predict breeding values. Considering P305, results indicated only small to medium difference in the animals' ranking based on breeding values predicted by the conventional model or by RRM. Therefore, the sum of all the RRM-predicted breeding values along the lactation period (RRM305) can be used as a selection criterion for 305-day milk production. (c) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study aimed to determine the optimal intake of lysine and threonine for broiler breeder hens. Two experiments were conducted to evaluate the responses of birds to digestible lysine (Lys) and threonine (Thr). Eight treatments were assessed in both experiments, with six replicates of eight birds in the Lys experiment and ten birds in the Thr experiment. The dietary levels of Lys and Thr were obtained by a dilution technique. The experimental period was ten weeks for each amino acid studied, which included six weeks of adaptation and four weeks of data collection. The amino acid intake, egg mass and body weight were adjusted using a Reading model. Based on the model coefficients, the cost of the synthetic amino acids sources and the price of fertile eggs determined the intake of each amino acid to maximize. The minimum intake of Lys and Thr reduced egg production by 40 and 30%, respectively, the weight of the eggs decreased by 12 and 9% with the same intake of Lys and Thr, respectively. The models generated by predicting Lys and Thr intake were as follows: Lys=11 x E+31 x W and Thr=9.5 x E+32 x W, where E=egg mass, g/bird per day, and W=body weight, kg/bird. Based on the models, 3 kg birds with an egg mass production of 50 g/day require 643 mg/bird per day of Lys and 569 mg/bird per day of Thr. The optimum economic intake was calculated at 954 and 834 mg/bird per day for Lys and Thr, respectively, reflecting a dietary concentration of 0.636% Lys and 0.556% Thr for a feed intake of 150 g/bird per day. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Composites are engineered materials that take advantage of the particular properties of each of its two or more constituents. They are designed to be stronger, lighter and to last longer which can lead to the creation of safer protection gear, more fuel efficient transportation methods and more affordable materials, among other examples. This thesis proposes a numerical and analytical verification of an in-house developed multiscale model for predicting the mechanical behavior of composite materials with various configurations subjected to impact loading. This verification is done by comparing the results obtained with analytical and numerical solutions with the results found when using the model. The model takes into account the heterogeneity of the materials that can only be noticed at smaller length scales, based on the fundamental structural properties of each of the composite’s constituents. This model can potentially reduce or eliminate the need of costly and time consuming experiments that are necessary for material characterization since it relies strictly upon the fundamental structural properties of each of the composite’s constituents. The results from simulations using the multiscale model were compared against results from direct simulations using over-killed meshes, which considered all heterogeneities explicitly in the global scale, indicating that the model is an accurate and fast tool to model composites under impact loads. Advisor: David H. Allen

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When a scaled structure (model or replica) is used to predict the response of a full-size compound (prototype), the model geometric dimensions should relate to the corresponding prototype dimensions by a single scaling factor. However, owing to manufacturing technical restrictions, this condition cannot be accomplished for some of the dimensions in real structures. Accordingly, the distorted geometry will not comply with the overall geometric scaling factor, infringing the Pi theorem requirements for complete dynamic similarity. In the present study, a method which takes geometrical distortions into account is introduced, leading to a model similar to the prototype. As a means to infer the performance of this method, three analytical problems of structures subjected to dynamic loads are analysed. It is shown that the replica developed applying this technique is able to accurately predict the full-size structure behaviour even when the studied models have some of their dimensions severely distorted. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Knowing which individuals can be more efficient in spreading a pathogen throughout a determinate environment is a fundamental question in disease control. Indeed, over recent years the spread of epidemic diseases and its relationship with the topology of the involved system have been a recurrent topic in complex network theory, taking into account both network models and real-world data. In this paper we explore possible correlations between the heterogeneous spread of an epidemic disease governed by the susceptible-infected-recovered (SIR) model, and several attributes of the originating vertices, considering Erdos-Renyi (ER), Barabasi-Albert (BA) and random geometric graphs (RGG), as well as a real case study, the US air transportation network, which comprises the 500 busiest airports in the US along with inter-connections. Initially, the heterogeneity of the spreading is achieved by considering the RGG networks, in which we analytically derive an expression for the distribution of the spreading rates among the established contacts, by assuming that such rates decay exponentially with the distance that separates the individuals. Such a distribution is also considered for the ER and BA models, where we observe topological effects on the correlations. In the case of the airport network, the spreading rates are empirically defined, assumed to be directly proportional to the seat availability. Among both the theoretical and real networks considered, we observe a high correlation between the total epidemic prevalence and the degree, as well as the strength and the accessibility of the epidemic sources. For attributes such as the betweenness centrality and the k-shell index, however, the correlation depends on the topology considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Although praised for their rationality, humans often make poor decisions, even in simple situations. In the repeated binary choice experiment, an individual has to choose repeatedly between the same two alternatives, where a reward is assigned to one of them with fixed probability. The optimal strategy is to perseverate with choosing the alternative with the best expected return. Whereas many species perseverate, humans tend to match the frequencies of their choices to the frequencies of the alternatives, a sub-optimal strategy known as probability matching. Our goal was to find the primary cognitive constraints under which a set of simple evolutionary rules can lead to such contrasting behaviors. We simulated the evolution of artificial populations, wherein the fitness of each animat (artificial animal) depended on its ability to predict the next element of a sequence made up of a repeating binary string of varying size. When the string was short relative to the animats' neural capacity, they could learn it and correctly predict the next element of the sequence. When it was long, they could not learn it, turning to the next best option: to perseverate. Animats from the last generation then performed the task of predicting the next element of a non-periodical binary sequence. We found that, whereas animats with smaller neural capacity kept perseverating with the best alternative as before, animats with larger neural capacity, which had previously been able to learn the pattern of repeating strings, adopted probability matching, being outperformed by the perseverating animats. Our results demonstrate how the ability to make predictions in an environment endowed with regular patterns may lead to probability matching under less structured conditions. They point to probability matching as a likely by-product of adaptive cognitive strategies that were crucial in human evolution, but may lead to sub-optimal performances in other environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The study introduces a new regression model developed to estimate the hourly values of diffuse solar radiation at the surface. The model is based on the clearness index and diffuse fraction relationship, and includes the effects of cloud (cloudiness and cloud type), traditional meteorological variables (air temperature, relative humidity and atmospheric pressure observed at the surface) and air pollution (concentration of particulate matter observed at the surface). The new model is capable of predicting hourly values of diffuse solar radiation better than the previously developed ones (R-2 = 0.93 and RMSE = 0.085). A simple version with a large applicability is proposed that takes into consideration cloud effects only (cloudiness and cloud height) and shows a R-2 = 0.92. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Tuberculosis (TB) remains a public health issue worldwide. The lack of specific clinical symptoms to diagnose TB makes the correct decision to admit patients to respiratory isolation a difficult task for the clinician. Isolation of patients without the disease is common and increases health costs. Decision models for the diagnosis of TB in patients attending hospitals can increase the quality of care and decrease costs, without the risk of hospital transmission. We present a predictive model for predicting pulmonary TB in hospitalized patients in a high prevalence area in order to contribute to a more rational use of isolation rooms without increasing the risk of transmission. Methods: Cross sectional study of patients admitted to CFFH from March 2003 to December 2004. A classification and regression tree (CART) model was generated and validated. The area under the ROC curve (AUC), sensitivity, specificity, positive and negative predictive values were used to evaluate the performance of model. Validation of the model was performed with a different sample of patients admitted to the same hospital from January to December 2005. Results: We studied 290 patients admitted with clinical suspicion of TB. Diagnosis was confirmed in 26.5% of them. Pulmonary TB was present in 83.7% of the patients with TB (62.3% with positive sputum smear) and HIV/AIDS was present in 56.9% of patients. The validated CART model showed sensitivity, specificity, positive predictive value and negative predictive value of 60.00%, 76.16%, 33.33%, and 90.55%, respectively. The AUC was 79.70%. Conclusions: The CART model developed for these hospitalized patients with clinical suspicion of TB had fair to good predictive performance for pulmonary TB. The most important variable for prediction of TB diagnosis was chest radiograph results. Prospective validation is still necessary, but our model offer an alternative for decision making in whether to isolate patients with clinical suspicion of TB in tertiary health facilities in countries with limited resources.