986 resultados para Prediction Equations


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The research was carried out to determine body composition and calcium and phosphorus requirements of Santa Ines lambs. Eighteen entire male lambs with average initial live weight of 15 kg were used. The animals were allotted to three groups: six animals were slaughtered at the beginning of the experiment, to access the amount of calcium and phosphorus present in the body, as reference animals for the comparative slaughter technique. Six animals were ad libitum fed and six were restrict fed (maintenance level plus 20%). The animals ad libitum and restrict fed started the experimental period by pairs and they were both slaughtered when the first reached 25 kg body weight. The body composition was estimated through the prediction equations obtained by regression of the logarithm of the amount of calcium and phosphorus in the empty body on the logarithm of the empty body weight. Net requirements for calcium and phosphorus for maintenance and the absorption coefficient were obtained through the correlation between the amount of each mineral consumed and retained in the animal body. The net requirements maintenance for live weight gain were obtained by means of derivation of the prediction of body composition equations. The net requirements maintainance of calcium and phosphorus for animals from 15 to 25 kg body weight were: 305 mg Ca/day and 325 mg P/day and net requirements for kg of the body weight gain for animals with 15 and 25 kg LW were 11.41 and 10.33 g Ca and 5.72 and 4.94 g P, respectively. The absorption coefficients were estimated to be .44 and 0.55 for Ca and P, respectively.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This work aimed to compare intake prediction equations with values obtained by direct methods using chopped elephant-grass offered to crossbreed lactating cows with rumen canulas. The experimental design was a 3 x 3 Latin square (three animals and three cutting ages: 30, 45 and 60 days). The equations used for intake prediction (y) were: (1) y= -1.19 + 0.035(a+b) + 28.5c; (2) y= [%NDF on DM]*[NDF intake]/[(1- a - b)/KP+b/(c+kp)]/24; (3) y= -0.822 + 0.0748(a+b) + 40.7c and (4) equation 2 with values of intake measured directly. The predictions of NDF intake by equations were not different among treatments, instead of the difference among values measured directly: the 30 day-old had lower intake (5.29 kg/day) in relation to 45 (6.57 kg/day) and 60 (7.31 kg/day) day-old grasses. In general, equations overestimated the DM intake in relation to direct measuring (9.0 kg/cow/day), with exception of equation 3 which underestimated the intake (7.7 kg/day). The means of DM intake found by equations 1 and 2 (13.7 and 13.4 kg/cow/day, respectively) were similar between themselves and superior in relation to those found by equation 4 (9.7 kg/cow/day). The intakes measured directly were similar to those found in equation 4 and higher than those found by equation 3. The mean of rumen fill of 7.5 kg was superior to those of 5.2 kg estimated by equation. The prediction equations based on in situ degradability parameters do not supply estimates of DM intake, NDF intake and rumen fill in agreement with values obtained by direct methods.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Zootecnia - FMVZ

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Ciências Odontológicas - FOAR

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A data set based on 50 studies including feed intake and utilization traits was used to perform a meta-analysis to obtain pooled estimates using the variance between studies of genetic parameters for average daily gain (ADG); residual feed intake (RFI); metabolic body weight (MBW); feed conversion ratio (FCR); and daily dry matter intake (DMI) in beef cattle. The total data set included 128 heritability and 122 genetic correlation estimates published in the literature from 1961 to 2012. The meta-analysis was performed using a random effects model where the restricted maximum likelihood estimator was used to evaluate variances among clusters. Also, a meta-analysis using the method of cluster analysis was used to group the heritability estimates. Two clusters were obtained for each trait by different variables. It was observed, for all traits, that the heterogeneity of variance was significant between clusters and studies for genetic correlation estimates. The pooled estimates, adding the variance between clusters, for direct heritability estimates for ADG, DMI, RFI, MBW and FCR were 0.32 +/- 0.04, 0.39 +/- 0.03, 0.31 +/- 0.02, 0.31 +/- 0.03 and 0.26 +/- 0.03, respectively. Pooled genetic correlation estimates ranged from -0.15 to 0.67 among ADG, DMI, RFI, MBW and FCR. These pooled estimates of genetic parameters could be used to solve genetic prediction equations in populations where data is insufficient for variance component estimation. Cluster analysis is recommended as a statistical procedure to combine results from different studies to account for heterogeneity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Zootecnia - FCAV

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Few equations have been developed in veterinary medicine compared to human medicine to predict body composition. The present study was done to evaluate the influence of weight loss on biometry (BIO), bioimpedance analysis (BIA) and ultrasonography (US) in cats, proposing equations to estimate fat (FM) and lean (LM) body mass, as compared to dual energy x-ray absorptiometry (DXA) as the referenced method. For this were used 16 gonadectomized obese cats (8 males and 8 females) in a weight loss program. DXA, BIO, BIA and US were performed in the obese state (T0; obese animals), after 10% of weight loss (T1) and after 20% of weight loss (T2). Stepwise regression was used to analyze the relationship between the dependent variables (FM, LM) determined by DXA and the independent variables obtained by BIO, BIA and US. The better models chosen were evaluated by a simple regression analysis and means predicted vs. determined by DXA were compared to verify the accuracy of the equations. Results: The independent variables determined by BIO, BIA and US that best correlated (p < 0.005) with the dependent variables (FM and LM) were BW (body weight), TC (thoracic circumference), PC (pelvic circumference), R (resistance) and SFLT (subcutaneous fat layer thickness). Using Mallows'Cp statistics, p value and r(2), 19 equations were selected (12 for FM, 7 for LM); however, only 7 equations accurately predicted FM and one LM of cats. Conclusions: The equations with two variables are better to use because they are effective and will be an alternative method to estimate body composition in the clinical routine. For estimated lean mass the equations using body weight associated with biometrics measures can be proposed. For estimated fat mass the equations using body weight associated with bioimpedance analysis can be proposed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

METHODS Spirometry datasets from South-Asian children were collated from four centres in India and five within the UK. Records with transcription errors, missing values for height or spirometry, and implausible values were excluded(n = 110). RESULTS Following exclusions, cross-sectional data were available from 8,124 children (56.3% male; 5-17 years). When compared with GLI-predicted values from White Europeans, forced expired volume in 1s (FEV1) and forced vital capacity (FVC) in South-Asian children were on average 15% lower, ranging from 4-19% between centres. By contrast, proportional reductions in FEV1 and FVC within all but two datasets meant that the FEV1/FVC ratio remained independent of ethnicity. The 'GLI-Other' equation fitted data from North India reasonably well while 'GLI-Black' equations provided a better approximation for South-Asian data than the 'GLI-White' equation. However, marked discrepancies in the mean lung function z-scores between centres especially when examined according to socio-economic conditions precluded derivation of a single South-Asian GLI-adjustment. CONCLUSION Until improved and more robust prediction equations can be derived, we recommend the use of 'GLI-Black' equations for interpreting most South-Asian data, although 'GLI-Other' may be more appropriate for North Indian data. Prospective data collection using standardised protocols to explore potential sources of variation due to socio-economic circumstances, secular changes in growth/predictors of lung function and ethnicities within the South-Asian classification are urgently required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tras el devastador terremoto del 12 de enero de 2010 en Puerto Príncipe, Haití, las autoridades locales, numerosas ONGs y organismos nacionales e internacionales están trabajando en el desarrollo de estrategias para minimizar el elevado riesgo sísmico existente en el país. Para ello es necesario, en primer lugar, estimar dicho riesgo asociado a eventuales terremotos futuros que puedan producirse, evaluando el grado de pérdidas que podrían generar, para dimensionar la catástrofe y actuar en consecuencia, tanto en lo referente a medidas preventivas como a adopción de planes de emergencia. En ese sentido, este Trabajo Fin de Master aporta un análisis detallado del riesgo sísmico asociado a un futuro terremoto que podría producirse con probabilidad razonable, causando importantes daños en Puerto Príncipe. Se propone para ello una metodología de cálculo del riesgo adaptada a los condicionantes de la zona, con modelos calibrados empleando datos del sismo de 2010. Se ha desarrollado en el marco del proyecto de cooperación Sismo-Haití, financiado por la Universidad Politécnica de Madrid, que comenzó diez meses después del terremoto de 2010 como respuesta a una petición de ayuda del gobierno haitiano. El cálculo del riesgo requiere la consideración de dos inputs: la amenaza sísmica o movimiento esperado por el escenario definido (sismo de cierta magnitud y localización) y los elementos expuestos a esta amenaza (una clasificación del parque inmobiliario en diferentes tipologías constructivas, así como su vulnerabilidad). La vulnerabilidad de estas tipologías se describe por medio de funciones de daño: espectros de capacidad, que representan su comportamiento ante las fuerzas horizontales motivadas por los sismos, y curvas de fragilidad, que representan la probabilidad de que las estructuras sufran daños al alcanzar el máximo desplazamiento horizontal entre plantas debido a la mencionada fuerza horizontal. La metodología que se propone especifica determinadas pautas y criterios para estimar el movimiento, asignar la vulnerabilidad y evaluar el daño, cubriendo los tres estados del proceso. Por una parte, se consideran diferentes modelos de movimiento fuerte incluyendo el efecto local, y se identifican los que mejor ajustan a las observaciones de 2010. Por otra se clasifica el parque inmobiliario en diferentes tipologías constructivas, en base a la información extraída en una campaña de campo y utilizando además una base de datos aportada por el Ministerio de Obras Públicas de Haití. Ésta contiene información relevante de todos los edificios de la ciudad, resultando un total de 6 tipologías. Finalmente, para la estimación del daño se aplica el método capacidad-demanda implementado en el programa SELENA (Molina et al., 2010). En primer lugar, utilizado los datos de daño del terremoto de 2010, se ha calibrado el modelo propuesto de cálculo de riesgo sísmico: cuatro modelos de movimiento fuerte, tres modelos de tipo de suelo y un conjunto de funciones de daño. Finalmente, con el modelo calibrado, se ha simulado un escenario sísmico determinista correspondiente a un posible terremoto con epicentro próximo a Puerto Príncipe. Los resultados muestran que los daños estructurales serán considerables y podrán llevar a pérdidas económicas y humanas que causen un gran impacto en el país, lo que pone de manifiesto la alta vulnerabilidad estructural existente. Este resultado será facilitado a las autoridades locales, constituyendo una base sólida para toma de decisiones y adopción de políticas de prevención y mitigación del riesgo. Se recomienda dirigir esfuerzos hacia la reducción de la vulnerabilidad estructural - mediante refuerzo de edificios vulnerables y adopción de una normativa sismorresistente- y hacia el desarrollo de planes de emergencia. Abstract After the devastating 12 January 2010 earthquake that hit the city of Port-au-Prince, Haiti, strategies to minimize the high seismic risk are being developed by local authorities, NGOs, and national and international institutions. Two important tasks to reach this objective are, on the one hand, the evaluation of the seismic risk associated to possible future earthquakes in order to know the dimensions of the catastrophe; on the other hand, the design of preventive measures and emergency plans to minimize the consequences of such events. In this sense, this Master Thesis provides a detailed estimation of the damage that a possible future earthquake will cause in Port-au-Prince. A methodology to calculate the seismic risk is proposed, adapted to the study area conditions. This methodology has been calibrated using data from the 2010 earthquake. It has been conducted in the frame of the Sismo-Haiti cooperative project, supported by the Technical University of Madrid, which started ten months after the 2010 earthquake as an answer to an aid call of the Haitian government. The seismic risk calculation requires two inputs: the seismic hazard (expected ground motion due to a scenario earthquake given by magnitude and location) and the elements exposed to the hazard (classification of the building stock into building typologies, as well as their vulnerability). This vulnerability is described through the damage functions: capacity curves, which represent the structure performance against the horizontal forces caused by the seisms; and fragility curves, which represent the probability of damage as the structure reaches the maximum spectral displacement due to the horizontal force. The proposed methodology specifies certain guidelines and criteria to estimate the ground motion, assign the vulnerability, and evaluate the damage, covering the whole process. Firstly, different ground motion prediction equations including the local effect are considered, and the ones that have the best correlation with the observations of the 2010 earthquake, are identified. Secondly, the classification of building typologies is made by using the information collected during a field campaign, as well as a data base provided by the Ministry of Public Works of Haiti. This data base contains relevant information about all the buildings in the city, leading to a total of 6 different typologies. Finally, the damage is estimated using the capacity-spectrum method as implemented in the software SELENA (Molina et al., 2010). Data about the damage caused by the 2010 earthquake have been used to calibrate the proposed calculation model: different choices of ground motion relationships, soil models, and damage functions. Then, with the calibrated model, a deterministic scenario corresponding to an epicenter close to Port-au-Prince has been simulated. The results show high structural damage, and therefore, they point out the high structural vulnerability in the city. Besides, the economic and human losses associated to the damage would cause a great impact in the country. This result will be provided to the Haitian Government, constituting a scientific base for decision making and for the adoption of measures to prevent and mitigate the seismic risk. It is highly recommended to drive efforts towards the quality control of the new buildings -through reinforcement and construction according to a seismic code- and the development of emergency planning.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In this paper we present a global overview of the recent study carried out in Spain for the new hazard map, which final goal is the revision of the Building Code in our country (NCSE-02). The study was carried our for a working group joining experts from The Instituto Geografico Nacional (IGN) and the Technical University of Madrid (UPM) , being the different phases of the work supervised by an expert Committee integrated by national experts from public institutions involved in subject of seismic hazard. The PSHA method (Probabilistic Seismic Hazard Assessment) has been followed, quantifying the epistemic uncertainties through a logic tree and the aleatory ones linked to variability of parameters by means of probability density functions and Monte Carlo simulations. In a first phase, the inputs have been prepared, which essentially are: 1) a project catalogue update and homogenization at Mw 2) proposal of zoning models and source characterization 3) calibration of Ground Motion Prediction Equations (GMPE’s) with actual data and development of a local model with data collected in Spain for Mw < 5.5. In a second phase, a sensitivity analysis of the different input options on hazard results has been carried out in order to have criteria for defining the branches of the logic tree and their weights. Finally, the hazard estimation was done with the logic tree shown in figure 1, including nodes for quantifying uncertainties corresponding to: 1) method for estimation of hazard (zoning and zoneless); 2) zoning models, 3) GMPE combinations used and 4) regression method for estimation of source parameters. In addition, the aleatory uncertainties corresponding to the magnitude of the events, recurrence parameters and maximum magnitude for each zone have been also considered including probability density functions and Monte Carlo simulations The main conclusions of the study are presented here, together with the obtained results in terms of PGA and other spectral accelerations SA (T) for return periods of 475, 975 and 2475 years. The map of the coefficient of variation (COV) are also represented to give an idea of the zones where the dispersion among results are the highest and the zones where the results are robust.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A damage scenario modelling is developed and compared with the damage distribution observed after the 2011 Lorca earthquake. The strong ground motion models considered include five modern ground motion prediction equations (GMPEs) amply used worldwide. Capacity and fragility curves from the Risk-UE project are utilized to model building vulnerability and expected damage. Damage estimates resulting from different combinations of GMPE and capacity/fragility curves are compared with the actual damage scenario, establishing the combination that best explains the observed damage distribution. In addition, some recommendations are proposed, including correction factors in fragility curves in order to reproduce in a better way the observed damage in masonry and reinforce concrete buildings. The lessons learned would contribute to improve the simulation of expected damages due to future earthquakes in Lorca or other regions in Spain with similar characteristics regarding attenuation and vulnerability.