26 resultados para Palm Kernel Meal
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Vegeu el resum a l'inici del document del fitxer adjunt.
Resumo:
In a seminal paper, Aitchison and Lauder (1985) introduced classical kernel densityestimation techniques in the context of compositional data analysis. Indeed, they gavetwo options for the choice of the kernel to be used in the kernel estimator. One ofthese kernels is based on the use the alr transformation on the simplex SD jointly withthe normal distribution on RD-1. However, these authors themselves recognized thatthis method has some deficiencies. A method for overcoming these dificulties based onrecent developments for compositional data analysis and multivariate kernel estimationtheory, combining the ilr transformation with the use of the normal density with a fullbandwidth matrix, was recently proposed in Martín-Fernández, Chacón and Mateu-Figueras (2006). Here we present an extensive simulation study that compares bothmethods in practice, thus exploring the finite-sample behaviour of both estimators
Resumo:
For the standard kernel density estimate, it is known that one can tune the bandwidth such that the expected L1 error is within a constant factor of the optimal L1 error (obtained when one is allowed to choose the bandwidth with knowledge of the density). In this paper, we pose the same problem for variable bandwidth kernel estimates where the bandwidths are allowed to depend upon the location. We show in particular that for positive kernels on the real line, for any data-based bandwidth, there exists a densityfor which the ratio of expected L1 error over optimal L1 error tends to infinity. Thus, the problem of tuning the variable bandwidth in an optimal manner is ``too hard''. Moreover, from the class of counterexamples exhibited in the paper, it appears thatplacing conditions on the densities (monotonicity, convexity, smoothness) does not help.
Resumo:
In the fixed design regression model, additional weights areconsidered for the Nadaraya--Watson and Gasser--M\"uller kernel estimators.We study their asymptotic behavior and the relationships between new andclassical estimators. For a simple family of weights, and considering theIMSE as global loss criterion, we show some possible theoretical advantages.An empirical study illustrates the performance of the weighted estimatorsin finite samples.
Resumo:
[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.
Resumo:
En aquest treball demostrem que en la classe de jocs d'assignació amb diagonal dominant (Solymosi i Raghavan, 2001), el repartiment de Thompson (que coincideix amb el valor tau) és l'únic punt del core que és maximal respecte de la relació de dominància de Lorenz, i a més coincideix amb la solucié de Dutta i Ray (1989), també coneguda com solució igualitària. En segon lloc, mitjançant una condició més forta que la de diagonal dominant, introduïm una nova classe de jocs d'assignació on cada agent obté amb la seva parella òptima almenys el doble que amb qualsevol altra parella. Per aquests jocs d'assignació amb diagonal 2-dominant, el repartiment de Thompson és l'únic punt del kernel, i per tant el nucleolo.
Resumo:
[cat] Es presenta un estimador nucli transformat que és adequat per a distribucions de cua pesada. Utilitzant una transformació basada en la distribució de probabilitat Beta l’elecció del paràmetre de finestra és molt directa. Es presenta una aplicació a dades d’assegurances i es mostra com calcular el Valor en Risc.
Resumo:
En aquest treball demostrem que en la classe de jocs d'assignació amb diagonal dominant (Solymosi i Raghavan, 2001), el repartiment de Thompson (que coincideix amb el valor tau) és l'únic punt del core que és maximal respecte de la relació de dominància de Lorenz, i a més coincideix amb la solucié de Dutta i Ray (1989), també coneguda com solució igualitària. En segon lloc, mitjançant una condició més forta que la de diagonal dominant, introduïm una nova classe de jocs d'assignació on cada agent obté amb la seva parella òptima almenys el doble que amb qualsevol altra parella. Per aquests jocs d'assignació amb diagonal 2-dominant, el repartiment de Thompson és l'únic punt del kernel, i per tant el nucleolo.
Resumo:
This study was undertaken in the framework of a larger European project dealing with the characterization of fat co- and by-products from the food chain, available for feed uses. In this study, we compare the effects, on the fatty acid (FA) and tocol composition of chicken and rabbit tissues, of the addition to feeds of a palm fatty acid distillate, very low in trans fatty acids (TFA), and two levels of the corresponding hydrogenated by-product, containing intermediate and high levels of TFA. Thus, the experimental design included three treatments, formulated for each species, containing the three levels of TFA defined above. Obviously, due to the use of hydrogenated fats, the levels of saturated fatty acids (SFA) show clear differences between the three dietary treatments. The results show that diets high in TFA (76 g/kg fat) compared with those low in TFA (4.4 g/kg fat) led to a lower content of tocopherols and tocotrienols in tissues, although these differences were not always statistically significant, and show a different pattern for rabbit and chicken. The TFA content in meat, liver and plasma increased from low-to-high TFA feeds in both chicken and rabbit. However, the transfer ratios from feed were not proportional to the TFA levels in feeds, reflecting certain differences according to the animal species. Moreover, feeds containing fats higher in TFA induced significant changes in tissue SFA, monounsaturated fatty acids and polyunsaturated fatty acids composition, but different patterns can be described for chicken and rabbit and for each type of tissue.
Resumo:
Memoria del proyecto de la elaboración de una aplicación para el sistema operativo Android que permite organizar la alimentación de los usuarios mediante la planificación de las comidas.
Resumo:
Background Nowadays, combining the different sources of information to improve the biological knowledge available is a challenge in bioinformatics. One of the most powerful methods for integrating heterogeneous data types are kernel-based methods. Kernel-based data integration approaches consist of two basic steps: firstly the right kernel is chosen for each data set; secondly the kernels from the different data sources are combined to give a complete representation of the available data for a given statistical task. Results We analyze the integration of data from several sources of information using kernel PCA, from the point of view of reducing dimensionality. Moreover, we improve the interpretability of kernel PCA by adding to the plot the representation of the input variables that belong to any dataset. In particular, for each input variable or linear combination of input variables, we can represent the direction of maximum growth locally, which allows us to identify those samples with higher/lower values of the variables analyzed. Conclusions The integration of different datasets and the simultaneous representation of samples and variables together give us a better understanding of biological knowledge.
Resumo:
BACKGROUND AND AIMS: Liver stiffness is increasingly used in the non-invasive evaluation of chronic liver diseases. Liver stiffness correlates with hepatic venous pressure gradient (HVPG) in patients with cirrhosis and holds prognostic value in this population. Hence, accuracy in its measurement is needed. Several factors independent of fibrosis influence liver stiffness, but there is insufficient information on whether meal ingestion modifies liver stiffness in cirrhosis. We investigated the changes in liver stiffness occurring after the ingestion of a liquid standard test meal in this population. METHODS: In 19 patients with cirrhosis and esophageal varices (9 alcoholic, 9 HCV-related, 1 NASH; Child score 6.9±1.8), liver stiffness (transient elastography), portal blood flow (PBF) and hepatic artery blood flow (HABF) (Doppler-Ultrasound) were measured before and 30 minutes after receiving a standard mixed liquid meal. In 10 the HVPG changes were also measured. RESULTS: Post-prandial hyperemia was accompanied by a marked increase in liver stiffness (+27±33%; p<0.0001). Changes in liver stiffness did not correlate with PBF changes, but directly correlated with HABF changes (r = 0.658; p = 0.002). After the meal, those patients showing a decrease in HABF (n = 13) had a less marked increase of liver stiffness as compared to patients in whom HABF increased (n = 6; +12±21% vs. +62±29%,p<0.0001). As expected, post-prandial hyperemia was associated with an increase in HVPG (n = 10; +26±13%, p = 0.003), but changes in liver stiffness did not correlate with HVPG changes. CONCLUSIONS: Liver stiffness increases markedly after a liquid test meal in patients with cirrhosis, suggesting that its measurement should be performed in standardized fasting conditions. The hepatic artery buffer response appears an important factor modulating postprandial changes of liver stiffness. The post-prandial increase in HVPG cannot be predicted by changes in liver stiffness.
Resumo:
Bakery products such as biscuits, cookies, and pastries represent a good medium for iron fortification in food products, since they are consumed by a large proportion of the population at risk of developing iron deficiency anemia, mainly children. The drawback, however, is that iron fortification can promote oxidation. To assess the extent of this, palm oil added with heme iron and different antioxidants was used as a model for evaluating the oxidative stability of some bakery products, such as baked goods containing chocolate. The palm oil samples were heated at 220°C for 10 min to mimic the conditions found during a typical baking processing. The selected antioxidants were a free radical scavenger (tocopherol extract (TE), 0 and 500 mg/kg), an oxygen scavenger (ascorbyl palmitate (AP), 0 and 500 mg/kg), and a chelating agent (citric acid (CA), 0 and 300 mg/kg). These antioxidants were combined using a factorial design and were compared to a control sample, which was not supplemented with antioxidants. Primary (peroxide value and lipid hydroperoxide content) and secondary oxidation parameters (p-anisidine value, p-AnV) were monitored over a period of 200 days in storage at room temperature. The combination of AP and CA was the most effective treatment in delaying the onset of oxidation. TE was not effective in preventing oxidation. The p-AnV did not increase during the storage period, indicating that this oxidation marker was not suitable for monitoring oxidation in this model.
Resumo:
Bakery products such as biscuits, cookies, and pastries represent a good medium for iron fortification in food products, since they are consumed by a large proportion of the population at risk of developing iron deficiency anemia, mainly children. The drawback, however, is that iron fortification can promote oxidation. To assess the extent of this, palm oil added with heme iron and different antioxidants was used as a model for evaluating the oxidative stability of some bakery products, such as baked goods containing chocolate. The palm oil samples were heated at 220°C for 10 min to mimic the conditions found during a typical baking processing. The selected antioxidants were a free radical scavenger (tocopherol extract (TE), 0 and 500 mg/kg), an oxygen scavenger (ascorbyl palmitate (AP), 0 and 500 mg/kg), and a chelating agent (citric acid (CA), 0 and 300 mg/kg). These antioxidants were combined using a factorial design and were compared to a control sample, which was not supplemented with antioxidants. Primary (peroxide value and lipid hydroperoxide content) and secondary oxidation parameters (p-anisidine value, p-AnV) were monitored over a period of 200 days in storage at room temperature. The combination of AP and CA was the most effective treatment in delaying the onset of oxidation. TE was not effective in preventing oxidation. The p-AnV did not increase during the storage period, indicating that this oxidation marker was not suitable for monitoring oxidation in this model.
Resumo:
We prove upper pointwise estimates for the Bergman kernel of the weighted Fock space of entire functions in $L^{2}(e^{-2\phi}) $ where $\phi$ is a subharmonic function with $\Delta\phi$ a doubling measure. We derive estimates for the canonical solution operator to the inhomogeneous Cauchy-Riemann equation and we characterize the compactness of this operator in terms of $\Delta\phi$.