993 resultados para Bayesian Estimation
Resumo:
This work is aimed at evaluating the physicochemical, physical, chromatic, microbiological, and sensorial stability of a non-dairy dessert elaborated with soy, guava juice, and oligofructose for 60 days at refrigerated storage as well as to estimate its shelf life time. The titrable acidity, pH, instrumental color, water activity, ascorbic acid, and physical stability were measured. Panelists (n = 50) from the campus community used a hedonic scale to assess the acceptance, purchase intent, creaminess, flavor, taste, acidity, color, and overall appearance of the dessert during 60 days. The data showed that the parameters differed significantly (p < 0.05) from the initial time, and they could be fitted in mathematical equations with coefficient of determination above 71%, aiming to consider them suitable for prediction purposes. Creaminess and acceptance did not differ statistically in the 60-day period; taste, flavor, and acidity kept a suitable hedonic score during storage. Notwithstanding, the sample showed good physical stability against gravity and presented more than 15% of the Brazilian Daily Recommended Value of copper, iron, and ascorbic acid. The product shelf life estimation found was 79 days considering the overall acceptance, acceptance index and purchase intent.
Resumo:
In a fish paste made with cooked Brazilian flathead (Percophis brasiliensis), glycerol (17%), sodium chloride (1.5%) and potassium sorbate (0.1%) the following acid percentages: 0.2; 0.4; 0.6; 0.8, 1 and 1.5% w/w were incorporated to determine the relationship between added acetic acid and the sensorially perceived intensity, and the effects of the combination of sweet-acid tastes. Tests for paired comparison, ranking and structured verbal scales for sweet and acid attributes and psychophysical test were carried out. There was a perceptible difference among samples for differences of 0.4 units of acid concentration. Samples indicated as sweeter by 89.47% of the judges were those containing a lesser acid concentration. A reduction in glycerol sweetness when increasing acid levels was observed. Acetic acid reduced the sweetness of glycerol and inversely glycerol reduced the acidity of acetic acid. The data obtained with the magnitude estimation test agree with Steven's law.
Resumo:
Simultaneous Distillation-Extraction (SDE) and headspace-solid phase microextraction (HS-SPME) combined with GC-FID and GC-MS were used to analyze volatile compounds from plum (Prunus domestica L. cv. Horvin) and to estimate the most odor-active compounds by application of the Odor Activity Values (OAV). The analyses led to the identification of 148 components, including 58 esters, 23 terpenoids, 14 aldehydes, 11 alcohols, 10 ketones, 9 alkanes, 7 acids, 4 lactones, 3 phenols, and other 9 compounds of different structures. According to the results of SDE-GC-MS, SPME-GC-MS and OAV, ethyl 2-methylbutanoate, hexyl acetate, (E)-2-nonenal, ethyl butanoate, (E)-2-decenal, ethyl hexanoate, nonanal, decanal, (E)-β-ionone, Γ-dodecalactone, (Z)-3-hexenyl acetate, pentyl acetate, linalool, Γ-decalactone, butyl acetate, limonene, propyl acetate, Δ-decalactone, diethyl sulfide, (E)-2-hexenyl acetate, ethyl heptanoate, (Z)-3-hexenol, (Z)-3-hexenyl hexanoate, eugenol, (E)-2-hexenal, ethyl pentanoate, hexyl 2-methylbutanoate, isopentyl hexanoate, 1-hexanol, Γ-nonalactone, myrcene, octyl acetate, phenylacetaldehyde, 1-butanol, isobutyl acetate, (E)-2-heptenal, octadecanal, and nerol are characteristic odor active compounds in fresh plums since they showed concentrations far above their odor thresholds.
Resumo:
This paper presents a methodology for calculating the industrial equilibrium exchange rate, which is defined as the one enabling exporters of state-of-the-art manufactured goods to be competitive abroad. The first section highlights the causes and problems of overvalued exchange rates, particularly the Dutch disease issue, which is neutralized when the exchange rate strikes the industrial equilibrium level. This level is defined by the ratio between the unit labor cost in the country under consideration and in competing countries. Finally, the evolution of this exchange rate in the Brazilian economy is estimated.
Resumo:
The aim of this paper is to discuss the trend of overvaluation of the Brazilian currency in the 2000s, presenting an econometric model to estimate the real exchange rate (RER) and which should be a reference level of the RER to guide long-term economic policy. In the econometric model, we consider long-term structural and short-term components, both of which may be responsible for explaining overvaluation trend of the Brazilian currency. Our econometric exercise confirms that the Brazilian currency had been persistently overvalued throughout almost all of the period under analysis, and we suggest that the long-term reference level of the real exchange rate was reached in 2004. In July 2014, the average nominal exchange rate should have been around 2.90 Brazilian reais per dollar (against an observed nominal rate of 2.22 Brazilian reais per dollar) to achieve the 2004 real reference level (average of the year). That is, according to our estimates, in July 2014 the Brazilian real was overvalued at 30.6 per cent in real terms relative to the reference level. Based on these findings we conclude the paper suggesting a mix of policy instruments that should have been used in order to reverse the overvaluation trend of the Brazilian real exchange rate, including a target for reaching a real exchange rate in the medium and the long-run which would favor resource allocation toward more technological intensive sectors.
Resumo:
Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.
Resumo:
Original.
Resumo:
A new approach to treating large Z systems by quantum Monte Carlo has been developed. It naturally leads to notion of the 'valence energy'. Possibilities of the new approach has been explored by optimizing the wave function for CuH and Cu and computing dissociation energy and dipole moment of CuH using variational Monte Carlo. The dissociation energy obtained is about 40% smaller than the experimental value; the method is comparable with SCF and simple pseudopotential calculations. The dipole moment differs from the best theoretical estimate by about 50% what is again comparable with other methods (Complete Active Space SCF and pseudopotential methods).
Resumo:
Our objective is to develop a diffusion Monte Carlo (DMC) algorithm to estimate the exact expectation values, ($o|^|^o), of multiplicative operators, such as polarizabilities and high-order hyperpolarizabilities, for isolated atoms and molecules. The existing forward-walking pure diffusion Monte Carlo (FW-PDMC) algorithm which attempts this has a serious bias. On the other hand, the DMC algorithm with minimal stochastic reconfiguration provides unbiased estimates of the energies, but the expectation values ($o|^|^) are contaminated by ^, an user specified, approximate wave function, when A does not commute with the Hamiltonian. We modified the latter algorithm to obtain the exact expectation values for these operators, while at the same time eliminating the bias. To compare the efficiency of FW-PDMC and the modified DMC algorithms we calculated simple properties of the H atom, such as various functions of coordinates and polarizabilities. Using three non-exact wave functions, one of moderate quality and the others very crude, in each case the results are within statistical error of the exact values.
Resumo:
The purpose of this study is to examine the impact of the choice of cut-off points, sampling procedures, and the business cycle on the accuracy of bankruptcy prediction models. Misclassification can result in erroneous predictions leading to prohibitive costs to firms, investors and the economy. To test the impact of the choice of cut-off points and sampling procedures, three bankruptcy prediction models are assessed- Bayesian, Hazard and Mixed Logit. A salient feature of the study is that the analysis includes both parametric and nonparametric bankruptcy prediction models. A sample of firms from Lynn M. LoPucki Bankruptcy Research Database in the U. S. was used to evaluate the relative performance of the three models. The choice of a cut-off point and sampling procedures were found to affect the rankings of the various models. In general, the results indicate that the empirical cut-off point estimated from the training sample resulted in the lowest misclassification costs for all three models. Although the Hazard and Mixed Logit models resulted in lower costs of misclassification in the randomly selected samples, the Mixed Logit model did not perform as well across varying business-cycles. In general, the Hazard model has the highest predictive power. However, the higher predictive power of the Bayesian model, when the ratio of the cost of Type I errors to the cost of Type II errors is high, is relatively consistent across all sampling methods. Such an advantage of the Bayesian model may make it more attractive in the current economic environment. This study extends recent research comparing the performance of bankruptcy prediction models by identifying under what conditions a model performs better. It also allays a range of user groups, including auditors, shareholders, employees, suppliers, rating agencies, and creditors' concerns with respect to assessing failure risk.