871 resultados para Terrain traversability estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is aimed at evaluating the physicochemical, physical, chromatic, microbiological, and sensorial stability of a non-dairy dessert elaborated with soy, guava juice, and oligofructose for 60 days at refrigerated storage as well as to estimate its shelf life time. The titrable acidity, pH, instrumental color, water activity, ascorbic acid, and physical stability were measured. Panelists (n = 50) from the campus community used a hedonic scale to assess the acceptance, purchase intent, creaminess, flavor, taste, acidity, color, and overall appearance of the dessert during 60 days. The data showed that the parameters differed significantly (p < 0.05) from the initial time, and they could be fitted in mathematical equations with coefficient of determination above 71%, aiming to consider them suitable for prediction purposes. Creaminess and acceptance did not differ statistically in the 60-day period; taste, flavor, and acidity kept a suitable hedonic score during storage. Notwithstanding, the sample showed good physical stability against gravity and presented more than 15% of the Brazilian Daily Recommended Value of copper, iron, and ascorbic acid. The product shelf life estimation found was 79 days considering the overall acceptance, acceptance index and purchase intent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a fish paste made with cooked Brazilian flathead (Percophis brasiliensis), glycerol (17%), sodium chloride (1.5%) and potassium sorbate (0.1%) the following acid percentages: 0.2; 0.4; 0.6; 0.8, 1 and 1.5% w/w were incorporated to determine the relationship between added acetic acid and the sensorially perceived intensity, and the effects of the combination of sweet-acid tastes. Tests for paired comparison, ranking and structured verbal scales for sweet and acid attributes and psychophysical test were carried out. There was a perceptible difference among samples for differences of 0.4 units of acid concentration. Samples indicated as sweeter by 89.47% of the judges were those containing a lesser acid concentration. A reduction in glycerol sweetness when increasing acid levels was observed. Acetic acid reduced the sweetness of glycerol and inversely glycerol reduced the acidity of acetic acid. The data obtained with the magnitude estimation test agree with Steven's law.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous Distillation-Extraction (SDE) and headspace-solid phase microextraction (HS-SPME) combined with GC-FID and GC-MS were used to analyze volatile compounds from plum (Prunus domestica L. cv. Horvin) and to estimate the most odor-active compounds by application of the Odor Activity Values (OAV). The analyses led to the identification of 148 components, including 58 esters, 23 terpenoids, 14 aldehydes, 11 alcohols, 10 ketones, 9 alkanes, 7 acids, 4 lactones, 3 phenols, and other 9 compounds of different structures. According to the results of SDE-GC-MS, SPME-GC-MS and OAV, ethyl 2-methylbutanoate, hexyl acetate, (E)-2-nonenal, ethyl butanoate, (E)-2-decenal, ethyl hexanoate, nonanal, decanal, (E)-β-ionone, Γ-dodecalactone, (Z)-3-hexenyl acetate, pentyl acetate, linalool, Γ-decalactone, butyl acetate, limonene, propyl acetate, Δ-decalactone, diethyl sulfide, (E)-2-hexenyl acetate, ethyl heptanoate, (Z)-3-hexenol, (Z)-3-hexenyl hexanoate, eugenol, (E)-2-hexenal, ethyl pentanoate, hexyl 2-methylbutanoate, isopentyl hexanoate, 1-hexanol, Γ-nonalactone, myrcene, octyl acetate, phenylacetaldehyde, 1-butanol, isobutyl acetate, (E)-2-heptenal, octadecanal, and nerol are characteristic odor active compounds in fresh plums since they showed concentrations far above their odor thresholds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for calculating the industrial equilibrium exchange rate, which is defined as the one enabling exporters of state-of-the-art manufactured goods to be competitive abroad. The first section highlights the causes and problems of overvalued exchange rates, particularly the Dutch disease issue, which is neutralized when the exchange rate strikes the industrial equilibrium level. This level is defined by the ratio between the unit labor cost in the country under consideration and in competing countries. Finally, the evolution of this exchange rate in the Brazilian economy is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to discuss the trend of overvaluation of the Brazilian currency in the 2000s, presenting an econometric model to estimate the real exchange rate (RER) and which should be a reference level of the RER to guide long-term economic policy. In the econometric model, we consider long-term structural and short-term components, both of which may be responsible for explaining overvaluation trend of the Brazilian currency. Our econometric exercise confirms that the Brazilian currency had been persistently overvalued throughout almost all of the period under analysis, and we suggest that the long-term reference level of the real exchange rate was reached in 2004. In July 2014, the average nominal exchange rate should have been around 2.90 Brazilian reais per dollar (against an observed nominal rate of 2.22 Brazilian reais per dollar) to achieve the 2004 real reference level (average of the year). That is, according to our estimates, in July 2014 the Brazilian real was overvalued at 30.6 per cent in real terms relative to the reference level. Based on these findings we conclude the paper suggesting a mix of policy instruments that should have been used in order to reverse the overvaluation trend of the Brazilian real exchange rate, including a target for reaching a real exchange rate in the medium and the long-run which would favor resource allocation toward more technological intensive sectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.