952 resultados para Doubly robust estimation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In numerous motor tasks, muscles around a joint act coactively to generate opposite torques. A variety of indexes based on electromyography signals have been presented in the literature to quantify muscle coactivation. However, it is not known how to estimate it reliably using such indexes. The goal of this study was to test the reliability of the estimation of muscle coactivation using electromyography. Isometric coactivation was obtained at various muscle activation levels. For this task, any coactivation measurement/index should present the maximal score (100% of coactivation). Two coactivation indexes were applied. In the first, the antagonistic muscle activity (the lower electromyographic signal between two muscles that generate opposite joint torques) is divided by the mean between the agonistic and antagonistic muscle activations. In the second, the ratio between antagonistic and agonistic muscle activation is calculated. Moreover, we computed these indexes considering different electromyographic amplitude normalization procedures. It was found that the first algorithm, with all signals normalized by their respective maximal voluntary coactivation, generates the index closest to the true value (100%), reaching 92 ± 6%. In contrast, the coactivation index value was 82 ± 12% when the second algorithm was applied and the electromyographic signal was not normalized (P < 0.04). The new finding of the present study is that muscle coactivation is more reliably estimated if the EMG signals are normalized by their respective maximal voluntary contraction obtained during maximal coactivation prior to dividing the antagonistic muscle activity by the mean between the agonistic and antagonistic muscle activations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fluid handling systems such as pump and fan systems are found to have a significant potential for energy efficiency improvements. To deliver the energy saving potential, there is a need for easily implementable methods to monitor the system output. This is because information is needed to identify inefficient operation of the fluid handling system and to control the output of the pumping system according to process needs. Model-based pump or fan monitoring methods implemented in variable speed drives have proven to be able to give information on the system output without additional metering; however, the current model-based methods may not be usable or sufficiently accurate in the whole operation range of the fluid handling device. To apply model-based system monitoring in a wider selection of systems and to improve the accuracy of the monitoring, this paper proposes a new method for pump and fan output monitoring with variable-speed drives. The method uses a combination of already known operating point estimation methods. Laboratory measurements are used to verify the benefits and applicability of the improved estimation method, and the new method is compared with five previously introduced model-based estimation methods. According to the laboratory measurements, the new estimation method is the most accurate and reliable of the model-based estimation methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Wind turbines based on doubly fed induction generators (DFIG) become the most popular solution in high power wind generation industry. While this topology provides great performance with the reduced power rating of power converter, it has more complicated structure in comparison with full-rated topologies, and therefore leads to complexity of control algorithms and electromechanical processes in the system. The purpose of presented study is to present a proper vector control scheme for the DFIG and overall control for the WT to investigate its behavior at different wind speeds and in different grid voltage conditions: voltage sags, magnitude and frequency variations. The key principles of variable-speed wind turbine were implemented in simulation model and demonstrated during the study. Then, based on developed control scheme and mathematical model, the set of simulation is made to analyze reactive power capabilities of the DFIG wind turbine. Further, the rating of rotor-side converter is modified to not only generate active rated active power, but also to fulfill Grid Codes. Results of modelling and analyzing of the DFIG WT behavior under different speeds and different voltage conditions are presented in the work.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work is aimed at evaluating the physicochemical, physical, chromatic, microbiological, and sensorial stability of a non-dairy dessert elaborated with soy, guava juice, and oligofructose for 60 days at refrigerated storage as well as to estimate its shelf life time. The titrable acidity, pH, instrumental color, water activity, ascorbic acid, and physical stability were measured. Panelists (n = 50) from the campus community used a hedonic scale to assess the acceptance, purchase intent, creaminess, flavor, taste, acidity, color, and overall appearance of the dessert during 60 days. The data showed that the parameters differed significantly (p < 0.05) from the initial time, and they could be fitted in mathematical equations with coefficient of determination above 71%, aiming to consider them suitable for prediction purposes. Creaminess and acceptance did not differ statistically in the 60-day period; taste, flavor, and acidity kept a suitable hedonic score during storage. Notwithstanding, the sample showed good physical stability against gravity and presented more than 15% of the Brazilian Daily Recommended Value of copper, iron, and ascorbic acid. The product shelf life estimation found was 79 days considering the overall acceptance, acceptance index and purchase intent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a fish paste made with cooked Brazilian flathead (Percophis brasiliensis), glycerol (17%), sodium chloride (1.5%) and potassium sorbate (0.1%) the following acid percentages: 0.2; 0.4; 0.6; 0.8, 1 and 1.5% w/w were incorporated to determine the relationship between added acetic acid and the sensorially perceived intensity, and the effects of the combination of sweet-acid tastes. Tests for paired comparison, ranking and structured verbal scales for sweet and acid attributes and psychophysical test were carried out. There was a perceptible difference among samples for differences of 0.4 units of acid concentration. Samples indicated as sweeter by 89.47% of the judges were those containing a lesser acid concentration. A reduction in glycerol sweetness when increasing acid levels was observed. Acetic acid reduced the sweetness of glycerol and inversely glycerol reduced the acidity of acetic acid. The data obtained with the magnitude estimation test agree with Steven's law.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simultaneous Distillation-Extraction (SDE) and headspace-solid phase microextraction (HS-SPME) combined with GC-FID and GC-MS were used to analyze volatile compounds from plum (Prunus domestica L. cv. Horvin) and to estimate the most odor-active compounds by application of the Odor Activity Values (OAV). The analyses led to the identification of 148 components, including 58 esters, 23 terpenoids, 14 aldehydes, 11 alcohols, 10 ketones, 9 alkanes, 7 acids, 4 lactones, 3 phenols, and other 9 compounds of different structures. According to the results of SDE-GC-MS, SPME-GC-MS and OAV, ethyl 2-methylbutanoate, hexyl acetate, (E)-2-nonenal, ethyl butanoate, (E)-2-decenal, ethyl hexanoate, nonanal, decanal, (E)-β-ionone, Γ-dodecalactone, (Z)-3-hexenyl acetate, pentyl acetate, linalool, Γ-decalactone, butyl acetate, limonene, propyl acetate, Δ-decalactone, diethyl sulfide, (E)-2-hexenyl acetate, ethyl heptanoate, (Z)-3-hexenol, (Z)-3-hexenyl hexanoate, eugenol, (E)-2-hexenal, ethyl pentanoate, hexyl 2-methylbutanoate, isopentyl hexanoate, 1-hexanol, Γ-nonalactone, myrcene, octyl acetate, phenylacetaldehyde, 1-butanol, isobutyl acetate, (E)-2-heptenal, octadecanal, and nerol are characteristic odor active compounds in fresh plums since they showed concentrations far above their odor thresholds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Most of the applications of airborne laser scanner data to forestry require that the point cloud be normalized, i.e., each point represents height from the ground instead of elevation. To normalize the point cloud, a digital terrain model (DTM), which is derived from the ground returns in the point cloud, is employed. Unfortunately, extracting accurate DTMs from airborne laser scanner data is a challenging task, especially in tropical forests where the canopy is normally very thick (partially closed), leading to a situation in which only a limited number of laser pulses reach the ground. Therefore, robust algorithms for extracting accurate DTMs in low-ground-point-densitysituations are needed in order to realize the full potential of airborne laser scanner data to forestry. The objective of this thesis is to develop algorithms for processing airborne laser scanner data in order to: (1) extract DTMs in demanding forest conditions (complex terrain and low number of ground points) for applications in forestry; (2) estimate canopy base height (CBH) for forest fire behavior modeling; and (3) assess the robustness of LiDAR-based high-resolution biomass estimation models against different field plot designs. Here, the aim is to find out if field plot data gathered by professional foresters can be combined with field plot data gathered by professionally trained community foresters and used in LiDAR-based high-resolution biomass estimation modeling without affecting prediction performance. The question of interest in this case is whether or not the local forest communities can achieve the level technical proficiency required for accurate forest monitoring. The algorithms for extracting DTMs from LiDAR point clouds presented in this thesis address the challenges of extracting DTMs in low-ground-point situations and in complex terrain while the algorithm for CBH estimation addresses the challenge of variations in the distribution of points in the LiDAR point cloud caused by things like variations in tree species and season of data acquisition. These algorithms are adaptive (with respect to point cloud characteristics) and exhibit a high degree of tolerance to variations in the density and distribution of points in the LiDAR point cloud. Results of comparison with existing DTM extraction algorithms showed that DTM extraction algorithms proposed in this thesis performed better with respect to accuracy of estimating tree heights from airborne laser scanner data. On the other hand, the proposed DTM extraction algorithms, being mostly based on trend surface interpolation, can not retain small artifacts in the terrain (e.g., bumps, small hills and depressions). Therefore, the DTMs generated by these algorithms are only suitable for forestry applications where the primary objective is to estimate tree heights from normalized airborne laser scanner data. On the other hand, the algorithm for estimating CBH proposed in this thesis is based on the idea of moving voxel in which gaps (openings in the canopy) which act as fuel breaks are located and their height is estimated. Test results showed a slight improvement in CBH estimation accuracy over existing CBH estimation methods which are based on height percentiles in the airborne laser scanner data. However, being based on the idea of moving voxel, this algorithm has one main advantage over existing CBH estimation methods in the context of forest fire modeling: it has great potential in providing information about vertical fuel continuity. This information can be used to create vertical fuel continuity maps which can provide more realistic information on the risk of crown fires compared to CBH.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a methodology for calculating the industrial equilibrium exchange rate, which is defined as the one enabling exporters of state-of-the-art manufactured goods to be competitive abroad. The first section highlights the causes and problems of overvalued exchange rates, particularly the Dutch disease issue, which is neutralized when the exchange rate strikes the industrial equilibrium level. This level is defined by the ratio between the unit labor cost in the country under consideration and in competing countries. Finally, the evolution of this exchange rate in the Brazilian economy is estimated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this paper is to discuss the trend of overvaluation of the Brazilian currency in the 2000s, presenting an econometric model to estimate the real exchange rate (RER) and which should be a reference level of the RER to guide long-term economic policy. In the econometric model, we consider long-term structural and short-term components, both of which may be responsible for explaining overvaluation trend of the Brazilian currency. Our econometric exercise confirms that the Brazilian currency had been persistently overvalued throughout almost all of the period under analysis, and we suggest that the long-term reference level of the real exchange rate was reached in 2004. In July 2014, the average nominal exchange rate should have been around 2.90 Brazilian reais per dollar (against an observed nominal rate of 2.22 Brazilian reais per dollar) to achieve the 2004 real reference level (average of the year). That is, according to our estimates, in July 2014 the Brazilian real was overvalued at 30.6 per cent in real terms relative to the reference level. Based on these findings we conclude the paper suggesting a mix of policy instruments that should have been used in order to reverse the overvaluation trend of the Brazilian real exchange rate, including a target for reaching a real exchange rate in the medium and the long-run which would favor resource allocation toward more technological intensive sectors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since its discovery, chaos has been a very interesting and challenging topic of research. Many great minds spent their entire lives trying to give some rules to it. Nowadays, thanks to the research of last century and the advent of computers, it is possible to predict chaotic phenomena of nature for a certain limited amount of time. The aim of this study is to present a recently discovered method for the parameter estimation of the chaotic dynamical system models via the correlation integral likelihood, and give some hints for a more optimized use of it, together with a possible application to the industry. The main part of our study concerned two chaotic attractors whose general behaviour is diff erent, in order to capture eventual di fferences in the results. In the various simulations that we performed, the initial conditions have been changed in a quite exhaustive way. The results obtained show that, under certain conditions, this method works very well in all the case. In particular, it came out that the most important aspect is to be very careful while creating the training set and the empirical likelihood, since a lack of information in this part of the procedure leads to low quality results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis aims to investigate pricing of liquidity risks in London Stock Exchange. Liquidity Adjusted Capital Asset Pricing Model i.e. LCAPM developed by Acharya and Pedersen (2005) is being applied to test the influence of various liquidity risks on stock returns in London Stock Exchange. The Liquidity Adjusted Capital Asset Pricing model provides a unified framework for the testing of liquidity risks. All the common stocks listed and delisted for the period of 2000 to 2014 are included in the data sample. The study has incorporated three different measures of liquidity – Percent Quoted Spread, Amihud (2002) and Turnover. The reason behind the application of three different liquidity measures is the multi-dimensional nature of liquidity. Firm fixed effects panel regression is applied for the estimation of LCAPM. However, the results are robust according to Fama-Macbeth regressions. The results of the study indicates that liquidity risks in the form of (i) level of liquidity, (ii) commonality in liquidity (iii) flight to liquidity, (iv) depressed wealth effect and market return as well as aggregate liquidity risk are priced at London Stock Exchange. However, the results are sensitive to the choice of liquidity measures.