913 resultados para instrumental variables
Resumo:
Variational data assimilation systems for numerical weather prediction rely on a transformation of model variables to a set of control variables that are assumed to be uncorrelated. Most implementations of this transformation are based on the assumption that the balanced part of the flow can be represented by the vorticity. However, this assumption is likely to break down in dynamical regimes characterized by low Burger number. It has recently been proposed that a variable transformation based on potential vorticity should lead to control variables that are uncorrelated over a wider range of regimes. In this paper we test the assumption that a transform based on vorticity and one based on potential vorticity produce an uncorrelated set of control variables. Using a shallow-water model we calculate the correlations between the transformed variables in the different methods. We show that the control variables resulting from a vorticity-based transformation may retain large correlations in some dynamical regimes, whereas a potential vorticity based transformation successfully produces a set of uncorrelated control variables. Calculations of spatial correlations show that the benefit of the potential vorticity transformation is linked to its ability to capture more accurately the balanced component of the flow.
Resumo:
The objective of this study was to determine the potential of mid-infrared spectroscopy coupled with multidimensional statistical analysis for the prediction of processed cheese instrumental texture and meltability attributes. Processed cheeses (n = 32) of varying composition were manufactured in a pilot plant. Following two and four weeks storage at 4 degrees C samples were analysed using texture profile analysis, two meltability tests (computer vision, Olson and Price) and mid-infrared spectroscopy (4000-640 cm(-1)). Partial least squares regression was used to develop predictive models for all measured attributes. Five attributes were successfully modelled with varying degrees of accuracy. The computer vision meltability model allowed for discrimination between high and low melt values (R-2 = 0.64). The hardness and springiness models gave approximate quantitative results (R-2 = 0.77) and the cohesiveness (R-2 = 0.81) and Olson and Price meltability (R-2 = 0.88) models gave good prediction results. (c) 2006 Elsevier Ltd. All rights reserved..
Resumo:
Three batches of oats were extruded under four combinations of process temperature (150 or 180 °C) and process moisture (14.5 and 18%). Two of the extrudates were evaluated by a sensory panel, and three were analyzed by GC-MS. Maillard reaction products, such as pyrazines, pyrroles, furans, and sulfur-containing compounds, were found in the most severely processed extrudates (high-temperature, low-moisture). These extrudates were also described by the assessors as having toasted cereal attributes. Lipid degradation products, such as alkanals, 2-alkenals, and 2,4-alkadienals, were found at much higher levels in the extrudates of the oat flour that had been debranned. It contained lower protein and fiber levels than the others and showed increased lipase activity. Extrudates from these samples also had significantly lower levels of Maillard reaction products that correlated, in the sensory analysis, with terms such as stale oil and oatmeal. Linoleic acid was added to a fourth oat flour to simulate the result of increased lipase activity, and GC-MS analysis showed both an increase in lipid degradation products and a decrease in Maillard reaction products.
Resumo:
For forecasting and economic analysis many variables are used in logarithms (logs). In time series analysis, this transformation is often considered to stabilize the variance of a series. We investigate under which conditions taking logs is beneficial for forecasting. Forecasts based on the original series are compared to forecasts based on logs. For a range of economic variables, substantial forecasting improvements from taking logs are found if the log transformation actually stabilizes the variance of the underlying series. Using logs can be damaging for the forecast precision if a stable variance is not achieved.
Resumo:
This study focuses on the mechanisms underlying water and heat transfer in upper soil layers, and their effects on soil physical prognostic variables and the individual components of the energy balance. The skill of the JULES (Joint UK Land Environment Simulator) land surface model (LSM) to simulate key soil variables, such as soil moisture content and surface temperature, and fluxes such as evaporation, is investigated. The Richards equation for soil water transfer, as used in most LSMs, was updated by incorporating isothermal and thermal water vapour transfer. The model was tested for three sites representative of semi-arid and temperate arid climates: the Jornada site (New Mexico, USA), Griffith site (Australia) and Audubon site (Arizona, USA). Water vapour flux was found to contribute significantly to the water and heat transfer in the upper soil layers. This was mainly due to isothermal vapour diffusion; thermal vapour flux also played a role at the Jornada site just after rainfall events. Inclusion of water vapour flux had an effect on the diurnal evolution of evaporation, soil moisture content and surface temperature. The incorporation of additional processes, such as water vapour flux among others, into LSMs may improve the coupling between the upper soil layers and the atmosphere, which in turn could increase the reliability of weather and climate predictions.
Resumo:
The North Atlantic oscillation (NAO) is under current climate conditions the leading mode of atmospheric circulation variability over the North Atlantic region. While the pattern is present during the entire year, it is most important during winter, explaining a large part of the variability of the large-scale pressure field, being thus largely determinant for the weather conditions over the North Atlantic basin and over Western Europe. In this study, a review of recent literature on the basic understanding of the NAO, its variability on different time scales and driving physical mechanisms is presented. In particular, the observed NAO variations and long-term trends are put into a long term perspective by considering paleo-proxy evidence. A representative number of recently released NAO reconstructions are discussed. While the reconstructions agree reasonably well with observations during the instrumental overlapping period, there is a rather high uncertainty between the different reconstructions for the pre-instrumental period, which leads to partially incoherent results, that is, periods where the NAO reconstructions do not agree even in sign. Finally, we highlight the future need of a broader definition of the NAO, the assessment of the stability of the teleconnection centers over time, the analysis of the relations to other relevant variables like temperature and precipitation, as well as on the relevant processes involved
Resumo:
This study investigates transfer at the third-language (L3) initial state, testing between the following possibilities: (1) the first language (L1) transfer hypothesis (an L1 effect for all adult acquisition), (2) the second language (L2) transfer hypothesis, where the L2 blocks L1 transfer (often referred to in the recent literature as the ‘L2 status factor’; Williams and Hammarberg, 1998), and (3) the Cumulative Enhancement Model (Flynn et al., 2004), which proposes selective transfer from all previous linguistic knowledge. We provide data from successful English-speaking learners of L2 Spanish at the initial state of acquiring L3 French and L3 Italian relating to properties of the Null-Subject Parameter (e.g. Chomsky, 1981; Rizzi, 1982). We compare these groups to each other, as well as to groups of English learners of L2 French and L2 Italian at the initial state, and conclude that the data are consistent with the predictions of the ‘L2 status factor’. However, we discuss an alternative possible interpretation based on (psycho)typologically-motivated transfer (borrowing from Kellerman, 1983), providing a methodology for future research in this domain to meaningfully tease apart the ‘L2 status factor’ from this alternative account.
Resumo:
Forgetting immediate physical reality and having awareness of one�s location in the simulated world is critical to enjoyment and performance in virtual environments be it an interactive 3D game such as Quake or an online virtual 3d community space such as Second Life. Answer to the question "where am I?" at two levels, whether the locus is in the immediate real world as opposed to the virtual world and whether one is aware of the spatial co-ordinates of that locus, hold the key to any virtual 3D experience. While 3D environments, especially virtual environments and their impact on spatial comprehension has been studied in disciplines such as architecture, it is difficult to determine the relative contributions of specific attributes such as screen size or stereoscopy towards spatial comprehension since most of them treat the technology as monolith (box-centered). Using a variable-centered approach put forth by Nass and Mason (1990) which breaks down the technology into its component variables and their corresponding values as its theoretical basis, this paper looks at the contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) common to most virtual environments on spatial comprehension and presence. The variable centered approach can be daunting as the increase in the number of variables can exponentially increase the number of conditions and resources required. We overcome this drawback posed by adoption of such a theoretical approach by the use of a fractional factorial design for the experiment. This study has completed the first wave of data collection and starting the next phase in January 2007 and expected to complete by February 2007. Theoretical and practical implications of the study are discussed.
Resumo:
The relative contributions of five variables (Stereoscopy, screen size, field of view, level of realism and level of detail) of virtual reality systems on spatial comprehension and presence are evaluated here. Using a variable-centered approach instead of an object-centric view as its theoretical basis, the contributions of these five variables and their two-way interactions are estimated through a 25-1 fractional factorial experiment (screening design) of resolution V with 84 subjects. The experiment design, procedure, measures used, creation of scales and indices, results of statistical analysis, their meaning and agenda for future research are elaborated.
Resumo:
We consider forecasting with factors, variables and both, modeling in-sample using Autometrics so all principal components and variables can be included jointly, while tackling multiple breaks by impulse-indicator saturation. A forecast-error taxonomy for factor models highlights the impacts of location shifts on forecast-error biases. Forecasting US GDP over 1-, 4- and 8-step horizons using the dataset from Stock and Watson (2009) updated to 2011:2 shows factor models are more useful for nowcasting or short-term forecasting, but their relative performance declines as the forecast horizon increases. Forecasts for GDP levels highlight the need for robust strategies, such as intercept corrections or differencing, when location shifts occur as in the recent financial crisis.
Resumo:
We present an analysis of Rapid Keck Spectroscopy of the CVs AM Her (polar) and SS Cyg (dwarf nova). We decompose the spectra into constant and variable components and identify different types of variability in AM Her with different characteristic timescales. The variable flickering component of the accretion disc flux and the observational characteristics of a small flare in SS Cyg are isolated.
Resumo:
The flavour profiles of two genotypes of Charentais cantaloupe melons (medium shelf-life and long shelf-life), harvested at two distinct maturities (immature and mature fruit), were investigated. Dynamic headspace extraction (DHE), solid-phase extraction (SPE), gas chromatography–mass spectrometry (GC-MS) and gas chromatography–olfactometry/mass spectrometry (GC-O/MS) were used to determine volatile and semi-volatile compounds. Qualitative descriptive analysis (QDA) was used to assess the organoleptic impact of the different melons and the sensory data were correlated with the chemical analysis. There were significant, consistent and substantial differences between the mature and immature fruit for the medium shelf-life genotype, the less mature giving a green, cucumber character and lacking the sweet, fruity character of the mature fruit. However, maturity at harvest had a much smaller impact on the long shelf-life melons and fewer differences were detected. These long shelf-life melons tasted sweet, but lacked fruity flavours, instead exhibiting a musty, earthy character.
Resumo:
Bayesian analysis is given of an instrumental variable model that allows for heteroscedasticity in both the structural equation and the instrument equation. Specifically, the approach for dealing with heteroscedastic errors in Geweke (1993) is extended to the Bayesian instrumental variable estimator outlined in Rossi et al. (2005). Heteroscedasticity is treated by modelling the variance for each error using a hierarchical prior that is Gamma distributed. The computation is carried out by using a Markov chain Monte Carlo sampling algorithm with an augmented draw for the heteroscedastic case. An example using real data illustrates the approach and shows that ignoring heteroscedasticity in the instrument equation when it exists may lead to biased estimates.