846 resultados para C53 - Forecasting and Other Model Applications
Resumo:
Forecasting tourism demand is crucial for management decisions in the tourism sector. Estimating a vector autoregressive (VAR) model for monthly visitor arrivals disaggregated by three entry points in Cambodia for the years 2006–2015, I forecast the number of arrivals for years 2016 and 2017. The results show that the VAR model fits well with the data on visitor arrivals for each entry point. Ex post forecasting shows that the forecasts closely match the observed data for visitor arrivals, thereby supporting the forecasting accuracy of the VAR model. Visitor arrivals to Siem Reap and Phnom Penh airports are forecast to increase steadily in future periods, with varying fluctuations across months and origin countries of foreign tourists.
Resumo:
We study the problem of measuring the uncertainty of CGE (or RBC)-type model simulations associated with parameter uncertainty. We describe two approaches for building confidence sets on model endogenous variables. The first one uses a standard Wald-type statistic. The second approach assumes that a confidence set (sampling or Bayesian) is available for the free parameters, from which confidence sets are derived by a projection technique. The latter has two advantages: first, confidence set validity is not affected by model nonlinearities; second, we can easily build simultaneous confidence intervals for an unlimited number of variables. We study conditions under which these confidence sets take the form of intervals and show they can be implemented using standard methods for solving CGE models. We present an application to a CGE model of the Moroccan economy to study the effects of policy-induced increases of transfers from Moroccan expatriates.
Resumo:
Recent work shows that a low correlation between the instruments and the included variables leads to serious inference problems. We extend the local-to-zero analysis of models with weak instruments to models with estimated instruments and regressors and with higher-order dependence between instruments and disturbances. This makes this framework applicable to linear models with expectation variables that are estimated non-parametrically. Two examples of such models are the risk-return trade-off in finance and the impact of inflation uncertainty on real economic activity. Results show that inference based on Lagrange Multiplier (LM) tests is more robust to weak instruments than Wald-based inference. Using LM confidence intervals leads us to conclude that no statistically significant risk premium is present in returns on the S&P 500 index, excess holding yields between 6-month and 3-month Treasury bills, or in yen-dollar spot returns.
Resumo:
A series of CCSD(T) single-point calculations on MP4(SDQ) geometries and the W1 model chemistry method have been used to calculate ΔH° and ΔG° values for the deprotonation of 17 gas-phase reactions where the experimental values have reported accuracies within 1 kcal/mol. These values have been compared with previous calculations using the G3 and CBS model chemistries and two DFT methods. The most accurate CCSD(T) method uses the aug-cc-pVQZ basis set. Extrapolation of the aug-cc-pVTZ and aug-cc-pVQZ results yields the most accurate agreement with experiment, with a standard deviation of 0.58 kcal/mol for ΔG° and 0.70 kcal/mol for ΔH°. Standard deviations from experiment for ΔG° and ΔH° for the W1 method are 0.95 and 0.83 kcal/mol, respectively. The G3 and CBS-APNO results are competitive with W1 and are much less expensive. Any of the model chemistry methods or the CCSD(T)/aug-cc-pVQZ method can serve as a valuable check on the accuracy of experimental data reported in the National Institutes of Standards and Technology (NIST) database.
Resumo:
Light emitting polymers (LEPs) are considered as the second generation of conducting polymers. A Prototype LEP device based on electroluminescence emission of poly(p-phenylenevinylene) (PPV) was first assembled in 1990. LEPs have progressed tremendously over the past 20 years. The development of new LEP derivatives are important because polymer light emitting diodes (PLEDs) can be used for the manufacture of next-generation displays and other optoelectronic applications such as lasers, photovoltaic cells and sensors. Under this circumstance, it is important to understand thermal, structural, morphological, electrochemical and photophysical characteristics of luminescent polymers. In this thesis the author synthesizes a series of light emitting polymers that can emit three primary colors (RGB) with high efficiency
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by processbased modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws.We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25m resolution.
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
The development of susceptibility maps for debris flows is of primary importance due to population pressure in hazardous zones. However, hazard assessment by process-based modelling at a regional scale is difficult due to the complex nature of the phenomenon, the variability of local controlling factors, and the uncertainty in modelling parameters. A regional assessment must consider a simplified approach that is not highly parameter dependant and that can provide zonation with minimum data requirements. A distributed empirical model has thus been developed for regional susceptibility assessments using essentially a digital elevation model (DEM). The model is called Flow-R for Flow path assessment of gravitational hazards at a Regional scale (available free of charge under http://www.flow-r.org) and has been successfully applied to different case studies in various countries with variable data quality. It provides a substantial basis for a preliminary susceptibility assessment at a regional scale. The model was also found relevant to assess other natural hazards such as rockfall, snow avalanches and floods. The model allows for automatic source area delineation, given user criteria, and for the assessment of the propagation extent based on various spreading algorithms and simple frictional laws. We developed a new spreading algorithm, an improved version of Holmgren's direction algorithm, that is less sensitive to small variations of the DEM and that is avoiding over-channelization, and so produces more realistic extents. The choices of the datasets and the algorithms are open to the user, which makes it compliant for various applications and dataset availability. Amongst the possible datasets, the DEM is the only one that is really needed for both the source area delineation and the propagation assessment; its quality is of major importance for the results accuracy. We consider a 10 m DEM resolution as a good compromise between processing time and quality of results. However, valuable results have still been obtained on the basis of lower quality DEMs with 25 m resolution.
Resumo:
The main focus of this research is to design and develop a high performance linear actuator based on a four bar mechanism. The present work includes the detailed analysis (kinematics and dynamics), design, implementation and experimental validation of the newly designed actuator. High performance is characterized by the acceleration of the actuator end effector. The principle of the newly designed actuator is to network the four bar rhombus configuration (where some bars are extended to form an X shape) to attain high acceleration. Firstly, a detailed kinematic analysis of the actuator is presented and kinematic performance is evaluated through MATLAB simulations. A dynamic equation of the actuator is achieved by using the Lagrangian dynamic formulation. A SIMULINK control model of the actuator is developed using the dynamic equation. In addition, Bond Graph methodology is presented for the dynamic simulation. The Bond Graph model comprises individual component modeling of the actuator along with control. Required torque was simulated using the Bond Graph model. Results indicate that, high acceleration (around 20g) can be achieved with modest (3 N-m or less) torque input. A practical prototype of the actuator is designed using SOLIDWORKS and then produced to verify the proof of concept. The design goal was to achieve the peak acceleration of more than 10g at the middle point of the travel length, when the end effector travels the stroke length (around 1 m). The actuator is primarily designed to operate in standalone condition and later to use it in the 3RPR parallel robot. A DC motor is used to operate the actuator. A quadrature encoder is attached with the DC motor to control the end effector. The associated control scheme of the actuator is analyzed and integrated with the physical prototype. From standalone experimentation of the actuator, around 17g acceleration was achieved by the end effector (stroke length was 0.2m to 0.78m). Results indicate that the developed dynamic model results are in good agreement. Finally, a Design of Experiment (DOE) based statistical approach is also introduced to identify the parametric combination that yields the greatest performance. Data are collected by using the Bond Graph model. This approach is helpful in designing the actuator without much complexity.
Resumo:
In this review we demonstrate how the algebraic Bethe ansatz is used for the calculation of the-energy spectra and form factors (operator matrix elements in the basis of Hamiltonian eigenstates) in exactly solvable quantum systems. As examples we apply the theory to several models of current interest in the study of Bose-Einstein condensates, which have been successfully created using ultracold dilute atomic gases. The first model we introduce describes Josephson tunnelling between two coupled Bose-Einstein condensates. It can be used not only for the study of tunnelling between condensates of atomic gases, but for solid state Josephson junctions and coupled Cooper pair boxes. The theory is also applicable to models of atomic-molecular Bose-Einstein condensates, with two examples given and analysed. Additionally, these same two models are relevant to studies in quantum optics; Finally, we discuss the model of Bardeen, Cooper and Schrieffer in this framework, which is appropriate for systems of ultracold fermionic atomic gases, as well as being applicable for the description of superconducting correlations in metallic grains with nanoscale dimensions.; In applying all the above models to. physical situations, the need for an exact analysis of small-scale systems is established due to large quantum fluctuations which render mean-field approaches inaccurate.
Resumo:
This work evaluates the possibility of using spent coffee grounds (SCG) for biodiesel production and other applications. An experimental study was conducted with different solvents showing that lipid content up to 6 wt% can be obtained from SCG. Results also show that besides biodiesel production, SCG can be used as fertilizer as it is rich in nitrogen, and as solid fuel with higher heating value (HHV) equivalent to some agriculture and wood residues. The extracted lipids were characterized for their properties of acid value, density at 15 °C, viscosity at 40 °C, iodine number, and HHV, which are negatively influenced by water content and solvents used in lipid extraction. Results suggest that for lipids with high free fatty acids (FFA), the best procedure for conversion to biodiesel would be a two-step process of acid esterification followed by alkaline transesterification, instead of a sole step of direct transesterification with acid catalyst. Biodiesel was characterized for its properties of iodine number, acid value, and ester content. Although these quality parameters were not within the limits of NP EN 14214:2009 standard, SCG lipids can be used for biodiesel, blended with higher-quality vegetable oils before transesterification, or the biodiesel produced from SCG can be blended with higher-quality biodiesel or even with fossil diesel, in order to meet the standard requirements.
Resumo:
A prevalência de pessoas que referem dor no complexo articular do ombro, com concomitante limitação na capacidade para realizar atividades da vida diária, é elevada. Estes níveis de prevalência sobrecarregam quer os utentes, como a própria sociedade. A evidência científica atual indicia a existência de uma relação entre as alterações da articulação escápulo-torácica e as patologias associadas à articulação gleno-umeral. A capacidade de quantificar, cinemática e cineticamente, as disfunções ao nível das articulações escápulo-torácica e gleno-umeral, é algo de enorme importância, quer para a comunidade biomecânica, como para a clínica. No decorrer dos trabalhos desta tese foi desenvolvido, através do software OpenSim, um modelo tridimensional músculo-esquelético do complexo articular do ombro que inclui a representação do tórax/coluna, clavícula, omoplata, úmero, rádio, cúbito e articulações que permitem os movimentos relativos desses segmentos, assim como, 16 músculos e 4 ligamentos. Com um total de 11 graus de liberdade, incluindo um novo modelo articular escápulo-torácico, os resultados demonstram que este é capaz de reconstruir de forma precisa e rápida os movimentos escápulo-torácicos e glenoumerais, recorrendo para tal, à cinemática inversa, e à dinâmica inversa e direta. Conta ainda com um método de transformação inovador para determinar, com base nas especificidades dos sujeitos, os locais de inserção muscular. As principais motivações subjacentes ao desenvolvimento desta tese foram contribuir para o aprofundar do atual conhecimento sobre as disfunções do complexo articular do ombro e, simultaneamente, proporcionar à comunidade clínica uma ferramenta biomecânica de livre acesso com o intuito de melhor suportar as decisões clínicas e dessa forma concorrer para uma prática mais efetiva.
Resumo:
Es va realitzar el II Workshop en Tomografia Computeritzada (TC) a Monells. El primer dia es va dedicar íntegrament a la utilització del TC en temes de classificació de canals porcines, i el segon dia es va obrir a altres aplicacions del TC, ja sigui en animals vius o en diferents aspectes de qualitat de la carn o els productes carnis. Al workshop hi van assistir 45 persones de 12 països de la UE. The II workshop on the use of Computed Tomography (CT) in pig carcass classification. Other CT applications: live animals and meat technology was held in Monells. The first day it was dedicated to the use of CT in pig carcass classification. The segond day it was open to otehr CT applications, in live animals or in meat and meat products quality. There were 45 assistants of 12 EU countries.
Resumo:
Every year, debris flows cause huge damage in mountainous areas. Due to population pressure in hazardous zones, the socio-economic impact is much higher than in the past. Therefore, the development of indicative susceptibility hazard maps is of primary importance, particularly in developing countries. However, the complexity of the phenomenon and the variability of local controlling factors limit the use of processbased models for a first assessment. A debris flow model has been developed for regional susceptibility assessments using digital elevation model (DEM) with a GIS-based approach.. The automatic identification of source areas and the estimation of debris flow spreading, based on GIS tools, provide a substantial basis for a preliminary susceptibility assessment at a regional scale. One of the main advantages of this model is its workability. In fact, everything is open to the user, from the data choice to the selection of the algorithms and their parameters. The Flow-R model was tested in three different contexts: two in Switzerland and one in Pakistan, for indicative susceptibility hazard mapping. It was shown that the quality of the DEM is the most important parameter to obtain reliable results for propagation, but also to identify the potential debris flows sources.