897 resultados para 2447: modelling and forecasting


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work introduces joint power amplifier (PA) and I/Q modulator modelling and compensation for LongTerm Evolution (LTE) transmitters using artificial neural networks (ANNs). The proposed solution util-izes a powerful nonlinear autoregressive with exogenous inputs (NARX) ANN architecture, which yieldsnoticeable results for high peak to average power ratio (PAPR) LTE signals. Given the ANNs learning capa-bilities, this one-step solution, which includes the mitigation of both PA nonlinearity and I/Q modulatorimpairments, is both accurate and adaptable

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Abstract Purpose The purpose of the study is to review recent studies published from 2007-2015 on tourism and hotel demand modeling and forecasting with a view to identifying the emerging topics and methods studied and to pointing future research directions in the field. Design/Methodology/approach Articles on tourism and hotel demand modeling and forecasting published in both science citation index (SCI) and social science citation index (SSCI) journals were identified and analyzed. Findings This review found that the studies focused on hotel demand are relatively less than those on tourism demand. It is also observed that more and more studies have moved away from the aggregate tourism demand analysis, while disaggregate markets and niche products have attracted increasing attention. Some studies have gone beyond neoclassical economic theory to seek additional explanations of the dynamics of tourism and hotel demand, such as environmental factors, tourist online behavior and consumer confidence indicators, among others. More sophisticated techniques such as nonlinear smooth transition regression, mixed-frequency modeling technique and nonparametric singular spectrum analysis have also been introduced to this research area. Research limitations/implications The main limitation of this review is that the articles included in this study only cover the English literature. Future review of this kind should also include articles published in other languages. The review provides a useful guide for researchers who are interested in future research on tourism and hotel demand modeling and forecasting. Practical implications This review provides important suggestions and recommendations for improving the efficiency of tourism and hospitality management practices. Originality/value The value of this review is that it identifies the current trends in tourism and hotel demand modeling and forecasting research and points out future research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PEDRINI, Aldomar; WESTPHAL, F. S.; LAMBERT, R.. A methodology for building energy modelling and calibration in warm climates. Building And Environment, Australia, n. 37, p.903-912, 2002. Disponível em: . Acesso em: 04 out. 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this talk, I will describe various computational modelling and data mining solutions that form the basis of how the office of Deputy Head of Department (Resources) works to serve you. These include lessons I learn about, and from, optimisation issues in resource allocation, uncertainty analysis on league tables, modelling the process of winning external grants, and lessons we learn from student satisfaction surveys, some of which I have attempted to inject into our planning processes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews the literature of construction risk modelling and assessment. It also reviews the real practice of risk assessment. The review resulted in significant results, summarised as follows. There has been a major shift in risk perception from an estimation variance into a project attribute. Although the Probability–Impact risk model is prevailing, substantial efforts are being put to improving it reflecting the increasing complexity of construction projects. The literature lacks a comprehensive assessment approach capable of capturing risk impact on different project objectives. Obtaining a realistic project risk level demands an effective mechanism for aggregating individual risk assessments. The various assessment tools suffer from low take-up; professionals typically rely on their experience. It is concluded that a simple analytical tool that uses risk cost as a common scale and utilises professional experience could be a viable option to facilitate closing the gap between theory and practice of risk assessment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A new method for the evaluation of the efficiency of parabolic trough collectors, called Rapid Test Method, is investigated at the Solar Institut Jülich. The basic concept is to carry out measurements under stagnation conditions. This allows a fast and inexpensive process due to the fact that no working fluid is required. With this approach, the temperature reached by the inner wall of the receiver is assumed to be the stagnation temperature and hence the average temperature inside the collector. This leads to a systematic error which can be rectified through the introduction of a correction factor. A model of the collector is simulated with COMSOL Multipyisics to study the size of the correction factor depending on collector geometry and working conditions. The resulting values are compared with experimental data obtained at a test rig at the Solar Institut Jülich. These results do not match with the simulated ones. Consequentially, it was not pos-sible to verify the model. The reliability of both the model with COMSOL Multiphysics and of the measurements are analysed. The influence of the correction factor on the rapid test method is also studied, as well as the possibility of neglecting it by measuring the receiver’s inner wall temperature where it receives the least amount of solar rays. The last two chapters analyse the specific heat capacity as a function of pressure and tem-perature and present some considerations about the uncertainties on the efficiency curve obtained with the Rapid Test Method.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ma thèse s’intéresse aux politiques de santé conçues pour encourager l’offre de services de santé. L’accessibilité aux services de santé est un problème majeur qui mine le système de santé de la plupart des pays industrialisés. Au Québec, le temps médian d’attente entre une recommandation du médecin généraliste et un rendez-vous avec un médecin spécialiste était de 7,3 semaines en 2012, contre 2,9 semaines en 1993, et ceci malgré l’augmentation du nombre de médecins sur cette même période. Pour les décideurs politiques observant l’augmentation du temps d’attente pour des soins de santé, il est important de comprendre la structure de l’offre de travail des médecins et comment celle-ci affecte l’offre des services de santé. Dans ce contexte, je considère deux principales politiques. En premier lieu, j’estime comment les médecins réagissent aux incitatifs monétaires et j’utilise les paramètres estimés pour examiner comment les politiques de compensation peuvent être utilisées pour déterminer l’offre de services de santé de court terme. En second lieu, j’examine comment la productivité des médecins est affectée par leur expérience, à travers le mécanisme du "learning-by-doing", et j’utilise les paramètres estimés pour trouver le nombre de médecins inexpérimentés que l’on doit recruter pour remplacer un médecin expérimenté qui va à la retraite afin de garder l’offre des services de santé constant. Ma thèse développe et applique des méthodes économique et statistique afin de mesurer la réaction des médecins face aux incitatifs monétaires et estimer leur profil de productivité (en mesurant la variation de la productivité des médecins tout le long de leur carrière) en utilisant à la fois des données de panel sur les médecins québécois, provenant d’enquêtes et de l’administration. Les données contiennent des informations sur l’offre de travail de chaque médecin, les différents types de services offerts ainsi que leurs prix. Ces données couvrent une période pendant laquelle le gouvernement du Québec a changé les prix relatifs des services de santé. J’ai utilisé une approche basée sur la modélisation pour développer et estimer un modèle structurel d’offre de travail en permettant au médecin d’être multitâche. Dans mon modèle les médecins choisissent le nombre d’heures travaillées ainsi que l’allocation de ces heures à travers les différents services offerts, de plus les prix des services leurs sont imposés par le gouvernement. Le modèle génère une équation de revenu qui dépend des heures travaillées et d’un indice de prix représentant le rendement marginal des heures travaillées lorsque celles-ci sont allouées de façon optimale à travers les différents services. L’indice de prix dépend des prix des services offerts et des paramètres de la technologie de production des services qui déterminent comment les médecins réagissent aux changements des prix relatifs. J’ai appliqué le modèle aux données de panel sur la rémunération des médecins au Québec fusionnées à celles sur l’utilisation du temps de ces mêmes médecins. J’utilise le modèle pour examiner deux dimensions de l’offre des services de santé. En premierlieu, j’analyse l’utilisation des incitatifs monétaires pour amener les médecins à modifier leur production des différents services. Bien que les études antérieures ont souvent cherché à comparer le comportement des médecins à travers les différents systèmes de compensation,il y a relativement peu d’informations sur comment les médecins réagissent aux changementsdes prix des services de santé. Des débats actuels dans les milieux de politiques de santé au Canada se sont intéressés à l’importance des effets de revenu dans la détermination de la réponse des médecins face à l’augmentation des prix des services de santé. Mon travail contribue à alimenter ce débat en identifiant et en estimant les effets de substitution et de revenu résultant des changements des prix relatifs des services de santé. En second lieu, j’analyse comment l’expérience affecte la productivité des médecins. Cela a une importante implication sur le recrutement des médecins afin de satisfaire la demande croissante due à une population vieillissante, en particulier lorsque les médecins les plus expérimentés (les plus productifs) vont à la retraite. Dans le premier essai, j’ai estimé la fonction de revenu conditionnellement aux heures travaillées, en utilisant la méthode des variables instrumentales afin de contrôler pour une éventuelle endogeneité des heures travaillées. Comme instruments j’ai utilisé les variables indicatrices des âges des médecins, le taux marginal de taxation, le rendement sur le marché boursier, le carré et le cube de ce rendement. Je montre que cela donne la borne inférieure de l’élasticité-prix direct, permettant ainsi de tester si les médecins réagissent aux incitatifs monétaires. Les résultats montrent que les bornes inférieures des élasticités-prix de l’offre de services sont significativement positives, suggérant que les médecins répondent aux incitatifs. Un changement des prix relatifs conduit les médecins à allouer plus d’heures de travail au service dont le prix a augmenté. Dans le deuxième essai, j’estime le modèle en entier, de façon inconditionnelle aux heures travaillées, en analysant les variations des heures travaillées par les médecins, le volume des services offerts et le revenu des médecins. Pour ce faire, j’ai utilisé l’estimateur de la méthode des moments simulés. Les résultats montrent que les élasticités-prix direct de substitution sont élevées et significativement positives, représentant une tendance des médecins à accroitre le volume du service dont le prix a connu la plus forte augmentation. Les élasticitésprix croisées de substitution sont également élevées mais négatives. Par ailleurs, il existe un effet de revenu associé à l’augmentation des tarifs. J’ai utilisé les paramètres estimés du modèle structurel pour simuler une hausse générale de prix des services de 32%. Les résultats montrent que les médecins devraient réduire le nombre total d’heures travaillées (élasticité moyenne de -0,02) ainsi que les heures cliniques travaillées (élasticité moyenne de -0.07). Ils devraient aussi réduire le volume de services offerts (élasticité moyenne de -0.05). Troisièmement, j’ai exploité le lien naturel existant entre le revenu d’un médecin payé à l’acte et sa productivité afin d’établir le profil de productivité des médecins. Pour ce faire, j’ai modifié la spécification du modèle pour prendre en compte la relation entre la productivité d’un médecin et son expérience. J’estime l’équation de revenu en utilisant des données de panel asymétrique et en corrigeant le caractère non-aléatoire des observations manquantes à l’aide d’un modèle de sélection. Les résultats suggèrent que le profil de productivité est une fonction croissante et concave de l’expérience. Par ailleurs, ce profil est robuste à l’utilisation de l’expérience effective (la quantité de service produit) comme variable de contrôle et aussi à la suppression d’hypothèse paramétrique. De plus, si l’expérience du médecin augmente d’une année, il augmente la production de services de 1003 dollar CAN. J’ai utilisé les paramètres estimés du modèle pour calculer le ratio de remplacement : le nombre de médecins inexpérimentés qu’il faut pour remplacer un médecin expérimenté. Ce ratio de remplacement est de 1,2.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PEDRINI, Aldomar; WESTPHAL, F. S.; LAMBERT, R.. A methodology for building energy modelling and calibration in warm climates. Building And Environment, Australia, n. 37, p.903-912, 2002. Disponível em: . Acesso em: 04 out. 2010.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ecological network analysis was applied in the Seine estuary ecosystem, northern France, integrating ecological data from the years 1996 to 2002. The Ecopath with Ecosim (EwE) approach was used to model the trophic flows in 6 spatial compartments leading to 6 distinct EwE models: the navigation channel and the two channel flanks in the estuary proper, and 3 marine habitats in the eastern Seine Bay. Each model included 12 consumer groups, 2 primary producers, and one detritus group. Ecological network analysis was performed, including a set of indices, keystoneness, and trophic spectrum analysis to describe the contribution of the 6 habitats to the Seine estuary ecosystem functioning. Results showed that the two habitats with a functioning most related to a stressed state were the northern and central navigation channels, where building works and constant maritime traffic are considered major anthropogenic stressors. The strong top-down control highlighted in the other 4 habitats was not present in the central channel, showing instead (i) a change in keystone roles in the ecosystem towards sediment-based, lower trophic levels, and (ii) a higher system omnivory. The southern channel evidenced the highest system activity (total system throughput), the higher trophic specialisation (low system omnivory), and the lowest indication of stress (low cycling and relative redundancy). Marine habitats showed higher fish biomass proportions and higher transfer efficiencies per trophic levels than the estuarine habitats, with a transition area between the two that presented intermediate ecosystem structure. The modelling of separate habitats permitted disclosing each one's response to the different pressures, based on their a priori knowledge. Network indices, although non-monotonously, responded to these differences and seem a promising operational tool to define the ecological status of transitional water ecosystems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deep seismic reflection profile Western Approaches Margin (WAM) cuts across the Goban Spur continental margin, located southwest of Ireland. This non-volcanic margin is characterized by a few tilted blocks parallel to the margin. A volcanic sill has been emplaced on the westernmost tilted block. The shape of the eastern part of this sill is known from seismic data, but neither seismic nor gravity data allow a precise determination of the extent and shape of the volcanic body at depth. Forward modelling and inversion of magnetic data constrain the shape of this volcanic sill and the location of the ocean-continent transition. The volcanic body thickens towards the ocean, and seems to be in direct contact with the oceanic crust. In the contact zone, the volcanic body and the oceanic magnetic layer display approximately the same thickness. The oceanic magnetic layer is anomalously thick immediately west of the volcanic body, and gradually thins to reach more typical values 40 km further to the west. The volcanic sill would therefore represent the very first formation of oceanic crust, just before or at the continental break-up. The ocean-continent transition is limited to a zone 15 km wide. The continental magnetic layer seems to thin gradually oceanwards, as does the continental crust, but no simple relation is observed between their respective thinnings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Part 4: Transition Towards Product-Service Systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete Event Simulation (DES) is a very popular simulation technique in Operational Research. Recently, there has been the emergence of another technique, namely Agent Based Simulation (ABS). Although there is a lot of literature relating to DES and ABS, we have found less that focuses on exploring the capabilities of both in tackling human behaviour issues. In order to understand the gap between these two simulation techniques, therefore, our aim is to understand the distinctions between DES and ABS models with the real world phenomenon in modelling and simulating human behaviour. In achieving the aim, we have carried out a case study at a department store. Both DES and ABS models will be compared using the same problem domain which is concerning on management policy in a fitting room. The behaviour of staffs while working and customers’ satisfaction will be modelled for both models behaviour understanding.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Planar cell polarity (PCP) occurs in the epithelia of many animals and can lead to the alignment of hairs, bristles and feathers; physiologically, it can organise ciliary beating. Here we present two approaches to modelling this phenomenon. The aim is to discover the basic mechanisms that drive PCP, while keeping the models mathematically tractable. We present a feedback and diffusion model, in which adjacent cell sides of neighbouring cells are coupled by a negative feedback loop and diffusion acts within the cell. This approach can give rise to polarity, but also to period two patterns. Polarisation arises via an instability provided a sufficiently strong feedback and sufficiently weak diffusion. Moreover, we discuss a conservative model in which proteins within a cell are redistributed depending on the amount of proteins in the neighbouring cells, coupled with intracellular diffusion. In this case polarity can arise from weakly polarised initial conditions or via a wave provided the diffusion is weak enough. Both models can overcome small anomalies in the initial conditions. Furthermore, the range of the effects of groups of cells with different properties than the surrounding cells depends on the strength of the initial global cue and the intracellular diffusion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Observing, modelling and understanding the climate-scale variability of the deep water formation (DWF) in the North-Western Mediterranean Sea remains today very challenging. In this study, we first characterize the interannual variability of this phenomenon by a thorough reanalysis of observations in order to establish reference time series. These quantitative indicators include 31 observed years for the yearly maximum mixed layer depth over the period 1980–2013 and a detailed multi-indicator description of the period 2007–2013. Then a 1980–2013 hindcast simulation is performed with a fully-coupled regional climate system model including the high-resolution representation of the regional atmosphere, ocean, land-surface and rivers. The simulation reproduces quantitatively well the mean behaviour and the large interannual variability of the DWF phenomenon. The model shows convection deeper than 1000 m in 2/3 of the modelled winters, a mean DWF rate equal to 0.35 Sv with maximum values of 1.7 (resp. 1.6) Sv in 2013 (resp. 2005). Using the model results, the winter-integrated buoyancy loss over the Gulf of Lions is identified as the primary driving factor of the DWF interannual variability and explains, alone, around 50 % of its variance. It is itself explained by the occurrence of few stormy days during winter. At daily scale, the Atlantic ridge weather regime is identified as favourable to strong buoyancy losses and therefore DWF, whereas the positive phase of the North Atlantic oscillation is unfavourable. The driving role of the vertical stratification in autumn, a measure of the water column inhibition to mixing, has also been analyzed. Combining both driving factors allows to explain more than 70 % of the interannual variance of the phenomenon and in particular the occurrence of the five strongest convective years of the model (1981, 1999, 2005, 2009, 2013). The model simulates qualitatively well the trends in the deep waters (warming, saltening, increase in the dense water volume, increase in the bottom water density) despite an underestimation of the salinity and density trends. These deep trends come from a heat and salt accumulation during the 1980s and the 1990s in the surface and intermediate layers of the Gulf of Lions before being transferred stepwise towards the deep layers when very convective years occur in 1999 and later. The salinity increase in the near Atlantic Ocean surface layers seems to be the external forcing that finally leads to these deep trends. In the future, our results may allow to better understand the behaviour of the DWF phenomenon in Mediterranean Sea simulations in hindcast, forecast, reanalysis or future climate change scenario modes. The robustness of the obtained results must be however confirmed in multi-model studies.