876 resultados para 2447: modelling and forecasting


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A tanulmányban a Pénzügyminisztérium gazdaságpolitikai főosztálya és az MTA Közgazdaságtudományi Intézete által kifejlesztett középméretű negyedéves makrogazdasági modell segítségével elemezzük a magyar gazdaság legfontosabb mechanizmusait. A modellezés során követett alapelvek és a modell blokkjainak bemutatása után egy forgatókönyv-elemzés keretében vizsgáljuk a makrogazdasági és költségvetési folyamatokat befolyásoló főbb faktorok hatásait. A - tágan értelmezett - "bizonytalansági tényezőket" három csoportba soroljuk: megkülönböztetjük a külső környezet (például árfolyam) változását, a gazdasági szereplők viselkedésében rejlő bizonytalanságokat (például a bérigazodás sebességének vagy a fogyasztássimítás mértékének bizonytalanságát), valamint a gazdaságpolitikai lépéseket (például állami bérek emelését). Megmutatjuk, hogy e kockázatok makrokövetkezményei nem függetlenek egymástól, például egy árfolyamváltozás hatását befolyásolja a bérigazodás sebessége. ______ This paper analyses the most important mechanisms of the Hungarian economy using a medium-sized quarterly macroeconomic model developed jointly by the Economic Policy Department of the Ministry of Finance and the Institute of Economics of the Hungarian Academy of Sciences. After introducing the fundamental principles of modelling and the building blocks of the model investigated, within a scenario analysis, the authors present the effects of the main factors behind the macroeconomic and budgetary processes. The sources of uncertainty - defined in a broad sense - are categorized in three groups: change in the external environment (e.g. the exchange rate), uncertainties in the behav-iour of economic agents (e.g. in speed of wage adjustment or extent of consumption smoothing), and economic policy decisions (e.g. the increase in public sector wages). The macroeconomic consequences of these uncertainties are shown not to be independent of each other. For instance, the effects of an exchange rate shock are influenced by the speed of wage adjustment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate change has serious effects on the setting up and the operation of natural ecosystems. Small increase in temperature could cause rise in the amount of some species or potential disappearance of others. During our researches, the dispersion of the species and biomass production of a theoretical ecosystem were examined on the effect of the temperature–climate change. The answers of the ecosystems which are given to the climate change could be described by means of global climate modelling and dynamic vegetation models. The examination of the operation of the ecosystems is only possible in huge centres on supercomputers because of the number and the complexity of the calculation. The number of the calculation could be decreased to the level of a PC by considering the temperature and the reproduction during modelling a theoretical ecosystem, and several important theoretical questions could be answered.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land use and transportation interaction has been a research topic for several decades. There have been efforts to identify impacts of transportation on land use from several different perspectives. One focus has been the role of transportation improvements in encouraging new land developments or relocation of activities due to improved accessibility. The impacts studied have included property values and increased development. Another focus has been on the changes in travel behavior due to better mobility and accessibility. Most studies to date have been conducted in metropolitan level, thus unable to account for interactions spatially and temporally at smaller geographic scales. ^ In this study, a framework for studying the temporal interactions between transportation and land use was proposed and applied to three selected corridor areas in Miami-Dade County, Florida. The framework consists of two parts: one is developing of temporal data and the other is applying time series analysis to this temporal data to identify their dynamic interactions. Temporal GIS databases were constructed and used to compile building permit data and transportation improvement projects. Two types of time series analysis approaches were utilized: univariate models and multivariate models. Time series analysis is designed to describe the dynamic consequences of time series by developing models and forecasting the future of the system based on historical trends. Model estimation results from the selected corridors were then compared. ^ It was found that the time series models predicted residential development better than commercial development. It was also found that results from three study corridors varied in terms of the magnitude of impacts, length of lags, significance of the variables, and the model structure. Long-run effect or cumulated impact of transportation improvement on land developments was also measured with time series techniques. The study offered evidence that congestion negatively impacted development and transportation investments encouraged land development. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Urban growth models have been used for decades to forecast urban development in metropolitan areas. Since the 1990s cellular automata, with simple computational rules and an explicitly spatial architecture, have been heavily utilized in this endeavor. One such cellular-automata-based model, SLEUTH, has been successfully applied around the world to better understand and forecast not only urban growth but also other forms of land-use and land-cover change, but like other models must be fed important information about which particular lands in the modeled area are available for development. Some of these lands are in categories for the purpose of excluding urban growth that are difficult to quantify since their function is dictated by policy. One such category includes voluntary differential assessment programs, whereby farmers agree not to develop their lands in exchange for significant tax breaks. Since they are voluntary, today’s excluded lands may be available for development at some point in the future. Mapping the shifting mosaic of parcels that are enrolled in such programs allows this information to be used in modeling and forecasting. In this study, we added information about California’s Williamson Act into SLEUTH’s excluded layer for Tulare County. Assumptions about the voluntary differential assessments were used to create a sophisticated excluded layer that was fed into SLEUTH’s urban growth forecasting routine. The results demonstrate not only a successful execution of this method but also yielded high goodness-of-fit metrics for both the calibration of enrollment termination as well as the urban growth modeling itself.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. ^ The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: (1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (E LUMO) via QSAR modelling and analysis; (2) to validate the models by using internal and external cross-validation techniques; (3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl ) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: (1) Linear or Multi-linear Regression (MLR); (2) Partial Least Squares (PLS); and (3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: (1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; (2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; (3) E LUMO are shown to correlate highly with the NCl for several classes of DBPs; and (4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applications of micro-end-milling operations have increased recently. A Micro-End-Milling Operation Guide and Research Tool (MOGART) package has been developed for the study and monitoring of micro-end-milling operations. It includes an analytical cutting force model, neural network based data mapping and forecasting processes, and genetic algorithms based optimization routines. MOGART uses neural networks to estimate tool machinability and forecast tool wear from the experimental cutting force data, and genetic algorithms with the analytical model to monitor tool wear, breakage, run-out, cutting conditions from the cutting force profiles. The performance of MOGART has been tested on the experimental data of over 800 experimental cases and very good agreement has been observed between the theoretical and experimental results. The MOGART package has been applied to the micro-end-milling operation study of Engineering Prototype Center of Radio Technology Division of Motorola Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A circumpolar representative and consistent wetland map is required for a range of applications ranging from upscaling of carbon fluxes and pools to climate modelling and wildlife habitat assessment. Currently available data sets lack sufficient accuracy and/or thematic detail in many regions of the Arctic. Synthetic aperture radar (SAR) data from satellites have already been shown to be suitable for wetland mapping. Envisat Advanced SAR (ASAR) provides global medium-resolution data which are examined with particular focus on spatial wetness patterns in this study. It was found that winter minimum backscatter values as well as their differences to summer minimum values reflect vegetation physiognomy units of certain wetness regimes. Low winter backscatter values are mostly found in areas vegetated by plant communities typically for wet regions in the tundra biome, due to low roughness and low volume scattering caused by the predominant vegetation. Summer to winter difference backscatter values, which in contrast to the winter values depend almost solely on soil moisture content, show expected higher values for wet regions. While the approach using difference values would seem more reasonable in order to delineate wetness patterns considering its direct link to soil moisture, it was found that a classification of winter minimum backscatter values is more applicable in tundra regions due to its better separability into wetness classes. Previous approaches for wetland detection have investigated the impact of liquid water in the soil on backscatter conditions. In this study the absence of liquid water is utilized. Owing to a lack of comparable regional to circumpolar data with respect to thematic detail, a potential wetland map cannot directly be validated; however, one might claim the validity of such a product by comparison with vegetation maps, which hold some information on the wetness status of certain classes. It was shown that the Envisat ASAR-derived classes are related to wetland classes of conventional vegetation maps, indicating its applicability; 30% of the land area north of the treeline was identified as wetland while conventional maps recorded 1-7%.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Acknowledgements This work is based on the Ecosystem Land Use Modelling & Soil Carbon GHG Flux Trial (ELUM) project, which was commissioned and funded by the Energy Technologies Institute (ETI). The authors are grateful to Niall McNamara (Centre for Ecology & Hydrology, Lancaster) for coordinating the project and to Dagmar Henner (University of Aberdeen) for project assistance. We are also grateful to staff at the ETI, particularly to Geraldine Newton-Cross, Geraint Evans and Hannah Evans for constructive advice and feedback, and to Jonathan Oxley for project support. The ELUM Software Package contains Ordnance Survey data © Crown copyright and database right 2012.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report is a review of additive and subtractive manufacturing techniques. This approach (additive manufacturing) has resided largely in the prototyping realm, where the methods of producing complex freeform solid objects directly from a computer model without part-specific tooling or knowledge. But these technologies are evolving steadily and are beginning to encompass related systems of material addition, subtraction, assembly, and insertion of components made by other processes. Furthermore, these various additive processes are starting to evolve into rapid manufacturing techniques for mass-customized products, away from narrowly defined rapid prototyping. Taking this idea far enough down the line, and several years hence, a radical restructuring of manufacturing could take place. Manufacturing itself would move from a resource base to a knowledge base and from mass production of single use products to mass customized, high value, life cycle products, majority of research and development was focused on advanced development of existing technologies by improving processing performance, materials, modelling and simulation tools, and design tools to enable the transition from prototyping to manufacturing of end use parts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research to explore the use of modelling in the field of Purchasing and Supply Management (P/SM). We are particularly interested in identifying the specific areas of P/SM where there are opportunities for the use of modelling based methods. The paper starts with an overview of main types of modelling and also provides a categorisation of the main P/SM research themes. Our research shows that there are many opportunities for using descriptive, predictive and prescriptive modelling approaches in all areas of P/SM research from the ones with a focus on the actual function from a purely operational and execution perspective (e.g. purchasing processes and behaviour) to the ones with a focus on the organisational level from a more strategic perspective (e.g. strategy and policy). We conclude that future P/SM research needs to explore the value of modelling not just at the functional or operational level, but also at the organisation and strategic level respectively. We also acknowledge that while using empirical results to inform and improve models has advantages, there are also drawbacks, which relate to the value, the practical relevance and the generalisability of the modelling based approaches.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The primary objective is to investigate the main factors contributing to GMS expenditure on pharmaceutical prescribing and projecting this expenditure to 2026. This study is located in the area of pharmacoeconomic cost containment and projections literature. The thesis has five main aims: 1. To determine the main factors contributing to GMS expenditure on pharmaceutical prescribing. 2. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2006 Central Statistics Office (CSO) Census data and 2007 Health Service Executive{Primary Care Reimbursement Service (HSE{PCRS) sample data. 3. To develop a model to project GMS prescribing expenditure in five year intervals to 2026, using 2012 HSE{PCRS population data, incorporating cost containment measures, and 2011 CSO Census data. 4. To investigate the impact of demographic factors and the pharmacology of drugs (Anatomical Therapeutic Chemical (ATC)) on GMS expenditure. 5. To explore the consequences of GMS policy changes on prescribing expenditure and behaviour between 2008 and 2014. The thesis is centered around three published articles and is located between the end of a booming Irish economy in 2007, a recession from 2008{2013, to the beginning of a recovery in 2014. The literature identified a number of factors influencing pharmaceutical expenditure, including population growth, population aging, changes in drug utilisation and drug therapies, age, gender and location. The literature identified the methods previously used in predictive modelling and consequently, the Monte Carlo Simulation (MCS) model was used to simulate projected expenditures to 2026. Also, the literature guided the use of Ordinary Least Squares (OLS) regression in determining demographic and pharmacology factors influencing prescribing expenditure. The study commences against a backdrop of growing GMS prescribing costs, which has risen from e250 million in 1998 to over e1 billion by 2007. Using a sample 2007 HSE{PCRS prescribing data (n=192,000) and CSO population data from 2008, (Conway et al., 2014) estimated GMS prescribing expenditure could rise to e2 billion by2026. The cogency of these findings was impacted by the global economic crisis of 2008, which resulted in a sharp contraction in the Irish economy, mounting fiscal deficits resulting in Ireland's entry to a bailout programme. The sustainability of funding community drug schemes, such as the GMS, came under the spotlight of the EU, IMF, ECB (Trioka), who set stringent targets for reducing drug costs, as conditions of the bailout programme. Cost containment measures included: the introduction of income eligibility limits for GP visit cards and medical cards for those aged 70 and over, introduction of co{payments for prescription items, reductions in wholesale mark{up and pharmacy dispensing fees. Projections for GMS expenditure were reevaluated using 2012 HSE{PCRS prescribing population data and CSO population data based on Census 2011. Taking into account both cost containment measures and revised population predictions, GMS expenditure is estimated to increase by 64%, from e1.1 billion in 2016 to e1.8 billion by 2026, (ConwayLenihan and Woods, 2015). In the final paper, a cross{sectional study was carried out on HSE{PCRS population prescribing database (n=1.63 million claimants) to investigate the impact of demographic factors, and the pharmacology of the drugs, on GMS prescribing expenditure. Those aged over 75 (ẞ = 1:195) and cardiovascular prescribing (ẞ = 1:193) were the greatest contributors to annual GMS prescribing costs. Respiratory drugs (Montelukast) recorded the highest proportion and expenditure for GMS claimants under the age of 15. Drugs prescribed for the nervous system (Escitalopram, Olanzapine and Pregabalin) were highest for those between 16 and 64 years with cardiovascular drugs (Statins) were highest for those aged over 65. Females are more expensive than males and are prescribed more items across the four ATC groups, except among children under 11, (ConwayLenihan et al., 2016). This research indicates that growth in the proportion of the elderly claimants and associated levels of cardiovascular prescribing, particularly for statins, will present difficulties for Ireland in terms of cost containment. Whilst policies aimed at cost containment (co{payment charges, generic substitution, reference pricing, adjustments to GMS eligibility) can be used to curtail expenditure, health promotional programs and educational interventions should be given equal emphasis. Also policies intended to affect physicians prescribing behaviour include guidelines, information (about price and less expensive alternatives) and feedback, and the use of budgetary restrictions could yield savings.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quantitative Structure-Activity Relationship (QSAR) has been applied extensively in predicting toxicity of Disinfection By-Products (DBPs) in drinking water. Among many toxicological properties, acute and chronic toxicities of DBPs have been widely used in health risk assessment of DBPs. These toxicities are correlated with molecular properties, which are usually correlated with molecular descriptors. The primary goals of this thesis are: 1) to investigate the effects of molecular descriptors (e.g., chlorine number) on molecular properties such as energy of the lowest unoccupied molecular orbital (ELUMO) via QSAR modelling and analysis; 2) to validate the models by using internal and external cross-validation techniques; 3) to quantify the model uncertainties through Taylor and Monte Carlo Simulation. One of the very important ways to predict molecular properties such as ELUMO is using QSAR analysis. In this study, number of chlorine (NCl) and number of carbon (NC) as well as energy of the highest occupied molecular orbital (EHOMO) are used as molecular descriptors. There are typically three approaches used in QSAR model development: 1) Linear or Multi-linear Regression (MLR); 2) Partial Least Squares (PLS); and 3) Principle Component Regression (PCR). In QSAR analysis, a very critical step is model validation after QSAR models are established and before applying them to toxicity prediction. The DBPs to be studied include five chemical classes: chlorinated alkanes, alkenes, and aromatics. In addition, validated QSARs are developed to describe the toxicity of selected groups (i.e., chloro-alkane and aromatic compounds with a nitro- or cyano group) of DBP chemicals to three types of organisms (e.g., Fish, T. pyriformis, and P.pyosphoreum) based on experimental toxicity data from the literature. The results show that: 1) QSAR models to predict molecular property built by MLR, PLS or PCR can be used either to select valid data points or to eliminate outliers; 2) The Leave-One-Out Cross-Validation procedure by itself is not enough to give a reliable representation of the predictive ability of the QSAR models, however, Leave-Many-Out/K-fold cross-validation and external validation can be applied together to achieve more reliable results; 3) ELUMO are shown to correlate highly with the NCl for several classes of DBPs; and 4) According to uncertainty analysis using Taylor method, the uncertainty of QSAR models is contributed mostly from NCl for all DBP classes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Limited information on the East Antarctic Ice Sheet (EAIS) geometry during Marine Isotope Stage 3 (MIS 3; 60-25 ka) restricts our understanding of its behaviour during periods of climate and sea level change. Ice sheet models forced by global parameters suggest an expanded EAIS compared to the Holocene during MIS 3, but field evidence from East Antarctic coastal areas contradicts such modelling, and suggests that the ice sheet margins were no more advanced than at present. Here we present a new lake sediment record, and cosmogenic exposure results from bedrock, which confirm that Rauer Group (eastern Prydz Bay) was ice-free for much of MIS 3. We also refine the likely duration of the Last Glacial Maximum (LGM) glaciation in the region. Lacustrine and marine sediments from Rauer Group indicate the penultimate period of ice retreat predates 50 ka. The lacustrine record indicates a change from warmer/wetter conditions to cooler/drier conditions after ca. 35 ka. Substantive ice sheet re-advance, however, may not have occurred until much closer to 20 ka. Contemporary coastal areas were still connected to the sea during MIS 3, restricting the possible extent of grounded ice in Prydz Bay on the continental shelf. In contrast, relative sea levels (RSL) deduced from field evidence indicate an extra ice load averaging several hundred metres thicker ice across the Bay between 45 and 32 ka. Thus, ice must either have been thicker immediately inland (with a steeper ice profile), or there were additional ice domes on the shallow banks of the outer continental shelf. Further work is required to reconcile the differences between empirical evidence of past ice sheet histories, and the history predicted by ice sheet models from far-field temperature and sea level records.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis, novel analog-to-digital and digital-to-analog generalized time-interleaved variable bandpass sigma-delta modulators are designed, analysed, evaluated and implemented that are suitable for high performance data conversion for a broad-spectrum of applications. These generalized time-interleaved variable bandpass sigma-delta modulators can perform noise-shaping for any centre frequency from DC to Nyquist. The proposed topologies are well-suited for Butterworth, Chebyshev, inverse-Chebyshev and elliptical filters, where designers have the flexibility of specifying the centre frequency, bandwidth as well as the passband and stopband attenuation parameters. The application of the time-interleaving approach, in combination with these bandpass loop-filters, not only overcomes the limitations that are associated with conventional and mid-band resonator-based bandpass sigma-delta modulators, but also offers an elegant means to increase the conversion bandwidth, thereby relaxing the need to use faster or higher-order sigma-delta modulators. A step-by-step design technique has been developed for the design of time-interleaved variable bandpass sigma-delta modulators. Using this technique, an assortment of lower- and higher-order single- and multi-path generalized A/D variable bandpass sigma-delta modulators were designed, evaluated and compared in terms of their signal-to-noise ratios, hardware complexity, stability, tonality and sensitivity for ideal and non-ideal topologies. Extensive behavioural-level simulations verified that one of the proposed topologies not only used fewer coefficients but also exhibited greater robustness to non-idealties. Furthermore, second-, fourth- and sixth-order single- and multi-path digital variable bandpass digital sigma-delta modulators are designed using this technique. The mathematical modelling and evaluation of tones caused by the finite wordlengths of these digital multi-path sigmadelta modulators, when excited by sinusoidal input signals, are also derived from first principles and verified using simulation and experimental results. The fourth-order digital variable-band sigma-delta modulator topologies are implemented in VHDL and synthesized on Xilinx® SpartanTM-3 Development Kit using fixed-point arithmetic. Circuit outputs were taken via RS232 connection provided on the FPGA board and evaluated using MATLAB routines developed by the author. These routines included the decimation process as well. The experiments undertaken by the author further validated the design methodology presented in the work. In addition, a novel tunable and reconfigurable second-order variable bandpass sigma-delta modulator has been designed and evaluated at the behavioural-level. This topology offers a flexible set of choices for designers and can operate either in single- or dual-mode enabling multi-band implementations on a single digital variable bandpass sigma-delta modulator. This work is also supported by a novel user-friendly design and evaluation tool that has been developed in MATLAB/Simulink that can speed-up the design, evaluation and comparison of analog and digital single-stage and time-interleaved variable bandpass sigma-delta modulators. This tool enables the user to specify the conversion type, topology, loop-filter type, path number and oversampling ratio.