953 resultados para Models and Principles


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sight distance plays an important role in road traffic safety. Two types of Digital Elevation Models (DEMs) are utilized for the estimation of available sight distance in roads: Digital Terrain Models (DTMs) and Digital Surface Models (DSMs). DTMs, which represent the bare ground surface, are commonly used to determine available sight distance at the design stage. Additionally, the use of DSMs provides further information about elements by the roadsides such as trees, buildings, walls or even traffic signals which may reduce available sight distance. This document analyses the influence of three classes of DEMs in available sight distance estimation. For this purpose, diverse roads within the Region of Madrid (Spain) have been studied using software based on geographic information systems. The study evidences the influence of using each DEM in the outcome as well as the pros and cons of using each model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We discuss linear Ricardo models with a range of parameters. We show that the exact boundary of the region of equilibria of these models is obtained by solving a simple integer programming problem. We show that there is also an exact correspondence between many of the equilibria resulting from families of linear models and the multiple equilibria of economies of scale models.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Negli ultimi anni i modelli VAR sono diventati il principale strumento econometrico per verificare se può esistere una relazione tra le variabili e per valutare gli effetti delle politiche economiche. Questa tesi studia tre diversi approcci di identificazione a partire dai modelli VAR in forma ridotta (tra cui periodo di campionamento, set di variabili endogene, termini deterministici). Usiamo nel caso di modelli VAR il test di Causalità di Granger per verificare la capacità di una variabile di prevedere un altra, nel caso di cointegrazione usiamo modelli VECM per stimare congiuntamente i coefficienti di lungo periodo ed i coefficienti di breve periodo e nel caso di piccoli set di dati e problemi di overfitting usiamo modelli VAR bayesiani con funzioni di risposta di impulso e decomposizione della varianza, per analizzare l'effetto degli shock sulle variabili macroeconomiche. A tale scopo, gli studi empirici sono effettuati utilizzando serie storiche di dati specifici e formulando diverse ipotesi. Sono stati utilizzati tre modelli VAR: in primis per studiare le decisioni di politica monetaria e discriminare tra le varie teorie post-keynesiane sulla politica monetaria ed in particolare sulla cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015) e regola del GDP nominale in Area Euro (paper 1); secondo per estendere l'evidenza dell'ipotesi di endogeneità della moneta valutando gli effetti della cartolarizzazione delle banche sul meccanismo di trasmissione della politica monetaria negli Stati Uniti (paper 2); terzo per valutare gli effetti dell'invecchiamento sulla spesa sanitaria in Italia in termini di implicazioni di politiche economiche (paper 3). La tesi è introdotta dal capitolo 1 in cui si delinea il contesto, la motivazione e lo scopo di questa ricerca, mentre la struttura e la sintesi, così come i principali risultati, sono descritti nei rimanenti capitoli. Nel capitolo 2 sono esaminati, utilizzando un modello VAR in differenze prime con dati trimestrali della zona Euro, se le decisioni in materia di politica monetaria possono essere interpretate in termini di una "regola di politica monetaria", con specifico riferimento alla cosiddetta "nominal GDP targeting rule" (McCallum 1988 Hall e Mankiw 1994; Woodford 2012). I risultati evidenziano una relazione causale che va dallo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo alle variazioni dei tassi di interesse di mercato a tre mesi. La stessa analisi non sembra confermare l'esistenza di una relazione causale significativa inversa dalla variazione del tasso di interesse di mercato allo scostamento tra i tassi di crescita del PIL nominale e PIL obiettivo. Risultati simili sono stati ottenuti sostituendo il tasso di interesse di mercato con il tasso di interesse di rifinanziamento della BCE. Questa conferma di una sola delle due direzioni di causalità non supporta un'interpretazione della politica monetaria basata sulla nominal GDP targeting rule e dà adito a dubbi in termini più generali per l'applicabilità della regola di Taylor e tutte le regole convenzionali della politica monetaria per il caso in questione. I risultati appaiono invece essere più in linea con altri approcci possibili, come quelli basati su alcune analisi post-keynesiane e marxiste della teoria monetaria e più in particolare la cosiddetta "regola di solvibilità" (Brancaccio e Fontana 2013, 2015). Queste linee di ricerca contestano la tesi semplicistica che l'ambito della politica monetaria consiste nella stabilizzazione dell'inflazione, del PIL reale o del reddito nominale intorno ad un livello "naturale equilibrio". Piuttosto, essi suggeriscono che le banche centrali in realtà seguono uno scopo più complesso, che è il regolamento del sistema finanziario, con particolare riferimento ai rapporti tra creditori e debitori e la relativa solvibilità delle unità economiche. Il capitolo 3 analizza l’offerta di prestiti considerando l’endogeneità della moneta derivante dall'attività di cartolarizzazione delle banche nel corso del periodo 1999-2012. Anche se gran parte della letteratura indaga sulla endogenità dell'offerta di moneta, questo approccio è stato adottato raramente per indagare la endogeneità della moneta nel breve e lungo termine con uno studio degli Stati Uniti durante le due crisi principali: scoppio della bolla dot-com (1998-1999) e la crisi dei mutui sub-prime (2008-2009). In particolare, si considerano gli effetti dell'innovazione finanziaria sul canale dei prestiti utilizzando la serie dei prestiti aggiustata per la cartolarizzazione al fine di verificare se il sistema bancario americano è stimolato a ricercare fonti più economiche di finanziamento come la cartolarizzazione, in caso di politica monetaria restrittiva (Altunbas et al., 2009). L'analisi si basa sull'aggregato monetario M1 ed M2. Utilizzando modelli VECM, esaminiamo una relazione di lungo periodo tra le variabili in livello e valutiamo gli effetti dell’offerta di moneta analizzando quanto la politica monetaria influisce sulle deviazioni di breve periodo dalla relazione di lungo periodo. I risultati mostrano che la cartolarizzazione influenza l'impatto dei prestiti su M1 ed M2. Ciò implica che l'offerta di moneta è endogena confermando l'approccio strutturalista ed evidenziando che gli agenti economici sono motivati ad aumentare la cartolarizzazione per una preventiva copertura contro shock di politica monetaria. Il capitolo 4 indaga il rapporto tra spesa pro capite sanitaria, PIL pro capite, indice di vecchiaia ed aspettativa di vita in Italia nel periodo 1990-2013, utilizzando i modelli VAR bayesiani e dati annuali estratti dalla banca dati OCSE ed Eurostat. Le funzioni di risposta d'impulso e la scomposizione della varianza evidenziano una relazione positiva: dal PIL pro capite alla spesa pro capite sanitaria, dalla speranza di vita alla spesa sanitaria, e dall'indice di invecchiamento alla spesa pro capite sanitaria. L'impatto dell'invecchiamento sulla spesa sanitaria è più significativo rispetto alle altre variabili. Nel complesso, i nostri risultati suggeriscono che le disabilità strettamente connesse all'invecchiamento possono essere il driver principale della spesa sanitaria nel breve-medio periodo. Una buona gestione della sanità contribuisce a migliorare il benessere del paziente, senza aumentare la spesa sanitaria totale. Tuttavia, le politiche che migliorano lo stato di salute delle persone anziane potrebbe essere necessarie per una più bassa domanda pro capite dei servizi sanitari e sociali.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Friction in hydrodynamic bearings are a major source of losses in car engines ([69]). The extreme loading conditions in those bearings lead to contact between the matching surfaces. In such conditions not only the overall geometry of the bearing is relevant, but also the small-scale topography of the surface determines the bearing performance. The possibility of shaping the surface of lubricated bearings down to the micrometer ([57]) opened the question of whether friction can be reduced by mean of micro-textures, with mixed results. This work focuses in the development of efficient numerical methods to solve thin film (lubrication) problems down to the roughness scale of measured surfaces. Due to the high velocities and the convergent-divergent geometries of hydrodynamic bearings, cavitation takes place. To treat cavitation in the lubrication problem the Elrod- Adams model is used, a mass-conserving model which has proven in careful numerical ([12]) and experimental ([119]) tests to be essential to obtain physically meaningful results. Another relevant aspect of the modeling is that the bearing inertial effects are considered, which is necessary to correctly simulate moving textures. As an application, the effects of micro-texturing the moving surface of the bearing were studied. Realistic values are assumed for the physical parameters defining the problems. Extensive fundamental studies were carried out in the hydrodynamic lubrication regime. Mesh-converged simulations considering the topography of real measured surfaces were also run, and the validity of the lubrication approximation was assessed for such rough surfaces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Phase equilibrium data regression is an unavoidable task necessary to obtain the appropriate values for any model to be used in separation equipment design for chemical process simulation and optimization. The accuracy of this process depends on different factors such as the experimental data quality, the selected model and the calculation algorithm. The present paper summarizes the results and conclusions achieved in our research on the capabilities and limitations of the existing GE models and about strategies that can be included in the correlation algorithms to improve the convergence and avoid inconsistencies. The NRTL model has been selected as a representative local composition model. New capabilities of this model, but also several relevant limitations, have been identified and some examples of the application of a modified NRTL equation have been discussed. Furthermore, a regression algorithm has been developed that allows for the advisable simultaneous regression of all the condensed phase equilibrium regions that are present in ternary systems at constant T and P. It includes specific strategies designed to avoid some of the pitfalls frequently found in commercial regression tools for phase equilibrium calculations. Most of the proposed strategies are based on the geometrical interpretation of the lowest common tangent plane equilibrium criterion, which allows an unambiguous comprehension of the behavior of the mixtures. The paper aims to show all the work as a whole in order to reveal the necessary efforts that must be devoted to overcome the difficulties that still exist in the phase equilibrium data regression problem.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Statistical machine translation (SMT) is an approach to Machine Translation (MT) that uses statistical models whose parameter estimation is based on the analysis of existing human translations (contained in bilingual corpora). From a translation student’s standpoint, this dissertation aims to explain how a phrase-based SMT system works, to determine the role of the statistical models it uses in the translation process and to assess the quality of the translations provided that system is trained with in-domain goodquality corpora. To that end, a phrase-based SMT system based on Moses has been trained and subsequently used for the English to Spanish translation of two texts related in topic to the training data. Finally, the quality of this output texts produced by the system has been assessed through a quantitative evaluation carried out with three different automatic evaluation measures and a qualitative evaluation based on the Multidimensional Quality Metrics (MQM).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper employs fifteen dynamic macroeconomic models maintained within the European System of Central Banks to assess the size of fiscal multipliers in European countries. Using a set of common simulations, we consider transitory and permanent shocks to government expenditures and different taxes. We investigate how the baseline multipliers change when monetary policy is transitorily constrained by the zero nominal interest rate bound, certain crisis-related structural features of the economy such as the share of liquidity-constrained households change, and the endogenous fiscal rule that ensures fiscal sustainability in the long run is specified in terms of labour income taxes instead of lump-sum taxes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A society acts sustainably if it ensures the long-term stability and productivity of ecological, sociopolitical and economic systems. In the past, issues of sustainability were typically handled separately, neglecting individual measures’ effects on other elements implied by a comprehensive conception of sustainability. The challenge ahead is to develop a holistic strategy for sustainable economic activity that takes into ac-count interdependencies between the various aspects of sustainability, and does not seek to solve problems of sustainability at other aspects’ expense.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper reviews peer-to-peer (P2P) lending, its development in the UK and other countries, and assesses the business and economic policy issues surrounding this new form of intermediation. P2P platform technology allows direct matching of borrowers’ and lenders’ diversification over a large number of borrowers without the loans having to be held on an intermediary balance sheet. P2P lending has developed rapidly in both the US and the UK, but it still represents a small fraction, less than 1%, of the stock of bank lending. In the UK – but not elsewhere – it is an important source of loans for smaller companies. We argue that P2P lending is fundamentally complementary to, and not competitive with, conventional banking. We therefore expect banks to adapt to the emergence of P2P lending, either by cooperating closely with third-party P2P lending platforms or offering their own proprietary platforms. We also argue that the full development of the sector requires much further work addressing the risks and business and regulatory issues in P2P lending, including risk communication, orderly resolution of platform failure, control of liquidity risks and minimisation of fraud, security and operational risks. This will depend on developing reliable business processes, the promotion to the full extent possible of transparency and standardisation and appropriate regulation that serves the needs of customers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Substantial retreat or disintegration of numerous ice shelves have been observed on the Antarctic Peninsula. The ice shelf in the Prince Gustav Channel retreated gradually since the late 1980's and broke-up in 1995. Tributary glaciers reacted with speed-up, surface lowering and increased ice discharge, consequently contributing to sea level rise. We present a detailed long-term study (1993-2014) on the dynamic response of Sjögren Inlet glaciers to the disintegration of Prince Gustav Ice Shelf. We analyzed various remote sensing datasets to observe the reactions of the glaciers to the loss of the buttressing ice shelf. A strong increase in ice surface velocities was observed with maximum flow speeds reaching 2.82±0.48 m/d in 2007 and 1.50±0.32 m/d in 2004 at Sjögren and Boydell glaciers respectively. Subsequently, the flow velocities decelerated, however in late 2014, we still measured about two times the values of our first measurements in 1996. The tributary glaciers retreated 61.7±3.1 km² behind the former grounding line of the ice shelf. In regions below 1000 m a.s.l., a mean surface lowering of -68±10 m (-3.1 m/a) was observed in the period 1993-2014. The lowering rate decreased to -2.2 m/a in recent years. Based on the surface lowering rates, geodetic mass balances of the glaciers were derived for different time steps. High mass loss rate of -1.21±0.36 Gt/a was found in the earliest period (1993-2001). Due to the dynamic adjustments of the glaciers to the new boundary conditions the ice mass loss reduced to -0.59±0.11 Gt/a in the period 2012-2014, resulting in an average mass loss rate of -0.89±0.16 Gt/a (1993-2014). Including the retreat of the ice front and grounding line, a total mass change of -38.5±7.7 Gt and a contribution to sea level rise of 0.061±0.013 mm were computed. Analysis of the ice flux revealed that available bedrock elevation estimates at Sjögren Inlet are too shallow and are the major uncertainty in ice flux computations. This temporally dense time series analysis of Sjögren Inlet glaciers shows that the adjustments of tributary glaciers to ice shelf disintegration are still going on and provides detailed information of the changes in glacier dynamics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Late Pleistocene signals of calcium carbonate, organic carbon, and opaline silica concentration and accumulation are documented in a series of cores from a zonal/meridional/depth transect in the equatorial Atlantic Ocean to reconstruct the regional sedimentary history. Spectral analysis reveals that maxima and minima in biogenous sedimentation occur with glacial-interglacial cyclicity as a function of both (1) primary production at the sea surface modulated by orbitally forced variation in trade wind zonality and (2) destruction at the seafloor by variation in the chemical character of advected intermediate and deep water from high latitudes modulated by high-latitude ice volume. From these results a pattern emerges in which the relative proportion of signal variance from the productivity signal centered on the precessional (23 kyr) band decreases while that of the destruction signal centered on the obliquity (41 kyr) and eccentricity (100 kyr) periods increases below ~3600-m ocean depth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The high-resolution delta18O and delta13C records of benthic foraminifera from a 150,000-year long core from the Caribbean Sea indicate that there was generally high delta13C during glaciations and low delta13C during interglaciations. Due to its 1800-m sill depth, the properties of deep water in the Caribbean Sea are similar to those of middepth tropical Atlantic water. During interglaciations, the water filling the deep Caribbean Sea is an admixture of low delta13C Upper Circumpolar Water (UCPW) and high delta13C Upper North Atlantic Deep Water (UNADW). By contrast, only high delta13C UNADW enters during glaciations. Deep ocean circulation changes can influence atmospheric CO2 levels (Broecker and Takahashi, 1985; Boyle, 1988 doi:10.1029/JC093iC12p15701; Keir, 1988 doi:10.1029/PA003i004p00413; Broecker and Peng, 1989 doi:10.1029/GB003i003p00215). By comparing delta13C records of benthic foraminifera from cores lying in Southern Ocean Water, the Caribbean Sea, and at several other Atlantic Ocean sites, the thermohaline state of the Atlantic Ocean (how close it was to a full glacial or full interglacial configuration) is characterized. A continuum of circulation patterns between the glacial and interglacial extremes appears to have existed in the past. Subtracting the deep Pacific (~mean ocean water) delta13C record from the Caribbean delta13C record yields a record which describes large changes in the Atlantic Ocean thermohaline circulation. The delta13C difference varies as the vertical nutrient distribution changes. This new proxy record bears a striking resemblance to the 150,000-year-long atmospheric CO2 record (Barnola et al., 1987 doi:10.1038/329408a0). This favorable comparison between the new proxy record and the atmospheric CO2 record is consistent with Boyle's (1988a) model that vertical nutrient redistribution has driven large atmospheric CO2 changes in the past. Changes in the relative contribution of NADW and Pacific outflow water to the Southern Ocean are also consistent with Broecker and Peng's (1989) recent model for atmospheric CO2 changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Federal Highway Administration, Office of Research and Development, Washington, D.C.