82 resultados para Dynamic photorefractive volume grating
em Consorci de Serveis Universitaris de Catalunya (CSUC), Spain
Resumo:
Purpose: The objective of this study is to investigate the feasibility of detecting and quantifying 3D cerebrovascular wall motion from a single 3D rotational x-ray angiography (3DRA) acquisition within a clinically acceptable time and computing from the estimated motion field for the further biomechanical modeling of the cerebrovascular wall. Methods: The whole motion cycle of the cerebral vasculature is modeled using a 4D B-spline transformation, which is estimated from a 4D to 2D + t image registration framework. The registration is performed by optimizing a single similarity metric between the entire 2D + t measured projection sequence and the corresponding forward projections of the deformed volume at their exact time instants. The joint use of two acceleration strategies, together with their implementation on graphics processing units, is also proposed so as to reach computation times close to clinical requirements. For further characterizing vessel wall properties, an approximation of the wall thickness changes is obtained through a strain calculation. Results: Evaluation on in silico and in vitro pulsating phantom aneurysms demonstrated an accurate estimation of wall motion curves. In general, the error was below 10% of the maximum pulsation, even in the situation when substantial inhomogeneous intensity pattern was present. Experiments on in vivo data provided realistic aneurysm and vessel wall motion estimates, whereas in regions where motion was neither visible nor anatomically possible, no motion was detected. The use of the acceleration strategies enabled completing the estimation process for one entire cycle in 5-10 min without degrading the overall performance. The strain map extracted from our motion estimation provided a realistic deformation measure of the vessel wall. Conclusions: The authors' technique has demonstrated that it can provide accurate and robust 4D estimates of cerebrovascular wall motion within a clinically acceptable time, although it has to be applied to a larger patient population prior to possible wide application to routine endovascular procedures. In particular, for the first time, this feasibility study has shown that in vivo cerebrovascular motion can be obtained intraprocedurally from a 3DRA acquisition. Results have also shown the potential of performing strain analysis using this imaging modality, thus making possible for the future modeling of biomechanical properties of the vascular wall.
Resumo:
Despite the important benefits for firms of commercial initiatives on the Internet, e-commerce is still an emerging distribution channel, even in developed countries. Thus, more needs to be known about the mechanisms affecting its development. A large number of works have studied firms¿ e-commerce adoption from technological, intraorganizational, institutional, or other specific perspectives, but there is a need for adequately tested integrative frameworks. Hence, this work proposes and tests a model of firms¿ business-to-consumer (called B2C) e-commerce adoption that is founded on a holistic vision of the phenomenon. With this integrative approach, the authors analyze the joint influence of environmental, technological, and organizational factors; moreover, they evaluate this effect over time. Using various representative Spanish data sets covering the period 1996-2005, the findings demonstrate the suitability of the holistic framework. Likewise, some lessons are learned from the analysis of the key building blocks. In particular, the current study provides evidence for the debate about the effect of competitive pressure, since the findings show that competitive pressure disincentivizes e-commerce adoption in the long term. The results also show that the development or enrichment of the consumers¿ consumption patterns, the technological readiness of the market forces, the firm¿s global scope, and its competences in innovation continuously favor e-commerce adoption.
Resumo:
This book is one out of 8 IAEG XII Congress volumes, and deals with Landslide processes, including: field data and monitoring techniques, prediction and forecasting of landslide occurrence, regional landslide inventories and dating studies, modeling of slope instabilities and secondary hazards (e.g. impulse waves and landslide-induced tsunamis, landslide dam failures and breaching), hazard and risk assessment, earthquake and rainfall induced landslides, instabilities of volcanic edifices, remedial works and mitigation measures, development of innovative stabilization techniques and applicability to specific engineering geological conditions, use of geophysical techniques for landslide characterization and investigation of triggering mechanisms. Focuses is given to innovative techniques, well documented case studies in different environments, critical components of engineering geological and geotechnical investigations, hydrological and hydrogeological investigations, remote sensing and geophysical techniques, modeling of triggering, collapse, runout and landslide reactivation, geotechnical design and construction procedures in landslide zones, interaction of landslides with structures and infrastructures and possibility of domino effects. The Engineering Geology for Society and Territory volumes of the IAEG XII Congress held in Torino from September 15-19, 2014, analyze the dynamic role of engineering geology in our changing world and build on the four main themes of the congress: environment, processes, issues, and approaches.
Resumo:
We quantify the long-time behavior of a system of (partially) inelastic particles in a stochastic thermostat by means of the contractivity of a suitable metric in the set of probability measures. Existence, uniqueness, boundedness of moments and regularity of a steady state are derived from this basic property. The solutions of the kinetic model are proved to converge exponentially as t→ ∞ to this diffusive equilibrium in this distance metrizing the weak convergence of measures. Then, we prove a uniform bound in time on Sobolev norms of the solution, provided the initial data has a finite norm in the corresponding Sobolev space. These results are then combined, using interpolation inequalities, to obtain exponential convergence to the diffusive equilibrium in the strong L¹-norm, as well as various Sobolev norms.
Resumo:
"Vegeu el resum a l'inici del document del fitxer adjunt."
Resumo:
We consider a dynamic model where traders in each period are matched randomly into pairs who then bargain about the division of a fixed surplus. When agreement is reached the traders leave the market. Traders who do not come to an agreement return next period in which they will be matched again, as long as their deadline has not expired yet. New traders enter exogenously in each period. We assume that traders within a pair know each other's deadline. We define and characterize the stationary equilibrium configurations. Traders with longer deadlines fare better than traders with short deadlines. It is shown that the heterogeneity of deadlines may cause delay. It is then shown that a centralized mechanism that controls the matching protocol, but does not interfere with the bargaining, eliminates all delay. Even though this efficient centralized mechanism is not as good for traders with long deadlines, it is shown that in a model where all traders can choose which mechanism to
Resumo:
The paper provides a description and analysis of the Hodgskin section of Theories of Surplus Value and the general law section of the first version of Volume III of Capital. It then considers Part III of Volume III, the evolution of Marx's thought and various interpretations of his theory in the light of this analysis. It is suggested that Marx thought that the rate of profit must fall and even in the 1870s hoped to be able to provide a demonstration of this. However the main conclusions are: 1. Marx's major attempt to show that the rate of profit must fall occurred in the general law section. 2. Part III does not contain a demonstration that the rate of profit must fall. 3. Marx was never able to demonstrate that the rate of profit must fall and he was aware of this.
Resumo:
Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Since conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. Monte Carlo results show that the estimator performs well in comparison to other estimators that have been proposed for estimation of general DLV models.
Resumo:
In the literature on risk, one generally assume that uncertainty is uniformly distributed over the entire working horizon, when the absolute risk-aversion index is negative and constant. From this perspective, the risk is totally exogenous, and thus independent of endogenous risks. The classic procedure is "myopic" with regard to potential changes in the future behavior of the agent due to inherent random fluctuations of the system. The agent's attitude to risk is rigid. Although often criticized, the most widely used hypothesis for the analysis of economic behavior is risk-neutrality. This borderline case must be envisaged with prudence in a dynamic stochastic context. The traditional measures of risk-aversion are generally too weak for making comparisons between risky situations, given the dynamic �complexity of the environment. This can be highlighted in concrete problems in finance and insurance, context for which the Arrow-Pratt measures (in the small) give ambiguous.
Resumo:
The objective of this paper is to re-evaluate the attitude to effort of a risk-averse decision-maker in an evolving environment. In the classic analysis, the space of efforts is generally discretized. More realistic, this new approach emploies a continuum of effort levels. The presence of multiple possible efforts and performance levels provides a better basis for explaining real economic phenomena. The traditional approach (see, Laffont, J. J. & Tirole, J., 1993, Salanie, B., 1997, Laffont, J.J. and Martimort, D, 2002, among others) does not take into account the potential effect of the system dynamics on the agent's behavior to effort over time. In the context of a Principal-agent relationship, not only the incentives of the Principal can determine the private agent to allocate a good effort, but also the evolution of the dynamic system. The incentives can be ineffective when the environment does not incite the agent to invest a good effort. This explains why, some effici
Resumo:
The demand for computational power has been leading the improvement of the High Performance Computing (HPC) area, generally represented by the use of distributed systems like clusters of computers running parallel applications. In this area, fault tolerance plays an important role in order to provide high availability isolating the application from the faults effects. Performance and availability form an undissociable binomial for some kind of applications. Therefore, the fault tolerant solutions must take into consideration these two constraints when it has been designed. In this dissertation, we present a few side-effects that some fault tolerant solutions may presents when recovering a failed process. These effects may causes degradation of the system, affecting mainly the overall performance and availability. We introduce RADIC-II, a fault tolerant architecture for message passing based on RADIC (Redundant Array of Distributed Independent Fault Tolerance Controllers) architecture. RADIC-II keeps as maximum as possible the RADIC features of transparency, decentralization, flexibility and scalability, incorporating a flexible dynamic redundancy feature, allowing to mitigate or to avoid some recovery side-effects.
Resumo:
This paper shows that tourism specialisation can help to explain the observed high growth rates of small countries. For this purpose, two models of growth and trade are constructed to represent the trade relations between two countries. One of the countries is large, rich, has an own source of sustained growth and produces a tradable capital good. The other is a small poor economy, which does not have an own engine of growth and produces tradable tourism services. The poor country exports tourism services to and imports capital goods from the rich economy. In one model tourism is a luxury good, while in the other the expenditure elasticity of tourism imports is unitary. Two main results are obtained. In the long run, the tourism country overcomes decreasing returns and permanently grows because its terms of trade continuously improve. Since the tourism sector is relatively less productive than the capital good sector, tourism services become relatively scarcer and hence more expensive than the capital good. Moreover, along the transition the growth rate of the tourism economy holds well above the one of the rich country for a long time. The growth rate differential between countries is particularly high when tourism is a luxury good. In this case, there is a faster increase in the tourism demand. As a result, investment of the small economy is boosted and its terms of trade highly improve.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Dynamic stackelberg game with risk-averse players: optimal risk-sharing under asymmetric information
Resumo:
The objective of this paper is to clarify the interactive nature of the leader-follower relationship when both players are endogenously risk-averse. The analysis is placed in the context of a dynamic closed-loop Stackelberg game with private information. The case of a risk-neutral leader, very often discussed in the literature, is only a borderline possibility in the present study. Each player in the game is characterized by a risk-averse type which is unknown to his opponent. The goal of the leader is to implement an optimal incentive compatible risk-sharing contract. The proposed approach provides a qualitative analysis of adaptive risk behavior profiles for asymmetrically informed players in the context of dynamic strategic interactions modelled as incentive Stackelberg games.
Resumo:
The objective of this paper is to re-examine the risk-and effort attitude in the context of strategic dynamic interactions stated as a discrete-time finite-horizon Nash game. The analysis is based on the assumption that players are endogenously risk-and effort-averse. Each player is characterized by distinct risk-and effort-aversion types that are unknown to his opponent. The goal of the game is the optimal risk-and effort-sharing between the players. It generally depends on the individual strategies adopted and, implicitly, on the the players' types or characteristics.