138 resultados para Application distribuée
Resumo:
In this project, we have investigated new ways of modelling and analysis of human vasculature from Medical images. The research was divided in two main areas: cerebral vasculature analysis and coronary arteries modeling. Regarding cerebral vasculature analysis, we have studed cerebral aneurysms, internal carotid and the Circle of Willis (CoW). Aneurysms are abnormal vessel enlargements that can rupture causing important cerebral damages or death. The understanding of this pathology, together with its virtual treatment, and image diagnosis and prognosis, includes identification and detailed measurement of the aneurysms. In this context, we have proposed two automatic aneurysm isolation method, to separate the abnormal part of the vessel from the healthy part, to homogenize and speed-up the processing pipeline usually employed to study this pathology, [Cardenes2011TMI, arrabide2011MedPhys]. The results obtained from both methods have been also compared and validatied in [Cardenes2012MBEC]. A second important task here the analysis of the internal carotid [Bogunovic2011Media] and the automatic labelling of the CoW, Bogunovic2011MICCAI, Bogunovic2012TMI]. The second area of research covers the study of coronary arteries, specially coronary bifurcations because there is where the formation of atherosclerotic plaque is more common, and where the intervention is more challenging. Therefore, we proposed a novel modelling method from Computed Tomography Angiography (CTA) images, combined with Conventional Coronary Angiography (CCA), to obtain realistic vascular models of coronary bifurcations, presented in [Cardenes2011MICCAI], and fully validated including phantom experiments in [Cardene2013MedPhys]. The realistic models obtained from this method are being used to simulate stenting procedures, and to investigate the hemodynamic variables in coronary bifurcations in the works submitted in [Morlachi2012, Chiastra2012]. Additionally, another preliminary work has been done to reconstruct the coronary tree from rotational angiography, and published in [Cardenes2012ISBI].
Resumo:
Intuitively, music has both predictable and unpredictable components. In this work we assess this qualitative statement in a quantitative way using common time series models fitted to state-of-the-art music descriptors. These descriptors cover different musical facets and are extracted from a large collection of real audio recordings comprising a variety of musical genres. Our findings show that music descriptor time series exhibit a certain predictability not only for short time intervals, but also for mid-term and relatively long intervals. This fact is observed independently of the descriptor, musical facet and time series model we consider. Moreover, we show that our findings are not only of theoretical relevance but can also have practical impact. To this end we demonstrate that music predictability at relatively long time intervals can be exploited in a real-world application, namely the automatic identification of cover songs (i.e. different renditions or versions of the same musical piece). Importantly, this prediction strategy yields a parameter-free approach for cover song identification that is substantially faster, allows for reduced computational storage and still maintains highly competitive accuracies when compared to state-of-the-art systems.
Resumo:
In this work we describe the usage of bilinear statistical models as a means of factoring the shape variability into two components attributed to inter-subject variation and to the intrinsic dynamics of the human heart. We show that it is feasible to reconstruct the shape of the heart at discrete points in the cardiac cycle. Provided we are given a small number of shape instances representing the same heart atdifferent points in the same cycle, we can use the bilinearmodel to establish this. Using a temporal and a spatial alignment step in the preprocessing of the shapes, around half of the reconstruction errors were on the order of the axial image resolution of 2 mm, and over 90% was within 3.5 mm. From this, weconclude that the dynamics were indeed separated from theinter-subject variability in our dataset.
Resumo:
This paper presents a new registration algorithm, called Temporal Di eomorphic Free Form Deformation (TDFFD), and its application to motion and strain quanti cation from a sequence of 3D ultrasound (US) images. The originality of our approach resides in enforcing time consistency by representing the 4D velocity eld as the sum of continuous spatiotemporal B-Spline kernels. The spatiotemporal displacement eld is then recovered through forward Eulerian integration of the non-stationary velocity eld. The strain tensor iscomputed locally using the spatial derivatives of the reconstructed displacement eld. The energy functional considered in this paper weighs two terms: the image similarity and a regularization term. The image similarity metric is the sum of squared di erences between the intensities of each frame and a reference one. Any frame in the sequence can be chosen as reference. The regularization term is based on theincompressibility of myocardial tissue. TDFFD was compared to pairwise 3D FFD and 3D+t FFD, bothon displacement and velocity elds, on a set of synthetic 3D US images with di erent noise levels. TDFFDshowed increased robustness to noise compared to these two state-of-the-art algorithms. TDFFD also proved to be more resistant to a reduced temporal resolution when decimating this synthetic sequence. Finally, this synthetic dataset was used to determine optimal settings of the TDFFD algorithm. Subsequently, TDFFDwas applied to a database of cardiac 3D US images of the left ventricle acquired from 9 healthy volunteers and 13 patients treated by Cardiac Resynchronization Therapy (CRT). On healthy cases, uniform strain patterns were observed over all myocardial segments, as physiologically expected. On all CRT patients, theimprovement in synchrony of regional longitudinal strain correlated with CRT clinical outcome as quanti ed by the reduction of end-systolic left ventricular volume at follow-up (6 and 12 months), showing the potential of the proposed algorithm for the assessment of CRT.
Resumo:
In this paper we propose a subsampling estimator for the distribution ofstatistics diverging at either known rates when the underlying timeseries in strictly stationary abd strong mixing. Based on our results weprovide a detailed discussion how to estimate extreme order statisticswith dependent data and present two applications to assessing financialmarket risk. Our method performs well in estimating Value at Risk andprovides a superior alternative to Hill's estimator in operationalizingSafety First portofolio selection.
Resumo:
We study the earnings structure and the equilibrium assignment of workers when workers exert intra-firm spillovers on each other.We allow for arbitrary spillovers provided output depends on some aggregate index of workers' skill. Despite the possibility of increasing returns to skills, equilibrium typically exists. We show that equilibrium will typically be segregated; that the skill space can be partitioned into a set of segments and any firm hires from only one segment. Next, we apply the model to analyze the effect of information technology on segmentation and the distribution of income. There are two types of human capital, productivity and creativity, i.e. the ability to produce ideas that may be duplicated over a network. Under plausible assumptions, inequality rises and then falls when network size increases, and the poorest workers cannot lose. We also analyze the impact of an improvement in worker quality and of an increased international mobility of ideas.
Resumo:
The growth of pharmaceutical expenditure and its prediction is a major concern for policy makers and health care managers. This paper explores different predictive models to estimate future drug expenses, using demographic and morbidity individual information from an integrated healthcare delivery organization in Catalonia for years 2002 and 2003. The morbidity information consists of codified health encounters grouped through the Clinical Risk Groups (CRGs). We estimate pharmaceutical costs using several model specifications, and CRGs as risk adjusters, providing an alternative way of obtaining high predictive power comparable to other estimations of drug expenditures in the literature. These results have clear implications for the use of risk adjustment and CRGs in setting the premiums for pharmaceutical benefits.
Resumo:
A new algorithm called the parameterized expectations approach(PEA) for solving dynamic stochastic models under rational expectationsis developed and its advantages and disadvantages are discussed. Thisalgorithm can, in principle, approximate the true equilibrium arbitrarilywell. Also, this algorithm works from the Euler equations, so that theequilibrium does not have to be cast in the form of a planner's problem.Monte--Carlo integration and the absence of grids on the state variables,cause the computation costs not to go up exponentially when the numberof state variables or the exogenous shocks in the economy increase. \\As an application we analyze an asset pricing model with endogenousproduction. We analyze its implications for time dependence of volatilityof stock returns and the term structure of interest rates. We argue thatthis model can generate hump--shaped term structures.
Resumo:
This paper resolves three empirical puzzles in outsourcing by formalizing the adaptationcost of long-term performance contracts. Side-trading with a new partner alongside a long-term contract (to exploit an adaptation-requiring investment) is usually less effective than switching to the new partner when the contract expires. So long-term contracts that prevent holdup of specific investments may induce holdup of adaptation investments. Contract length therefore trades of specific and adaptation investments. Length should increase with the importance and specificity of self-investments, and decrease with the importance of adaptation investments for which side-trading is ineffective. My general model also shows how optimal length falls with cross-investments and wasteful investments.
Resumo:
This paper presents and estimates a dynamic choice model in the attribute space considering rational consumers. In light of the evidence of several state-dependence patterns, the standard attribute-based model is extended by considering a general utility function where pure inertia and pure variety-seeking behaviors can be explained in the model as particular linear cases. The dynamics of the model are fully characterized by standard dynamic programming techniques. The model presents a stationary consumption pattern that can be inertial, where the consumer only buys one product, or a variety-seeking one, where the consumer shifts among varied products.We run some simulations to analyze the consumption paths out of the steady state. Underthe hybrid utility assumption, the consumer behaves inertially among the unfamiliar brandsfor several periods, eventually switching to a variety-seeking behavior when the stationary levels are approached. An empirical analysis is run using scanner databases for three different product categories: fabric softener, saltine cracker, and catsup. Non-linear specifications provide the best fit of the data, as hybrid functional forms are found in all the product categories for most attributes and segments. These results reveal the statistical superiority of the non-linear structure and confirm the gradual trend to seek variety as the level of familiarity with the purchased items increases.
Resumo:
This paper proposes to estimate the covariance matrix of stock returnsby an optimally weighted average of two existing estimators: the samplecovariance matrix and single-index covariance matrix. This method isgenerally known as shrinkage, and it is standard in decision theory andin empirical Bayesian statistics. Our shrinkage estimator can be seenas a way to account for extra-market covariance without having to specifyan arbitrary multi-factor structure. For NYSE and AMEX stock returns from1972 to 1995, it can be used to select portfolios with significantly lowerout-of-sample variance than a set of existing estimators, includingmulti-factor models.
Resumo:
This paper presents a comparative analysis of linear and mixed modelsfor short term forecasting of a real data series with a high percentage of missing data. Data are the series of significant wave heights registered at regular periods of three hours by a buoy placed in the Bay of Biscay.The series is interpolated with a linear predictor which minimizes theforecast mean square error. The linear models are seasonal ARIMA models and themixed models have a linear component and a non linear seasonal component.The non linear component is estimated by a non parametric regression of dataversus time. Short term forecasts, no more than two days ahead, are of interestbecause they can be used by the port authorities to notice the fleet.Several models are fitted and compared by their forecasting behavior.
Resumo:
The goal of this paper is to estimate time-varying covariance matrices.Since the covariance matrix of financial returns is known to changethrough time and is an essential ingredient in risk measurement, portfolioselection, and tests of asset pricing models, this is a very importantproblem in practice. Our model of choice is the Diagonal-Vech version ofthe Multivariate GARCH(1,1) model. The problem is that the estimation ofthe general Diagonal-Vech model model is numerically infeasible indimensions higher than 5. The common approach is to estimate more restrictive models which are tractable but may not conform to the data. Our contributionis to propose an alternative estimation method that is numerically feasible,produces positive semi-definite conditional covariance matrices, and doesnot impose unrealistic a priori restrictions. We provide an empiricalapplication in the context of international stock markets, comparing thenew estimator to a number of existing ones.