60 resultados para software asset creation
Resumo:
Recently, morphometric measurements of the ascending aorta have been done with ECG-gated multidector computerized tomography (MDCT) to help the development of future novel transcatheter therapies (TCT); nevertheless, the variability of such measurements remains unknown. Thirty patients referred for ECG-gated CT thoracic angiography were evaluated. Continuous reformations of the ascending aorta, perpendicular to the centerline, were obtained automatically with a commercially available computer aided diagnosis (CAD). Then measurements of the maximal diameter were done with the CAD and manually by two observers (separately). Measurements were repeated one month later. The Bland-Altman method, Spearman coefficients, and a Wilcoxon signed-rank test were used to evaluate the variability, the correlation, and the differences between observers. The interobserver variability for maximal diameter between the two observers was up to 1.2 mm with limits of agreement [-1.5, +0.9] mm; whereas the intraobserver limits were [-1.2, +1.0] mm for the first observer and [-0.8, +0.8] mm for the second observer. The intraobserver CAD variability was 0.8 mm. The correlation was good between observers and the CAD (0.980-0.986); however, significant differences do exist (P<0.001). The maximum variability observed was 1.2 mm and should be considered in reports of measurements of the ascending aorta. The CAD is as reproducible as an experienced reader.
Resumo:
Cloud computing and its three facets (Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS)) are terms that denote new developments in the software industry. In particular, PaaS solutions, also referred to as cloud platforms, are changing the way software is being produced, distributed, consumed, and priced. Software vendors have started considering cloud platforms as a strategic option but are battling to redefine their offerings to embrace PaaS. In contrast to SaaS and IaaS, PaaS allows for value co-creation with partners to develop complementary components and applications. It thus requires multisided business models that bring together two or more distinct customer segments. Understanding how to design PaaS business models to establish a flourishing ecosystem is crucial for software vendors. This doctoral thesis aims to address this issue in three interrelated research parts. First, based on case study research, the thesis provides a deeper understanding of current PaaS business models and their evolution. Second, it analyses and simulates consumers' preferences regarding PaaS business models, using a conjoint approach to find out what determines the choice of cloud platforms. Finally, building on the previous research outcomes, the third part introduces a design theory for the emerging class of PaaS business models, which is grounded on an extensive action design research study with a large European software vendor. Understanding PaaS business models from a market as well as a consumer perspective will, together with the design theory, inform and guide decision makers in their business model innovation plans. It also closes gaps in the research related to PaaS business model design and more generally related to platform business models.
Resumo:
In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool.
Resumo:
Percutaneous transluminal renal angioplasty (PTRA) is an invasive technique that is costly and involves the risk of complications and renal failure. The ability of PTRA to reduce the administration of antihypertensive drugs has been demonstrated. A potentially greater benefit, which nevertheless remains to be proven, is the deferral of the need for chronic dialysis. The aim of the study (ANPARIA) was to assess the appropriateness of PTRA to impact on the evolution of renal function. A standardized expert panel method was used to assess the appropriateness of medical treatment alone or medical treatment with revascularization in various clinical situations. The choice of revascularization by either PTRA or surgery was examined for each clinical situation. Analysis was based on a detailed literature review and on systematically elicited expert opinion, which were obtained during a two-round modified Delphi process. The study provides detailed responses on the appropriateness of PTRA for 1848 distinct clinical scenarios. Depending on the major clinical presentation, appropriateness of revascularization varied from 32% to 75% for individual scenarios (overal 48%). Uncertainty as to revascularization was 41% overall. When revascularization was appropriate, PTRA was favored over surgery in 94% of the scenarios, except in certain cases of aortic atheroma where sugery was the preferred choice. Kidney size [7 cm, absence of coexisting disease, acute renal failure, a high degree of stenosis (C70%), and absence of multiple arteries were identified as predictive variables of favorable appropriateness ratings. Situations such as cardiac failure with pulmonary edema or acute thrombosis of the renal artery were defined as indications for PTRA. This study identified clinical situations in which PTRA or surgery are appropriate for renal artery disease. We built a decision tree which can be used via Internet: the ANPARIA software (http://www.chu-clermontferrand.fr/anparia/). In numerous clinical situations uncertainty remains as to whether PTRA prevents deterioration of renal function.
Resumo:
Purpose: IOL centration and stability after cataract surgery is of high interest for cataract surgeons and IOL-producing companies. We present a new imaging software to evaluate the centration of the rhexis and the centration of the IOL after cataract surgery.Methods: We developed, in collaboration with the Biomedical Imaging Group (BIG), EPFL, Lausanne, a new working tool in order to assess precisely outcomes after IOL-implantation, such as ideal capsulorhexis and IOL-centration. The software is a plug-in of ImageJ, a general-purpose image processing and image-analysis package. The specifications of this software are: evaluation of the rhexis-centration and evaluation the position of the IOL in the posterior chamber. The end points are to analyze the quality of the centration of a rhexis after cataract surgery, the deformation of the rhexis with capsular bag retraction and the centration of the IOL after implantation.Results: This software delivers tools to interactively measure the distances between limbus, IOL and capsulorhexis and its changes over time. The user is invited to adjust nodes of three radial curves for the limbus, rhexis and the optic of the IOL. The radial distances of the curves are computed to evaluate the IOL implantation. The user is also able to define patterns for ideal capsulorhexis and optimal IOL-centration. We are going to present examples of calculations after cataract surgery.Conclusions: Evaluation of the centration of the rhexis and of the IOL after cataract surgery is an important end point for optimal IOL implantation after cataract surgery. Especially multifocal or accommodative lenses need a precise position in the bag with a good stability over time. This software is able to evaluate these parameters just after the surgery but also its changes over time. The results of these evaluations can lead to an optimizing of surgical procedures and materials.
Resumo:
Swain corrects the chi-square overidentification test (i.e., likelihood ratio test of fit) for structural equation models whethr with or without latent variables. The chi-square statistic is asymptotically correct; however, it does not behave as expected in small samples and/or when the model is complex (cf. Herzog, Boomsma, & Reinecke, 2007). Thus, particularly in situations where the ratio of sample size (n) to the number of parameters estimated (p) is relatively small (i.e., the p to n ratio is large), the chi-square test will tend to overreject correctly specified models. To obtain a closer approximation to the distribution of the chi-square statistic, Swain (1975) developed a correction; this scaling factor, which converges to 1 asymptotically, is multiplied with the chi-square statistic. The correction better approximates the chi-square distribution resulting in more appropriate Type 1 reject error rates (see Herzog & Boomsma, 2009; Herzog, et al., 2007).
Resumo:
Barrels are discrete cytoarchitectonic neurons cluster located in the layer IV of the somatosensory¦cortex in mice brain. Each barrel is related to a specific whisker located on the mouse snout. The¦whisker-to-barrel pathway is a part of the somatosensory system that is intensively used to explore¦sensory activation induced plasticity in the cerebral cortex.¦Different recording methods exist to explore the cortical response induced by whisker deflection in¦the cortex of anesthetized mice. In this work, we used a method called the Single-Unit Analysis by¦which we recorded the extracellular electric signals of a single barrel neuron using a microelectrode.¦After recording the signal was processed by discriminators to isolate specific neuronal shape (action¦potentials).¦The objective of this thesis was to familiarize with the barrel cortex recording during whisker¦deflection and its theoretical background and to compare two different ways of discriminating and¦sorting cortical signal, the Waveform Window Discriminator (WWD) or the Spike Shape Discriminator (SSD).¦WWD is an electric module allowing the selection of specific electric signal shape. A trigger and a¦window potential level are set manually. During measurements, every time the electric signal passes¦through the two levels a dot is generated on time line. It was the method used in previous¦extracellular recording study in the Département de Biologie Cellulaire et de Morphologie (DBCM) in¦Lausanne.¦SSD is a function provided by the signal analysis software Spike2 (Cambridge Electronic Design). The¦neuronal signal is discriminated by a complex algorithm allowing the creation of specific templates.¦Each of these templates is supposed to correspond to a cell response profile. The templates are saved¦as a number of points (62 in this study) and are set for each new cortical location. During¦measurements, every time the cortical recorded signal corresponds to a defined number of templates¦points (60% in this study) a dot is generated on time line. The advantage of the SSD is that multiple¦templates can be used during a single stimulation, allowing a simultaneous recording of multiple¦signals.¦It exists different ways to represent data after discrimination and sorting. The most commonly used¦in the Single-Unit Analysis of the barrel cortex are the representation of the time between stimulation¦and the first cell response (the latency), the representation of the Response Magnitude (RM) after¦whisker deflection corrected for spontaneous activity and the representation of the time distribution¦of neuronal spikes on time axis after whisker stimulation (Peri-Stimulus Time Histogram, PSTH).¦The results show that the RMs and the latencies in layer IV were significantly different between the¦WWD and the SSD discriminated signal. The temporal distribution of the latencies shows that the¦different values were included between 6 and 60ms with no peak value for SSD while the WWD¦data were all gathered around a peak of 11ms (corresponding to previous studies). The scattered¦distribution of the latencies recorded with the SSD did not correspond to a cell response.¦The SSD appears to be a powerful tool for signal sorting but we do not succeed to use it for the¦Single-Unit Analysis extracellular recordings. Further recordings with different SSD templates settings¦and larger sample size may help to show the utility of this tool in Single-Unit Analysis studies.
Resumo:
The empirical literature on the asset allocation and medical expenditures of U.S. households consistently shows that risky portfolio shares are increasing in both wealth and health whereas health investment shares are decreasing in these same variables. Despite this evidence, most of the existing models treat financial and health-related choices separately. This paper bridges this gap by proposing a tractable framework for the joint determination of optimal consumption, portfolio and health investments. We solve for the optimal rules in closed form and show that the model can theoretically reproduce the empirical facts. Capitalizing on this closed-form solution, we perform a structural estimation of the model on HRS data. Our parameter estimates are reasonable and confirm the relevance of all the main characteristics of the model.
Resumo:
SUMMARY: We present a tool designed for visualization of large-scale genetic and genomic data exemplified by results from genome-wide association studies. This software provides an integrated framework to facilitate the interpretation of SNP association studies in genomic context. Gene annotations can be retrieved from Ensembl, linkage disequilibrium data downloaded from HapMap and custom data imported in BED or WIG format. AssociationViewer integrates functionalities that enable the aggregation or intersection of data tracks. It implements an efficient cache system and allows the display of several, very large-scale genomic datasets. AVAILABILITY: The Java code for AssociationViewer is distributed under the GNU General Public Licence and has been tested on Microsoft Windows XP, MacOSX and GNU/Linux operating systems. It is available from the SourceForge repository. This also includes Java webstart, documentation and example datafiles.
Resumo:
ABSTRACT : Research in empirical asset pricing has pointed out several anomalies both in the cross section and time series of asset prices, as well as in investors' portfolio choice. This dissertation aims to discover the forces driving some of these "puzzling" asset pricing dynamics and portfolio decisions observed in the financial market. Through the dissertation I construct and study dynamic general equilibrium models of heterogeneous investors in the presence of frictions and evaluate quantitatively their implications for financial-market asset prices and portfolio choice. I also explore the potential roots of puzzles in international finance. Chapter 1 shows that, by introducing jointly endogenous no-default type of borrowing constraints and heterogeneous beliefs in a dynamic general-equilibrium economy, many empirical features of stock return volatility can be reproduced. While most of the research on stock return volatility is empirical, this paper provides a theoretical framework that is able to reproduce simultaneously the cross section and time series stylized facts concerning stock returns and their volatility. In contrast to the existing theoretical literature related to stock return volatility, I don't impose persistence or regimes in any of the exogenous state variables or in preferences. Volatility clustering, asymmetry in the stock return-volatility relationship, and pricing of multi-factor volatility components in the cross section all arise endogenously as a consequence of the feedback between the binding of no-default constraints and heterogeneous beliefs. Chapters 2 and 3 explore the implications of differences of opinion across investors in different countries for international asset pricing anomalies. Chapter 2 demonstrates that several international finance "puzzles" can be reproduced by a single risk factor which captures heterogeneous beliefs across international investors. These puzzles include: (i) home equity preference; (ii) the dependence of firm returns on local and foreign factors; (iii) the co-movement of returns and international capital flows; and (iv) abnormal returns around foreign firm cross-listing events in the local market. These are reproduced in a setup with symmetric information and in a perfectly integrated world with multiple countries and independent processes producing the same good. Chapter 3 shows that by extending this framework to multiple goods and correlated production processes; the "forward premium puzzle" arises naturally as a compensation for the heterogeneous expectations about the depreciation of the exchange rate held by international investors. Chapters 2 and 3 propose differences of opinion across international investors as the potential resolution of several international finance `puzzles'. In a globalized world where both capital and information flow freely across countries, this explanation seems more appealing than existing asymmetric information or segmented markets theories aiming to explain international finance puzzles.