952 resultados para financial data processing


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The health system is one sector dealing with very large amount of complex data. Many healthcare organisations struggle to utilise these volumes of health data effectively and efficiently. Therefore, there is a need for very effective system to capture, collate and distribute this health data. There are number of technologies have been identified to integrate data from different sources. Data warehousing is one technology can be used to manage clinical data in the healthcare. This paper addresses how data warehousing assist to improve cardiac surgery decision making. This research used the cardiac surgery unit at the Prince Charles Hospital (TPCH) as the case study. In order to deal with other units efficiently, it is important to integrate disparate data to a single point interrogation. We propose implementing a data warehouse for the cardiac surgery unit at TPCH. The data warehouse prototype developed using SAS enterprise data integration studio 4.2 and data was analysed using SAS enterprise edition 4.3. This improves access to integrated clinical and financial data with, improved framing of data to the clinical context, giving potentially better informed decision making for both improved management and patient care.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes a safety data recording and analysis system that has been developed to capture safety occurrences including precursors using high-definition forward-facing video from train cabs and data from other train-borne systems. The paper describes the data processing model and how events detected through data analysis are related to an underlying socio-technical model of accident causation. The integrated approach to safety data recording and analysis insures systemic factors that condition, influence or potentially contribute to an occurrence are captured both for safety occurrences and precursor events, providing a rich tapestry of antecedent causal factors that can significantly improve learning around accident causation. This can ultimately provide benefit to railways through the development of targeted and more effective countermeasures, better risk models and more effective use and prioritization of safety funds. Level crossing occurrences are a key focus in this paper with data analysis scenarios describing causal factors around near-miss occurrences. The paper concludes with a discussion on how the system can also be applied to other types of railway safety occurrences.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Increasingly larger scale applications are generating an unprecedented amount of data. However, the increasing gap between computation and I/O capacity on High End Computing machines makes a severe bottleneck for data analysis. Instead of moving data from its source to the output storage, in-situ analytics processes output data while simulations are running. However, in-situ data analysis incurs much more computing resource contentions with simulations. Such contentions severely damage the performance of simulation on HPE. Since different data processing strategies have different impact on performance and cost, there is a consequent need for flexibility in the location of data analytics. In this paper, we explore and analyze several potential data-analytics placement strategies along the I/O path. To find out the best strategy to reduce data movement in given situation, we propose a flexible data analytics (FlexAnalytics) framework in this paper. Based on this framework, a FlexAnalytics prototype system is developed for analytics placement. FlexAnalytics system enhances the scalability and flexibility of current I/O stack on HEC platforms and is useful for data pre-processing, runtime data analysis and visualization, as well as for large-scale data transfer. Two use cases – scientific data compression and remote visualization – have been applied in the study to verify the performance of FlexAnalytics. Experimental results demonstrate that FlexAnalytics framework increases data transition bandwidth and improves the application end-to-end transfer performance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

During the last decades there has been a global shift in forest management from a focus solely on timber management to ecosystem management that endorses all aspects of forest functions: ecological, economic and social. This has resulted in a shift in paradigm from sustained yield to sustained diversity of values, goods and benefits obtained at the same time, introducing new temporal and spatial scales into forest resource management. The purpose of the present dissertation was to develop methods that would enable spatial and temporal scales to be introduced into the storage, processing, access and utilization of forest resource data. The methods developed are based on a conceptual view of a forest as a hierarchically nested collection of objects that can have a dynamically changing set of attributes. The temporal aspect of the methods consists of lifetime management for the objects and their attributes and of a temporal succession linking the objects together. Development of the forest resource data processing method concentrated on the extensibility and configurability of the data content and model calculations, allowing for a diverse set of processing operations to be executed using the same framework. The contribution of this dissertation to the utilisation of multi-scale forest resource data lies in the development of a reference data generation method to support forest inventory methods in approaching single-tree resolution.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The hot deformation behavior of α brass with varying zinc contents in the range 3%–30% was characterized using hot compression testing in the temperature range 600–900 °C and strain rate range 0.001–100 s−1. On the basis of the flow stress data, processing maps showing the variation of the efficiency of power dissipation (given by Image where m is the strain rate sensitivity) with temperature and strain rate were obtained. α brass exhibits a domain of dynamic recrystallization (DRX) at temperatures greater than 0.85Tm and at strain rates lower than 1 s−1. The maximum efficiency of power dissipation increases with increasing zinc content and is in the range 33%–53%. The DRX domain shifts to lower strain rates for higher zinc contents and the strain rate for peak efficiency is in the range 0.0001–0.05 s−1. The results indicate that the DRX in α brass is controlled by the rate of interface formation (nucleation) which depends on the diffusion-controlled process of thermal recovery by climb.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The effect of zirconium on the hot working characteristics of alpha and alpha-beta brass was studied in the temperature range of 500 to 850-degrees-C and the strain rate range of 0.001 to 100 s-1. On the basis of the flow stress data, processing maps showing the variation of the efficiency of power dissipation (given by [2m/(m+1)] where m is the strain rate sensitivity) with temperature and strain rate were obtained. The addition of zirconium to alpha brass decreased the maximum efficiency of power dissipation from 53 to 39%, increased the strain rate for dynamic recrystallization (DRX) from 0.001 to 0.1 s-1 and improved the hot workability. Alpha-beta brasses with and without zirconium exhibit a domain in the temperature range from 550 to 750-degrees-C and at strain rates lower than 1 s-1 with a maximum efficiency of power dissipation of nearly 50 % occurring in the temperature range of 700 to 750-degrees-C and a strain rate of 0.001 s-1. In the domain, the alpha phase undergoes DRX and controls the hot deformation of the alloy whereas the beta phase deforms superplastically. The addition of zirconium to alpha-beta brass has not affected the processing maps as it gets partitioned to the beta phase and does not alter the constitutive behavior of the alpha phase

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The constitutive behaviour of agr — nickel silver in the temperature range 700–950 °C and strain rate range 0.001–100 s–1 was characterized with the help of a processing map generated on the basis of the principles of the ldquodynamic materials modelrdquo of Prasadet al Using the flow stress data, processing maps showing the variation of the efficiency of power dissipation (given by 2m/(m+1) wherem is the strain-rate sensitivity) with temperature and strain rate were obtained, agr-nickel silver exhibits a single domain at temperatures greater than 750 °C and at strain rates lower than 1s–1, with a maximum efficiency of 38% occurring at about 950 °C and at a strain rate of 0.1 s–1. In the domain the material undergoes dynamic recrystallization (DRX). On the basis of a model, it is shown that the DRX is controlled by the rate of interface formation (nucleation) which depends on the diffusion-controlled process of thermal recovery by climb. At high strain rates (10 and 100s–1) the material undergoes microstructural instabilities, the manifestations of which are in the form of adiabatic shear bands and strain markings.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The constitutive behaviour of agr-beta nickel silver in the temperature range 600�850 °C and strainrate range 0.001�100s�1 was characterized with the help of a processing map generated on the principles of the dynamic materials model. On the basis of the flow-stress data, processing maps showing the variation of the efficiency of power dissipation (given by [2m/(m+1)], wherem is the strain-rate sensitivity) with temperature and strain rate were obtained, agr-beta nickel silver exhibits a single domain at temperatures greater than 700 °C and at strain rates lower than 1 s�1 with a maximum efficiency of power dissipation of about 42% occurring at about 850 °C and at 0.1 s�1. In the domain, the agr phase undergoes dynamic recrystallization and controls the deformation of the alloy, while the beta phase deforms superplastically. Optimum conditions for the processing of agr-beta nickel silver are 850 °C and 0.1 s�1. The material undergoes unstable flow at strain rates of 10 and 100 s�1 and in the temperature range 600�750 °C, manifestated in the form of adiabatic shear bands.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Based on the computer integrated and flexible laser processing system, develop the intelligent measuring sub-system. A novel model has been built to compensate the deviations of the main frame, a new-developed 3-D laser tracker system is applied to adjust the accuracy of the system. Analyzing the characteristic of all kinds of automobile dies, which is the main processing object of the laser processing system, classify the types of the surface and border needed to be measured and be processed. According to different types of surface and border, develop 2-D adaptive measuring method based on B?zier curve and 3-D adaptive measuring method based on spline curve. During the data processing, a new 3-D probe compensation method has been described in details. Some measuring experiments and laser processing experiments are carried out to testify the methods. All the methods have been applied in the computer integrated and flexible laser processing system invented by the Institute of Mechanics, CAS.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Em função dos problemas vivenciados na seguridade social ao longo da história os governos promoveram reformas estruturais com o objetivo de equacionar estes problemas e promover o equilíbrio das contas públicas. A seguridade social brasileira foi modulada em um sistema multipilar congregando uma previdência pública para cobertura de trabalhadores do setor privado (RGPS), uma previdência complementar (RPC) e uma previdência do setor público (RPPS). O RPPS é uma previdência de filiação obrigatória e contribuição compulsória, não permitindo aos seus contribuintes argüirem sua adesão, sendo um questionamento impraticável enquanto houver um vínculo empregatício. A compreensão do funcionamento do regime previdenciário ao qual está vinculado, suas obrigações e direitos enquanto contribuinte e beneficiário, apresenta-se de vital importância para a aquiescência de sua participação, assim como também a sua co-responsabilidade na gestão dos recursos aportados ao sistema. Neste contexto, este estudo teve como objetivo avaliar, por meio da realização de uma pesquisa de natureza descritiva e com adoção do método quantitativo para tratamento dos dados, se as informações contábeis geradas pelo regime previdenciário municipal são úteis ao processo decisório desta última classe de segurados previdenciários, os servidores públicos municipais. Os resultados obtidos evidenciaram que o servidor municipal demonstra um baixo interesse em obter informações previdenciárias principalmente financeiras e contábeis. Este baixo interesse advém de dois fatores: dificuldade de acesso (65% dos respondentes) e pouco conhecimento de temas relacionados tema tais como fontes de custeio e aplicação dos recursos (62%). O baixo interesse dos servidores públicos em obter informações quanto ao PREVIRIO/ FUNPREVI concede ao gestor do sistema liberdade para decidir os rumos que devem ser tomados para a instituição previdenciária. O servidor público precisa tomar conhecimento quanto aos resultados de gestão do sistema previdenciário e para isto são necessárias duas ações: de um lado uma intenção de agir do próprio servidor, tomando para si a responsabilidade pelos rumos do sistema. Do outro lado uma intenção próativa dos responsáveis pela gestão e pela elaboração de informações a serem disponibilizadas para este segmento de usuário.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The accurate prediction of time-changing covariances is an important problem in the modeling of multivariate financial data. However, some of the most popular models suffer from a) overfitting problems and multiple local optima, b) failure to capture shifts in market conditions and c) large computational costs. To address these problems we introduce a novel dynamic model for time-changing covariances. Over-fitting and local optima are avoided by following a Bayesian approach instead of computing point estimates. Changes in market conditions are captured by assuming a diffusion process in parameter values, and finally computationally efficient and scalable inference is performed using particle filters. Experiments with financial data show excellent performance of the proposed method with respect to current standard models.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The prediction of time-changing variances is an important task in the modeling of financial data. Standard econometric models are often limited as they assume rigid functional relationships for the evolution of the variance. Moreover, functional parameters are usually learned by maximum likelihood, which can lead to over-fitting. To address these problems we introduce GP-Vol, a novel non-parametric model for time-changing variances based on Gaussian Processes. This new model can capture highly flexible functional relationships for the variances. Furthermore, we introduce a new online algorithm for fast inference in GP-Vol. This method is much faster than current offline inference procedures and it avoids overfitting problems by following a fully Bayesian approach. Experiments with financial data show that GP-Vol performs significantly better than current standard alternatives.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Statistical analysis of diffusion tensor imaging (DTI) data requires a computational framework that is both numerically tractable (to account for the high dimensional nature of the data) and geometric (to account for the nonlinear nature of diffusion tensors). Building upon earlier studies exploiting a Riemannian framework to address these challenges, the present paper proposes a novel metric and an accompanying computational framework for DTI data processing. The proposed approach grounds the signal processing operations in interpolating curves. Well-chosen interpolating curves are shown to provide a computational framework that is at the same time tractable and information relevant for DTI processing. In addition, and in contrast to earlier methods, it provides an interpolation method which preserves anisotropy, a central information carried by diffusion tensor data. © 2013 Springer Science+Business Media New York.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Q. Shen and R. Jensen, 'Rough sets, their extensions and applications,' International Journal of Automation and Computing (IJAC), vol. 4, no. 3, pp. 217-218, 2007.