93 resultados para Data Warehousing Systems
Resumo:
Nowadays, there are several services and applications that allow users to locate and move to different tourist areas using a mobile device. These systems can be used either by internet or downloading an application in concrete places like a visitors centre. Although such applications are able to facilitate the location and the search for points of interest, in most cases, these services and applications do not meet the needs of each user. This paper aims to provide a solution by studying the main projects, services and applications, their routing algorithms and their treatment of the real geographical data in Android mobile devices, focusing on the data acquisition and treatment to improve the routing searches in off-line environments.
Resumo:
This paper aims to survey the techniques and methods described in literature to analyse and characterise voltage sags and the corresponding objectives of these works. The study has been performed from a data mining point of view
Resumo:
The main objective of this paper aims at developing a methodology that takes into account the human factor extracted from the data base used by the recommender systems, and which allow to resolve the specific problems of prediction and recommendation. In this work, we propose to extract the user's human values scale from the data base of the users, to improve their suitability in open environments, such as the recommender systems. For this purpose, the methodology is applied with the data of the user after interacting with the system. The methodology is exemplified with a case study
Resumo:
In the B-ISDN there is a provision for four classes of services, all of them supported by a single transport network (the ATM network). Three of these services, the connected oriented (CO) ones, permit connection access control (CAC) but the fourth, the connectionless oriented (CLO) one, does not. Therefore, when CLO service and CO services have to share the same ATM link, a conflict may arise. This is because a bandwidth allocation to obtain maximum statistical gain can damage the contracted ATM quality of service (QOS); and vice versa, in order to guarantee the contracted QOS, the statistical gain have to be sacrificed. The paper presents a performance evaluation study of the influence of the CLO service on a CO service (a circuit emulation service or a variable bit-rate service) when sharing the same link
Resumo:
Comparison of donor-acceptor electronic couplings calculated within two-state and three-state models suggests that the two-state treatment can provide unreliable estimates of Vda because of neglecting the multistate effects. We show that in most cases accurate values of the electronic coupling in a π stack, where donor and acceptor are separated by a bridging unit, can be obtained as Ṽ da = (E2 - E1) μ12 Rda + (2 E3 - E1 - E2) 2 μ13 μ23 Rda2, where E1, E2, and E3 are adiabatic energies of the ground, charge-transfer, and bridge states, respectively, μij is the transition dipole moments between the states i and j, and Rda is the distance between the planes of donor and acceptor. In this expression based on the generalized Mulliken-Hush approach, the first term corresponds to the coupling derived within a two-state model, whereas the second term is the superexchange correction accounting for the bridge effect. The formula is extended to bridges consisting of several subunits. The influence of the donor-acceptor energy mismatch on the excess charge distribution, adiabatic dipole and transition moments, and electronic couplings is examined. A diagnostic is developed to determine whether the two-state approach can be applied. Based on numerical results, we showed that the superexchange correction considerably improves estimates of the donor-acceptor coupling derived within a two-state approach. In most cases when the two-state scheme fails, the formula gives reliable results which are in good agreement (within 5%) with the data of the three-state generalized Mulliken-Hush model
Resumo:
A fundamental question in developmental biology is how tissues are patterned to give rise to differentiated body structures with distinct morphologies. The Drosophila wing disc offers an accessible model to understand epithelial spatial patterning. It has been studied extensively using genetic and molecular approaches. Bristle patterns on the thorax, which arise from the medial part of the wing disc, are a classical model of pattern formation, dependent on a pre-pattern of trans-activators and –repressors. Despite of decades of molecular studies, we still only know a subset of the factors that determine the pre-pattern. We are applying a novel and interdisciplinary approach to predict regulatory interactions in this system. It is based on the description of expression patterns by simple logical relations (addition, subtraction, intersection and union) between simple shapes (graphical primitives). Similarities and relations between primitives have been shown to be predictive of regulatory relationships between the corresponding regulatory factors in other Systems, such as the Drosophila egg. Furthermore, they provide the basis for dynamical models of the bristle-patterning network, which enable us to make even more detailed predictions on gene regulation and expression dynamics. We have obtained a data-set of wing disc expression patterns which we are now processing to obtain average expression patterns for each gene. Through triangulation of the images we can transform the expression patterns into vectors which can easily be analysed by Standard clustering methods. These analyses will allow us to identify primitives and regulatory interactions. We expect to identify new regulatory interactions and to understand the basic Dynamics of the regulatory network responsible for thorax patterning. These results will provide us with a better understanding of the rules governing gene regulatory networks in general, and provide the basis for future studies of the evolution of the thorax-patterning network in particular.
Resumo:
Background: To enhance our understanding of complex biological systems like diseases we need to put all of the available data into context and use this to detect relations, pattern and rules which allow predictive hypotheses to be defined. Life science has become a data rich science with information about the behaviour of millions of entities like genes, chemical compounds, diseases, cell types and organs, which are organised in many different databases and/or spread throughout the literature. Existing knowledge such as genotype - phenotype relations or signal transduction pathways must be semantically integrated and dynamically organised into structured networks that are connected with clinical and experimental data. Different approaches to this challenge exist but so far none has proven entirely satisfactory. Results: To address this challenge we previously developed a generic knowledge management framework, BioXM™, which allows the dynamic, graphic generation of domain specific knowledge representation models based on specific objects and their relations supporting annotations and ontologies. Here we demonstrate the utility of BioXM for knowledge management in systems biology as part of the EU FP6 BioBridge project on translational approaches to chronic diseases. From clinical and experimental data, text-mining results and public databases we generate a chronic obstructive pulmonary disease (COPD) knowledge base and demonstrate its use by mining specific molecular networks together with integrated clinical and experimental data. Conclusions: We generate the first semantically integrated COPD specific public knowledge base and find that for the integration of clinical and experimental data with pre-existing knowledge the configuration based set-up enabled by BioXM reduced implementation time and effort for the knowledge base compared to similar systems implemented as classical software development projects. The knowledgebase enables the retrieval of sub-networks including protein-protein interaction, pathway, gene - disease and gene - compound data which are used for subsequent data analysis, modelling and simulation. Pre-structured queries and reports enhance usability; establishing their use in everyday clinical settings requires further simplification with a browser based interface which is currently under development.
Resumo:
The increasing volume of data describing humandisease processes and the growing complexity of understanding, managing, and sharing such data presents a huge challenge for clinicians and medical researchers. This paper presents the@neurIST system, which provides an infrastructure for biomedical research while aiding clinical care, by bringing together heterogeneous data and complex processing and computing services. Although @neurIST targets the investigation and treatment of cerebral aneurysms, the system’s architecture is generic enough that it could be adapted to the treatment of other diseases.Innovations in @neurIST include confining the patient data pertaining to aneurysms inside a single environment that offers cliniciansthe tools to analyze and interpret patient data and make use of knowledge-based guidance in planning their treatment. Medicalresearchers gain access to a critical mass of aneurysm related data due to the system’s ability to federate distributed informationsources. A semantically mediated grid infrastructure ensures that both clinicians and researchers are able to seamlessly access andwork on data that is distributed across multiple sites in a secure way in addition to providing computing resources on demand forperforming computationally intensive simulations for treatment planning and research.
Resumo:
This article reviews the methodology of the studies on drug utilization with particular emphasis on primary care. Population based studies of drug inappropriateness can be done with microdata from Health Electronic Records and e-prescriptions. Multilevel models estimate the influence of factors affecting the appropriateness of drug prescription at different hierarchical levels: patient, doctor, health care organization and regulatory environment. Work by the GIUMAP suggest that patient characteristics are the most important factor in the appropriateness of prescriptions with significant effects at the general practicioner level.
Resumo:
In this paper we describe the existence of financial illusion in public accountingand we comment on its effects for the future sustainability of local publicservices. We relate these features to the lack of incentives amongst publicmanagers for improving the financial reporting and thus management of publicassets. Financial illusion pays off for politicians and managers since it allowsfor larger public expenditure increases and managerial slack, these beingarguments in their utility functions. This preference is strengthen by the shorttime perspective of politically appointed public managers. Both factors runagainst public accountability. This hypothesis is tested for Spain by using anunique sample. We take data from around forty Catalan local authorities withpopulation above 20,000 for the financial years 1993-98. We build this databasis from the Catalan Auditing Office Reports in a way that it can be linkedto some other local social and economic variables in order to test ourassumptions. The results confirm that there is a statistical relationship between the financialillusion index (FI as constructed in the paper) and higher current expenditure.This reflects on important overruns and increases of the delay in payingsuppliers, as well as on a higher difficulties to face capital finance. Mechanismsfor FI creation have to do among other factors, with delays in paying suppliers(and thereafter higher future financial costs per unit of service), no adequateprovision for bad debts and lack of appropriate capital funding either forreposition or for new equipments. For this, it is crucial to monitor the way inwhich capital transfers are accounted in local public sheet balances. As a result,for most of the Municipalities we analyse, the funds for guaranteeing continuityand sustainability of public services provision are today at risk.Given managerial incentives at present in public institutions, we conclude thatpublic regulation recently enforced for assuring better information systems inlocal public management may not be enough to change the current state of affairs.
Resumo:
This article reviews the methodology of the studies on drug utilization with particular emphasis on primary care. Population based studies of drug inappropriateness can be done with microdata from Health Electronic Records and e-prescriptions. Multilevel models estimate the influence of factors affecting the appropriateness of drug prescription at different hierarchical levels: patient, doctor, health care organization and regulatory environment.Work by the GIUMAP suggest that patient characteristics are the most important factor in the appropriateness of prescriptions with significant effects at the general practicioner level.
Resumo:
Gaia is the most ambitious space astrometry mission currently envisaged and is a technological challenge in all its aspects. We describe a proposal for the payload data handling system of Gaia, as an example of a high-performance, real-time, concurrent, and pipelined data system. This proposal includes the front-end systems for the instrumentation, the data acquisition and management modules, the star data processing modules, and the payload data handling unit. We also review other payload and service module elements and we illustrate a data flux proposal.
Resumo:
Drift is an important issue that impairs the reliability of gas sensing systems. Sensor aging, memory effects and environmental disturbances produce shifts in sensor responses that make initial statistical models for gas or odor recognition useless after a relatively short period (typically few weeks). Frequent recalibrations are needed to preserve system accuracy. However, when recalibrations involve numerous samples they become expensive and laborious. An interesting and lower cost alternative is drift counteraction by signal processing techniques. Orthogonal Signal Correction (OSC) is proposed for drift compensation in chemical sensor arrays. The performance of OSC is also compared with Component Correction (CC). A simple classification algorithm has been employed for assessing the performance of the algorithms on a dataset composed by measurements of three analytes using an array of seventeen conductive polymer gas sensors over a ten month period.
Resumo:
The finite-size-dependent enhancement of pairing in mesoscopic Fermi systems is studied under the assumption that the BCS approach is valid and that the two-body force is size independent. Different systems are investigated such as superconducting metallic grains and films as well as atomic nuclei. It is shown that the finite size enhancement of pairing in these systems is in part due to the presence of a surface which accounts quite well for the data of nuclei and explains a good fraction of the enhancement in Al grains.
Resumo:
We study theoretical and empirical aspects of the mean exit time (MET) of financial time series. The theoretical modeling is done within the framework of continuous time random walk. We empirically verify that the mean exit time follows a quadratic scaling law and it has associated a prefactor which is specific to the analyzed stock. We perform a series of statistical tests to determine which kind of correlation are responsible for this specificity. The main contribution is associated with the autocorrelation property of stock returns. We introduce and solve analytically both two-state and three-state Markov chain models. The analytical results obtained with the two-state Markov chain model allows us to obtain a data collapse of the 20 measured MET profiles in a single master curve.