989 resultados para information visualization


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Résumé Si l'impact de l'informatique ne fait généralement pas de doute, il est souvent plus problématique d'en mesurer sa valeur. Les Directeurs des Systèmes d'Information (DSI) expliquent l'absence de schéma directeur et de vision à moyen et long terme de l'entreprise, par un manque de temps et de ressources mais aussi par un défaut d'implication des directions générales et des directions financières. L'incapacité de mesurer précisément la valeur du système d'information engendre une logique de gestion par les coûts, néfaste à l'action de la DSI. Alors qu'une mesure de la valeur économique de l'informatique offrirait aux directions générales la matière leur permettant d'évaluer réellement la maturité et la contribution de leur système d'information. L'objectif de cette thèse est d'évaluer à la fois l'alignement de l'informatique avec la stratégie de l'entreprise, la qualité du pilotage (mesure de performance) des systèmes d'information, et enfin, l'organisation et le positionnement de la fonction informatique dans l'entreprise. La mesure de ces trois éléments clés de la gouvernance informatique a été réalisée par l'intermédiaire de deux vagues d'enquêtes successives menées en 2000/2001 (DSI) et 2002/2003 (DSI et DG) en Europe francophone (Suisse Romande, France, Belgique et Luxembourg). Abstract The impact of Information Technology (IT) is today a clear evidence to company stakeholders. However, measuring the value generated by IT is a real challenge. Chief Information Officers (CIO) explain the absence of solid IT Business Plans and clear mid/long term visions by a lack of time and resources but also by a lack of involvement of business senior management (e.g. CEO and CFO). Thus, being not able to measure the economic value of IT, the CIO will have to face the hard reality of permanent cost pressures and cost reductions to justify IT spending and investments. On the other side, being able to measure the value of IT would help CIO and senior business management to assess the maturity and the contribution of the Information System and therefore facilitate the decision making process. The objective of this thesis is to assess the alignment of IT with the business strategy, to assess the quality of measurement of the Information System and last but not least to assess the positioning of the IT organisation within the company. The assessment of these three key elements of the IT Governance was established with two surveys (first wave in 2000/2001 for CIO, second wave in 2002/2003 for CIO and CEO) in Europe (French speaking countries namely Switzerland, France, Belgium and Luxembourg).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fibrocytes are important for understanding the progression of many diseases because they are present in areas where pathogenic lesions are generated. However, the morphology of fibrocytes and their interactions with parasites are poorly understood. In this study, we examined the morphology of peripheral blood fibrocytes and their interactions with Leishmania (L.) amazonensis . Through ultrastructural analysis, we describe the details of fibrocyte morphology and how fibrocytes rapidly internaliseLeishmania promastigotes. The parasites differentiated into amastigotes after 2 h in phagolysosomes and the infection was completely resolved after 72 h. Early in the infection, we found increased nitric oxide production and large lysosomes with electron-dense material. These factors may regulate the proliferation and death of the parasites. Because fibrocytes are present at the infection site and are directly involved in developing cutaneous leishmaniasis, they are targets for effective, non-toxic cell-based therapies that control and treat leishmaniasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the Bethesda Statement on Open Access Policy for libraries and the recommendations of the BOAI10, libraries and librarians have an important role to fulfil in the encouragement of open access. Taking into account the Competencies for Information Professionals of the 21st Century, elaborated by the Special Libraries Association, and the Librarians’ Competencies Profile for Scholarly Publishing and Open Access, we shall identify the competencies and new areas of knowledge and expertise that have been involved in the process of the development and upkeep of our institutional repository (Repositorio SSPA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[Table des matières] 1. Introduction. 2. Structure (introduction, hiérarchie). 3. Processus (généralités, flux de clientèle, flux d'activité, flux de ressources, aspects temporels, aspects comptables). 4. Descripteurs (qualification, quantification). 5. Indicateurs (définitions, productivité, pertinence, adéquation, efficacité, effectivité, efficience, standards). 6. Bibliographie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monitor a distribution network implies working with a huge amount of data coining from the different elements that interact in the network. This paper presents a visualization tool that simplifies the task of searching the database for useful information applicable to fault management or preventive maintenance of the network

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food intake increases to a varying extent during pregnancy to provide extra energy for the growing fetus. Measuring the respiratory quotient (RQ) during the course of pregnancy (by quantifying O2 consumption and CO2 production with indirect calorimetry) could be potentially useful since it gives an insight into the evolution of the proportion of carbohydrate vs. fat oxidized during pregnancy and thus allows recommendations on macronutrients for achieving a balanced (or slightly positive) substrate intake. A systematic search of the literature for papers reporting RQ changes during normal pregnancy identified 10 papers reporting original research. The existing evidence supports an increased RQ of varying magnitude in the third trimester of pregnancy, while the discrepant results reported for the first and second trimesters (i.e. no increase in RQ), explained by limited statistical power (small sample size) or fragmentary data, preclude safe conclusions about the evolution of RQ during early pregnancy. From a clinical point of view, measuring RQ during pregnancy requires not only sophisticated and costly indirect calorimeters but appears of limited value outside pure research projects, because of several confounding variables: (1) spontaneous changes in food intake and food composition during the course of pregnancy (which influence RQ); (2) inter-individual differences in weight gain and composition of tissue growth; (3) technical factors, notwithstanding the relatively small contribution of fetal metabolism per se (RQ close to 1.0) to overall metabolism of the pregnant mother.