90 resultados para Information literacy information


Relevância:

40.00% 40.00%

Publicador:

Resumo:

Résumé Si l'impact de l'informatique ne fait généralement pas de doute, il est souvent plus problématique d'en mesurer sa valeur. Les Directeurs des Systèmes d'Information (DSI) expliquent l'absence de schéma directeur et de vision à moyen et long terme de l'entreprise, par un manque de temps et de ressources mais aussi par un défaut d'implication des directions générales et des directions financières. L'incapacité de mesurer précisément la valeur du système d'information engendre une logique de gestion par les coûts, néfaste à l'action de la DSI. Alors qu'une mesure de la valeur économique de l'informatique offrirait aux directions générales la matière leur permettant d'évaluer réellement la maturité et la contribution de leur système d'information. L'objectif de cette thèse est d'évaluer à la fois l'alignement de l'informatique avec la stratégie de l'entreprise, la qualité du pilotage (mesure de performance) des systèmes d'information, et enfin, l'organisation et le positionnement de la fonction informatique dans l'entreprise. La mesure de ces trois éléments clés de la gouvernance informatique a été réalisée par l'intermédiaire de deux vagues d'enquêtes successives menées en 2000/2001 (DSI) et 2002/2003 (DSI et DG) en Europe francophone (Suisse Romande, France, Belgique et Luxembourg). Abstract The impact of Information Technology (IT) is today a clear evidence to company stakeholders. However, measuring the value generated by IT is a real challenge. Chief Information Officers (CIO) explain the absence of solid IT Business Plans and clear mid/long term visions by a lack of time and resources but also by a lack of involvement of business senior management (e.g. CEO and CFO). Thus, being not able to measure the economic value of IT, the CIO will have to face the hard reality of permanent cost pressures and cost reductions to justify IT spending and investments. On the other side, being able to measure the value of IT would help CIO and senior business management to assess the maturity and the contribution of the Information System and therefore facilitate the decision making process. The objective of this thesis is to assess the alignment of IT with the business strategy, to assess the quality of measurement of the Information System and last but not least to assess the positioning of the IT organisation within the company. The assessment of these three key elements of the IT Governance was established with two surveys (first wave in 2000/2001 for CIO, second wave in 2002/2003 for CIO and CEO) in Europe (French speaking countries namely Switzerland, France, Belgium and Luxembourg).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

[Table des matières] 1. Introduction. 2. Structure (introduction, hiérarchie). 3. Processus (généralités, flux de clientèle, flux d'activité, flux de ressources, aspects temporels, aspects comptables). 4. Descripteurs (qualification, quantification). 5. Indicateurs (définitions, productivité, pertinence, adéquation, efficacité, effectivité, efficience, standards). 6. Bibliographie.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Food intake increases to a varying extent during pregnancy to provide extra energy for the growing fetus. Measuring the respiratory quotient (RQ) during the course of pregnancy (by quantifying O2 consumption and CO2 production with indirect calorimetry) could be potentially useful since it gives an insight into the evolution of the proportion of carbohydrate vs. fat oxidized during pregnancy and thus allows recommendations on macronutrients for achieving a balanced (or slightly positive) substrate intake. A systematic search of the literature for papers reporting RQ changes during normal pregnancy identified 10 papers reporting original research. The existing evidence supports an increased RQ of varying magnitude in the third trimester of pregnancy, while the discrepant results reported for the first and second trimesters (i.e. no increase in RQ), explained by limited statistical power (small sample size) or fragmentary data, preclude safe conclusions about the evolution of RQ during early pregnancy. From a clinical point of view, measuring RQ during pregnancy requires not only sophisticated and costly indirect calorimeters but appears of limited value outside pure research projects, because of several confounding variables: (1) spontaneous changes in food intake and food composition during the course of pregnancy (which influence RQ); (2) inter-individual differences in weight gain and composition of tissue growth; (3) technical factors, notwithstanding the relatively small contribution of fetal metabolism per se (RQ close to 1.0) to overall metabolism of the pregnant mother.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: The Internet is increasingly used as a source of information for mental health issues. The burden of obsessive compulsive disorder (OCD) may lead persons with diagnosed or undiagnosed OCD, and their relatives, to search for good quality information on the Web. This study aimed to evaluate the quality of Web-based information on English-language sites dealing with OCD and to compare the quality of websites found through a general and a medically specialized search engine. METHODS: Keywords related to OCD were entered into Google and OmniMedicalSearch. Websites were assessed on the basis of accountability, interactivity, readability, and content quality. The "Health on the Net" (HON) quality label and the Brief DISCERN scale score were used as possible content quality indicators. Of the 235 links identified, 53 websites were analyzed. RESULTS: The content quality of the OCD websites examined was relatively good. The use of a specialized search engine did not offer an advantage in finding websites with better content quality. A score ≥16 on the Brief DISCERN scale is associated with better content quality. CONCLUSION: This study shows the acceptability of the content quality of OCD websites. There is no advantage in searching for information with a specialized search engine rather than a general one. Practical implications: The Internet offers a number of high quality OCD websites. It remains critical, however, to have a provider-patient talk about the information found on the Web.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Classical treatments of problems of sequential mate choice assume that the distribution of the quality of potential mates is known a priori. This assumption, made for analytical purposes, may seem unrealistic, opposing empirical data as well as evolutionary arguments. Using stochastic dynamic programming, we develop a model that includes the possibility for searching individuals to learn about the distribution and in particular to update mean and variance during the search. In a constant environment, a priori knowledge of the parameter values brings strong benefits in both time needed to make a decision and average value of mate obtained. Knowing the variance yields more benefits than knowing the mean, and benefits increase with variance. However, the costs of learning become progressively lower as more time is available for choice. When parameter values differ between demes and/or searching periods, a strategy relying on fixed a priori information might lead to erroneous decisions, which confers advantages on the learning strategy. However, time for choice plays an important role as well: if a decision must be made rapidly, a fixed strategy may do better even when the fixed image does not coincide with the local parameter values. These results help in delineating the ecological-behavior context in which learning strategies may spread.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Reliable information is a crucial factor influencing decision-making and, thus, fitness in all animals. A common source of information comes from inadvertent cues produced by the behavior of conspecifics. Here we use a system of experimental evolution with robots foraging in an arena containing a food source to study how communication strategies can evolve to regulate information provided by such cues. The robots could produce information by emitting blue light, which the other robots could perceive with their cameras. Over the first few generations, the robots quickly evolved to successfully locate the food, while emitting light randomly. This behavior resulted in a high intensity of light near food, which provided social information allowing other robots to more rapidly find the food. Because robots were competing for food, they were quickly selected to conceal this information. However, they never completely ceased to produce information. Detailed analyses revealed that this somewhat surprising result was due to the strength of selection on suppressing information declining concomitantly with the reduction in information content. Accordingly, a stable equilibrium with low information and considerable variation in communicative behaviors was attained by mutation selection. Because a similar coevolutionary process should be common in natural systems, this may explain why communicative strategies are so variable in many animal species.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

SUMMARY This paper analyses the outcomes of the EEA and bilateral agreements vote at the level of the 3025 communities of the Swiss Confederation by simultaneously modelling the vote and the participation decisions. Regressions include economic and political factors. The economic variables are the aggregated shares of people employed in the losing, Winning and neutral sectors, according to BRUNETTI, JAGGI and WEDER (1998) classification, Which follows a Ricardo-Viner logic, and the average education levels, which follows a Heckscher-Ohlin approach. The political factors are those used in the recent literature. The results are extremely precise and consistent. Most of the variables have the predicted sign and are significant at the l % level. More than 80 % of the communities' vote variance is explained by the model, substantially reducing the residuals when compared to former studies. The political variables do have the expected signs and are significant as Well. Our results underline the importance of the interaction between electoral choice and participation decisions as well as the importance of simultaneously dealing with those issues. Eventually they reveal the electorate's high level of information and rationality. ZUSAMMENFASSUNG Unser Beitrag analysiert in einem Model, welches gleichzeitig die Stimm- ("ja" oder "nein") und Partizipationsentscheidung einbezieht, den Ausgang der Abstimmungen über den Beitritt zum EWR und über die bilateralen Verträge für die 3025 Gemeinden der Schweiz. Die Regressionsgleichungen beinhalten ökonomische und politische Variabeln. Die ökonomischen Variabeln beinhalten die Anteile an sektoriellen Arbeitsplatzen, die, wie in BRUNETTI, JAGGIl.1I1d WEDER (1998), in Gewinner, Verlierer und Neutrale aufgeteilt Wurden, gemäß dem Model von Ricardo-Viner, und das durchschnittliche Ausbildungsniveau, gemäß dem Model von Heckscher-Ohlin. Die politischen Variabeln sind die in der gegenwärtigen Literatur üblichen. Unsere Resultate sind bemerkenswert präzise und kohärent. Die meisten Variabeln haben das von der Theorie vorausgesagte Vorzeichen und sind hoch signifikant (l%). Mehr als 80% der Varianz der Stimmabgabe in den Gemeinden wird durch das Modell erklärt, was, im Vergleich mit früheren Arbeiten, die unerklärten Residuen Wesentlich verkleinert. Die politischen Variabeln haben auch die erwarteten Vorzeichen und sind signifikant. Unsere Resultate unterstreichen die Bedeutung der Interaktion zwischen der Stimm- und der Partizipationsentscheidung, und die Bedeutung diese gleichzeitig zu behandeln. Letztendlich, belegen sie den hohen lnformationsgrad und die hohe Rationalität der Stimmbürger. RESUME Le présent article analyse les résultats des votations sur l'EEE et sur les accords bilatéraux au niveau des 3025 communes de la Confédération en modélisant simultanément les décisions de vote ("oui" ou "non") et de participation. Les régressions incluent des déterminants économiques et politiques. Les déterminants économiques sont les parts d'emploi sectoriels agrégées en perdants, gagnants et neutres selon la classification de BRUNETTI, JAGGI ET WEDER (1998), suivant la logique du modèle Ricardo-Viner, et les niveaux de diplômes moyens, suivant celle du modèle Heckscher-Ohlin. Les déterminants politiques suivent de près ceux utilisés dans la littérature récente. Les résultats sont remarquablement précis et cohérents. La plupart des variables ont les signes prédits par les modèles et sont significatives a 1%. Plus de 80% de la variance du vote par commune sont expliqués par le modèle, faisant substantiellement reculer la part résiduelle par rapport aux travaux précédents. Les variables politiques ont aussi les signes attendus et sont aussi significatives. Nos résultats soulignent l'importance de l'interaction entre choix électoraux et décisions de participation et l'importance de les traiter simultanément. Enfin, ils mettent en lumière les niveaux élevés d'information et de rationalité de l'électorat.