900 resultados para Data Structures, Cryptology and Information Theory
Resumo:
In elections, majority divisions pave the way to focal manipulations and coordination failures, which can lead to the victory of the wrong candidate. This paper shows how this flaw can be addressed if voter preferences over candidates are sensitive to information. We consider two potential sources of divisions: majority voters may have similar preferences but opposite information about the candidates, or opposite preferences. We show that when information is the source of majority divisions, Approval Voting features a unique equilibrium with full information and coordination equivalence. That is, it produces the same outcome as if both information and coordination problems could be resolved. Other electoral systems, such as Plurality and Two-Round elections, do not satisfy this equivalence. The second source of division is opposite preferences. Whenever the fraction of voters with such preferences is not too large, Approval Voting still satisfies full information and coordination equivalence.
Resumo:
A number of important trends currently impact libraries. Academic libraries face the fundamental shift of collections toward ever increasing proportions of electronic content; public libraries continue to see vigorous interest in print materials, now supplemented by demand to provide e-books for lending. Breeding will explore these and other trends and describe some of the technologies available and emerging to help libraries meet the challenges involved in this context.
Resumo:
The Exhibitium Project , awarded by the BBVA Foundation, is a data-driven project developed by an international consortium of research groups . One of its main objectives is to build a prototype that will serve as a base to produce a platform for the recording and exploitation of data about art-exhibitions available on the Internet . Therefore, our proposal aims to expose the methods, procedures and decision-making processes that have governed the technological implementation of this prototype, especially with regard to the reuse of WordPress (WP) as development framework.
Resumo:
Le système éducatif encourage une histoire positiviste, ordonnée, unilatérale et universelle; par l´incorporation de le découpage chronologique de l´histoire en quatre étapes. Mais, est-ce qu´il serait posible que les élèves puissent étudier leur propre présent? Mon commuication poursuit d´exposer, comme Saab affirmait, le présent est “le point de départ et d´arrivée de l´enseignement de l´histoire détermine les allers et les retours au passé”. La façon d´approcher l´enseignement de l´histoire est confortable. Il n´y a pas de questions, il n´y a pas de discussions. Cette vision de l´histoire interprétée par l´homme blancoccidental-hétérosexuel s´inscrit dans le projet de la modernité du Siècle des Lumières. Par conséquent, cette histoire obvie que nous vivons dans una société postmoderne de la suspicion, de la pensée débile. En ce qui concerne la problématique autour de la pollution audiovisuelle et la façon dont les enseignants et les élèves sont quotidiennement confrontés à ce problème. Par conséquent, il est nécessaire de réfléchir à la question de l´enseignement de l´histoire quadripartite. Actuellement, les médias et les nouvelles technologies sont en train de changer la vie de l´humanité. Il est indispensable que l´élève connaisse son histoire presente et les scénarioshistoriques dans l´avenir. Je pense en la nécessité d´adopter une didactique de l’histoire presente et par conséquent, nous devons utiliser la maîtrise des médias et de l´information. Il faut une formation des enseignants que pose, comme Gadamer a dit: “le passé y le présent se trouvent par une négociation permanente”. Una formation des enseignants qui permette de comprendre et penser l´histoire future / les histoires futures. À mon avis, si les élèves comprennent la complexité de leur monde et leurs multiples visions, les élèves seront plus tolérantes et empathiques.
Resumo:
From the Harvard theological review, v. III, July, 1910.
Resumo:
With the exponential growth of the usage of web-based map services, the web GIS application has become more and more popular. Spatial data index, search, analysis, visualization and the resource management of such services are becoming increasingly important to deliver user-desired Quality of Service. First, spatial indexing is typically time-consuming and is not available to end-users. To address this, we introduce TerraFly sksOpen, an open-sourced an Online Indexing and Querying System for Big Geospatial Data. Integrated with the TerraFly Geospatial database [1-9], sksOpen is an efficient indexing and query engine for processing Top-k Spatial Boolean Queries. Further, we provide ergonomic visualization of query results on interactive maps to facilitate the user’s data analysis. Second, due to the highly complex and dynamic nature of GIS systems, it is quite challenging for the end users to quickly understand and analyze the spatial data, and to efficiently share their own data and analysis results with others. Built on the TerraFly Geo spatial database, TerraFly GeoCloud is an extra layer running upon the TerraFly map and can efficiently support many different visualization functions and spatial data analysis models. Furthermore, users can create unique URLs to visualize and share the analysis results. TerraFly GeoCloud also enables the MapQL technology to customize map visualization using SQL-like statements [10]. Third, map systems often serve dynamic web workloads and involve multiple CPU and I/O intensive tiers, which make it challenging to meet the response time targets of map requests while using the resources efficiently. Virtualization facilitates the deployment of web map services and improves their resource utilization through encapsulation and consolidation. Autonomic resource management allows resources to be automatically provisioned to a map service and its internal tiers on demand. v-TerraFly are techniques to predict the demand of map workloads online and optimize resource allocations, considering both response time and data freshness as the QoS target. The proposed v-TerraFly system is prototyped on TerraFly, a production web map service, and evaluated using real TerraFly workloads. The results show that v-TerraFly can accurately predict the workload demands: 18.91% more accurate; and efficiently allocate resources to meet the QoS target: improves the QoS by 26.19% and saves resource usages by 20.83% compared to traditional peak load-based resource allocation.
Resumo:
Purpose: To assess the compliance of Daily Disposable Contact Lenses (DDCLs) wearers with replacing lenses at a manufacturer-recommended replacement frequency. To evaluate the ability of two different Health Behavioural Theories (HBT), The Health Belief Model (HBM) and The Theory of Planned Behaviour (TPB), in predicting compliance. Method: A multi-centre survey was conducted using a questionnaire completed anonymously by contact lens wearers during the purchase of DDCLs. Results: Three hundred and fifty-four questionnaires were returned. The survey comprised 58.5% females and 41.5% males (mean age 34. ±. 12. years). Twenty-three percent of respondents were non-compliant with manufacturer-recommended replacement frequency (re-using DDCLs at least once). The main reason for re-using DDCLs was "to save money" (35%). Predictions of compliance behaviour (past behaviour or future intentions) on the basis of the two HBT was investigated through logistic regression analysis: both TPB factors (subjective norms and perceived behavioural control) were significant (p. <. 0.01); HBM was less predictive with only the severity (past behaviour and future intentions) and perceived benefit (only for past behaviour) as significant factors (p. <. 0.05). Conclusions: Non-compliance with DDCLs replacement is widespread, affecting 1 out of 4 Italian wearers. Results from the TPB model show that the involvement of persons socially close to the wearers (subjective norms) and the improvement of the procedure of behavioural control of daily replacement (behavioural control) are of paramount importance in improving compliance. With reference to the HBM, it is important to warn DDCLs wearers of the severity of a contact-lens-related eye infection, and to underline the possibility of its prevention.
Resumo:
The semiarid region of northeastern Brazil, the Caatinga, is extremely important due to its biodiversity and endemism. Measurements of plant physiology are crucial to the calibration of Dynamic Global Vegetation Models (DGVMs) that are currently used to simulate the responses of vegetation in face of global changes. In a field work realized in an area of preserved Caatinga forest located in Petrolina, Pernambuco, measurements of carbon assimilation (in response to light and CO2) were performed on 11 individuals of Poincianella microphylla, a native species that is abundant in this region. These data were used to calibrate the maximum carboxylation velocity (Vcmax) used in the INLAND model. The calibration techniques used were Multiple Linear Regression (MLR), and data mining techniques as the Classification And Regression Tree (CART) and K-MEANS. The results were compared to the UNCALIBRATED model. It was found that simulated Gross Primary Productivity (GPP) reached 72% of observed GPP when using the calibrated Vcmax values, whereas the UNCALIBRATED approach accounted for 42% of observed GPP. Thus, this work shows the benefits of calibrating DGVMs using field ecophysiological measurements, especially in areas where field data is scarce or non-existent, such as in the Caatinga
Resumo:
In questo elaborato ci siamo occupati della legge di Zipf sia da un punto di vista applicativo che teorico. Tale legge empirica afferma che il rango in frequenza (RF) delle parole di un testo seguono una legge a potenza con esponente -1. Per quanto riguarda l'approccio teorico abbiamo trattato due classi di modelli in grado di ricreare leggi a potenza nella loro distribuzione di probabilità. In particolare, abbiamo considerato delle generalizzazioni delle urne di Polya e i processi SSR (Sample Space Reducing). Di questi ultimi abbiamo dato una formalizzazione in termini di catene di Markov. Infine abbiamo proposto un modello di dinamica delle popolazioni capace di unificare e riprodurre i risultati dei tre SSR presenti in letteratura. Successivamente siamo passati all'analisi quantitativa dell'andamento del RF sulle parole di un corpus di testi. Infatti in questo caso si osserva che la RF non segue una pura legge a potenza ma ha un duplice andamento che può essere rappresentato da una legge a potenza che cambia esponente. Abbiamo cercato di capire se fosse possibile legare l'analisi dell'andamento del RF con le proprietà topologiche di un grafo. In particolare, a partire da un corpus di testi abbiamo costruito una rete di adiacenza dove ogni parola era collegata tramite un link alla parola successiva. Svolgendo un'analisi topologica della struttura del grafo abbiamo trovato alcuni risultati che sembrano confermare l'ipotesi che la sua struttura sia legata al cambiamento di pendenza della RF. Questo risultato può portare ad alcuni sviluppi nell'ambito dello studio del linguaggio e della mente umana. Inoltre, siccome la struttura del grafo presenterebbe alcune componenti che raggruppano parole in base al loro significato, un approfondimento di questo studio potrebbe condurre ad alcuni sviluppi nell'ambito della comprensione automatica del testo (text mining).
Resumo:
The stable increase in average life expectancy and the consecutive increase in the number of cases of bone related diseases has led to a growing interest in the development of materials that can promote bone repair and/or replacement. Among the best candidates are those materials that have a high similarity to bones, in terms of composition, structure, morphology and functionality. Biomineralized tissue, and thus also bones, have three main components: water, an organic matrix and an inorganic deposit. In vertebrates, the inorganic deposit consists of what is called biological apatite, which slightly differ from stoichiometric hydroxyapatite (HA) both in crystallographic terms and in the presence of foreign atoms and species. This justifies the great attention towards calcium phosphates, which show excellent biocompatibility and bioactivity. The performances of the material and the response of the biological tissue can be further improved through their functionalization with ions, biologically active molecules and nanostructures. This thesis focuses on several possible functionalizations of calcium phosphates, and their effects on chemical properties and biological performances. In particular, the functionalizing agents include several biologically relevant ions, such as Cobalt (Co), Manganese (Mn), Strontium (Sr) and Zinc (Zn); two organic molecules, a flavonoid (Quercetin) and a polyphenol (Curcumin); and nanoparticles, namely tungsten oxide (WO3) NPs. Functionalization was carried out on various calcium phosphates: dicalcium phosphate dihydrate (DCPD), dicalcium phosphate anhydrous (DCPA) and hydroxyapatite (HA). Two different strategies of functionalization were applied: direct synthesis and adsorption from solution. Finally, a chapter is devoted to a preliminary study on the development of cements based on some of the functionalized phosphates obtained.
Resumo:
Considering intrinsic characteristics of the system exclusively, both statistical and information theory interpretations of the second law are used to provide more comprehensive meanings for the concepts of entropy, temperature, and Helmholtz and Gibbs energies. The coherence of Clausius inequality to these concepts is emphasized. The aim of this work is to re-discuss the second law of thermodynamics in accordance to homogeneous processes thermodynamics, a temporal science which is the very special oversimplification of continuum mechanics for spatially constant intensive properties.
Resumo:
Considering intrinsic characteristics of the system exclusively, both statistical and information theory interpretations of the second law are used to provide more comprehensive meanings for the concepts of entropy, temperature, and Helmholtz and Gibbs energies. The coherence of Clausius inequality to these concepts is emphasized. The aim of this work is to re-discuss the second law of thermodynamics in accordance to homogeneous processes thermodynamics, a temporal science which is the very special oversimplification of continuum mechanics for spatially constant intensive properties.
Resumo:
Alison Macrina is the founder and director of the Library Freedom Project, an initiative that aims to make real the promise of intellectual freedom in libraries. The Library Freedom Project trains librarians on the state of global surveillance, privacy rights, and privacy-protecting technology, so that librarians may in turn teach their communities about safeguarding privacy. In 2015, Alison was named one of Library Journal‘s Movers and Shakers. Read more about the Library Freedom Project at libraryfreedomproject.org.
Resumo:
O presente trabalho tem por objetivo investigar se a adoção de documentos eletrônicos, uma realidade cuja obrigatoriedade é crescente no Brasil, é acompanhada por uma redução nos custos de conformidade das empresas. O autor buscou o referencial teórico em várias áreas do conhecimento: no Direito Tributário e no Direito Civil, na Matemática Aplicada, na Tecnologia da Informação e, por fim, na Contabilidade. Do Direito Civil vieram os conceitos de documento, que juntamente com conceitos de Matemática e de Teoria da Informação permitem construir a noção de Documento Eletrônico. Do Direito Tributário vieram as noções relativas aos tributos no ordenamento brasileiro, e as suas obrigações associadas (principal e acessórias). Da Contabilidade buscaram-se as definições de custos de conformidade e de transação, de forma que se pudesse avaliar quanto custa para uma empresa ser conforme com as obrigações tributárias acessórias brasileiras, especialmente no que tange ao uso de documentos fiscais eletrônicos. O estudo foi circunscrito na Nota Fiscal Eletrônica, que no Brasil deve ser utilizada em operações de circulação de mercadorias em substituição à Nota Fiscal Modelo 1 ou 1-A, documento tradicional que existe há décadas no Brasil. Buscaram-se informações quantitativas com empresas brasileiras, e a conclusão final é que existem evidências que justificam a afirmação de que o uso de documentos eletrônicos é mais barato que o uso de documentos em papel, mediante a comparação entre os custos de transação associados com a Nota Fiscal Modelo 1 ou 1-A e com a Nota Fiscal Eletrônica.