911 resultados para access and information integration
Resumo:
Digital technologies have profoundly changed not only the ways we create, distribute, access, use and re-use information but also many of the governance structures we had in place. Overall, "older" institutions at all governance levels have grappled and often failed to master the multi-faceted and multi-directional issues of the Internet. Regulatory entrepreneurs have yet to discover and fully mobilize the potential of digital technologies as an influential factor impacting upon the regulability of the environment and as a potential regulatory tool in themselves. At the same time, we have seen a deterioration of some public spaces and lower prioritization of public objectives, when strong private commercial interests are at play, such as most tellingly in the field of copyright. Less tangibly, private ordering has taken hold and captured through contracts spaces, previously regulated by public law. Code embedded in technology often replaces law. Non-state action has in general proliferated and put serious pressure upon conventional state-centered, command-and-control models. Under the conditions of this "messy" governance, the provision of key public goods, such as freedom of information, has been made difficult or is indeed jeopardized.The grand question is how can we navigate this complex multi-actor, multi-issue space and secure the attainment of fundamental public interest objectives. This is also the question that Ian Brown and Chris Marsden seek to answer with their book, Regulating Code, as recently published under the "Information Revolution and Global Politics" series of MIT Press. This book review critically assesses the bold effort by Brown and Marsden.
Resumo:
This review examines the overall accuracy of social perception across several research topics and identifies factors that inf luence the accuracy of social perception. Findings from 14 meta-analyses examining topics such as social/personality judgments, health judgments, legal judgments, and academic/vocational judg-ments were obtained. Social perception accuracy was generally moderate, yielding an average effect size (r) of .32. However, individual meta-analytic effects varied widely, with some topics yielding small effects (e.g., lie detection, eyewitness identification) and other topics yielding large effects (e.g., educational judgments, health judgments). Several moderators of social perception accuracy were identified, includ-ing the nature of the information source, familiarity of the target, type of personality trait, and severity of the outcome being judged. These findings provide a comprehensive summary and novel integration of disparate findings on the accuracy of social perception. Concluding remarks highlight avenues for future research and call for cross-disciplinary collaborations that would enhance our understanding of social perception.
Resumo:
One of the broad objectives of the Nigerian health service, vigorously being pursued at all levels of government, is to make comprehensive health care available and accessible to the population at the lowest possible cost, within available resources. Some state governments in the federation have already introduced free medical service as a practical way to remove financial barriers to access and in turn to encourage greater utilization of publicly funded care facilities.^ To aid health planners and decision makers in identifying a shorter corridor through which urban dwellers can gain access to comprehensive health care, a health interview survey of the metropolitan Lagos was undertaken. The primary purpose was to ascertain the magnitude of access problems which urban households face in seeking care from existing public facilities at the time of need. Six categories of illness chosen from the 1975 edition of the International Classification of Disease were used as indicators of health need.^ Choice of treatment facilities in response to illness episode was examined in relation to distance, travel time, time of use and transportation experiences. These were graphically described. The overall picture indicated that distance and travel time coexist with transportation problems in preventing a significant segment of those in need of health care from benefitting in the free medical service offered in public health facilities. Within this milieu, traditional medicine and its practitioners became the most preferred alternative. Recommendations were offered for action with regard to decentralization of general practitioner (GP) consultations in general hospitals and integration of traditional medicine and its practitioners into public health service. ^
Resumo:
Clinical text understanding (CTU) is of interest to health informatics because critical clinical information frequently represented as unconstrained text in electronic health records are extensively used by human experts to guide clinical practice, decision making, and to document delivery of care, but are largely unusable by information systems for queries and computations. Recent initiatives advocating for translational research call for generation of technologies that can integrate structured clinical data with unstructured data, provide a unified interface to all data, and contextualize clinical information for reuse in multidisciplinary and collaborative environment envisioned by CTSA program. This implies that technologies for the processing and interpretation of clinical text should be evaluated not only in terms of their validity and reliability in their intended environment, but also in light of their interoperability, and ability to support information integration and contextualization in a distributed and dynamic environment. This vision adds a new layer of information representation requirements that needs to be accounted for when conceptualizing implementation or acquisition of clinical text processing tools and technologies for multidisciplinary research. On the other hand, electronic health records frequently contain unconstrained clinical text with high variability in use of terms and documentation practices, and without commitmentto grammatical or syntactic structure of the language (e.g. Triage notes, physician and nurse notes, chief complaints, etc). This hinders performance of natural language processing technologies which typically rely heavily on the syntax of language and grammatical structure of the text. This document introduces our method to transform unconstrained clinical text found in electronic health information systems to a formal (computationally understandable) representation that is suitable for querying, integration, contextualization and reuse, and is resilient to the grammatical and syntactic irregularities of the clinical text. We present our design rationale, method, and results of evaluation in processing chief complaints and triage notes from 8 different emergency departments in Houston Texas. At the end, we will discuss significance of our contribution in enabling use of clinical text in a practical bio-surveillance setting.
Resumo:
This research aims to diachronically analyze the worldwide scientific production on open access, in the academic and scientific context, in order to contribute to knowledge and visualization of its main actors. As a method, bibliographical, descriptive and analytical research was used, with the contribution of bibliometric studies, especially the production indicators, scientific collaboration and indicators of thematic co-occurrence. The Scopus database was used as a source to retrieve the articles on the subject, with a resulting corpus of 1179 articles. Using Bibexcel software, frequency tables were constructed for the variables, and Pajek software was used to visualize the collaboration network and VoSViewer for the construction of the keywords' network. As for the results, the most productive researchers come from countries such as the United States, Canada, France and Spain. Journals with higher impact in the academic community have disseminated the new constructed knowledge. A collaborative network with a few subnets where co-authors are from different countries has been observed. As conclusions, this study allows identifying the themes of debates that mark the development of open access at the international level, and it is possible to state that open access is one of the new emerging and frontier fields of library and information science
Resumo:
This research aims to diachronically analyze the worldwide scientific production on open access, in the academic and scientific context, in order to contribute to knowledge and visualization of its main actors. As a method, bibliographical, descriptive and analytical research was used, with the contribution of bibliometric studies, especially the production indicators, scientific collaboration and indicators of thematic co-occurrence. The Scopus database was used as a source to retrieve the articles on the subject, with a resulting corpus of 1179 articles. Using Bibexcel software, frequency tables were constructed for the variables, and Pajek software was used to visualize the collaboration network and VoSViewer for the construction of the keywords' network. As for the results, the most productive researchers come from countries such as the United States, Canada, France and Spain. Journals with higher impact in the academic community have disseminated the new constructed knowledge. A collaborative network with a few subnets where co-authors are from different countries has been observed. As conclusions, this study allows identifying the themes of debates that mark the development of open access at the international level, and it is possible to state that open access is one of the new emerging and frontier fields of library and information science
Resumo:
This research aims to diachronically analyze the worldwide scientific production on open access, in the academic and scientific context, in order to contribute to knowledge and visualization of its main actors. As a method, bibliographical, descriptive and analytical research was used, with the contribution of bibliometric studies, especially the production indicators, scientific collaboration and indicators of thematic co-occurrence. The Scopus database was used as a source to retrieve the articles on the subject, with a resulting corpus of 1179 articles. Using Bibexcel software, frequency tables were constructed for the variables, and Pajek software was used to visualize the collaboration network and VoSViewer for the construction of the keywords' network. As for the results, the most productive researchers come from countries such as the United States, Canada, France and Spain. Journals with higher impact in the academic community have disseminated the new constructed knowledge. A collaborative network with a few subnets where co-authors are from different countries has been observed. As conclusions, this study allows identifying the themes of debates that mark the development of open access at the international level, and it is possible to state that open access is one of the new emerging and frontier fields of library and information science
A repository for integration of software artifacts with dependency resolution and federation support
Resumo:
While developing new IT products, reusability of existing components is a key aspect that can considerably improve the success rate. This fact has become even more important with the rise of the open source paradigm. However, integrating different products and technologies is not always an easy task. Different communities employ different standards and tools, and most times is not clear which dependencies a particular piece of software has. This is exacerbated by the transitive nature of these dependencies, making component integration a complicated affair. To help reducing this complexity we propose a model-based repository, capable of automatically resolve the required dependencies. This repository needs to be expandable, so new constraints can be analyzed, and also have federation support, for the integration with other sources of artifacts. The solution we propose achieves these working with OSGi components and using OSGi itself.
Resumo:
We introduce an easily computable topological measure which locates the effective crossover between segregation and integration in a modular network. Segregation corresponds to the degree of network modularity, while integration is expressed in terms of the algebraic connectivity of an associated hypergraph. The rigorous treatment of the simplified case of cliques of equal size that are gradually rewired until they become completely merged, allows us to show that this topological crossover can be made to coincide with a dynamical crossover from cluster to global synchronization of a system of coupled phase oscillators. The dynamical crossover is signaled by a peak in the product of the measures of intracluster and global synchronization, which we propose as a dynamical measure of complexity. This quantity is much easier to compute than the entropy (of the average frequencies of the oscillators), and displays a behavior which closely mimics that of the dynamical complexity index based on the latter. The proposed topological measure simultaneously provides information on the dynamical behavior, sheds light on the interplay between modularity and total integration, and shows how this affects the capability of the network to perform both local and distributed dynamical tasks.
Resumo:
All around the ITER vacuum vessel, forty-four ports will provide access to the vacuum vessel for remotehandling operations, diagnostic systems, heating, and vacuum systems: 18 upper ports, 17 equatorialports, and 9 lower ports. Among the lower ports, three of them will be used for the remote handlinginstallation of the ITER divertor. Once the divertor is in place, these ports will host various diagnosticsystems mounted in the so-called diagnostic racks. The diagnostic racks must allow the support andcooling of the diagnostics, extraction of the required diagnostic signals, and providing access and main-tainability while minimizing the leakage of radiation toward the back of the port where the humans areallowed to enter. A fully integrated inner rack, carrying the near plasma diagnostic components, will bean stainless steel structure, 4.2 m long, with a maximum weight of 10 t. This structure brings water forcooling and baking at maximum temperature of 240?C and provides connection with gas, vacuum andelectric services. Additional racks (placed away from plasma and not requiring cooling) may be requiredfor the support of some particular diagnostic components. The diagnostics racks and its associated exvessel structures, which are in its conceptual design phase, are being designed to survive the lifetimeof ITER of 20 years. This paper presents the current state of development including interfaces, diagnos-tic integration, operation and maintenance, shielding requirements, remote handling, loads cases anddiscussion of the main challenges coming from the severe environment and engineering requirements.
Resumo:
Arabidopsis thaliana, a small annual plant belonging to the mustard family, is the subject of study by an estimated 7000 researchers around the world. In addition to the large body of genetic, physiological and biochemical data gathered for this plant, it will be the first higher plant genome to be completely sequenced, with completion expected at the end of the year 2000. The sequencing effort has been coordinated by an international collaboration, the Arabidopsis Genome Initiative (AGI). The rationale for intensive investigation of Arabidopsis is that it is an excellent model for higher plants. In order to maximize use of the knowledge gained about this plant, there is a need for a comprehensive database and information retrieval and analysis system that will provide user-friendly access to Arabidopsis information. This paper describes the initial steps we have taken toward realizing these goals in a project called The Arabidopsis Information Resource (TAIR) (www.arabidopsis.org).
Resumo:
Cells in adult primary visual cortex are capable of integrating information over much larger portions of the visual field than was originally thought. Moreover, their receptive field properties can be altered by the context within which local features are presented and by changes in visual experience. The substrate for both spatial integration and cortical plasticity is likely to be found in a plexus of long-range horizontal connections, formed by cortical pyramidal cells, which link cells within each cortical area over distances of 6-8 mm. The relationship between horizontal connections and cortical functional architecture suggests a role in visual segmentation and spatial integration. The distribution of lateral interactions within striate cortex was visualized with optical recording, and their functional consequences were explored by using comparable stimuli in human psychophysical experiments and in recordings from alert monkeys. They may represent the substrate for perceptual phenomena such as illusory contours, surface fill-in, and contour saliency. The dynamic nature of receptive field properties and cortical architecture has been seen over time scales ranging from seconds to months. One can induce a remapping of the topography of visual cortex by making focal binocular retinal lesions. Shorter-term plasticity of cortical receptive fields was observed following brief periods of visual stimulation. The mechanisms involved entailed, for the short-term changes, altering the effectiveness of existing cortical connections, and for the long-term changes, sprouting of axon collaterals and synaptogenesis. The mutability of cortical function implies a continual process of calibration and normalization of the perception of visual attributes that is dependent on sensory experience throughout adulthood and might further represent the mechanism of perceptual learning.
Resumo:
The United Sates was founded on the principles of freedom. Events in recent history have threatened the freedoms we as individuals enjoy. Notably, changes to government legislation and policies regarding access to environmentally sensitive information following September 11, 2001, are troubling. The government has struggled with a difficult balancing act. The public has the right of access to information, yet, information some view as sensitive or dangerous must be kept out of the hands of terrorists. This project examines and discusses the information access debate within the United States and how to best provide the public environmentally sensitive information.
Resumo:
In the last few years, there has been a wide development in the research on textual information systems. The goal is to improve these systems in order to allow an easy localization, treatment and access to the information stored in digital format (Digital Databases, Documental Databases, and so on). There are lots of applications focused on information access (for example, Web-search systems like Google or Altavista). However, these applications have problems when they must access to cross-language information, or when they need to show information in a language different from the one of the query. This paper explores the use of syntactic-sematic patterns as a method to access to multilingual information, and revise, in the case of Information Retrieval, where it is possible and useful to employ patterns when it comes to the multilingual and interactive aspects. On the one hand, the multilingual aspects that are going to be studied are the ones related to the access to documents in different languages from the one of the query, as well as the automatic translation of the document, i.e. a machine translation system based on patterns. On the other hand, this paper is going to go deep into the interactive aspects related to the reformulation of a query based on the syntactic-semantic pattern of the request.
Resumo:
In this paper we describe Fénix, a data model for exchanging information between Natural Language Processing applications. The format proposed is intended to be flexible enough to cover both current and future data structures employed in the field of Computational Linguistics. The Fénix architecture is divided into four separate layers: conceptual, logical, persistence and physical. This division provides a simple interface to abstract the users from low-level implementation details, such as programming languages and data storage employed, allowing them to focus in the concepts and processes to be modelled. The Fénix architecture is accompanied by a set of programming libraries to facilitate the access and manipulation of the structures created in this framework. We will also show how this architecture has been already successfully applied in different research projects.