872 resultados para Protection of personal information
Resumo:
It is well known that invertebrates are devoid of adaptive immune components and rely primarily on innate immunity to defend against pathogens, but recent studies have demonstrated the existence of enhanced secondary immune protection in some invertebrates. in the present study, the cumulative mortality of scallops received two successive Listonella anguillarum stimulations was recorded, and variations of immune parameters including phagocytosis (phagocytic rate and phagocytic index), phenoloxidase-like enzyme, acid phosphatase and superoxide dismutase activities were also examined. The scallops received a previous short-term L anguillarum stimulation were protected against a long-term stimulation of L. anguillarum. Significantly higher level of phagocytic activities and acid phosphatase activity were observed in the scallops received twice stimulations compared with those only received the secondary stimulation. These results indicated that a short-term immersion with L. anguillarum modulated the scallops' immune system and endowed the scallops with enhanced resistance to the secondary bacterial stimulation: phagocytosis and acid phosphatase were suspected to be involved in the protection. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
We report on a study of how people look for information within email, files, and the Web. When locating a document or searching for a specific answer, people relied on their contextual knowledge of their information target to help them find it, often associating the target with a specific document. They appeared to prefer to use this contextual information as a guide in navigating locally in small steps to the desired document rather than directly jumping to their target. We found this behavior was especially true for people with unstructured information organization. We discuss the implications of our findings for the design of personal information management tools.
Resumo:
This work describes a program, called TOPLE, which uses a procedural model of the world to understand simple declarative sentences. It accepts sentences in a modified predicate calculus symbolism, and uses plausible reasoning to visualize scenes, resolve ambiguous pronoun and noun phrase references, explain events, and make conditional predications. Because it does plausible deduction, with tentative conclusions, it must contain a formalism for describing its reasons for its conclusions and what the alternatives are. When an inconsistency is detected in its world model, it uses its recorded information to resolve it, one way or another. It uses simulation techniques to make deductions about creatures motivation and behavior, assuming they are goal-directed beings like itself.
Resumo:
Ferr?, S. and King, R. D. (2004) BLID: an Application of Logical Information Systems in Bioinformatics. In P. Eklund (editor), 2nd International Conference on Formal Concept Analysis (ICFCA), Feb 2004. LNCS 2961, Springer.
Resumo:
Srinivasan, A., King, R. D. and Bain, M.E. (2003) An Empirical Study of the Use of Relevance Information in Inductive Logic Programming. Journal of Machine Learning Research. 4(Jul):369-383
Resumo:
Urquhart, C., Lonsdale, R.,Thomas, R., Spink, S., Yeoman, A., Armstrong, C. & Fenton, R. (2003). Uptake and use of electronic information services: trends in UK higher education from the JUSTEIS project. Program, 37(3), 167-180. Sponsorship: JISC
Resumo:
The problem of the acquisition of first language phonology is dealt with within the general information-processing perspective. In this sense, language acquisition is viewed as a process of biologically founded pattern formation due to information exchanges between an adult and a child. Moreover, the process is cognitive in that the child, as a goal-seeking and error correcting individual, undertakes an intricate task of compressing a huge variety of linguistic stimuli in order to build an effective information code. It is further assumed that the basic mechanism which leads to the establishment of fully articulate linguistic ability is that of simulation. The mechanism works through a compression of a set of initial variables (i.e. initial conditions) into a minimum length algorithm and a subsequent construction of an integrated system of language-specific attractors. It is only then that the language user is capable of participating in an information transaction in a fully developed manner.
Resumo:
Accepted Version
Resumo:
BACKGROUND: Implementing new practices, such as health information technology (HIT), is often difficult due to the disruption of the highly coordinated, interdependent processes (e.g., information exchange, communication, relationships) of providing care in hospitals. Thus, HIT implementation may occur slowly as staff members observe and make sense of unexpected disruptions in care. As a critical organizational function, sensemaking, defined as the social process of searching for answers and meaning which drive action, leads to unified understanding, learning, and effective problem solving -- strategies that studies have linked to successful change. Project teamwork is a change strategy increasingly used by hospitals that facilitates sensemaking by providing a formal mechanism for team members to share ideas, construct the meaning of events, and take next actions. METHODS: In this longitudinal case study, we aim to examine project teams' sensemaking and action as the team prepares to implement new information technology in a tiertiary care hospital. Based on management and healthcare literature on HIT implementation and project teamwork, we chose sensemaking as an alternative to traditional models for understanding organizational change and teamwork. Our methods choices are derived from this conceptual framework. Data on project team interactions will be prospectively collected through direct observation and organizational document review. Through qualitative methods, we will identify sensemaking patterns and explore variation in sensemaking across teams. Participant demographics will be used to explore variation in sensemaking patterns. DISCUSSION: Outcomes of this research will be new knowledge about sensemaking patterns of project teams, such as: the antecedents and consequences of the ongoing, evolutionary, social process of implementing HIT; the internal and external factors that influence the project team, including team composition, team member interaction, and interaction between the project team and the larger organization; the ways in which internal and external factors influence project team processes; and the ways in which project team processes facilitate team task accomplishment. These findings will lead to new methods of implementing HIT in hospitals.
Resumo:
Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.
Resumo:
Photon correlation spectroscopy (PCS) is a light-scattering technique for particle size diagnosis. It has been used mainly in the investigation of hydrosol particles since it is based on the measurement of the correlation function of the light scattered from the Brownian motion of suspended particles. Recently this technique also proved useful for studying soot particles in flames and similar aerosol systems. In the case of a polydispersed system the problem of recovering the particle size distribution can be reduced to the problem of inverting the Laplace transform. In this paper we review several methods introduced by the authors for the solution of this problem. We present some numerical results and we discuss the resolution limits characterizing the reconstruction of the size distributions. © 1989.
Resumo:
info:eu-repo/semantics/published
Resumo:
The requirement for a very accurate dependence analysis to underpin software tools to aid the generation of efficient parallel implementations of scalar code is argued. The current status of dependence analysis is shown to be inadequate for the generation of efficient parallel code, causing too many conservative assumptions to be made. This paper summarises the limitations of conventional dependence analysis techniques, and then describes a series of extensions which enable the production of a much more accurate dependence graph. The extensions include analysis of symbolic variables, the development of a symbolic inequality disproof algorithm and its exploitation in a symbolic Banerjee inequality test; the use of inference engine proofs; the exploitation of exact dependence and dependence pre-domination attributes; interprocedural array analysis; conditional variable definition tracing; integer array tracing and division calculations. Analysis case studies on typical numerical code is shown to reduce the total dependencies estimated from conventional analysis by up to 50%. The techniques described in this paper have been embedded within a suite of tools, CAPTools, which combines analysis with user knowledge to produce efficient parallel implementations of numerical mesh based codes.