933 resultados para non-trivial data structures
Resumo:
Las infecciones asociadas a ventilación mecánica (VM) son frecuentes en la unidad de cuidado intensivo (UCI). Existen dos infecciones: neumonía (NAV) y traqueobronquitis (TAV). NAV genera impacto negativo en los desenlaces de los pacientes al aumentar la morbilidad, mortalidad y los tiempos en UCI y VM, pero no se conoce el impacto de TAV. El objetivo de este estudio fue identificar si hay diferencias entre NAV y TAV. Materiales y métodos: Se realizó un estudio de cohortes entre 2009 y 2013 en la UCI de la Fundación Neumológica Colombiana. De los pacientes con NAV y TAV se obtuvieron datos demográficos, epidemiológicos, microbiológicos y desenlaces como tiempos de estancia en UCI, VM y de hospitalización y mortalidad. Se compararon estadísticamente mediante t de Student y Chi2 para datos normales y prueba de Mann-Whitney para datos no normales. Resultados: Los pacientes con NAV y TAV fueron similares en la condición de ingreso a UCI. Al diagnóstico de la infección hubo diferencias significativas entre grupos en la oxigenación y tiempo de estancia hospitalaria, en UCI y VM. La microbiología fue con predominio de gérmenes Gram negativos y presencia de multirresistencia en el 52.5% de casos, sin diferencias significativas entre grupos. En los desenlaces, se observó diferencias en los tiempos totales de estancia en UCI, hospitalización y VM, pero sin diferencia en ellos después del diagnóstico. No hubo diferencias significativas en mortalidad. Conclusiones: NAV y TAV son similares en el impacto sobre la evolución de los pacientes en cuanto a morbilidad, estancias y mortalidad.
Resumo:
En aquesta tesi he estudiat l'efecte de l'error de superposició de base (BSSE) en la planaritat d'algunes molècules. He observat que l'ús d'alguns mètodes de càlcul amb determinades funcions de base descriuen mínims d'energia no planars per les bases nitrogenades de l'ADN. He demostrat que aquests problemes es poden arreglar utilitzant el mètode Counterpoise per corregir el BSSE en els càlculs. En aquesta tesi també he estudiat la fotofísica de la timina i els resultats mostren que existeixen dos camins de relaxació des de l'estat excitat que permeten la regeneració de l'estructura inicial de forma ultraràpida.
Resumo:
We construct a mapping from complex recursive linguistic data structures to spherical wave functions using Smolensky's filler/role bindings and tensor product representations. Syntactic language processing is then described by the transient evolution of these spherical patterns whose amplitudes are governed by nonlinear order parameter equations. Implications of the model in terms of brain wave dynamics are indicated.
Resumo:
We have developed a new method for the analysis of voids in proteins (defined as empty cavities not accessible to solvent). This method combines analysis of individual discrete voids with analysis of packing quality. While these are different aspects of the same effect, they have traditionally been analysed using different approaches. The method has been applied to the calculation of total void volume and maximum void size in a non-redundant set of protein domains and has been used to examine correlations between thermal stability and void size. The tumour-suppressor protein p53 has then been compared with the non-redundant data set to determine whether its low thermal stability results from poor packing. We found that p53 has average packing, but the detrimental effects of some previously unexplained mutations to p53 observed in cancer can be explained by the creation of unusually large voids. (C) 2004 Elsevier Ltd. All rights reserved.
Resumo:
A mapping between chains in the Protein Databank and Enzyme Classification numbers is invaluable for research into structure-function relationships. Mapping at the chain level is a non-trivial problem and we present an automatically updated Web-server, which provides this link in a queryable form and as a downloadable XML or flat file.
Resumo:
Inverse problems for dynamical system models of cognitive processes comprise the determination of synaptic weight matrices or kernel functions for neural networks or neural/dynamic field models, respectively. We introduce dynamic cognitive modeling as a three tier top-down approach where cognitive processes are first described as algorithms that operate on complex symbolic data structures. Second, symbolic expressions and operations are represented by states and transformations in abstract vector spaces. Third, prescribed trajectories through representation space are implemented in neurodynamical systems. We discuss the Amari equation for a neural/dynamic field theory as a special case and show that the kernel construction problem is particularly ill-posed. We suggest a Tikhonov-Hebbian learning method as regularization technique and demonstrate its validity and robustness for basic examples of cognitive computations.
Resumo:
Comparison-based diagnosis is an effective approach to system-level fault diagnosis. Under the Maeng-Malek comparison model (NM* model), Sengupta and Dahbura proposed an O(N-5) diagnosis algorithm for general diagnosable systems with N nodes. Thanks to lower diameter and better graph embedding capability as compared with a hypercube of the same size, the crossed cube has been a promising candidate for interconnection networks. In this paper, we propose a fault diagnosis algorithm tailored for crossed cube connected multicomputer systems under the MM* model. By introducing appropriate data structures, this algorithm runs in O(Nlog(2)(2) N) time, which is linear in the size of the input. As a result, this algorithm is significantly superior to the Sengupta-Dahbura's algorithm when applied to crossed cube systems. (C) 2004 Elsevier B.V. All rights reserved.
Resumo:
Programming is a skill which requires knowledge of both the basic constructs of the computer language used and techniques employing these constructs. How these are used in any given application is determined intuitively, and this intuition is based on experience of programs already written. One aim of this book is to describe the techniques and give practical examples of the techniques in action - to provide some experience. Another aim of the book is to show how a program should be developed, in particular how a relatively large program should be tackled in a structured manner. These aims are accomplished essentially by describing the writing of one large program, a diagram generator package, in which a number of useful programming techniques are employed. Also, the book provides a useful program, with an in-built manual describing not only how the program works, but also how it does it, with full source code listings. This means that the user can, if required, modify the package to meet particular requirements. A floppy disk is available from the publishers containing the program, including listings of the source code. All the programs are written in Modula-2, using JPI's Top Speed Modula-2 system running on IBM-PCs and compatibles. This language was chosen as it is an ideal language for implementing large programs and it is the main language taught in the Cybernetics Department at the University of Reading. There are some aspects of the Top Speed implementation which are not standard, so suitable comments are given when these occur. Although implemented in Modula-2, many of the techniques described here are appropriate to other languages, like Pascal of C, for example. The book and programs are based on a second year undergraduate course taught at Reading to Cybernetics students, entitled Algorithms and Data Structures. Useful techniques are described for the reader to use, applications where they are appropriate are recommended, but detailed analyses of the techniques are not given.
Resumo:
Reconfigurable computing is becoming an important new alternative for implementing computations. Field programmable gate arrays (FPGAs) are the ideal integrated circuit technology to experiment with the potential benefits of using different strategies of circuit specialization by reconfiguration. The final form of the reconfiguration strategy is often non-trivial to determine. Consequently, in this paper, we examine strategies for reconfiguration and, based on our experience, propose general guidelines for the tradeoffs using an area-time metric called functional density. Three experiments are set up to explore different reconfiguration strategies for FPGAs applied to a systolic implementation of a scalar quantizer used as a case study. Quantitative results for each experiment are given. The regular nature of the example means that the results can be generalized to a wide class of industry-relevant problems based on arrays.
Resumo:
The benefits and applications of virtual reality (VR) in the construction industry have been investigated for almost a decade. However, the practical implementation of VR in the construction industry has yet to reach maturity owing to technical constraints. The need for effective information management presents challenges: both transfer of building data to, and organisation of building information within, the virtual environment require consideration. This paper reviews the applications and benefits of VR in the built environment field and reports on a collaboration between Loughborough University and South Bank University to overcome constraints on the use of the overall VR model for whole lifecycle visualisation. The work at each research centre is concerned with an aspect of information management within VR applications for the built environment, and both data transfer and internal data organisation have been investigated. In this paper, similarities and differences between computer-aided design (CAD) and VR packages are first discussed. Three different approaches to the creation of VR models during the design stage are identified and described, with a view to providing sharing understanding across the interdiscipliary groups involved. The suitable organisation of building information within the virtual environment is then further investigated. This work focused on the visualisation of the degradation of a building, through its lifespan, with the view to provide a visual aid for developing an effective and economic project maintenance programme. Finally consideration is given to the potential of emerging standards to facilitate an integrated use of VR. The convergence towards similar data structures in VR and other construction packages may enable visualisation to be better utilised in the overall lifecycle model.
Resumo:
Models play a vital role in supporting a range of activities in numerous domains. We rely on models to support the design, visualisation, analysis and representation of parts of the world around us, and as such significant research effort has been invested into numerous areas of modelling; including support for model semantics, dynamic states and behaviour, temporal data storage and visualisation. Whilst these efforts have increased our capabilities and allowed us to create increasingly powerful software-based models, the process of developing models, supporting tools and /or data structures remains difficult, expensive and error-prone. In this paper we define from literature the key factors in assessing a model’s quality and usefulness: semantic richness, support for dynamic states and object behaviour, temporal data storage and visualisation. We also identify a number of shortcomings in both existing modelling standards and model development processes and propose a unified generic process to guide users through the development of semantically rich, dynamic and temporal models.
Resumo:
A novel approach is presented for combining spatial and temporal detail from newly available TRMM-based data sets to derive hourly rainfall intensities at 1-km spatial resolution for hydrological modelling applications. Time series of rainfall intensities derived from 3-hourly 0.25° TRMM 3B42 data are merged with a 1-km gridded rainfall climatology based on TRMM 2B31 data to account for the sub-grid spatial distribution of rainfall intensities within coarse-scale 0.25° grid cells. The method is implemented for two dryland catchments in Tunisia and Senegal, and validated against gauge data. The outcomes of the validation show that the spatially disaggregated and intensity corrected TRMM time series more closely approximate ground-based measurements than non-corrected data. The method introduced here enables the generation of rainfall intensity time series with realistic temporal and spatial detail for dynamic modelling of runoff and infiltration processes that are especially important to water resource management in arid regions.
Resumo:
Large-scale ocean transports of heat and freshwater have not been well monitored, and yet the regional budgets of these quantities are important to understanding the role of the oceans in climate and climate change. In contrast, atmospheric heat and freshwater transports are commonly assessed from atmospheric reanalysis products, despite the presence of non-conserving data assimilation based on the wealth of distributed atmospheric observations as constraints. The ability to carry out ocean reanalyses globally at eddy-permitting resolutions of 1/4 ° or better, along with new global ocean observation programs, now makes a similar approach viable for the ocean. In this paper we examine the budgets and transports within a global high resolution ocean model constrained by ocean data assimilation, and compare them with independent oceanic and atmospheric estimates.
Resumo:
We consider an equilibrium birth and death type process for a particle system in infinite volume, the latter is described by the space of all locally finite point configurations on Rd. These Glauber type dynamics are Markov processes constructed for pre-given reversible measures. A representation for the ``carré du champ'' and ``second carré du champ'' for the associate infinitesimal generators L are calculated in infinite volume and for a large class of functions in a generalized sense. The corresponding coercivity identity is derived and explicit sufficient conditions for the appearance and bounds for the size of the spectral gap of L are given. These techniques are applied to Glauber dynamics associated to Gibbs measure and conditions are derived extending all previous known results and, in particular, potentials with negative parts can now be treated. The high temperature regime is extended essentially and potentials with non-trivial negative part can be included. Furthermore, a special class of potentials is defined for which the size of the spectral gap is as least as large as for the free system and, surprisingly, the spectral gap is independent of the activity. This type of potentials should not show any phase transition for a given temperature at any activity.
Resumo:
Spatial memory is important for locating objects in hierarchical data structures, such as desktop folders. There are, however, some contradictions in literature concerning the effectiveness of 3D user interfaces when compared to their 2D counterparts. This paper uses a task-based approach in order to investigate the effectiveness of adding a third dimension to specific user tasks, i.e. the impact of depth on navigation in a 3D file manager. Results highlight issues and benefits of using 3D interfaces for visual and verbal tasks, and introduces the possible existence of a correlation between aptitude scores achieved on the Guilford- Zimmerman Orientation Survey and Electroencephalography- measured brainwave activity as participants search for targets of variable perceptual salience in 2D and 3D environments.