977 resultados para Information complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Typical human immunodeficiency virus-1 subtype B (HIV-1B) sequences present a GPGR signature at the tip of the variable region 3 (V3) loop; however, unusual motifs harbouring a GWGR signature have also been isolated. Although epidemiological studies have detected this variant in approximately 17-50% of the total infections in Brazil, the prevalence of B"-GWGR in the southernmost region of Brazil is not yet clear. This study aimed to investigate the C2-V3 molecular diversity of the HIV-1B epidemic in southernmost Brazil. HIV-1 seropositive patients were ana-lysed at two distinct time points in the state of Rio Grande do Sul (RS98 and RS08) and at one time point in the state of Santa Catarina (SC08). Phylogenetic analysis classified 46 individuals in the RS98 group as HIV-1B and their molecular signatures were as follows: 26% B"-GWGR, 54% B-GPGR and 20% other motifs. In the RS08 group, HIV-1B was present in 32 samples: 22% B"-GWGR, 59% B-GPGR and 19% other motifs. In the SC08 group, 32 HIV-1B samples were found: 28% B"-GWGR, 59% B-GPGR and 13% other motifs. No association could be established between the HIV-1B V3 signatures and exposure categories in the HIV-1B epidemic in RS. However, B-GPGR seemed to be related to heterosexual individuals in the SC08 group. Our results suggest that the established B"-GWGR epidemics in both cities have similar patterns, which is likely due to their geographical proximity and cultural relationship.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fibrocytes are important for understanding the progression of many diseases because they are present in areas where pathogenic lesions are generated. However, the morphology of fibrocytes and their interactions with parasites are poorly understood. In this study, we examined the morphology of peripheral blood fibrocytes and their interactions with Leishmania (L.) amazonensis . Through ultrastructural analysis, we describe the details of fibrocyte morphology and how fibrocytes rapidly internaliseLeishmania promastigotes. The parasites differentiated into amastigotes after 2 h in phagolysosomes and the infection was completely resolved after 72 h. Early in the infection, we found increased nitric oxide production and large lysosomes with electron-dense material. These factors may regulate the proliferation and death of the parasites. Because fibrocytes are present at the infection site and are directly involved in developing cutaneous leishmaniasis, they are targets for effective, non-toxic cell-based therapies that control and treat leishmaniasis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a) real absences b) pseudo-absences selected randomly from the background and c) two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA) or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors) was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97), and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have limited fit.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Since 2008, Intelligence units of six states of the western part of Switzerland have been sharing a common database for the analysis of high volume crimes. On a daily basis, events reported to the police are analysed, filtered and classified to detect crime repetitions and interpret the crime environment. Several forensic outcomes are integrated in the system such as matches of traces with persons, and links between scenes detected by the comparison of forensic case data. Systematic procedures have been settled to integrate links assumed mainly through DNA profiles, shoemarks patterns and images. A statistical outlook on a retrospective dataset of series from 2009 to 2011 of the database informs for instance on the number of repetition detected or confirmed and increased by forensic case data. Time needed to obtain forensic intelligence in regard with the type of marks treated, is seen as a critical issue. Furthermore, the underlying integration process of forensic intelligence into the crime intelligence database raised several difficulties in regards of the acquisition of data and the models used in the forensic databases. Solutions found and adopted operational procedures are described and discussed. This process form the basis to many other researches aimed at developing forensic intelligence models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Two hundred twelve patients with colonization/infection due to amoxicillin-clavulanate (AMC)-resistant Escherichia coli were studied. OXA-1- and inhibitor-resistant TEM (IRT)-producing strains were associated with urinary tract infections, while OXA-1 producers and chromosomal AmpC hyperproducers were associated with bacteremic infections. AMC resistance in E. coli is a complex phenomenon with heterogeneous clinical implications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIM To determine the opinions of infectious diseases professionals on the possibilities of monitoring patients with HIV in Primary Care. DESIGN Qualitative study using in-depth interviews. LOCATION Infectious Diseases Unit in the University Hospital "Virgen de la Victoria" in Málaga. PARTICIPANTS Health professionals with more than one year experience working in infectious diseases. A total of 25 respondents: 5 doctors, 15 nurses and 5 nursing assistants. METHOD Convenience sample. Semi-structured interviews were used that were later transcribed verbatim. Content analysis was performed according to the Taylor and Bogdan approach with computer support. Validation of information was made through additional analysis, expert participation, and feedback of part of the results to the participants. RESULTS Hospital care professionals considered the disease-related complexity of HIV, treatment and social aspects that may have an effect on the organizational level of care. Professionals highlighted the benefits of specialized care, although opinions differed between doctors and nurses as regards follow up in Primary Care. Some concerns emerged about the level of training, confidentiality and workload in Primary Care, although they mentioned potential advantages related to accessibility of patients. CONCLUSIONS Physicians perceive difficulties in following up HIV patients in Primary Care, even for those patients with a good control of their disease. Nurses and nursing assistants are more open to this possibility due to the proximity to home and health promotion in Primary Care.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: The high prevalence of disease-related hospital malnutrition justifies the need for screening tools and early detection in patients at risk for malnutrition, followed by an assessment targeted towards diagnosis and treatment. At the same time there is clear undercoding of malnutrition diagnoses and the procedures to correct it Objectives: To describe the INFORNUT program/ process and its development as an information system. To quantify performance in its different phases. To cite other tools used as a coding source. To calculate the coding rates for malnutrition diagnoses and related procedures. To show the relationship to Mean Stay, Mortality Rate and Urgent Readmission; as well as to quantify its impact on the hospital Complexity Index and its effect on the justification of Hospitalization Costs. Material and methods: The INFORNUT® process is based on an automated screening program of systematic detection and early identification of malnourished patients on hospital admission, as well as their assessment, diagnoses, documentation and reporting. Of total readmissions with stays longer than three days incurred in 2008 and 2010, we recorded patients who underwent analytical screening with an alert for a medium or high risk of malnutrition, as well as the subgroup of patients in whom we were able to administer the complete INFORNUT® process, generating a report for each.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

According to the Bethesda Statement on Open Access Policy for libraries and the recommendations of the BOAI10, libraries and librarians have an important role to fulfil in the encouragement of open access. Taking into account the Competencies for Information Professionals of the 21st Century, elaborated by the Special Libraries Association, and the Librarians’ Competencies Profile for Scholarly Publishing and Open Access, we shall identify the competencies and new areas of knowledge and expertise that have been involved in the process of the development and upkeep of our institutional repository (Repositorio SSPA).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the role of learning by private agents and the central bank (two-sided learning) in a New Keynesian framework in which both sides of the economy have asymmetric and imperfect knowledge about the true data generating process. We assume that all agents employ the data that they observe (which may be distinct for different sets of agents) to form beliefs about unknown aspects of the true model of the economy, use their beliefs to decide on actions, and revise these beliefs through a statistical learning algorithm as new information becomes available. We study the short-run dynamics of our model and derive its policy recommendations, particularly with respect to central bank communications. We demonstrate that two-sided learning can generate substantial increases in volatility and persistence, and alter the behavior of the variables in the model in a signifficant way. Our simulations do not converge to a symmetric rational expectations equilibrium and we highlight one source that invalidates the convergence results of Marcet and Sargent (1989). Finally, we identify a novel aspect of central bank communication in models of learning: communication can be harmful if the central bank's model is substantially mis-specified

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-lapse geophysical data acquired during transient hydrological experiments are being increasingly employed to estimate subsurface hydraulic properties at the field scale. In particular, crosshole ground-penetrating radar (GPR) data, collected while water infiltrates into the subsurface either by natural or artificial means, have been demonstrated in a number of studies to contain valuable information concerning the hydraulic properties of the unsaturated zone. Previous work in this domain has considered a variety of infiltration conditions and different amounts of time-lapse GPR data in the estimation procedure. However, the particular benefits and drawbacks of these different strategies as well as the impact of a variety of key and common assumptions remain unclear. Using a Bayesian Markov-chain-Monte-Carlo stochastic inversion methodology, we examine in this paper the information content of time-lapse zero-offset-profile (ZOP) GPR traveltime data, collected under three different infiltration conditions, for the estimation of van Genuchten-Mualem (VGM) parameters in a layered subsurface medium. Specifically, we systematically analyze synthetic and field GPR data acquired under natural loading and two rates of forced infiltration, and we consider the value of incorporating different amounts of time-lapse measurements into the estimation procedure. Our results confirm that, for all infiltration scenarios considered, the ZOP GPR traveltime data contain important information about subsurface hydraulic properties as a function of depth, with forced infiltration offering the greatest potential for VGM parameter refinement because of the higher stressing of the hydrological system. Considering greater amounts of time-lapse data in the inversion procedure is also found to help refine VGM parameter estimates. Quite importantly, however, inconsistencies observed in the field results point to the strong possibility that posterior uncertainties are being influenced by model structural errors, which in turn underlines the fundamental importance of a systematic analysis of such errors in future related studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[Table des matières] 1. Introduction. 2. Structure (introduction, hiérarchie). 3. Processus (généralités, flux de clientèle, flux d'activité, flux de ressources, aspects temporels, aspects comptables). 4. Descripteurs (qualification, quantification). 5. Indicateurs (définitions, productivité, pertinence, adéquation, efficacité, effectivité, efficience, standards). 6. Bibliographie.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Food intake increases to a varying extent during pregnancy to provide extra energy for the growing fetus. Measuring the respiratory quotient (RQ) during the course of pregnancy (by quantifying O2 consumption and CO2 production with indirect calorimetry) could be potentially useful since it gives an insight into the evolution of the proportion of carbohydrate vs. fat oxidized during pregnancy and thus allows recommendations on macronutrients for achieving a balanced (or slightly positive) substrate intake. A systematic search of the literature for papers reporting RQ changes during normal pregnancy identified 10 papers reporting original research. The existing evidence supports an increased RQ of varying magnitude in the third trimester of pregnancy, while the discrepant results reported for the first and second trimesters (i.e. no increase in RQ), explained by limited statistical power (small sample size) or fragmentary data, preclude safe conclusions about the evolution of RQ during early pregnancy. From a clinical point of view, measuring RQ during pregnancy requires not only sophisticated and costly indirect calorimeters but appears of limited value outside pure research projects, because of several confounding variables: (1) spontaneous changes in food intake and food composition during the course of pregnancy (which influence RQ); (2) inter-individual differences in weight gain and composition of tissue growth; (3) technical factors, notwithstanding the relatively small contribution of fetal metabolism per se (RQ close to 1.0) to overall metabolism of the pregnant mother.