4 resultados para Papillary Patterns Analyze
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
Community-acquired pneumonia (CAP) is a common cause of morbidity among children. Evidence on seasonality, especially on the frequency of viral and bacterial causative agents is scarce; such information may be useful in an era of changing climate conditions worldwide. To analyze the frequency of distinct infections, meteorological indicators and seasons in children hospitalized for CAP in Salvador, Brazil, nasopharyngeal aspirate and blood were collected from 184 patients aged < 5 y over a 21-month period. Fourteen microbes were investigated and 144 (78%) cases had the aetiology established. Significant differences were found in air temperature between spring and summer (p = 0.02) or winter (p < 0.001), summer and fall (p = 0.007) or winter (p < 0.001), fall and winter (p = 0.002), and on precipitation between spring and fall (p = 0.01). Correlations were found between: overall viral infections and relative humidity (p = 0.006; r = 0.6) or precipitation (p = 0.03; r = 0.5), parainfluenza and precipitation (p = 0.02; r = -0.5), respiratory syncytial virus (RSV) and air temperature (p = 0.048; r = -0.4) or precipitation (p = 0.045; r = 0.4), adenovirus and precipitation (p = 0.02; r = 0.5), pneumococcus and air temperature (p = 0.04; r = -0.4), and Chlamydia trachomatis and relative humidity (p = 0.02; r = -0.5). The frequency of parainfluenza infection was highest during spring (32.1%; p = 0.005) and that of RSV infection was highest in the fall (36.4%; p < 0.001). Correlations at regular strength were found between several microbes and meteorological indicators. Parainfluenza and RSV presented marked seasonal patterns.
Resumo:
Morphological integration refers to the modular structuring of inter-trait relationships in an organism, which could bias the direction and rate of morphological change, either constraining or facilitating evolution along certain dimensions of the morphospace. Therefore, the description of patterns and magnitudes of morphological integration and the analysis of their evolutionary consequences are central to understand the evolution of complex traits. Here we analyze morphological integration in the skull of several mammalian orders, addressing the following questions: are there common patterns of inter-trait relationships? Are these patterns compatible with hypotheses based on shared development and function? Do morphological integration patterns and magnitudes vary in the same way across groups? We digitized more than 3,500 specimens spanning 15 mammalian orders, estimated the correspondent pooled within-group correlation and variance/covariance matrices for 35 skull traits and compared those matrices among the orders. We also compared observed patterns of integration to theoretical expectations based on common development and function. Our results point to a largely shared pattern of inter-trait correlations, implying that mammalian skull diversity has been produced upon a common covariance structure that remained similar for at least 65 million years. Comparisons with a rodent genetic variance/covariance matrix suggest that this broad similarity extends also to the genetic factors underlying phenotypic variation. In contrast to the relative constancy of inter-trait correlation/covariance patterns, magnitudes varied markedly across groups. Several morphological modules hypothesized from shared development and function were detected in the mammalian taxa studied. Our data provide evidence that mammalian skull evolution can be viewed as a history of inter-module parcellation, with the modules themselves being more clearly marked in those lineages with lower overall magnitude of integration. The implication of these findings is that the main evolutionary trend in the mammalian skull was one of decreasing the constraints to evolution by promoting a more modular architecture.
Resumo:
To analyze the differential recruitment of the raphe nuclei during different phases of feeding behavior, rats were subjected to a food restriction schedule (food for 2 h/day, during 15 days). The animals were submitted to different feeding conditions, constituting the experimental groups: search for food (MFS), food ingestion (MFI), satiety (MFSa) and food restriction control (MFC). A baseline condition (BC) group was included as further control. The MFI and MFC groups, which presented greater autonomic and somatic activation, had more FOS-immunoreactive (FOS-IR) neurons. The MFI group presented more labeled cells in the linear (LRN) and dorsal (DRN) nuclei; the MFC group showed more labeling in the median (MRN), pontine (PRN), magnus (NRM) and obscurus (NRO) nuclei; and the MFSa group had more labeled cells in the pallidus (NRP). The BC exhibited the lowest number of reactive cells. The PRN presented the highest percentage of activation in the raphe while the DRN the lowest. Additional experiments revealed few double-labeled (FOS-IR+ 5-HT-IR) cells within the raphe nuclei in the MFI group, suggesting little serotonergic activation in the raphe during food ingestion. These findings suggest a differential recruitment of raphe nuclei during various phases of feeding behavior. Such findings may reflect changes in behavioral state (e.g., food-induced arousal versus sleep) that lead to greater motor activation, and consequently increased FOS expression. While these data are consistent with the idea that the raphe system acts as gain setter for autonomic and somatic activities, the functional complexity of the raphe is not completely understood. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
The assessment of routing protocols for mobile wireless networks is a difficult task, because of the networks` dynamic behavior and the absence of benchmarks. However, some of these networks, such as intermittent wireless sensors networks, periodic or cyclic networks, and some delay tolerant networks (DTNs), have more predictable dynamics, as the temporal variations in the network topology can be considered as deterministic, which may make them easier to study. Recently, a graph theoretic model-the evolving graphs-was proposed to help capture the dynamic behavior of such networks, in view of the construction of least cost routing and other algorithms. The algorithms and insights obtained through this model are theoretically very efficient and intriguing. However, there is no study about the use of such theoretical results into practical situations. Therefore, the objective of our work is to analyze the applicability of the evolving graph theory in the construction of efficient routing protocols in realistic scenarios. In this paper, we use the NS2 network simulator to first implement an evolving graph based routing protocol, and then to use it as a benchmark when comparing the four major ad hoc routing protocols (AODV, DSR, OLSR and DSDV). Interestingly, our experiments show that evolving graphs have the potential to be an effective and powerful tool in the development and analysis of algorithms for dynamic networks, with predictable dynamics at least. In order to make this model widely applicable, however, some practical issues still have to be addressed and incorporated into the model, like adaptive algorithms. We also discuss such issues in this paper, as a result of our experience.