997 resultados para temporal complexity
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Sustainable computer systems require some flexibility to adapt to environmental unpredictable changes. A solution lies in autonomous software agents which can adapt autonomously to their environments. Though autonomy allows agents to decide which behavior to adopt, a disadvantage is a lack of control, and as a side effect even untrustworthiness: we want to keep some control over such autonomous agents. How to control autonomous agents while respecting their autonomy? A solution is to regulate agents’ behavior by norms. The normative paradigm makes it possible to control autonomous agents while respecting their autonomy, limiting untrustworthiness and augmenting system compliance. It can also facilitate the design of the system, for example, by regulating the coordination among agents. However, an autonomous agent will follow norms or violate them in some conditions. What are the conditions in which a norm is binding upon an agent? While autonomy is regarded as the driving force behind the normative paradigm, cognitive agents provide a basis for modeling the bindingness of norms. In order to cope with the complexity of the modeling of cognitive agents and normative bindingness, we adopt an intentional stance. Since agents are embedded into a dynamic environment, things may not pass at the same instant. Accordingly, our cognitive model is extended to account for some temporal aspects. Special attention is given to the temporal peculiarities of the legal domain such as, among others, the time in force and the time in efficacy of provisions. Some types of normative modifications are also discussed in the framework. It is noteworthy that our temporal account of legal reasoning is integrated to our commonsense temporal account of cognition. As our intention is to build sustainable reasoning systems running unpredictable environment, we adopt a declarative representation of knowledge. A declarative representation of norms will make it easier to update their system representation, thus facilitating system maintenance; and to improve system transparency, thus easing system governance. Since agents are bounded and are embedded into unpredictable environments, and since conflicts may appear amongst mental states and norms, agent reasoning has to be defeasible, i.e. new pieces of information can invalidate formerly derivable conclusions. In this dissertation, our model is formalized into a non-monotonic logic, namely into a temporal modal defeasible logic, in order to account for the interactions between normative systems and software cognitive agents.
Resumo:
Although assessment of asthma control is important to guide treatment, it is difficult since the temporal pattern and risk of exacerbations are often unpredictable. In this Review, we summarise the classic methods to assess control with unidimensional and multidimensional approaches. Next, we show how ideas from the science of complexity can explain the seemingly unpredictable nature of bronchial asthma and emphysema, with implications for chronic obstructive pulmonary disease. We show that fluctuation analysis, a method used in statistical physics, can be used to gain insight into asthma as a dynamic disease of the respiratory system, viewed as a set of interacting subsystems (eg, inflammatory, immunological, and mechanical). The basis of the fluctuation analysis methods is the quantification of the long-term temporal history of lung function parameters. We summarise how this analysis can be used to assess the risk of future asthma episodes, with implications for asthma severity and control both in children and adults.
Resumo:
Ungulates are important components of a variety of ecosystems worldwide. This dissertation integrates aspects of ungulate and forest ecology to increase our understanding of how they work together in ways that are of interest to natural resource managers, educators, and those who are simply curious about nature. Although animal ecology and ecosystem ecology are often studied separately, one of the general goals of this dissertation is to examine how they interact across spatial and temporal scales. Forest ecosystems are heterogeneous across a range of scales. Spatial and temporal habitat use patterns of forest ungulates tend to be congregated in patches where food and/or cover are readily available. Ungulates interact with ecosystem processes by selectively foraging on plants and excreting waste products in concentrated patches. Positive feedbacks may develop where these activities increase the value of habitat through soil fertilization or the alteration of plant chemistry and architecture. Heterogeneity in ecosystem processes and plant community structure, observed at both stand and local scales, may be the integrated outcome of feedbacks between ungulate behavior and abiotic resource gradients. The first chapter of this dissertation briefly discusses pertinent background information on ungulate ecology, with a focus on white-tailed deer (Odocoileus virginianus) in the Upper Great Lakes region and moose (Alces acles) in Isle Royale National Park, Michigan, USA. The second chapter demonstrates why ecological context is important for studying ungulate ecology in forest ecosystems. Excluding deer from eastern hemlock (Tsuga canadensis) stands, which deer use primarily as winter cover, resulted in less spatial complexity in soil reactive nitrogen and greater complexity in diffuse light compared to unfenced stands. The spatial patterning of herbaceous-layer cover was more similar to nitrogen where deer were present, and was a combination of nitrogen and light within deer exclosures. This relationship depends on the seasonal timing of deer habitat use because deer fertilize the soil during winter, but leave during the growing season. The third chapter draws upon an eight-year, 39-stand data set of deer fecal pellet counts in hemlock stands to estimate the amount of nitrogen that deer are depositing in hemlock stands each winter. In stands of high winter deer use, deer-excreted nitrogen inputs consistently exceeded those of atmospheric deposition at the stand scale. At the neighborhood scale, deer-excreted nitrogen was often in excess of atmospheric deposition due to the patchy distribution of deer habitat use. Spatial patterns in habitat use were consistent over the eight-year study at both stand and neighborhood scales. The fourth chapter explores how foraging selectivity by moose interacts with an abiotic resource gradient to influence forest structure and composition. Soil depth on Isle Royale varies from east to west according to glacial history. Fir saplings growing in deeper soils on the west side are generally more palatable forage for moose (lower foliar C:N) than those growing in shallower soils on the east side. Therefore, saplings growing in better conditions are less likely to reach the canopy due to moose browsing, and fir is a smaller overstory component on the west side. Lastly, chapter five focuses on issues surrounding eastern hemlock regeneration failure, which is a habitat type that is important to many wildlife species. Increasing hemlock on the landscape is complicated by several factors including disturbance regime and climate change, in addition to the influence of deer.
Resumo:
Brian electric activity is viewed as sequences of momentary maps of potential distribution. Frequency-domain source modeling, estimation of the complexity of the trajectory of the mapped brain field distributions in state space, and microstate parsing were used as analysis tools. Input-presentation as well as task-free (spontaneous thought) data collection paradigms were employed. We found: Alpha EEG field strength is more affected by visualizing mentation than by abstract mentation, both input-driven as well as self-generated. There are different neuronal populations and brain locations of the electric generators for different temporal frequencies of the brain field. Different alpha frequencies execute different brain functions as revealed by canonical correlations with mentation profiles. Different modes of mentation engage the same temporal frequencies at different brain locations. The basic structure of alpha electric fields implies inhomogeneity over time — alpha consists of concatenated global microstates in the sub-second range, characterized by quasi-stable field topographies, and rapid transitions between the microstates. In general, brain activity is strongly discontinuous, indicating that parsing into field landscape-defined microstates is appropriate. Different modes of spontaneous and induced mentation are associated with different brain electric microstates; these are proposed as candidates for psychophysiological ``atoms of thought''.
Resumo:
The bone-anchored port (BAP) is an investigational implant, which is intended to be fixed on the temporal bone and provide vascular access. There are a number of implants taking advantage of the stability and available room in the temporal bone. These devices range from implantable hearing aids to percutaneous ports. During temporal bone surgery, injuring critical anatomical structures must be avoided. Several methods for computer-assisted temporal bone surgery are reported, which typically add an additional procedure for the patient. We propose a surgical guide in the form of a bone-thickness map displaying anatomical landmarks that can be used for planning of the surgery, and for the intra-operative decision of the implant’s location. The retro-auricular region of the temporal and parietal bone was marked on cone-beam computed tomography scans and tridimensional surfaces displaying the bone thickness were created from this space. We compared this method using a thickness map (n = 10) with conventional surgery without assistance (n = 5) in isolated human anatomical whole head specimens. The use of the thickness map reduced the rate of Dura Mater exposition from 100% to 20% and OPEN ACCESS Materials 2013, 6 5292 suppressed sigmoid sinus exposures. The study shows that a bone-thickness map can be used as a low-complexity method to improve patient’s safety during BAP surgery in the temporal bone.
Resumo:
Snow avalanches pose a threat to settlements and infrastructure in alpine environments. Due to the catastrophic events in recent years, the public is more aware of this phenomenon. Alpine settlements have always been confronted with natural hazards, but changes in land use and in dealing with avalanche hazards lead to an altering perception of this threat. In this study, a multi-temporal risk assessment is presented for three avalanche tracks in the municipality of Galtür, Austria. Changes in avalanche risk as well as changes in the risk-influencing factors (process behaviour, values at risk (buildings) and vulnerability) between 1950 and 2000 are quantified. An additional focus is put on the interconnection between these factors and their influence on the resulting risk. The avalanche processes were calculated using different simulation models (SAMOS as well as ELBA+). For each avalanche track, different scenarios were calculated according to the development of mitigation measures. The focus of the study was on a multi-temporal risk assessment; consequently the used models could be replaced with other snow avalanche models providing the same functionalities. The monetary values of buildings were estimated using the volume of the buildings and average prices per cubic meter. The changing size of the buildings over time was inferred from construction plans. The vulnerability of the buildings is understood as a degree of loss to a given element within the area affected by natural hazards. A vulnerability function for different construction types of buildings that depends on avalanche pressure was used to assess the degree of loss. No general risk trend could be determined for the studied avalanche tracks. Due to the high complexity of the variations in risk, small changes of one of several influencing factors can cause considerable differences in the resulting risk. This multi-temporal approach leads to better understanding of the today's risk by identifying the main changes and the underlying processes. Furthermore, this knowledge can be implemented in strategies for sustainable development in Alpine settlements.
Resumo:
Quantitative descriptive analysis (QDA) is used to describe the nature and the intensity of sensory properties from a single evaluation of a product, whereas temporal dominance of sensation (TDS) is primarily used to identify dominant sensory properties over time. Previous studies with TDS have focused on model systems, but this is the first study to use a sequential approach, i.e. QDA then TDS in measuring sensory properties of a commercial product category, using the same set of trained assessors (n = 11). The main objectives of this study were to: (1) investigate the benefits of using a sequential approach of QDA and TDS and (2) to explore the impact of the sample composition on taste and flavour perceptions in blackcurrant squashes. The present study has proposed an alternative way of determining the choice of attributes for TDS measurement based on data obtained from previous QDA studies, where available. Both methods indicated that the flavour profile was primarily influenced by the level of dilution and complexity of sample composition combined with blackcurrant juice content. In addition, artificial sweeteners were found to modify the quality of sweetness and could also contribute to bitter notes. Using QDA and TDS in tandem was shown to be more beneficial than each just on its own enabling a more complete sensory profile of the products.
Resumo:
Increasingly, studies of genes and genomes are indicating that considerable horizontal transfer has occurred between prokaryotes. Extensive horizontal transfer has occurred for operational genes (those involved in housekeeping), whereas informational genes (those involved in transcription, translation, and related processes) are seldomly horizontally transferred. Through phylogenetic analysis of six complete prokaryotic genomes and the identification of 312 sets of orthologous genes present in all six genomes, we tested two theories describing the temporal flow of horizontal transfer. We show that operational genes have been horizontally transferred continuously since the divergence of the prokaryotes, rather than having been exchanged in one, or a few, massive events that occurred early in the evolution of prokaryotes. In agreement with earlier studies, we found that differences in rates of evolution between operational and informational genes are minimal, suggesting that factors other than rate of evolution are responsible for the observed differences in horizontal transfer. We propose that a major factor in the more frequent horizontal transfer of operational genes is that informational genes are typically members of large, complex systems, whereas operational genes are not, thereby making horizontal transfer of informational gene products less probable (the complexity hypothesis).
Resumo:
Analysis of previously published sets of DNA microarray gene expression data by singular value decomposition has uncovered underlying patterns or “characteristic modes” in their temporal profiles. These patterns contribute unequally to the structure of the expression profiles. Moreover, the essential features of a given set of expression profiles are captured using just a small number of characteristic modes. This leads to the striking conclusion that the transcriptional response of a genome is orchestrated in a few fundamental patterns of gene expression change. These patterns are both simple and robust, dominating the alterations in expression of genes throughout the genome. Moreover, the characteristic modes of gene expression change in response to environmental perturbations are similar in such distant organisms as yeast and human cells. This analysis reveals simple regularities in the seemingly complex transcriptional transitions of diverse cells to new states, and these provide insights into the operation of the underlying genetic networks.
Resumo:
We quantified the morphology of over 350 pyramidal neurons with identified ipsilateral corticocortical projections to the primary (V1) and middle temporal (MT) visual areas of the marmoset monkey, following intracellular injection of Lucifer Yellow into retrogradely labelled cells. Paralleling the results of studies in which randomly sampled pyramidal cells were injected, we found that the size of the basal dendritic tree of connectionally identified cells differed between cortical areas, as did the branching complexity and spine density. We found no systematic relationship between dendritic tree structure and axon target or length. Instead, the size of the basal dendritic tree increased roughly in relation to increasing distance from the occipital pole, irrespective of the length of the connection or the cortical layer in which the neurons were located. For example, cells in the second visual area had some of the smallest and least complex dendritic trees irrespective of whether they projected to V1 or MT, while those in the dorsolateral area (DL) were among the largest and most complex. We also observed that systematic differences in spine number were more marked among V1-projecting cells than MT-projecting cells. These data demonstrate that the previously documented systematic differences in pyramidal cell morphology between areas cannot simply be attributed to variable proportions of neurons projecting to different targets, in the various areas. Moreover, they suggest that mechanisms intrinsic to the area in which neurons are located are strong determinants of basal dendritic field structure.
Resumo:
Pattern discovery in temporal event sequences is of great importance in many application domains, such as telecommunication network fault analysis. In reality, not every type of event has an accurate timestamp. Some of them, defined as inaccurate events may only have an interval as possible time of occurrence. The existence of inaccurate events may cause uncertainty in event ordering. The traditional support model cannot deal with this uncertainty, which would cause some interesting patterns to be missing. A new concept, precise support, is introduced to evaluate the probability of a pattern contained in a sequence. Based on this new metric, we define the uncertainty model and present an algorithm to discover interesting patterns in the sequence database that has one type of inaccurate event. In our model, the number of types of inaccurate events can be extended to k readily, however, at a cost of increasing computational complexity.
Resumo:
Computer simulated trajectories of bulk water molecules form complex spatiotemporal structures at the picosecond time scale. This intrinsic complexity, which underlies the formation of molecular structures at longer time scales, has been quantified using a measure of statistical complexity. The method estimates the information contained in the molecular trajectory by detecting and quantifying temporal patterns present in the simulated data (velocity time series). Two types of temporal patterns are found. The first, defined by the short-time correlations corresponding to the velocity autocorrelation decay times (â‰0.1â€ps), remains asymptotically stable for time intervals longer than several tens of nanoseconds. The second is caused by previously unknown longer-time correlations (found at longer than the nanoseconds time scales) leading to a value of statistical complexity that slowly increases with time. A direct measure based on the notion of statistical complexity that describes how the trajectory explores the phase space and independent from the particular molecular signal used as the observed time series is introduced. © 2008 The American Physical Society.
Resumo:
Novel molecular complexity measures are designed based on the quantum molecular kinematics. The Hamiltonian matrix constructed in a quasi-topological approximation describes the temporal evolution of the modelled electronic system and determined the time derivatives for the dynamic quantities. This allows to define the average quantum kinematic characteristics closely related to the curvatures of the electron paths, particularly, the torsion reflecting the chirality of the dynamic system. A special attention has been given to the computational scheme for this chirality measure. The calculations on realistic molecular systems demonstrate reasonable behaviour of the proposed molecular complexity indices.
Resumo:
We use advanced statistical tools of time-series analysis to characterize the dynamical complexity of the transition to optical wave turbulence in a fiber laser. Ordinal analysis and the horizontal visibility graph applied to the experimentally measured laser output intensity reveal the presence of temporal correlations during the transition from the laminar to the turbulent lasing regimes. Both methods unveil coherent structures with well-defined time scales and strong correlations both, in the timing of the laser pulses and in their peak intensities. Our approach is generic and may be used in other complex systems that undergo similar transitions involving the generation of extreme fluctuations.