856 resultados para complexity
Resumo:
The advent of new signal processing methods, such as non-linear analysis techniques, represents a new perspective which adds further value to brain signals' analysis. Particularly, Lempel–Ziv's Complexity (LZC) has proven to be useful in exploring the complexity of the brain electromagnetic activity. However, an important problem is the lack of knowledge about the physiological determinants of these measures. Although acorrelation between complexity and connectivity has been proposed, this hypothesis was never tested in vivo. Thus, the correlation between the microstructure of the anatomic connectivity and the functional complexity of the brain needs to be inspected. In this study we analyzed the correlation between LZC and fractional anisotropy (FA), a scalar quantity derived from diffusion tensors that is particularly useful as an estimate of the functional integrity of myelinated axonal fibers, in a group of sixteen healthy adults (all female, mean age 65.56 ± 6.06 years, intervals 58–82). Our results showed a positive correlation between FA and LZC scores in regions including clusters in the splenium of the corpus callosum, cingulum, parahipocampal regions and the sagittal stratum. This study supports the notion of a positive correlation between the functional complexity of the brain and the microstructure of its anatomical connectivity. Our investigation proved that a combination of neuroanatomical and neurophysiological techniques may shed some light on the underlying physiological determinants of brain's oscillations
Resumo:
Magnetoencephalography (MEG) allows the real-time recording of neural activity and oscillatory activity in distributed neural networks. We applied a non-linear complexity analysis to resting-state neural activity as measured using whole-head MEG. Recordings were obtained from 20 unmedicated patients with major depressive disorder and 19 matched healthy controls. Subsequently, after 6 months of pharmacological treatment with the antidepressant mirtazapine 30 mg/day, patients received a second MEG scan. A measure of the complexity of neural signals, the Lempel–Ziv Complexity (LZC), was derived from the MEG time series. We found that depressed patients showed higher pre-treatment complexity values compared with controls, and that complexity values decreased after 6 months of effective pharmacological treatment, although this effect was statistically significant only in younger patients. The main treatment effect was to recover the tendency observed in controls of a positive correlation between age and complexity values. Importantly, the reduction of complexity with treatment correlated with the degree of clinical symptom remission. We suggest that LZC, a formal measure of neural activity complexity, is sensitive to the dynamic physiological changes observed in depression and may potentially offer an objective marker of depression and its remission after treatment.
Resumo:
Objective The neurodevelopmental–neurodegenerative debate is a basic issue in the field of the neuropathological basis of schizophrenia (SCH). Neurophysiological techniques have been scarcely involved in such debate, but nonlinear analysis methods may contribute to it. Methods Fifteen patients (age range 23–42 years) matching DSM IV-TR criteria for SCH, and 15 sex- and age-matched control subjects (age range 23–42 years) underwent a resting-state magnetoencephalographic evaluation and Lempel–Ziv complexity (LZC) scores were calculated. Results Regression analyses indicated that LZC values were strongly dependent on age. Complexity scores increased as a function of age in controls, while SCH patients exhibited a progressive reduction of LZC values. A logistic model including LZC scores, age and the interaction of both variables allowed the classification of patients and controls with high sensitivity and specificity. Conclusions Results demonstrated that SCH patients failed to follow the “normal” process of complexity increase as a function of age. In addition, SCH patients exhibited a significant reduction of complexity scores as a function of age, thus paralleling the pattern observed in neurodegenerative diseases. Significance Our results support the notion of a progressive defect in SCH, which does not contradict the existence of a basic neurodevelopmental alteration. Highlights ► Schizophrenic patients show higher complexity values as compared to controls. ► Schizophrenic patients showed a tendency to reduced complexity values as a function of age while controls showed the opposite tendency. ► The tendency observed in schizophrenic patients parallels the tendency observed in Alzheimer disease patients.
Resumo:
The research in this thesis is related to static cost and termination analysis. Cost analysis aims at estimating the amount of resources that a given program consumes during the execution, and termination analysis aims at proving that the execution of a given program will eventually terminate. These analyses are strongly related, indeed cost analysis techniques heavily rely on techniques developed for termination analysis. Precision, scalability, and applicability are essential in static analysis in general. Precision is related to the quality of the inferred results, scalability to the size of programs that can be analyzed, and applicability to the class of programs that can be handled by the analysis (independently from precision and scalability issues). This thesis addresses these aspects in the context of cost and termination analysis, from both practical and theoretical perspectives. For cost analysis, we concentrate on the problem of solving cost relations (a form of recurrence relations) into closed-form upper and lower bounds, which is the heart of most modern cost analyzers, and also where most of the precision and applicability limitations can be found. We develop tools, and their underlying theoretical foundations, for solving cost relations that overcome the limitations of existing approaches, and demonstrate superiority in both precision and applicability. A unique feature of our techniques is the ability to smoothly handle both lower and upper bounds, by reversing the corresponding notions in the underlying theory. For termination analysis, we study the hardness of the problem of deciding termination for a speci�c form of simple loops that arise in the context of cost analysis. This study gives a better understanding of the (theoretical) limits of scalability and applicability for both termination and cost analysis.
Resumo:
Over the last decade, Grid computing paved the way for a new level of large scale distributed systems. This infrastructure made it possible to securely and reliably take advantage of widely separated computational resources that are part of several different organizations. Resources can be incorporated to the Grid, building a theoretical virtual supercomputer. In time, cloud computing emerged as a new type of large scale distributed system, inheriting and expanding the expertise and knowledge that have been obtained so far. Some of the main characteristics of Grids naturally evolved into clouds, others were modified and adapted and others were simply discarded or postponed. Regardless of these technical specifics, both Grids and clouds together can be considered as one of the most important advances in large scale distributed computing of the past ten years; however, this step in distributed computing has came along with a completely new level of complexity. Grid and cloud management mechanisms play a key role, and correct analysis and understanding of the system behavior are needed. Large scale distributed systems must be able to self-manage, incorporating autonomic features capable of controlling and optimizing all resources and services. Traditional distributed computing management mechanisms analyze each resource separately and adjust specific parameters of each one of them. When trying to adapt the same procedures to Grid and cloud computing, the vast complexity of these systems can make this task extremely complicated. But large scale distributed systems complexity could only be a matter of perspective. It could be possible to understand the Grid or cloud behavior as a single entity, instead of a set of resources. This abstraction could provide a different understanding of the system, describing large scale behavior and global events that probably would not be detected analyzing each resource separately. In this work we define a theoretical framework that combines both ideas, multiple resources and single entity, to develop large scale distributed systems management techniques aimed at system performance optimization, increased dependability and Quality of Service (QoS). The resulting synergy could be the key 350 J. Montes et al. to address the most important difficulties of Grid and cloud management.
Resumo:
During the last years cities around the world have invested important quantities of money in measures for reducing congestion and car-trips. Investments which are nothing but potential solutions for the well-known urban sprawl phenomenon, also called the “development trap” that leads to further congestion and a higher proportion of our time spent in slow moving cars. Over the path of this searching for solutions, the complex relationship between urban environment and travel behaviour has been studied in a number of cases. The main question on discussion is, how to encourage multi-stop tours? Thus, the objective of this paper is to verify whether unobserved factors influence tour complexity. For this purpose, we use a data-base from a survey conducted in 2006-2007 in Madrid, a suitable case study for analyzing urban sprawl due to new urban developments and substantial changes in mobility patterns in the last years. A total of 943 individuals were interviewed from 3 selected neighbourhoods (CBD, urban and suburban). We study the effect of unobserved factors on trip frequency. This paper present the estimation of an hybrid model where the latent variable is called propensity to travel and the discrete choice model is composed by 5 alternatives of tour type. The results show that characteristics of the neighbourhoods in Madrid are important to explain trip frequency. The influence of land use variables on trip generation is clear and in particular the presence of commercial retails. Through estimation of elasticities and forecasting we determine to what extent land-use policy measures modify travel demand. Comparing aggregate elasticities with percentage variations, it can be seen that percentage variations could lead to inconsistent results. The result shows that hybrid models better explain travel behavior than traditional discrete choice models.
Resumo:
Several authors have analysed the changes of the probability density function of the solar radiation with different time resolutions. Some others have approached to study the significance of these changes when produced energy calculations are attempted. We have undertaken different transformations to four Spanish databases in order to clarify the interrelationship between radiation models and produced energy estimations. Our contribution is straightforward: the complexity of a solar radiation model needed for yearly energy calculations, is very low. Twelve values of monthly mean of solar radiation are enough to estimate energy with errors below 3%. Time resolutions better than hourly samples do not improve significantly the result of energy estimations.
Resumo:
A unified low complexity sign-bit correlation based symbol timing synchronization scheme for Multiband Orthogonal Frequency Division Multiplexing (MB-OFDM) Ultra Wideband (UWB) receiver system is proposed. By using the time domain sequence of the packet/frame synchronization preamble, the proposed scheme is in charge of detecting the upcoming MB-OFDM symbol and it estimates the exact boundary of the start of Fast Fourier Transform (FFT) window. The proposed algorithm is implemented by using an efficient Hardware-Software co-simulation methodology. The effectiveness of the proposed synchronization scheme and the optimization criteria is confirmed by hardware implementation results.
Resumo:
Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz–Mancini–Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.
Resumo:
Today's motivation for autonomous systems research stems out of the fact that networked environments have reached a level of complexity and heterogeneity that make their control and management by solely human administrators more and more difficult. The optimisation of performance metrics for the air traffic management system, like in other networked system, has become more complex with increasing number of flights, capacity constraints, environmental factors and safety regulations. It is anticipated that a new structure of planning layers and the introduction of higher levels of automation will reduce complexity and will optimise the performance metrics of the air traffic management system. This paper discusses the complexity of optimising air traffic management performance metrics and proposes a way forward based on higher levels of automation.
Resumo:
In informatics there is one kind of complexity that is perceived by everyone. It is the complexity of a concrete, isolated object, normally situated completely within one of the branches universally recognized by the scientific and technical community. Examples of this are the complexity of integrated electronic circuits, the complexity of lgorithms and the complexity of software. The first complexity deals with the number of circuit components, the second with computation time and the third with the number of necessary mental discriminations. In arder to illustrate my point, I will take up the last complexity, which, m o reo ver, is the least well-known.
Resumo:
Actas.
Resumo:
Office automation is one of the fields where the complexity related with technologies and working environments can be best shown. This is the starting point we have chosen to build up a theoretical model that shows us a scene quite different from the one traditionally considered. Through the development of the model, the levels of complexity associated with office automation and office environments have been identified, establishing a relationship between them. Thus, the model allows to state a general principle for sociotechnical design of office automation systems, comprising the ontological distinctions needed to properly evaluate each particular technology and its virtual contribution to office automation. From this fact comes the model's taxonomic ability to draw a global perspective of the state-of-art in office automation technologies.
Resumo:
One medium-term strategy for helping in the management of complexity is the introduction of a conceptual complexity component in the very centre of university curricula. In very few areas is the growth of complexity as evident as in the information technologies (ITs), the focus of the work presented in the current paper. We have therefore developed an integrated way of tackling the specific field of information technologies by means of an approach,to complexity. The content of this paper describes the guidelines of our research effort, placing an emphasis on informatics. Concepts of complexity based on the system metaphor have been substantially drawn upon in this exercise and are thus presented in some detail. Also described is a didactic experiment conducted by the author and designed to provide a new and integrating approach to University curricula for future professionals. The students' "discovery" of complexity is the focal point of the experiment. The findings of this effort are encouraging and call for the continuation and expansion of this experiment.
Resumo:
The influence of CP content and ingredient complexity, feed form, and duration of feeding of the Phase I diets on growth performance and total tract apparent digestibility -TTAD- of energy and nutrients was studied in Iberian pigs weaned at 28 d of age. There were 12 dietary treatments with 2 type of feeds -high-quality, HQ; and low-quality, LQ-, 2 feed forms -pellets vs. mash-, and 3 durations -7, 14, and 21 d- of supply of the Phase I diets.