860 resultados para Complexity economics
Resumo:
As a nation we have gained world recognition for our ability to utilize our resources. In forestry our greatest accomplishments have been in the mechanization of harvest methods and in improvements in forest products. The renewal of this resource has been our greatest neglect. Though the end of the 19th Century marked the beginning of the conservation movement, it was not until a half century later that the force of economics through the demands of a growing population made forest re-establishment more than just a desire. Conservation in itself is a Utopian concept which requires other motivating forces to make it a reality. In the post-war years, and as late as the early 195O's, stocked land in the Pacific Northwest could be purchased for less than the cost of planting; the economic incentive was lacking. Only with sustained yield management and increased land values was there a balance in favor of true values. With greater effort placed on forest regeneration there was an increased need for methods of reducing losses to wildlife. The history of forest wildlife damage research, therefore, parallels that of forest land management; after rather austere beginnings, development became predominantly a response to economics. It was not until 1950 that the full time of one scientist was assigned to this important activity. The development of control methods for forest animal damage is a relatively new area of research. All animal life is dependent upon plants for its existence; forest wildlife is no exception. The removal of seed and foliage of undesirable plants often benefits the land managers; only when the losses or injuries are in conflict with man's interest is there damage involved. Unfortunately, the feeding activities of wildlife and the interests of the land managers are often in conflict. Few realize the breadth, scope, and subtilities associated with forest wildlife damage problems. There are not only numerous species of animals involved, but also a myriad of conditions, each combination possessing unique facets. It is a foregone conclusion that an understanding of the conditions is essential to facilitate a solution to any given problem. Though there are numerous methods of reducing animal damage, all of which have application under some situations, in this discussion emphasis will be placed on the role of chemicals and on western problems. Because of the broadness and complexity of the problem, generalizing is necessary and only brief coverage will be possible. However, an attempt will be made to discuss the use and limitations of various control methods.
Resumo:
This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.
Resumo:
In this article, we introduce two new variants of the Assembly Line Worker Assignment and Balancing Problem (ALWABP) that allow parallelization of and collaboration between heterogeneous workers. These new approaches suppose an additional level of complexity in the Line Design and Assignment process, but also higher flexibility; which may be particularly useful in practical situations where the aim is to progressively integrate slow or limited workers in conventional assembly lines. We present linear models and heuristic procedures for these two new problems. Computational results show the efficiency of the proposed approaches and the efficacy of the studied layouts in different situations. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Methods from statistical physics, such as those involving complex networks, have been increasingly used in the quantitative analysis of linguistic phenomena. In this paper, we represented pieces of text with different levels of simplification in co-occurrence networks and found that topological regularity correlated negatively with textual complexity. Furthermore, in less complex texts the distance between concepts, represented as nodes, tended to decrease. The complex networks metrics were treated with multivariate pattern recognition techniques, which allowed us to distinguish between original texts and their simplified versions. For each original text, two simplified versions were generated manually with increasing number of simplification operations. As expected, distinction was easier for the strongly simplified versions, where the most relevant metrics were node strength, shortest paths and diversity. Also, the discrimination of complex texts was improved with higher hierarchical network metrics, thus pointing to the usefulness of considering wider contexts around the concepts. Though the accuracy rate in the distinction was not as high as in methods using deep linguistic knowledge, the complex network approach is still useful for a rapid screening of texts whenever assessing complexity is essential to guarantee accessibility to readers with limited reading ability. Copyright (c) EPLA, 2012
Resumo:
The intention of this paper is to present some Aristotelian arguments regarding the motion on local terrestrial region. Because it is a highly sophisticated and complex explanation dealt with, briefly, the principles and causes that based theoretic sciences in general and in particular physics. Subdivided into eight topics this article in order to facilitate the understanding of these concepts for the reader not familiar with the Aristotelian texts. With intent to avoid an innocent view, anachronistic and linear the citations are of primary sources or commentators of Aristotle's works.
Resumo:
Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]
Resumo:
In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.
Resumo:
With the financial market globalization, foreign investments became vital for the economies, mainly in emerging countries. In the last decades, Brazilian exchange rates appeared as a good indicator to measure either investors' confidence or risk aversion. Here, some events of global or national financial crisis are analyzed, trying to understand how they influenced the "dollar-real" rate evolution. The theoretical tool to be used is the Lopez-Mancini-Calbet (LMC) complexity measure that, applied to real exchange rate data, has shown good fitness between critical events and measured patterns. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
Birds that remove ectoparasites and other food material from their hosts are iconic illustrations of mutualistic-commensalistic cleaning associations. To assess the complex pattern of food resource use embedded in cleaning interactions of an assemblage of birds and their herbivorous mammal hosts in open habitats in Brazil, we used a network approach that characterized their patterns of association. Cleaning interactions showed a distinctly nested pattern, related to the number of interactions of cleaners and hosts and to the range of food types that each host species provided. Hosts that provided a wide range of food types (flies, ticks, tissue and blood, and organic debris) were attended by more species of cleaners and formed the core of the web. On the other hand, core cleaner species did not exploit the full range of available food resources, but used a variety of host species to exploit these resources instead. The structure that we found indicates that cleaners rely on cleaning interactions to obtain food types that would not be available otherwise (e.g., blood-engorged ticks or horseflies, wounded tissue). Additionally, a nested organization for the cleaner bird mammalian herbivore association means that both generalist and selective species take part in the interactions and that partners of selective species form an ordered subset of the partners of generalist species. The availability of predictable protein-rich food sources for birds provided by cleaning interactions may lead to an evolutionary pathway favoring their increased use by birds that forage opportunistically. Received 30 June 2011, accepted 10 November 2011.
Resumo:
Increasing age is associated with a reduction in overall heart rate variability as well as changes in complexity of physiologic dynamics. The aim of this study was to verify if the alterations in autonomic modulation of heart rate caused by the aging process could be detected by Shannon entropy (SE), conditional entropy (CE) and symbolic analysis (SA). Complexity analysis was carried out in 44 healthy subjects divided into two groups: old (n = 23, 63 +/- A 3 years) and young group (n = 21, 23 +/- A 2). It was analyzed SE, CE [complexity index (CI) and normalized CI (NCI)] and SA (0V, 1V, 2LV and 2ULV patterns) during short heart period series (200 cardiac beats) derived from ECG recordings during 15 min of rest in a supine position. The sequences characterized by three heart periods with no significant variations (0V), and that with two significant unlike variations (2ULV) reflect changes in sympathetic and vagal modulation, respectively. The unpaired t test (or Mann-Whitney rank sum test when appropriate) was used in the statistical analysis. In the aging process, the distributions of patterns (SE) remain similar to young subjects. However, the regularity is significantly different; the patterns are more repetitive in the old group (a decrease of CI and NCI). The amounts of pattern types are different: 0V is increased and 2LV and 2ULV are reduced in the old group. These differences indicate marked change of autonomic regulation. The CE and SA are feasible techniques to detect alteration in autonomic control of heart rate in the old group.
Resumo:
We used the statistical measurements of information entropy, disequilibrium and complexity to infer a hierarchy of equations of state for two types of compact stars from the broad class of neutron stars, namely, with hadronic composition and with strange quark composition. Our results show that, since order costs energy. Nature would favor the exotic strange stars even though the question of how to form the strange stars cannot be answered within this approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
This article develops an ecological economic interpretation of the Jevons effect. Moreover, it is argued that under the neoclassical paradigm there are no elements with which to foresee the long-term existence of this phenomenon. The objective of these arguments is to demonstrate that the Jevons effect can be used to compare the ability of neoclassical and ecological economics describing the social appropriation of nature. This is elaborated in two steps. First, we show the importance of the thesis that the economy cannot be cut off from the biophysical materiality of what is produced to give consistency to the so-called Khazzoom-Brookes postulate. It is made clear that this supposition is exogenous to the neoclassical paradigm. Second, the supposition of the biophysical materiality of what is produced is utilized to make an ecological economic interpretation of the Jevons effect. Afterwards, a comparison is made between the neoclassical and the ecological economic perspectives. This comparison leads to the following conclusions: (i) the persistent presence of the Jevons effect in the long run is an anomaly in the neoclassical paradigm; (ii) the observation of the non-existence of the Jevons effect is a refutation of the supposition that economic growth and biophysical materiality are not separable, a central thesis defended by ecological economists. This situation makes possible to use the Jevons effect as a 'laboratory test' to compare the ability of neoclassical and ecological economic paradigms to describe the social appropriation of nature. (C) 20111 Elsevier B.V. All rights reserved.
Resumo:
Background: In normal aging, the decrease in the syntactic complexity of written production is usually associated with cognitive deficits. This study was aimed to analyze the quality of older adults' textual production indicated by verbal fluency (number of words) and grammatical complexity (number of ideas) in relation to gender, age, schooling, and cognitive status. Methods: From a probabilistic sample of community-dwelling people aged 65 years and above (n = 900), 577 were selected on basis of their responses to the Mini-Mental State Examination (MMSE) sentence writing, which were submitted to content analysis; 323 were excluded as they left the item blank or performed illegible or not meaningful responses. Education adjusted cut-off scores for the MMSE were used to classify the participants as cognitively impaired or unimpaired. Total and subdomain MMSE scores were computed. Results: 40.56% of participants whose answers to the MMSE sentence were excluded from the analyses had cognitive impairment compared to 13.86% among those whose answers were included. The excluded participants were older and less educated. Women and those older than 80 years had the lowest scores in the MMSE. There was no statistically significant relationship between gender, age, schooling, and textual performance. There was a modest but significant correlation between number of words written and the scores in the Language subdomain. Conclusions: Results suggest the strong influence of schooling and age over MMSE sentence performance. Failing to write a sentence may suggest cognitive impairment, yet, instructions for the MMSE sentence, i.e. to produce a simple sentence, may limit its clinical interpretation.
Resumo:
This paper presents a new parallel methodology for calculating the determinant of matrices of the order n, with computational complexity O(n), using the Gauss-Jordan Elimination Method and Chio's Rule as references. We intend to present our step-by-step methodology using clear mathematical language, where we will demonstrate how to calculate the determinant of a matrix of the order n in an analytical format. We will also present a computational model with one sequential algorithm and one parallel algorithm using a pseudo-code.