927 resultados para Paradigm of complexity


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three experiments investigated the effect of complexity on children's understanding of a beam balance. In nonconflict problems, weights or distances varied, while the other was held constant. In conflict items, both weight and distance varied, and items were of three kinds: weight dominant, distance dominant, or balance (in which neither was dominant). In Experiment 1, 2-year-old children succeeded on nonconflict-weight and nonconflict-distance problems. This result was replicated in Experiment 2, but performance on conflict items did not exceed chance. In Experiment 3, 3- and 4-year-olds succeeded on all except conflict balance problems, while 5- and 6-year-olds succeeded on all problem types. The results were interpreted in terms of relational complexity theory. Children aged 2 to 4 years succeeded on problems that entailed binary relations, but 5- and 6-year-olds also succeeded on problems that entailed ternary relations. Ternary relations tasks from other domains-transitivity and class inclusion-accounted for 93% of the age-related variance in balance scale scores. (C) 2002 Elsevier Science (USA).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The introduction of electricity markets and integration of Distributed Generation (DG) have been influencing the power system’s structure change. Recently, the smart grid concept has been introduced, to guarantee a more efficient operation of the power system using the advantages of this new paradigm. Basically, a smart grid is a structure that integrates different players, considering constant communication between them to improve power system operation and management. One of the players revealing a big importance in this context is the Virtual Power Player (VPP). In the transportation sector the Electric Vehicle (EV) is arising as an alternative to conventional vehicles propel by fossil fuels. The power system can benefit from this massive introduction of EVs, taking advantage on EVs’ ability to connect to the electric network to charge, and on the future expectation of EVs ability to discharge to the network using the Vehicle-to-Grid (V2G) capacity. This thesis proposes alternative strategies to control these two EV modes with the objective of enhancing the management of the power system. Moreover, power system must ensure the trips of EVs that will be connected to the electric network. The EV user specifies a certain amount of energy that will be necessary to charge, in order to ensure the distance to travel. The introduction of EVs in the power system turns the Energy Resource Management (ERM) under a smart grid environment, into a complex problem that can take several minutes or hours to reach the optimal solution. Adequate optimization techniques are required to accommodate this kind of complexity while solving the ERM problem in a reasonable execution time. This thesis presents a tool that solves the ERM considering the intensive use of EVs in the smart grid context. The objective is to obtain the minimum cost of ERM considering: the operation cost of DG, the cost of the energy acquired to external suppliers, the EV users payments and remuneration and penalty costs. This tool is directed to VPPs that manage specific network areas, where a high penetration level of EVs is expected to be connected in these areas. The ERM is solved using two methodologies: the adaptation of a deterministic technique proposed in a previous work, and the adaptation of the Simulated Annealing (SA) technique. With the purpose of improving the SA performance for this case, three heuristics are additionally proposed, taking advantage on the particularities and specificities of an ERM with these characteristics. A set of case studies are presented in this thesis, considering a 32 bus distribution network and up to 3000 EVs. The first case study solves the scheduling without considering EVs, to be used as a reference case for comparisons with the proposed approaches. The second case study evaluates the complexity of the ERM with the integration of EVs. The third case study evaluates the performance of scheduling with different control modes for EVs. These control modes, combined with the proposed SA approach and with the developed heuristics, aim at improving the quality of the ERM, while reducing drastically its execution time. The proposed control modes are: uncoordinated charging, smart charging and V2G capability. The fourth and final case study presents the ERM approach applied to consecutive days.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Business organisations are excellent representations of what in physics and mathematics are designated "chaotic" systems. Because a culture of innovation will be vital for organisational survival in the 21st century, the present paper proposes that viewing organisations in terms of "complexity theory" may assist leaders in fine-tuning managerial philosophies that provide orderly management emphasizing stability within a culture of organised chaos, for it is on the "boundary of chaos" that the greatest creativity occurs. It is argued that 21st century companies, as chaotic social systems, will no longer be effectively managed by rigid objectives (MBO) nor by instructions (MBI). Their capacity for self-organisation will be derived essentially from how their members accept a shared set of values or principles for action (MBV). Complexity theory deals with systems that show complex structures in time or space, often hiding simple deterministic rules. This theory holds that once these rules are found, it is possible to make effective predictions and even to control the apparent complexity. The state of chaos that self-organises, thanks to the appearance of the "strange attractor", is the ideal basis for creativity and innovation in the company. In this self-organised state of chaos, members are not confined to narrow roles, and gradually develop their capacity for differentiation and relationships, growing continuously toward their maximum potential contribution to the efficiency of the organisation. In this way, values act as organisers or "attractors" of disorder, which in the theory of chaos are equations represented by unusually regular geometric configurations that predict the long-term behaviour of complex systems. In business organisations (as in all kinds of social systems) the starting principles end up as the final principles in the long term. An attractor is a model representation of the behavioral results of a system. The attractor is not a force of attraction or a goal-oriented presence in the system; it simply depicts where the system is headed based on its rules of motion. Thus, in a culture that cultivates or shares values of autonomy, responsibility, independence, innovation, creativity, and proaction, the risk of short-term chaos is mitigated by an overall long-term sense of direction. A more suitable approach to manage the internal and external complexities that organisations are currently confronting is to alter their dominant culture under the principles of MBV.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although tumor heterogeneity is widely accepted, the existence of cancer stem cells (CSCs) and their proposed role in tumor maintenance has always been challenged and remains a matter of debate. Recently, a path-breaking chapter was added to this saga when three independent groups reported the in vivo existence of CSCs in brain, skin and intestinal tumors using lineage-tracing and thus strengthens the CSC concept; even though certain fundamental caveats are always associated with lineage-tracing approach. In principle, the CSC hypothesis proposes that similar to normal stem cells, CSCs maintain self renewal and multilineage differentiation property and are found at the central echelon of cellular hierarchy present within tumors. However, these cells differ from their normal counterpart by maintaining their malignant potential, alteration of genomic integrity, epigenetic identity and the expression of specific surface protein profiles. As CSCs are highly resistant to chemotherapeutics, they are thought to be a crucial factor involved in tumor relapse and superficially appear as the ultimate therapeutic target. However, even that is not the end; further complication is attributed by reports of bidirectional regeneration mechanism for CSCs, one from their self-renewal capability and another from the recently proposed concept of dynamic equilibrium between CSCs and non-CSCs via their interconversion. This phenomenon has currently added a new layer of complexity in understanding the biology of tumor heterogeneity. In-spite of its associated controversies, this area has rapidly emerged as the center of attention for researchers and clinicians, because of the conceptual framework it provides towards devising new therapies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Maximum entropy modeling (Maxent) is a widely used algorithm for predicting species distributions across space and time. Properly assessing the uncertainty in such predictions is non-trivial and requires validation with independent datasets. Notably, model complexity (number of model parameters) remains a major concern in relation to overfitting and, hence, transferability of Maxent models. An emerging approach is to validate the cross-temporal transferability of model predictions using paleoecological data. In this study, we assess the effect of model complexity on the performance of Maxent projections across time using two European plant species (Alnus giutinosa (L.) Gaertn. and Corylus avellana L) with an extensive late Quaternary fossil record in Spain as a study case. We fit 110 models with different levels of complexity under present time and tested model performance using AUC (area under the receiver operating characteristic curve) and AlCc (corrected Akaike Information Criterion) through the standard procedure of randomly partitioning current occurrence data. We then compared these results to an independent validation by projecting the models to mid-Holocene (6000 years before present) climatic conditions in Spain to assess their ability to predict fossil pollen presence-absence and abundance. We find that calibrating Maxent models with default settings result in the generation of overly complex models. While model performance increased with model complexity when predicting current distributions, it was higher with intermediate complexity when predicting mid-Holocene distributions. Hence, models of intermediate complexity resulted in the best trade-off to predict species distributions across time. Reliable temporal model transferability is especially relevant for forecasting species distributions under future climate change. Consequently, species-specific model tuning should be used to find the best modeling settings to control for complexity, notably with paleoecological data to independently validate model projections. For cross-temporal projections of species distributions for which paleoecological data is not available, models of intermediate complexity should be selected.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper analyses the effects of manipulating the cognitive complexity of L2 oral tasks on language production. It specifically focuses on self-repairs, which are taken as a measure of accuracy since they denote both attention to form and an attempt at being accurate. By means of a repeated measures de- sign, 42 lower-intermediate students were asked to perform three different tasks types (a narrative, and instruction-giving task, and a decision-making task) for which two degrees of cognitive complexity were established. The narrative task was manipulated along +/− Here-and-Now, an instruction-giving task ma- nipulated along +/− elements, and the decision-making task which is manipu- lated along +/− reasoning demands. Repeated measures ANOVAs are used for the calculation of differences between degrees of complexity and among task types. One-way ANOVA are used to detect potential differences between low- proficiency and high-proficiency participants. Results show an overall effect of Task Complexity on self-repairs behavior across task types, with different be- haviors existing among the three task types. No differences are found between the self-repair behavior between low and high proficiency groups. Results are discussed in the light of theories of cognition and L2 performance (Robin- son 2001a, 2001b, 2003, 2005, 2007), L1 and L2 language production models (Levelt 1989, 1993; Kormos 2000, 2006), and attention during L2 performance (Skehan 1998; Robinson, 2002).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Developing software is a difficult and error-prone activity. Furthermore, the complexity of modern computer applications is significant. Hence,an organised approach to software construction is crucial. Stepwise Feature Introduction – created by R.-J. Back – is a development paradigm, in which software is constructed by adding functionality in small increments. The resulting code has an organised, layered structure and can be easily reused. Moreover, the interaction with the users of the software and the correctness concerns are essential elements of the development process, contributing to high quality and functionality of the final product. The paradigm of Stepwise Feature Introduction has been successfully applied in an academic environment, to a number of small-scale developments. The thesis examines the paradigm and its suitability to construction of large and complex software systems by focusing on the development of two software systems of significant complexity. Throughout the thesis we propose a number of improvements and modifications that should be applied to the paradigm when developing or reengineering large and complex software systems. The discussion in the thesis covers various aspects of software development that relate to Stepwise Feature Introduction. More specifically, we evaluate the paradigm based on the common practices of object-oriented programming and design and agile development methodologies. We also outline the strategy to testing systems built with the paradigm of Stepwise Feature Introduction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The main purpose of the present doctoral thesis is to investigate subjective experiences and cognitive processes in four different types of altered states of consciousness: naturally occurring dreaming, cognitively induced hypnosis, pharmacologically induced sedation, and pathological psychosis. Both empirical and theoretical research is carried out, resulting in four empirical and four theoretical studies. The thesis begins with a review of the main concepts used in consciousness research, the most influential philosophical and neurobiological theories of subjective experience, the classification of altered states of consciousness, and the main empirical methods used to study consciousness alterations. Next, findings of the original studies are discussed, as follows. Phenomenal consciousness is found to be dissociable from responsiveness, as subjective experiences do occur in unresponsive states, including anaesthetic-induced sedation and natural sleep, as demonstrated by post-awakening subjective reports. Two new tools for the content analysis of subjective experiences and dreams are presented, focusing on the diversity, complexity and dynamics of phenomenal consciousness. In addition, a new experimental paradigm of serial awakenings from non-rapid eye movement sleep is introduced, which enables more rapid sampling of dream reports than has been available in previous studies. It is also suggested that lucid dreaming can be studied using transcranial brain stimulation techniques and systematic analysis of pre-lucid dreaming. For blind judges, dreams of psychotic patients appear to be indistinguishable from waking mentation reports collected from the same patients, which indicates a close resemblance of these states of mind. However, despite phenomenological similarities, dreaming should not be treated as a uniform research model of psychotic or intact consciousness. Contrary to this, there seems to be a multiplicity of routes of how different states of consciousness can be associated. For instance, seemingly identical time perception distortions in different alterations of consciousness may have diverse underlying causes for these distortions. It is also shown that altered states do not necessarily exhibit impaired cognitive processing compared to a baseline waking state of consciousness: a case study of time perception in a hypnotic virtuoso indicates a more consistent perceptual timing under hypnosis than in a waking state. The thesis ends with a brief discussion of the most promising new perspectives for the study of alterations of consciousness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

n this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The problem of complexity is particularly relevant to the field of control engineering, since many engineering problems are inherently complex. The inherent complexity is such that straightforward computational problem solutions often produce very poor results. Although parallel processing can alleviate the problem to some extent, it is artificial neural networks (in various forms) which have recently proved particularly effective, even in dealing with the causes of the problem itself. This paper presents an overview of the current neural network research being undertaken. Such research aims to solve the complex problems found in many areas of science and engineering today.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Complexity is integral to planning today. Everyone and everything seem to be interconnected, causality appears ambiguous, unintended consequences are ubiquitous, and information overload is a constant challenge. The nature of complexity, the consequences of it for society, and the ways in which one might confront it, understand it and deal with it in order to allow for the possibility of planning, are issues increasingly demanding analytical attention. One theoretical framework that can potentially assist planners in this regard is Luhmann's theory of autopoiesis. This article uses insights from Luhmann's ideas to understand the nature of complexity and its reduction, thereby redefining issues in planning, and explores the ways in which management of these issues might be observed in actual planning practice via a reinterpreted case study of the People's Planning Campaign in Kerala, India. Overall, this reinterpretation leads to a different understanding of the scope of planning and planning practice, telling a story about complexity and systemic response. It allows the reinterpretation of otherwise familiar phenomena, both highlighting the empirical relevance of the theory and providing new and original insight into particular dynamics of the case study. This not only provides a greater understanding of the dynamics of complexity, but also produces advice to help planners implement structures and processes that can cope with complexity in practice.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study investigates effects of syntactic complexity operationalised in terms of movement, intervention and (NP) feature similarity in the development of A’ dependencies in 4-, 6-, and 8-year old typically developing (TD) French children and children with Autism Spectrum Disorders (ASD). Children completed an off-line comprehension task testing eight syntactic structures classified in four levels of complexity: Level 0: No Movement; Level 1: Movement without (configurational) Intervention; Level 2: Movement with Intervention from an element which is maximally different or featurally ‘disjoint’ (mismatched in both lexical NP restriction and number); Level 3: Movement with Intervention from an element similar in one feature or featurally ‘intersecting’ (matched in lexical NP restriction, mismatched in number). The results show that syntactic complexity affects TD children across the three age groups, but also indicate developmental differences between these groups. Movement affected all three groups in a similar way, but intervention effects in intersection cases were stronger in younger than older children, with NP feature similarity affecting only 4-year olds. Complexity effects created by the similarity in lexical restriction of an intervener thus appear to be overcome early in development, arguably thanks to other differences of this intervener (which was mismatched in number). Children with ASD performed less well than the TD children although they were matched on non-verbal reasoning. Overall, syntactic complexity affected their performance in a similar way as in their TD controls, but their performance correlated with non-verbal abilities rather than age, suggesting that their grammatical development does not follow the smooth relation to age that is found in TD children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article presents, under the perspective of Complexity Theory, the characteristics of the learning process of Spanish as a foreign language in Teletandem. Data were collected from two pairs of Portuguese-Spanish interagents, who were engaged in a systematic and regular interaction, based on the tandem principles. It was found that the learning experience is developed with the peculiarities that arise from the context, agents, members and their nuances, which revealed the presence of a shallow space between the systems of native and foreign languages.