943 resultados para Complex systems prediction


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Aims - A National Screening Programme for diabetic eye disease in the UK is in development. We propose a grading and early disease management protocol to detect sight-threatening diabetic retinopathy and any retinopathy, which will allow precise quality assurance at all steps while minimizing false-positive referral to the hospital eye service. Methods - Expert panel structured discussions between 2000 and 2002 with review of existing evidence and grading classifications. Proposals - Principles of the protocol include: separate grading of retinopathy and maculopathy, minimum number of steps, compatible with central monitoring, expandable for established more complex systems and for research, no lesion counting, no ‘questionable’ lesions, attempt to detect focal exudative, diffuse and ischaemic maculopathy and fast track referral from primary or secondary graders. Sight-threatening diabetic retinopathy is defined as: preproliferative retinopathy or worse, sight-threatening maculopathy and/or the presence of photocoagulation. In the centrally reported minimum data set retinopathy is graded into four levels: none (R0), background (R1), preproliferative (R2), proliferative (R3). Maculopathy and photocoagulation are graded as absent (M0, P0) or present (M1, P1). Discussion - The protocol developed by the Diabetic Retinopathy Grading and Disease Management Working Party represents a new consensus upon which national guidelines can be based leading to the introduction of quality-assured screening for people with diabetes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We study the dynamics of a growing crystalline facet where the growth mechanism is controlled by the geometry of the local curvature. A continuum model, in (2+1) dimensions, is developed in analogy with the Kardar-Parisi-Zhang (KPZ) model is considered for the purpose. Following standard coarse graining procedures, it is shown that in the large time, long distance limit, the continuum model predicts a curvature independent KPZ phase, thereby suppressing all explicit effects of curvature and local pinning in the system, in the "perturbative" limit. A direct numerical integration of this growth equation, in 1+1 dimensions, supports this observation below a critical parametric range, above which generic instabilities, in the form of isolated pillared structures lead to deviations from standard scaling behaviour. Possibilities of controlling this instability by introducing statistically "irrelevant" (in the sense of renormalisation groups) higher ordered nonlinearities have also been discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Advances in statistical physics relating to our understanding of large-scale complex systems have recently been successfully applied in the context of communication networks. Statistical mechanics methods can be used to decompose global system behavior into simple local interactions. Thus, large-scale problems can be solved or approximated in a distributed manner with iterative lightweight local messaging. This survey discusses how statistical physics methodology can provide efficient solutions to hard network problems that are intractable by classical methods. We highlight three typical examples in the realm of networking and communications. In each case we show how a fundamental idea of statistical physics helps solve the problem in an efficient manner. In particular, we discuss how to perform multicast scheduling with message passing methods, how to improve coding using the crystallization process, and how to compute optimal routing by representing routes as interacting polymers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

2000 Mathematics Subject Classification: 60J45, 60K25

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The quantization scheme is suggested for a spatially inhomogeneous 1+1 Bianchi I model. The scheme consists in quantization of the equations of motion and gives the operator (so called quasi-Heisenberg) equations describing explicit evolution of a system. Some particular gauge suitable for quantization is proposed. The Wheeler-DeWitt equation is considered in the vicinity of zero scale factor and it is used to construct a space where the quasi-Heisenberg operators act. Spatial discretization as a UV regularization procedure is suggested for the equations of motion.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

We use advanced statistical tools of time-series analysis to characterize the dynamical complexity of the transition to optical wave turbulence in a fiber laser. Ordinal analysis and the horizontal visibility graph applied to the experimentally measured laser output intensity reveal the presence of temporal correlations during the transition from the laminar to the turbulent lasing regimes. Both methods unveil coherent structures with well-defined time scales and strong correlations both, in the timing of the laser pulses and in their peak intensities. Our approach is generic and may be used in other complex systems that undergo similar transitions involving the generation of extreme fluctuations.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This chapter deals with the physicochemical aspects of structure-property relationships in synthetic hydrogels, with particular reference to their application in optometry and ophthalmology. It demonstrates the ways in which the amount of water contained in the hydrogel network can be manipulated by changes in copolymer composition and illustrates the advantages and limitations imposed by use of water as a means of influencing surface, transport and mechanical properties of the gel. The chapter then illustrates how this basic understanding has formed a platform for the development of synthetic interpenetrating networks and macroporous materials, and of hybrids of natural and synthetic hydrogels. The behaviour of these more complex systems is not so centrally dominated by the equilibrium water content as is the case with homogeneous synthetic hydrogels, thus providing advantageous ways of extending the properties and applications of these interesting materials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A minőségügy egyik kulcsfeladata, hogy azonosítsa az értékteremtés szempontjából kritikus tényezőket, meghatározza ezek értékét, valamint intézkedjen negatív hatásuk megelőzése és csökkentése érdekében. Az értékteremtés sok esetben folyamatokon keresztül történik, amelyek tevékenységekből, elvégzendő feladatokból állnak. Ezekhez megfelelő munkatársak kellenek, akiknek az egyik legfontosabb jellemzője az általuk birtokolt tudás. Mindezek alapján a feladat-tudás-erőforrás kapcsolatrendszer ismerete és kezelése minőségügyi feladat is. A komplex rendszerek elemzésével foglalkozó hálózatkutatás eszközt biztosíthat ehhez, ezért indokolt a minőségügyi területen történő alkalmazhatóságának vizsgálata. Az alkalmazási lehetőségek rendszerezése érdekében a szerzők kategorizálták a minőségügyi hálózatokat az élek (kapcsolatok) és a csúcsok (hálózati pontok) típusai alapján. Ezt követően definiálták a multimodális (több különböző csúcstípusból álló) tudáshálózatot, amely a feladatokból, az erőforrásokból, a tudáselemekből és a közöttük lévő kapcsolatokból épül fel. A hálózat segítségével kategóriákba sorolták a tudáselemeket, valamint a fokszámok alapján meghatározták értéküket. A multimodális hálózatból képzett tudáselem-hálózatban megadták az összefüggő csoportok jelentését, majd megfogalmaztak egy összefüggést a tudáselem-elvesztés kockázatának meghatározására. _______ The aims of quality management are to identify those factors that have significant influence on value production, qualify or quantify them, and make preventive and corrective actions in order to reduce their negative effects. The core elements of value production are processes and tasks, along with workforce having the necessary knowledge to work. For that reason the task-resource-knowledge structure is pertinent to quality management. Network science provides methods to analyze complex systems; therefore it seems reasonable to study the use of tools of network analysis in association with quality management issues. First of all the authors categorized quality networks according to the types of nodes (vertices) and links (edges or arcs). Focusing on knowledge management, they defined the multimodal knowledge network, consisting of tasks, resources, knowledge items and their interconnections. Based on their degree, network nodes can be categorized and their value can be quantified. Derived from the multimodal network knowledge-item network is to be created, where the meaning of cohesive subgroups is defined. Eventually they proposed a formula for determining the risk of knowledge loss.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The physics of self-organization and complexity is manifested on a variety of biological scales, from large ecosystems to the molecular level. Protein molecules exhibit characteristics of complex systems in terms of their structure, dynamics, and function. Proteins have the extraordinary ability to fold to a specific functional three-dimensional shape, starting from a random coil, in a biologically relevant time. How they accomplish this is one of the secrets of life. In this work, theoretical research into understanding this remarkable behavior is discussed. Thermodynamic and statistical mechanical tools are used in order to investigate the protein folding dynamics and stability. Theoretical analyses of the results from computer simulation of the dynamics of a four-helix bundle show that the excluded volume entropic effects are very important in protein dynamics and crucial for protein stability. The dramatic effects of changing the size of sidechains imply that a strategic placement of amino acid residues with a particular size may be an important consideration in protein engineering. Another investigation deals with modeling protein structural transitions as a phase transition. Using finite size scaling theory, the nature of unfolding transition of a four-helix bundle protein was investigated and critical exponents for the transition were calculated for various hydrophobic strengths in the core. It is found that the order of the transition changes from first to higher order as the strength of the hydrophobic interaction in the core region is significantly increased. Finally, a detailed kinetic and thermodynamic analysis was carried out in a model two-helix bundle. The connection between the structural free-energy landscape and folding kinetics was quantified. I show how simple protein engineering, by changing the hydropathy of a small number of amino acids, can enhance protein folding by significantly changing the free energy landscape so that kinetic traps are removed. The results have general applicability in protein engineering as well as understanding the underlying physical mechanisms of protein folding. ^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Unified Modeling Language (UML) is the most comprehensive and widely accepted object-oriented modeling language due to its multi-paradigm modeling capabilities and easy to use graphical notations, with strong international organizational support and industrial production quality tool support. However, there is a lack of precise definition of the semantics of individual UML notations as well as the relationships among multiple UML models, which often introduces incomplete and inconsistent problems for software designs in UML, especially for complex systems. Furthermore, there is a lack of methodologies to ensure a correct implementation from a given UML design. The purpose of this investigation is to verify and validate software designs in UML, and to provide dependability assurance for the realization of a UML design.^ In my research, an approach is proposed to transform UML diagrams into a semantic domain, which is a formal component-based framework. The framework I proposed consists of components and interactions through message passing, which are modeled by two-layer algebraic high-level nets and transformation rules respectively. In the transformation approach, class diagrams, state machine diagrams and activity diagrams are transformed into component models, and transformation rules are extracted from interaction diagrams. By applying transformation rules to component models, a (sub)system model of one or more scenarios can be constructed. Various techniques such as model checking, Petri net analysis techniques can be adopted to check if UML designs are complete or consistent. A new component called property parser was developed and merged into the tool SAM Parser, which realize (sub)system models automatically. The property parser generates and weaves runtime monitoring code into system implementations automatically for dependability assurance. The framework in the investigation is creative and flexible since it not only can be explored to verify and validate UML designs, but also provides an approach to build models for various scenarios. As a result of my research, several kinds of previous ignored behavioral inconsistencies can be detected.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The freshwater Everglades is a complex system containing thousands of tree islands embedded within a marsh-grassland matrix. The tree island-marsh mosaic is shaped and maintained by hydrologic, edaphic and biological mechanisms that interact across multiple scales. Preserving tree islands requires a more integrated understanding of how scale-dependent phenomena interact in the larger freshwater system. The hierarchical patch dynamics paradigm provides a conceptual framework for exploring multi-scale interactions within complex systems. We used a three-tiered approach to examine the spatial variability and patterning of nutrients in relation to site parameters within and between two hydrologically defined Everglades landscapes: the freshwater Marl Prairie and the Ridge and Slough. Results were scale-dependent and complexly interrelated. Total carbon and nitrogen patterning were correlated with organic matter accumulation, driven by hydrologic conditions at the system scale. Total and bioavailable phosphorus were most strongly related to woody plant patterning within landscapes, and were found to be 3 to 11 times more concentrated in tree island soils compared to surrounding marshes. Below canopy resource islands in the slough were elongated in a downstream direction, indicating soil resource directional drift. Combined multi-scale results suggest that hydrology plays a significant role in landscape patterning and also the development and maintenance of tree islands. Once developed, tree islands appear to exert influence over the spatial distribution of nutrients, which can reciprocally affect other ecological processes.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Investigation of the performance of engineering project organizations is critical for understanding and eliminating inefficiencies in today’s dynamic global markets. The existing theoretical frameworks consider project organizations as monolithic systems and attribute the performance of project organizations to the characteristics of the constituents. However, project organizations consist of complex interdependent networks of agents, information, and resources whose interactions give rise to emergent properties that affect the overall performance of project organizations. Yet, our understanding of the emergent properties in project organizations and their impact on project performance is rather limited. This limitation is one of the major barriers towards creation of integrated theories of performance assessment in project organizations. The objective of this paper is to investigate the emergent properties that affect the ability of project organization to cope with uncertainty. Based on the theories of complex systems, we propose and test a novel framework in which the likelihood of performance variations in project organizations could be investigated based on the environment of uncertainty (i.e., static complexity, dynamic complexity, and external source of disruption) as well as the emergent properties (i.e., absorptive capacity, adaptive capacity, and restorative capacity) of project organizations. The existence and significance of different dimensions of the environment of uncertainty and emergent properties in the proposed framework are tested based on the analysis of the information collected from interviews with senior project managers in the construction industry. The outcomes of this study provide a novel theoretical lens for proactive bottom-up investigation of performance in project organizations at the interface of emergent properties and uncertainty

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Petri Nets are a formal, graphical and executable modeling technique for the specification and analysis of concurrent and distributed systems and have been widely applied in computer science and many other engineering disciplines. Low level Petri nets are simple and useful for modeling control flows but not powerful enough to define data and system functionality. High level Petri nets (HLPNs) have been developed to support data and functionality definitions, such as using complex structured data as tokens and algebraic expressions as transition formulas. Compared to low level Petri nets, HLPNs result in compact system models that are easier to be understood. Therefore, HLPNs are more useful in modeling complex systems. ^ There are two issues in using HLPNs—modeling and analysis. Modeling concerns the abstracting and representing the systems under consideration using HLPNs, and analysis deals with effective ways study the behaviors and properties of the resulting HLPN models. In this dissertation, several modeling and analysis techniques for HLPNs are studied, which are integrated into a framework that is supported by a tool. ^ For modeling, this framework integrates two formal languages: a type of HLPNs called Predicate Transition Net (PrT Net) is used to model a system's behavior and a first-order linear time temporal logic (FOLTL) to specify the system's properties. The main contribution of this dissertation with regard to modeling is to develop a software tool to support the formal modeling capabilities in this framework. ^ For analysis, this framework combines three complementary techniques, simulation, explicit state model checking and bounded model checking (BMC). Simulation is a straightforward and speedy method, but only covers some execution paths in a HLPN model. Explicit state model checking covers all the execution paths but suffers from the state explosion problem. BMC is a tradeoff as it provides a certain level of coverage while more efficient than explicit state model checking. The main contribution of this dissertation with regard to analysis is adapting BMC to analyze HLPN models and integrating the three complementary analysis techniques in a software tool to support the formal analysis capabilities in this framework. ^ The SAMTools developed for this framework in this dissertation integrates three tools: PIPE+ for HLPNs behavioral modeling and simulation, SAMAT for hierarchical structural modeling and property specification, and PIPE+Verifier for behavioral verification.^

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The study of chemical reactions is among the most important contents to the understanding of Chemistry discipline in basic education. However, there are still few studies about chemical reactions as a complex system because, generally, this content is presented in textbooks, taught and even researched in a fragmented form. The thesis here presented aims to investigate, identify and characterize the mistakes and learning difficulties of the students about chemical reactions as a complex system, using for that purpose the analysis of the answers of 126 exams of candidates for the bachelor’s degree in Chemistry Teaching on the entrance exam for the Federal University of Rio Grande do Norte (UFRN). The mistakes and learning difficulties about the parameters ΔG°, Kp, Ea and of the calculation of the amount of substance in a chemical reaction have been identified, as well as the levels of development of the ability to interpret the chemical reaction as a system. The main theoretical source of this study is structured based on the mistakes and learning difficulties (NÚÑEZ, RAMALHO, 2012), of the chemical reactions as complex systems (NÚÑEZ, 1994; RESHETOVA, 1988; SANDERSON, 1968). As methodology, it was prioritized the analysis of the answers to the exams and the interview with the teachers. The results showed typical mistakes in the study of this subject, especially low levels and skill development. No student was able to integrate the different aspects in the systemic understanding of the chemical reaction. From the interviews with Chemistry teachers from High School, it was determined the reasons the teachers assign to those mistakes and learning difficulties. The interviews revealed that the teachers do not work in the perspective of integration of the contents which leads the students to present difficulties and make mistakes related to the content previously mentioned. The study presents a proposal for the organization of the contents of Chemistry discipline for High School as a possibility of a dialectic systemic integration of contents, understanding that this systematic vision, leads to important contributions to the development of the theoretical thinking of the students. We can mention as one of the conclusions of this study, the fact that the non-systemic organization of contents do not favor this kind of thinking in students.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rapid development in industry have contributed to more complex systems that are prone to failure. In applications where the presence of faults may lead to premature failure, fault detection and diagnostics tools are often implemented. The goal of this research is to improve the diagnostic ability of existing FDD methods. Kernel Principal Component Analysis has good fault detection capability, however it can only detect the fault and identify few variables that have contribution on occurrence of fault and thus not precise in diagnosing. Hence, KPCA was used to detect abnormal events and the most contributed variables were taken out for more analysis in diagnosis phase. The diagnosis phase was done in both qualitative and quantitative manner. In qualitative mode, a networked-base causality analysis method was developed to show the causal effect between the most contributing variables in occurrence of the fault. In order to have more quantitative diagnosis, a Bayesian network was constructed to analyze the problem in probabilistic perspective.