50 resultados para LARGE SYSTEMS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

For thousands of years, humans have inhabited locations that are highly vulnerable to the impacts of climate change, earthquakes, and floods. In order to investigate the extent to which Holocene environmental changes may have impacted on cultural evolution, we present new geologic, geomorphic, and chronologic data from the Qazvin Plain in northwest Iran that provides a backdrop of natural environmental changes for the simultaneous cultural dynamics observed on the Central Iranian Plateau. Well-resolved archaeological data from the neighbouring settlements of Zagheh (7170—6300 yr BP), Ghabristan (6215—4950 yr BP) and Sagzabad (4050—2350 yr BP) indicate that Holocene occupation of the Hajiarab alluvial fan was interrupted by a 900 year settlement hiatus. Multiproxy climate data from nearby lakes in northwest Iran suggest a transition from arid early-Holocene conditions to more humid middle-Holocene conditions from c. 7550 to 6750 yr BP, coinciding with the settlement of Zagheh, and a peak in aridity at c. 4550 yr BP during the settlement hiatus. Palaeoseismic investigations indicate that large active fault systems in close proximity to the tell sites incurred a series of large (MW ~7.1) earthquakes with return periods of ~500—1000 years during human occupation of the tells. Mapping and optically stimulated luminescence (OSL) chronology of the alluvial sequences reveals changes in depositional style from coarse-grained unconfined sheet flow deposits to proximal channel flow and distally prograding alluvial deposits sometime after c. 8830 yr BP, possibly reflecting an increase in moisture following the early-Holocene arid phase. The coincidence of major climate changes, earthquake activity, and varying sedimentation styles with changing patterns of human occupation on the Hajiarab fan indicate links between environmental and anthropogenic systems. However, temporal coincidence does not necessitate a fundamental causative dependency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The K-Means algorithm for cluster analysis is one of the most influential and popular data mining methods. Its straightforward parallel formulation is well suited for distributed memory systems with reliable interconnection networks, such as massively parallel processors and clusters of workstations. However, in large-scale geographically distributed systems the straightforward parallel algorithm can be rendered useless by a single communication failure or high latency in communication paths. The lack of scalable and fault tolerant global communication and synchronisation methods in large-scale systems has hindered the adoption of the K-Means algorithm for applications in large networked systems such as wireless sensor networks, peer-to-peer systems and mobile ad hoc networks. This work proposes a fully distributed K-Means algorithm (EpidemicK-Means) which does not require global communication and is intrinsically fault tolerant. The proposed distributed K-Means algorithm provides a clustering solution which can approximate the solution of an ideal centralised algorithm over the aggregated data as closely as desired. A comparative performance analysis is carried out against the state of the art sampling methods and shows that the proposed method overcomes the limitations of the sampling-based approaches for skewed clusters distributions. The experimental analysis confirms that the proposed algorithm is very accurate and fault tolerant under unreliable network conditions (message loss and node failures) and is suitable for asynchronous networks of very large and extreme scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An incidence matrix analysis is used to model a three-dimensional network consisting of resistive and capacitive elements distributed across several interconnected layers. A systematic methodology for deriving a descriptor representation of the network with random allocation of the resistors and capacitors is proposed. Using a transformation of the descriptor representation into standard state-space form, amplitude and phase admittance responses of three-dimensional random RC networks are obtained. Such networks display an emergent behavior with a characteristic Jonscher-like response over a wide range of frequencies. A model approximation study of these networks is performed to infer the admittance response using integral and fractional order models. It was found that a fractional order model with only seven parameters can accurately describe the responses of networks composed of more than 70 nodes and 200 branches with 100 resistors and 100 capacitors. The proposed analysis can be used to model charge migration in amorphous materials, which may be associated to specific macroscopic or microscopic scale fractal geometrical structures in composites displaying a viscoelastic electromechanical response, as well as to model the collective responses of processes governed by random events described using statistical mechanics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Clinical pathway is an approach to standardise care processes to support the implementations of clinical guidelines and protocols. It is designed to support the management of treatment processes including clinical and non-clinical activities, resources and also financial aspects. It provides detailed guidance for each stage in the management of a patient with the aim of improving the continuity and coordination of care across different disciplines and sectors. However, in the practical treatment process, the lack of knowledge sharing and information accuracy of paper-based clinical pathways burden health-care staff with a large amount of paper work. This will often result in medical errors, inefficient treatment process and thus poor quality medical services. This paper first presents a theoretical underpinning and a co-design research methodology for integrated pathway management by drawing input from organisational semiotics. An approach to integrated clinical pathway management is then proposed, which aims to embed pathway knowledge into treatment processes and existing hospital information systems. The capability of this approach has been demonstrated through the case study in one of the largest hospitals in China. The outcome reveals that medical quality can be improved significantly by the classified clinical pathway knowledge and seamless integration with hospital information systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The anomalously wet winter of 2010 had a very important impact on the Portuguese hydrological system. Owing to the detrimental effects of reduced precipitation in Portugal on the environmental and socio-economic systems, the 2010 winter was predominantly beneficial by reversing the accumulated precipitation deficits during the previous hydrological years. The recorded anomalously high precipitation amounts have contributed to an overall increase in river runoffs and dam recharges in the 4 major river basins. In synoptic terms, the winter 2010 was characterised by an anomalously strong westerly flow component over the North Atlantic that triggered high precipitation amounts. A dynamically coherent enhancement in the frequencies of mid-latitude cyclones close to Portugal, also accompanied by significant increases in the occurrence of cyclonic, south and south-westerly circulation weather types, are noteworthy. Furthermore, the prevalence of the strong negative phase of the North Atlantic Oscillation (NAO) also emphasises the main dynamical features of the 2010 winter. A comparison of the hydrological and atmospheric conditions between the 2010 winter and the previous 2 anomalously wet winters (1996 and 2001) was also carried out to isolate not only their similarities, but also their contrasting conditions, highlighting the limitations of estimating winter precipitation amounts in Portugal using solely the NAO phase as a predictor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A primitive equation model is used to study the sensitivity of baroclinic wave life cycles to the initial latitude-height distribution of humidity. Diabatic heating is parametrized only as a consequence of condensation in regions of large-scale ascent. Experiments are performed in which the initial relative humidity is a simple function of model level, and in some cases latitude bands are specified which are initially relatively dry. It is found that the presence of moisture can either increase or decrease the peak eddy kinetic energy of the developing wave, depending on the initial moisture distribution. A relative abundance of moisture at mid-latitudes tends to weaken the wave, while a relative abundance at low latitudes tends to strengthen it. This sensitivity exists because competing processes are at work. These processes are described in terms of energy box diagnostics. The most realistic case lies on the cusp of this sensitivity. Further physical parametrizations are then added, including surface fluxes and upright moist convection. These have the effect of increasing wave amplitude, but the sensitivity to initial conditions of relative humidity remains. Finally, 'control' and 'doubled CO2' life cycles are performed, with initial conditions taken from the time-mean zonal-mean output of equilibrium GCM experiments. The attenuation of the wave resulting from reduced baroclinicity is more pronounced than any effect due to changes in initial moisture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exascale systems are the next frontier in high-performance computing and are expected to deliver a performance of the order of 10^18 operations per second using massive multicore processors. Very large- and extreme-scale parallel systems pose critical algorithmic challenges, especially related to concurrency, locality and the need to avoid global communication patterns. This work investigates a novel protocol for dynamic group communication that can be used to remove the global communication requirement and to reduce the communication cost in parallel formulations of iterative data mining algorithms. The protocol is used to provide a communication-efficient parallel formulation of the k-means algorithm for cluster analysis. The approach is based on a collective communication operation for dynamic groups of processes and exploits non-uniform data distributions. Non-uniform data distributions can be either found in real-world distributed applications or induced by means of multidimensional binary search trees. The analysis of the proposed dynamic group communication protocol has shown that it does not introduce significant communication overhead. The parallel clustering algorithm has also been extended to accommodate an approximation error, which allows a further reduction of the communication costs. The effectiveness of the exact and approximate methods has been tested in a parallel computing system with 64 processors and in simulations with 1024 processing elements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The financial crisis of 2007-2009 has precipitated large scale regulatory change. Tight deadlines for implementation require organizations to start working on remediation projects before final drafts of regulations are crystalized. Firms are faced with engaging in complex and costly change management programs at a time when profits are diminished. As a consequence of these factors, pre-crisis logics for organizing compliance practices are being questioned and new approaches introduced. Our study explores the use of Investment Management Systems (IMS) in facilitating compliance arrangements. Our motivation is to understand the new logics and the part played by IMS in supporting these approaches. The study adopts an institutional logics perspective to explore the use of such systems at eight financial organizations. The study found new logics for organizing compliance include consolidation, centralization, harmonization and consistency and that the IMS plays an important role in supporting and enabling related activities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Complex Adaptive Systems, Cognitive Agents and Distributed Energy (CASCADE) project is developing a framework based on Agent Based Modelling (ABM). The CASCADE Framework can be used both to gain policy and industry relevant insights into the smart grid concept itself and as a platform to design and test distributed ICT solutions for smart grid based business entities. ABM is used to capture the behaviors of diff erent social, economic and technical actors, which may be defi ned at various levels of abstraction. It is applied to understanding their interactions and can be adapted to include learning processes and emergent patterns. CASCADE models ‘prosumer’ agents (i.e., producers and/or consumers of energy) and ‘aggregator’ agents (e.g., traders of energy in both wholesale and retail markets) at various scales, from large generators and Energy Service Companies down to individual people and devices. The CASCADE Framework is formed of three main subdivisions that link models of electricity supply and demand, the electricity market and power fl ow. It can also model the variability of renewable energy generation caused by the weather, which is an important issue for grid balancing and the profi tability of energy suppliers. The development of CASCADE has already yielded some interesting early fi ndings, demonstrating that it is possible for a mediating agent (aggregator) to achieve stable demandfl attening across groups of domestic households fi tted with smart energy control and communication devices, where direct wholesale price signals had previously been found to produce characteristic complex system instability. In another example, it has demonstrated how large changes in supply mix can be caused even by small changes in demand profi le. Ongoing and planned refi nements to the Framework will support investigation of demand response at various scales, the integration of the power sector with transport and heat sectors, novel technology adoption and diffusion work, evolution of new smart grid business models, and complex power grid engineering and market interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It is well known that atmospheric concentrations of carbon dioxide (CO2) (and other greenhouse gases) have increased markedly as a result of human activity since the industrial revolution. It is perhaps less appreciated that natural and managed soils are an important source and sink for atmospheric CO2 and that, primarily as a result of the activities of soil microorganisms, there is a soil-derived respiratory flux of CO2 to the atmosphere that overshadows by tenfold the annual CO2 flux from fossil fuel emissions. Therefore small changes in the soil carbon cycle could have large impacts on atmospheric CO2 concentrations. Here we discuss the role of soil microbes in the global carbon cycle and review the main methods that have been used to identify the microorganisms responsible for the processing of plant photosynthetic carbon inputs to soil. We discuss whether application of these techniques can provide the information required to underpin the management of agro-ecosystems for carbon sequestration and increased agricultural sustainability. We conclude that, although crucial in enabling the identification of plant-derived carbon-utilising microbes, current technologies lack the high-throughput ability to quantitatively apportion carbon use by phylogentic groups and its use efficiency and destination within the microbial metabolome. It is this information that is required to inform rational manipulation of the plant–soil system to favour organisms or physiologies most important for promoting soil carbon storage in agricultural soil.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global communication requirements and load imbalance of some parallel data mining algorithms are the major obstacles to exploit the computational power of large-scale systems. This work investigates how non-uniform data distributions can be exploited to remove the global communication requirement and to reduce the communication cost in iterative parallel data mining algorithms. In particular, the analysis focuses on one of the most influential and popular data mining methods, the k-means algorithm for cluster analysis. The straightforward parallel formulation of the k-means algorithm requires a global reduction operation at each iteration step, which hinders its scalability. This work studies a different parallel formulation of the algorithm where the requirement of global communication can be relaxed while still providing the exact solution of the centralised k-means algorithm. The proposed approach exploits a non-uniform data distribution which can be either found in real world distributed applications or can be induced by means of multi-dimensional binary search trees. The approach can also be extended to accommodate an approximation error which allows a further reduction of the communication costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Automatic generation of classification rules has been an increasingly popular technique in commercial applications such as Big Data analytics, rule based expert systems and decision making systems. However, a principal problem that arises with most methods for generation of classification rules is the overfit-ting of training data. When Big Data is dealt with, this may result in the generation of a large number of complex rules. This may not only increase computational cost but also lower the accuracy in predicting further unseen instances. This has led to the necessity of developing pruning methods for the simplification of rules. In addition, classification rules are used further to make predictions after the completion of their generation. As efficiency is concerned, it is expected to find the first rule that fires as soon as possible by searching through a rule set. Thus a suit-able structure is required to represent the rule set effectively. In this chapter, the authors introduce a unified framework for construction of rule based classification systems consisting of three operations on Big Data: rule generation, rule simplification and rule representation. The authors also review some existing methods and techniques used for each of the three operations and highlight their limitations. They introduce some novel methods and techniques developed by them recently. These methods and techniques are also discussed in comparison to existing ones with respect to efficient processing of Big Data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compared fat and fatty acids in cooked retail chicken meat from conventional and organic systems. Fat contents were 1.7, 5.2, 7.1 and 12.9 g/100 g cooked weight in skinless breast, breast with skin, skinless leg and leg with skin respectively, with organic meat containing less fat overall (P < 0.01). Meat was rich in cis-monounsaturated fatty acids, although organic meat contained less than did conventional meat (1850 vs. 2538 mg/100 g; P < 0.001). Organic meat was also lower (P < 0.001) in 18:3 n−3 (115 vs. 180 mg/100 g) and, whilst it contained more (P < 0.001) docosahexaenoic acid (30.9 vs. 13.7 mg/100 g), this was due to the large effect of one supermarket. This system by supermarket interaction suggests that poultry meat labelled as organic is not a guarantee of higher long chain n−3 fatty acids. Overall there were few major differences in fatty acid contents/profiles between organic and conventional meat that were consistent across all supermarkets.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study examines when “incremental” change is likely to trigger “discontinuous” change, using the lens of complex adaptive systems theory. Going beyond the simulations and case studies through which complex adaptive systems have been approached so far, we study the relationship between incremental organizational reconfigurations and discontinuous organizational restructurings using a large-scale database of U.S. Fortune 50 industrial corporations. We develop two types of escalation process in organizations: accumulation and perturbation. Under ordinary conditions, it is perturbation rather than the accumulation that is more likely to trigger subsequent discontinuous change. Consistent with complex adaptive systems theory, organizations are more sensitive to both accumulation and perturbation in conditions of heightened disequilibrium. Contrary to expectations, highly interconnected organizations are not more liable to discontinuous change. We conclude with implications for further research, especially the need to attend to the potential role of managerial design and coping when transferring complex adaptive systems theory from natural systems to organizational systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper concerns the innovative use of a blend of systems thinking ideas in the ‘Munro Review of Child Protection’, a high-profile examination of child protection activities in England, conducted for the Department for Education. We go ‘behind the scenes’ to describe the OR methodologies and processes employed. The circumstances that led to the Review are outlined. Three specific contributions that systems thinking made to the Review are then described. First, the systems-based analysis and visualisation of how a ‘compliance culture’ had grown up. Second the creation of a large, complex systems map of current operations and the effects of past policies on them. Third, how the map gave shape to the range of issues the Review addressed and acted as an organising framework for the systemically coherent set of recommendations made. The paper closes with an outline of the main implementation steps taken so far to create a child protection system with the critically reflective properties of a learning organisation, and methodological reflections on the benefits of systems thinking to support organisational analysis.