108 resultados para Perimetric complexity
Resumo:
Most models of riverine eco-hydrology and biogeochemistry rely upon bulk parameterization of fluxes. However, the transport and retention of carbon and nutrients in headwater streams is strongly influenced by biofilms (surface-attached microbial communities), which results in strong feedbacks between stream hydrodynamics and biogeochemistry. Mechanistic understanding of the interactions between streambed biofilms and nutrient dynamics is lacking. Here we present experimental results linking microscale observations of biofilm community structure to the deposition and resuspension of clay-sized mineral particles in streams. Biofilms were grown in identical 3 m recirculating flumes over periods of 14-50 days. Fluorescent particles were introduced to each flume, and their deposition was traced over 30 minutes. Particle resuspension from the biofilms was then observed under an increased stream flow, mimicking a flood event. We quantified particle fluxes using flow cytometry and epifluorescence microscopy. We directly observed particle adhesion to the biofilm using a confocal laser scanning microscope. 3-D Optical Coherence Tomography was used to determine biofilm roughness, areal coverage and void space in each flume. These measurements allow us to link biofilm complexity to particle retention during both baseflow and floodflow. The results suggest that increased biofilm complexity favors deposition and retention of fine particles in streams.
Resumo:
Prescribing tasks, which involve pharmacological knowledge, clinical decision-making and practical skill, take place within unpredictable social environments and involve interactions within and between endlessly changing health care teams. Despite this, curriculum designers commonly assume them to be simple to learn and perform. This research used mixed methods to explore how undergraduate medical students learn to prescribe in the 'real world'. It was informed by cognitive psychology, sociocultural theory, and systems thinking. We found that learning to prescribe occurs as a dynamic series of socially negotiated interactions within and between individuals, communities and environments. As well as a thematic analysis, we developed a framework of three conceptual spaces in which learning opportunities for prescribing occur. This illustrates a complex systems view of prescribing education and defines three major system components: the "social space", where the environmental conditions influence or bring about a learning experience; the "process space", describing what happens during the learning experience; and the intra-personal "cognitive space", where the learner may develop aspects of prescribing expertise. This conceptualisation broadens the scope of inquiry of prescribing education research by highlighting the complex interplay between individual and social dimensions of learning. This perspective is also likely to be relevant to students' learning of other clinical competencies.
Resumo:
OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.
METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.
RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).
CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.
ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.
Resumo:
Pre-processing (PP) of received symbol vector and channel matrices is an essential pre-requisite operation for Sphere Decoder (SD)-based detection of Multiple-Input Multiple-Output (MIMO) wireless systems. PP is a highly complex operation, but relative to the total SD workload it represents a relatively small fraction of the overall computational cost of detecting an OFDM MIMO frame in standards such as 802.11n. Despite this, real-time PP architectures are highly inefficient, dominating the resource cost of real-time SD architectures. This paper resolves this issue. By reorganising the ordering and QR decomposition sub operations of PP, we describe a Field Programmable Gate Array (FPGA)-based PP architecture for the Fixed Complexity Sphere Decoder (FSD) applied to 4 × 4 802.11n MIMO which reduces resource cost by 50% as compared to state-of-the-art solutions whilst maintaining real-time performance.
Resumo:
We study the computational complexity of finding maximum a posteriori configurations in Bayesian networks whose probabilities are specified by logical formulas. This approach leads to a fine grained study in which local information such as context-sensitive independence and determinism can be considered. It also allows us to characterize more precisely the jump from tractability to NP-hardness and beyond, and to consider the complexity introduced by evidence alone.
Resumo:
To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.
Resumo:
Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.
Resumo:
This letter analyzes the performance of a low complexity detection scheme for a multi-carrier index keying (MCIK) with orthogonal frequency division multiplexing (OFDM) system over two-wave with diffused power (TWDP) fading channels. A closed-form expression for the average pairwise error probability (PEP) over TWDP fading channels is derived. This expression is used to analyze the performance of MCIK-OFDM in moderate, severe and extreme fading conditions. The presented results provide an insight on the performance of MCIK-OFDM for wireless communication systems that operate in enclosed metallic structures such as in-vehicular device-to-device (D2D) wireless networks.
Resumo:
Multi-carrier index keying (MCIK) is a recently developed transmission technique that exploits the sub-carrier indices as an additional degree of freedom for data transmission. This paper investigates the performance of a low complexity detection scheme with diversity reception for MCIK with orthogonal frequency division multiplexing (OFDM). For the performance evaluation, an exact and an approximate closed form expression for the pairwise error probability (PEP) of a greedy detector (GD) with maximal ratio combining (MRC) is derived. The presented results show that the performance of the GD is significantly improved when MRC diversity is employed. The proposed hybrid scheme is found to outperform maximum likelihood (ML) detection with a substantial reduction on the associated computational complexity.
Resumo:
Abstract
Complexity and environmental uncertainty in public sector systems requires leaders to balance the administrative practices necessary to be aligned and efficient in the management of routine challenges, and the adaptive practices required to respond to complex and dynamic circumstances. Conventional notions of leadership in the field of public administration do not fully explain the role of leadership in enabling and balancing the entanglement of formal, top-down, administrative functions and informal, emergent, adaptive functions within public sector settings with different levels of complexity. Drawing on and extending existing complexity leadership constructs, this paper explores how change was enabled over the duration of three urban regeneration projects, each representing high, medium and low levels of project complexity. The data reveals six distinct yet interconnected functions of enabling leadership that were identified within the three urban regeneration projects. The paper contributes to our understanding of how leadership is enacted and poses questions for those engaged in leading in complex public sector settings.
Resumo:
Several north temperate marine species were recorded on subtidal hard-substratum reef sites selected to produce a gradient of structural complexity. The study employed an established scuba-based census method, the belt transect. The three types of reef examined, with a measured gradient of increasing structural complexity, were natural rocky reef, artificial reef constructed of solid concrete blocks, and artificial reef made of concrete blocks with voids. Surveys were undertaken monthly over a calendar year using randomly placed fixed rope transects. For a number of conspicuous species of fish and invertebrates, significant differences were found between the levels of habitat complexity and abundance. Overall abundance for many of the species examined was 2-3 times higher on the complex artificial habitats than on simple artificial or natural reef habitats. The enhanced habitat availability produced by the increased structural complexity delivered through specifically designed artificial reefs may have the potential to augment faunal abundance while promoting species diversity.
Resumo:
Many have called for medical students to learn how to manage complexity in healthcare. This study examines the nuances of students' challenges in coping with a complex simulation learning activity, using concepts from complexity theory, and suggests strategies to help them better understand and manage complexity.Wearing video glasses, participants took part in a simulation ward-based exercise that incorporated characteristics of complexity. Video footage was used to elicit interviews, which were transcribed. Using complexity theory as a theoretical lens, an iterative approach was taken to identify the challenges that participants faced and possible coping strategies using both interview transcripts and video footage.Students' challenges in coping with clinical complexity included being: a) unprepared for 'diving in', b) caught in an escalating system, c) captured by the patient, and d) unable to assert boundaries of acceptable practice.Many characteristics of complexity can be recreated in a ward-based simulation learning activity, affording learners an embodied and immersive experience of these complexity challenges. Possible strategies for managing complexity themes include: a) taking time to size up the system, b) attuning to what emerges, c) reducing complexity, d) boundary practices, and e) working with uncertainty. This study signals pedagogical opportunities for recognizing and dealing with complexity.