921 resultados para complexity


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Prescribing tasks, which involve pharmacological knowledge, clinical decision-making and practical skill, take place within unpredictable social environments and involve interactions within and between endlessly changing health care teams. Despite this, curriculum designers commonly assume them to be simple to learn and perform. This research used mixed methods to explore how undergraduate medical students learn to prescribe in the 'real world'. It was informed by cognitive psychology, sociocultural theory, and systems thinking. We found that learning to prescribe occurs as a dynamic series of socially negotiated interactions within and between individuals, communities and environments. As well as a thematic analysis, we developed a framework of three conceptual spaces in which learning opportunities for prescribing occur. This illustrates a complex systems view of prescribing education and defines three major system components: the "social space", where the environmental conditions influence or bring about a learning experience; the "process space", describing what happens during the learning experience; and the intra-personal "cognitive space", where the learner may develop aspects of prescribing expertise. This conceptualisation broadens the scope of inquiry of prescribing education research by highlighting the complex interplay between individual and social dimensions of learning. This perspective is also likely to be relevant to students' learning of other clinical competencies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: To demonstrate the benefit of complexity metrics such as the modulation complexity score (MCS) and monitor units (MUs) in multi-institutional audits of volumetric-modulated arc therapy (VMAT) delivery.

METHODS: 39 VMAT treatment plans were analysed using MCS and MU. A virtual phantom planning exercise was planned and independently measured using the PTW Octavius(®) phantom and seven29(®) 2D array (PTW-Freiburg GmbH, Freiburg, Germany). MCS and MU were compared with the median gamma index pass rates (2%/2 and 3%/3 mm) and plan quality. The treatment planning systems (TPS) were grouped by VMAT modelling being specifically designed for the linear accelerator manufacturer's own treatment delivery system (Type 1) or independent of vendor for VMAT delivery (Type 2). Differences in plan complexity (MCS and MU) between TPS types were compared.

RESULTS: For Varian(®) linear accelerators (Varian(®) Medical Systems, Inc., Palo Alto, CA), MCS and MU were significantly correlated with gamma pass rates. Type 2 TPS created poorer quality, more complex plans with significantly higher MUs and MCS than Type 1 TPS. Plan quality was significantly correlated with MU for Type 2 plans. A statistically significant correlation was observed between MU and MCS for all plans (R = -0.84, p < 0.01).

CONCLUSION: MU and MCS have a role in assessing plan complexity in audits along with plan quality metrics. Plan complexity metrics give some indication of plan deliverability but should be analysed with plan quality.

ADVANCES IN KNOWLEDGE: Complexity metrics were investigated for a national rotational audit involving 34 institutions and they showed value. The metrics found that more complex plans were created for planning systems which were independent of vendor for VMAT delivery.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pre-processing (PP) of received symbol vector and channel matrices is an essential pre-requisite operation for Sphere Decoder (SD)-based detection of Multiple-Input Multiple-Output (MIMO) wireless systems. PP is a highly complex operation, but relative to the total SD workload it represents a relatively small fraction of the overall computational cost of detecting an OFDM MIMO frame in standards such as 802.11n. Despite this, real-time PP architectures are highly inefficient, dominating the resource cost of real-time SD architectures. This paper resolves this issue. By reorganising the ordering and QR decomposition sub operations of PP, we describe a Field Programmable Gate Array (FPGA)-based PP architecture for the Fixed Complexity Sphere Decoder (FSD) applied to 4 × 4 802.11n MIMO which reduces resource cost by 50% as compared to state-of-the-art solutions whilst maintaining real-time performance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the computational complexity of finding maximum a posteriori configurations in Bayesian networks whose probabilities are specified by logical formulas. This approach leads to a fine grained study in which local information such as context-sensitive independence and determinism can be considered. It also allows us to characterize more precisely the jump from tractability to NP-hardness and beyond, and to consider the complexity introduced by evidence alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

To value something, you first have to know what it is. Bartkowski et al. (2015) reveal a critical weakness: that biodiversity has rarely, if ever, been defined in economic valuations of putative biodiversity. Here we argue that a precise definition is available and could help focus valuation studies, but that in using this scientific definition (a three-dimensional measure of total difference), valuation by stated-preference methods becomes, at best, very difficult.We reclassify the valuation studies reviewed by Bartkowski et al. (2015) to better reflect the biological definition of biodiversity and its potential indirect use value as the support for provisioning and regulating services. Our analysis shows that almost all of the studies reviewed by Bartkowski et al. (2015) were not about biodiversity, but rather were about the 'vague notion' of naturalness, or sometimes a specific biological component of diversity. Alternative economic methods should be found to value biodiversity as it is defined in natural science. We suggest options based on a production function analogy or cost-based methods. Particularly the first of these provides a strong link between economic theory and ecological research and is empirically practical. Since applied science emphasizes a scientific definition of biodiversity in the design and justification of conservation plans, the need for economic valuation of this quantitative meaning of biodiversity is considerable and as yet unfulfilled.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coastal and estuarine landforms provide a physical template that not only accommodates diverse ecosystem functions and human activities, but also mediates flood and erosion risks that are expected to increase with climate change. In this paper, we explore some of the issues associated with the conceptualisation and modelling of coastal morphological change at time and space scales relevant to managers and policy makers. Firstly, we revisit the question of how to define the most appropriate scales at which to seek quantitative predictions of landform change within an age defined by human interference with natural sediment systems and by the prospect of significant changes in climate and ocean forcing. Secondly, we consider the theoretical bases and conceptual frameworks for determining which processes are most important at a given scale of interest and the related problem of how to translate this understanding into models that are computationally feasible, retain a sound physical basis and demonstrate useful predictive skill. In particular, we explore the limitations of a primary scale approach and the extent to which these can be resolved with reference to the concept of the coastal tract and application of systems theory. Thirdly, we consider the importance of different styles of landform change and the need to resolve not only incremental evolution of morphology but also changes in the qualitative dynamics of a system and/or its gross morphological configuration. The extreme complexity and spatially distributed nature of landform systems means that quantitative prediction of future changes must necessarily be approached through mechanistic modelling of some form or another. Geomorphology has increasingly embraced so-called ‘reduced complexity’ models as a means of moving from an essentially reductionist focus on the mechanics of sediment transport towards a more synthesist view of landform evolution. However, there is little consensus on exactly what constitutes a reduced complexity model and the term itself is both misleading and, arguably, unhelpful. Accordingly, we synthesise a set of requirements for what might be termed ‘appropriate complexity modelling’ of quantitative coastal morphological change at scales commensurate with contemporary management and policy-making requirements: 1) The system being studied must be bounded with reference to the time and space scales at which behaviours of interest emerge and/or scientific or management problems arise; 2) model complexity and comprehensiveness must be appropriate to the problem at hand; 3) modellers should seek a priori insights into what kind of behaviours are likely to be evident at the scale of interest and the extent to which the behavioural validity of a model may be constrained by its underlying assumptions and its comprehensiveness; 4) informed by qualitative insights into likely dynamic behaviour, models should then be formulated with a view to resolving critical state changes; and 5) meso-scale modelling of coastal morphological change should reflect critically on the role of modelling and its relation to the observable world.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This letter analyzes the performance of a low complexity detection scheme for a multi-carrier index keying (MCIK) with orthogonal frequency division multiplexing (OFDM) system over two-wave with diffused power (TWDP) fading channels. A closed-form expression for the average pairwise error probability (PEP) over TWDP fading channels is derived. This expression is used to analyze the performance of MCIK-OFDM in moderate, severe and extreme fading conditions. The presented results provide an insight on the performance of MCIK-OFDM for wireless communication systems that operate in enclosed metallic structures such as in-vehicular device-to-device (D2D) wireless networks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-carrier index keying (MCIK) is a recently developed transmission technique that exploits the sub-carrier indices as an additional degree of freedom for data transmission. This paper investigates the performance of a low complexity detection scheme with diversity reception for MCIK with orthogonal frequency division multiplexing (OFDM). For the performance evaluation, an exact and an approximate closed form expression for the pairwise error probability (PEP) of a greedy detector (GD) with maximal ratio combining (MRC) is derived. The presented results show that the performance of the GD is significantly improved when MRC diversity is employed. The proposed hybrid scheme is found to outperform maximum likelihood (ML) detection with a substantial reduction on the associated computational complexity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Abstract
Complexity and environmental uncertainty in public sector systems requires leaders to balance the administrative practices necessary to be aligned and efficient in the management of routine challenges, and the adaptive practices required to respond to complex and dynamic circumstances. Conventional notions of leadership in the field of public administration do not fully explain the role of leadership in enabling and balancing the entanglement of formal, top-down, administrative functions and informal, emergent, adaptive functions within public sector settings with different levels of complexity. Drawing on and extending existing complexity leadership constructs, this paper explores how change was enabled over the duration of three urban regeneration projects, each representing high, medium and low levels of project complexity. The data reveals six distinct yet interconnected functions of enabling leadership that were identified within the three urban regeneration projects. The paper contributes to our understanding of how leadership is enacted and poses questions for those engaged in leading in complex public sector settings.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Complex medication regimens may adversely affect compliance and treatment outcomes. Complexity can be assessed with the medication regimen complexity index (MRCI), which has proved to be a valid, reliable tool, with potential uses in both practice and research. Objective To use the MRCI to assess medication regimen complexity in institutionalized elderly people. Setting Five nursing homes in mainland Portugal. Methods A descriptive, cross-sectional study of institutionalized elderly people (n = 415) was performed from March to June 2009, including all inpatients aged 65 and over taking at least one medication per day. Main outcome measure Medication regimen complexity index. Results The mean age of the sample was 83.9 years (±6.6 years), and 60.2 % were women. The elderly patients were taking a large number of drugs, with 76.6 % taking more than five medications per day. The average medication regimen complexity was 18.2 (±SD = 9.6), and was higher in the females (p < 0.001). The most decisive factors contributing to the complexity were the number of drugs and dosage frequency. In regimens with the same number of medications, schedule was the most relevant factor in the final score (r = 0.922), followed by pharmaceutical forms (r = 0.768) and additional instructions (r = 0.742). Conclusion Medication regimen complexity proved to be high. There is certainly potential for the pharmacist’s intervention to reduce it as part as the medication review routine in all the patients.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the performance and convergence time comparisons of various low-complexity LMS algorithms used for the coefficient update of adaptive I/Q corrector for quadrature receivers are presented. We choose the optimum LMS algorithm suitable for low complexity, high performance and high order QAM and PSK constellations. What is more, influence of the finite bit precision on VLSI implementation of such algorithms is explored through extensive simulations and optimum wordlengths established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper digital part of a self-calibrating quadrature-receiver is described, containing a digital calibration-engine. The blind source-separation-based calibration-engine eliminates the RF-impairments in real-time hence improving the receiver's performance without the need for test/pilot tones, trimming or use of power-hungry discrete components. Furthermore, an efficient time-multiplexed calibration-engine architecture is proposed and implemented on an FPGA utilising a reduced-range multiplier structure. The use of reduced-range multipliers results in substantial reduction of area as well as power consumption without a compromise in performance when compared with an efficiently designed general purpose multiplier. The performance of the calibration-engine does not depend on the modulation format or the constellation size of the received signal; hence it can be easily integrated into the digital signal processing paths of any receiver.