907 resultados para Complexity score
Resumo:
This chapter takes as its central premise the human capacity to adapt to changing environments. It is an idea that is central to complexity theory but receives only modest attention in relation to learning. To do this we will draw from a range of fields and then consider some recent research in motor control that may extend the discussion in ways not yet considered, but that will build on advances already made within pedagogy and motor control synergies. Recent work in motor control indicates that humans have far greater capacity to adapt to the ‘product space’ than was previously thought, mainly through fast heuristics and on-line corrections. These are changes that can be made in real (movement) time and are facilitated by what are referred to as ‘feed-forward’ mechanisms that take advantage of ultra-fast ways of recognizing the likely outcomes of our movements and using this as a source of feedback. We conclude by discussing some possible ideas for pedagogy within the sport and physical activity domains, the implications of which would require a rethink on how motor skill learning opportunities might best be facilitated.
Resumo:
Introduction to Youth Services is a second year Social Work and Human Services unit. In this unit a reflective writing task was introduced to assess students’ reflections on an ongoing tutorial discussion to which they contributed. The discussion was based on a fictional young person each tutorial group ‘worked with’ across eight weeks of a semester. In developing the process and the criteria for the reflective journal, the ideas raised by the Teaching and Assessing Reflective Learning (TARL) in Higher Education project (see Chap. 2) were utilised, scaffolding the work with resources and submission of a draft. The students were also invited to choose the form of reflective process they used, it could be a written journal but did not need to be. The evidence exemplified that a reflective journal is an effective tool for students to record their developing understanding regarding the concept that issues people experience are complex and compounding. Importantly, it was also a useful vehicle for students to begin to consider the impacts of their own and others’ values and beliefs on their response to the issues raised within the case discussion. The reflective journal also helped participants to consider how this learning contributes to the ongoing development of their professional practice framework.
Resumo:
Biomedical systems involve a large number of entities and intricate interactions between these. Their direct analysis is, therefore, difficult, and it is often necessary to rely on computational models. These models require significant resources and parallel computing solutions. These approaches are particularly suited, given parallel aspects in the nature of biomedical systems. Model hybridisation also permits the integration and simultaneous study of multiple aspects and scales of these systems, thus providing an efficient platform for multidisciplinary research.
Resumo:
Many researchers in the field of civil structural health monitoring (SHM) have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Fieldwork has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. Informed by a brief review of the literature, this paper documents the design and proposed test plan of a structurally complex laboratory bridge model that has been specifically designed for the purpose of SHM research. Preliminary results have been presented in the companion paper.
Resumo:
Many researchers in the field of civil structural health monitoring have developed and tested their methods on simple to moderately complex laboratory structures such as beams, plates, frames, and trusses. Field work has also been conducted by many researchers and practitioners on more complex operating bridges. Most laboratory structures do not adequately replicate the complexity of truss bridges. This paper presents some preliminary results of experimental modal testing and analysis of the bridge model presented in the companion paper, using the peak picking method, and compares these results with those of a simple numerical model of the structure. Three dominant modes of vibration were experimentally identified under 15 Hz. The mode shapes and order of the modes matched those of the numerical model; however, the frequencies did not match.
Resumo:
Migraine and major depressive disorder (MDD) are comorbid, moderately heritable and to some extent influenced by the same genes. In a previous paper, we suggested the possibility of causality (one trait causing the other) underlying this comorbidity. We present a new application of polygenic (genetic risk) score analysis to investigate the mechanisms underlying the genetic overlap of migraine and MDD. Genetic risk scores were constructed based on data from two discovery samples in which genome-wide association analyses (GWA) were performed for migraine and MDD, respectively. The Australian Twin Migraine GWA study (N = 6,350) included 2,825 migraine cases and 3,525 controls, 805 of whom met the diagnostic criteria for MDD. The RADIANT GWA study (N = 3,230) included 1,636 MDD cases and 1,594 controls. Genetic risk scores for migraine and for MDD were used to predict pure and comorbid forms of migraine and MDD in an independent Dutch target sample (NTR-NESDA, N = 2,966), which included 1,476 MDD cases and 1,058 migraine cases (723 of these individuals had both disorders concurrently). The observed patterns of prediction suggest that the 'pure' forms of migraine and MDD are genetically distinct disorders. The subgroup of individuals with comorbid MDD and migraine were genetically most similar to MDD patients. These results indicate that in at least a subset of migraine patients with MDD, migraine may be a symptom or consequence of MDD. © 2013 Springer-Verlag Berlin Heidelberg.
Resumo:
Background: Prediction of outcome after stroke is important for triage decisions, prognostic estimates for family and for appropriate resource utilization. Prognostication must be timely and simply applied. Several scales have shown good prognostic value. In Calgary, the Orpington Prognostic Score (OPS) has been used to predict outcome as an aid to rehabilitation triage. However, the OPS has not been assessed at one week for predictive capability. Methods: Among patients admitted to a sub-acute stroke unit, OPS from the first week were examined to determine if any correlation existed between final disposition after rehabilitation and first week score. The predictive validity of the OPS at one week was compared to National Institute of Health Stroke Scale (NIHSS) score at 24 hours using logistic regression and receiver operator characteristics analysis. The primary outcome was final disposition after discharge from the stroke unit if the patient went directly home, or died, or from the inpatient rehabilitation unit. Results: The first week OPS was highly predictive of final disposition. However, no major advantage in using the first week OPS was observed when compared to 24h NIHSS score. Both scales were equally predictive of final disposition of stroke patients, post rehabilitation. Conclusion: The first week OPS can be used to predict final outcome. The NIHSS at 24h provides the same prognostic information.
Resumo:
In the past two decades, complexity thinking has emerged as an important theoretical response to the limitations of orthodox ways of understanding educational phenomena. Complexity provides ways of understanding that embrace uncertainty, non-linearity and the inevitable ‘messiness’ that is inherent in educational settings, paying attention to the ways in which the whole is greater than the sum of its parts. This is the first book to focus on complexity thinking in the context of physical education, enabling fresh ways of thinking about research, teaching, curriculum and learning. Written by a team of leading international physical education scholars, the book highlights how the considerable theoretical promise of complexity can be reflected in the actual policies, pedagogies and practices of physical education (PE). It encourages teachers, educators and researchers to embrace notions of learning that are more organic and emergent, to allow the inherent complexity of pedagogical work in PE to be examined more broadly and inclusively. In doing so, Complexity Thinking in Physical Education makes a major contribution to our understanding of pedagogy, curriculum design and development, human movement and educational practice.
Resumo:
This study investigated a new performance indicator to assess climbing fluency (smoothness of the hip trajectory and orientation of a climber using normalized jerk coefficients) to explore effects of practice and hold design on performance. Eight experienced climbers completed four repetitions of two, 10-m high routes with similar difficulty levels, but varying in hold graspability (holds with one edge vs holds with two edges). An inertial measurement unit was attached to the hips of each climber to collect 3D acceleration and 3D orientation data to compute jerk coefficients. Results showed high correlations (r = .99, P < .05) between the normalized jerk coefficient of hip trajectory and orientation. Results showed higher normalized jerk coefficients for the route with two graspable edges, perhaps due to more complex route finding and action regulation behaviors. This effect decreased with practice. Jerk coefficient of hip trajectory and orientation could be a useful indicator of climbing fluency for coaches as its computation takes into account both spatial and temporal parameters (ie, changes in both climbing trajectory and time to travel this trajectory)
Resumo:
Objective Risk scores and accelerated diagnostic protocols can identify chest pain patients with low risk of major adverse cardiac event who could be discharged early from the ED, saving time and costs. We aimed to derive and validate a chest pain score and accelerated diagnostic protocol (ADP) that could safely increase the proportion of patients suitable for early discharge. Methods Logistic regression identified statistical predictors for major adverse cardiac events in a derivation cohort. Statistical coefficients were converted to whole numbers to create a score. Clinician feedback was used to improve the clinical plausibility and the usability of the final score (Emergency Department Assessment of Chest pain Score [EDACS]). EDACS was combined with electrocardiogram results and troponin results at 0 and 2 h to develop an ADP (EDACS-ADP). The score and EDACS-ADP were validated and tested for reproducibility in separate cohorts of patients. Results In the derivation (n = 1974) and validation (n = 608) cohorts, the EDACS-ADP classified 42.2% (sensitivity 99.0%, specificity 49.9%) and 51.3% (sensitivity 100.0%, specificity 59.0%) as low risk of major adverse cardiac events, respectively. The intra-class correlation coefficient for categorisation of patients as low risk was 0.87. Conclusion The EDACS-ADP identified approximately half of the patients presenting to the ED with possible cardiac chest pain as having low risk of short-term major adverse cardiac events, with high sensitivity. This is a significant improvement on similar, previously reported protocols. The EDACS-ADP is reproducible and has the potential to make considerable cost reductions to health systems.
Resumo:
Background The Palliative Care Problem Severity Score is a clinician-rated tool to assess problem severity in four palliative care domains (pain, other symptoms, psychological/spiritual, family/carer problems) using a 4-point categorical scale (absent, mild, moderate, severe). Aim To test the reliability and acceptability of the Palliative Care Problem Severity Score. Design: Multi-centre, cross-sectional study involving pairs of clinicians independently rating problem severity using the tool. Setting/participants Clinicians from 10 Australian palliative care services: 9 inpatient units and 1 mixed inpatient/community-based service. Results A total of 102 clinicians participated, with almost 600 paired assessments completed for each domain, involving 420 patients. A total of 91% of paired assessments were undertaken within 2 h. Strength of agreement for three of the four domains was moderate: pain (Kappa = 0.42, 95% confidence interval = 0.36 to 0.49); psychological/spiritual (Kappa = 0.48, 95% confidence interval = 0.42 to 0.54); family/carer (Kappa = 0.45, 95% confidence interval = 0.40 to 0.52). Strength of agreement for the remaining domain (other symptoms) was fair (Kappa = 0.38, 95% confidence interval = 0.32 to 0.45). Conclusion The Palliative Care Problem Severity Score is an acceptable measure, with moderate reliability across three domains. Variability in inter-rater reliability across sites and participant feedback indicate that ongoing education is required to ensure that clinicians understand the purpose of the tool and each of its domains. Raters familiar with the patient they were assessing found it easier to assign problem severity, but this did not improve inter-rater reliability.
Resumo:
We consider the problem of deciding whether the output of a boolean circuit is determined by a partial assignment to its inputs. This problem is easily shown to be hard, i.e., co-Image Image -complete. However, many of the consequences of a partial input assignment may be determined in linear time, by iterating the following step: if we know the values of some inputs to a gate, we can deduce the values of some outputs of that gate. This process of iteratively deducing some of the consequences of a partial assignment is called propagation. This paper explores the parallel complexity of propagation, i.e., the complexity of determining whether the output of a given boolean circuit is determined by propagating a given partial input assignment. We give a complete classification of the problem into those cases that are Image -complete and those that are unlikely to be Image complete.
Resumo:
We address the issue of complexity for vector quantization (VQ) of wide-band speech LSF (line spectrum frequency) parameters. The recently proposed switched split VQ (SSVQ) method provides better rate-distortion (R/D) performance than the traditional split VQ (SVQ) method, even at the requirement of lower computational complexity. but at the expense of much higher memory. We develop the two stage SVQ (TsSVQ) method, by which we gain both the memory and computational advantages and still retain good R/D performance. The proposed TsSVQ method uses a full dimensional quantizer in its first stage for exploiting all the higher dimensional coding advantages and then, uses an SVQ method for quantizing the residual vector in the second stage so as to reduce the complexity. We also develop a transform domain residual coding method in this two stage architecture such that it further reduces the computational complexity. To design an effective residual codebook in the second stage, variance normalization of Voronoi regions is carried out which leads to the design of two new methods, referred to as normalized two stage SVQ (NTsSVQ) and normalized two stage transform domain SVQ (NTsTrSVQ). These two new methods have complimentary strengths and hence, they are combined in a switched VQ mode which leads to the further improvement in R/D performance, but retaining the low complexity requirement. We evaluate the performances of new methods for wide-band speech LSF parameter quantization and show their advantages over established SVQ and SSVQ methods.
Resumo:
Les histoires de l’art et du design ont délaissé, au cours desquatre dernières décennies, l’étude canonique des objets, des artistes/concepteurs et des styles et se sont tournées vers des recherches plus interdisciplinaires. Nous soutenons néanmoins que les historiens et historiennes du design doivent continuer de pousser leur utilisation d’approches puisant dans la culturelle matérielle et la criticalité afin de combler des lacunes dans l’histoire du design et de développer des méthodes et des approches pertinentes pour son étude. Puisant dans notre expérience d’enseignement auprès de la génération des « milléniaux », qui sont portés vers un « design militant », nous offrons des exemples pédagogiques qui ont aidé nos étudiants et étudiantes à assimiler des histoires du design responsables, engagées et réflexives et à comprendre la complexité et la criticalité du design.
Resumo:
The research in software science has so far been concentrated on three measures of program complexity: (a) software effort; (b) cyclomatic complexity; and (c) program knots. In this paper we propose a measure of the logical complexity of programs in terms of the variable dependency of sequence of computations, inductive effort in writing loops and complexity of data structures. The proposed complexity mensure is described with the aid of a graph which exhibits diagrammatically the dependence of a computation at a node upon the computation of other (earlier) nodes. Complexity measures of several example programs have been computed and the related issues have been discussed. The paper also describes the role played by data structures in deciding the program complexity.