896 resultados para Many-body models
Resumo:
The identification of attractors is one of the key tasks in studies of neurobiological coordination from a dynamical systems perspective, with a considerable body of literature resulting from this task. However, with regards to typical movement models investigated, the overwhelming majority of actions studied previously belong to the class of continuous, rhythmical movements. In contrast, very few studies have investigated coordination of discrete movements, particularly multi-articular discrete movements. In the present study, we investigated phase transition behavior in a basketball throwing task where participants were instructed to shoot at the basket from different distances. Adopting the ubiquitous scaling paradigm, throwing distance was manipulated as a candidate control parameter. Using a cluster analysis approach, clear phase transitions between different movement patterns were observed in performance of only two of eight participants. The remaining participants used a single movement pattern and varied it according to throwing distance, thereby exhibiting hysteresis effects. Results suggested that, in movement models involving many biomechanical degrees of freedom in degenerate systems, greater movement variation across individuals is available for exploitation. This observation stands in contrast to movement variation typically observed in studies using more constrained bi-manual movement models. This degenerate system behavior provides new insights and poses fresh challenges to the dynamical systems theoretical approach, requiring further research beyond conventional movement models.
Resumo:
'Beyond the intercultural to the Accented Body’ foregrounds contemporary choreography as a multi-modal practice which is increasingly interdisciplinary and engages with interactive technologies. These concepts are explored in the context of intercultural dance and performance practices particularly in relation to issues of identity, hybridity, the diaspora and transformation. Four models of intercultural choreography are proposed: in-country immersion; collaborative international exchanges through sharing of culturally diverse practices; hybrid practices of diasporic artists; and implicit intercultural connections. The latter model is investigated via a case study of an interactive, multi-site and interdisciplinary collaboration Accented Body.
Resumo:
As militarization of bodies politic continues apace the world over, as military organizations again reveal themselves as primary political, economic and cultural forces in many societies, we argue that the emergent and potentially dominant form of political economic organization is a species of neo-feudal corporatism. Drawing upon Bourdieu, we theorize bodies politic as living habitus. Bodies politic are prepared for war and peace through new mediations, powerful means of public pedagogy. The process of militarization requires the generation of new, antagonistic evaluations of other bodies politic. Such evaluations are inculcated via these mediations, the movement of meanings across time and space, between formerly disparate histories, places, and cultures. New mediations touch new and different aspects of the body politic: its eyes, its ears, its organs, but they are consistently targeted at the formation of dispositions, the prime movers of action.
Resumo:
This article examines the continued relevance of the 16-19 business education curriculum in the UK, stimulated by doubts expressed by Thomas (1996), over its continued relevance. We express a concern that business education needs, but is struggling, to respond to significant societal shifts in consumption and production strategies that do not sit easily within traditional theories of business practice currently underpinning 16-19 business education. We examine firstly, the extent to which a formal body of knowledge couched in a modernist discourse of facts and objectivity can cope with the changing and fluid developments in much current business practice that is rooted in the cultural and symbolic. Secondly, the extent to which both academic and vocational competences provide the means for students to develop a framework of critical understanding that can respond effectively to rapidly changing business environments.Findings are based on research conducted jointly by the University of Manchester and the Manchester Institute for Popular Culture at Manchester Metropolitan University. The growth of dynamism of the cultural industries sector - largely micro-businesses and small and medium sized enterprises (SMEs) -encapsulates forms of business knowledge, business language and business practice which may not immediately fit with the models provided within business education. Results suggest increasingly reflexive forms of consumption being met by similarly reflexive and flexible modes of production.Our evidence suggests that whilst modernist business knowledge is often the foundation for many 16-19 business education courses, these programmes of study/training do not usually reflect the activities of SME and micro-business practitioners in the cultural industries. Given the importance of cultural industries in terms of the production strategies required to meet increasingly reflexive markets, it is suggested that there may be a need to incorporate a postmodern approach to the current content and pedagogy; one that is contextual, cultural and discursive.
Resumo:
Background: Diagnosis of epithelial ovarian cancer (EOC) in young women has major implications including those to their reproductive potential. We evaluated depression, anxiety and body image in patients with stage I EOC treated with fertility sparing surgery (FSS) or radical surgery (RS). We also investigated fertility outcomes after FSS.----- Methods: A retrospective study was undertaken in which 62 patients completed questionnaires related to anxiety, depression, body image and fertility outcomes. Additional information on adjuvant therapy after FSS and RS and demographic details were abstracted from medical records. Both bi and multivariate regression models were used to assess the relationship between demographic, clinical and pathological results and scores for anxiety, depression and body image.----- Results: Thirty-nine patients underwent RS and the rest, FSS. The percentage of patients reporting elevated anxiety and depression (subscores ≥ 11) were 27 % and 5% respectively. The median (inter quartile range) score for body image scale (BIS) was 6 (3-15). None of the demographic or clinical factors examined showed significant association with anxiety and BIS with the exception of ‘time since diagnosis’. For depression, post-menopausal status was the only independent predictor. Among those 23 patients treated by FSS, 14 patients tried to conceive (7 successful), resulting in 7 live births, one termination of pregnancy and one miscarriage.----- Conclusion: This study shows that psychological issues are common in women treated for stage I EOC. Reproduction after FSS is feasible and lead to the birth of healthy babies in about half of patients who wished to have another child. Further prospective studies with standardised instruments are required.
Resumo:
Patients with severe back deformities can greatly benefit from customized medical seating. Customized medical seating is made by taking measurements of each individual patient and making the seat as per these measurements. The current measuring systems employed by the industry are limited to use in clinics which are generally located only in major population centres. Patients living in remote areas are severely affected by this as the clinics could be far away and inaccessible for these patients. To provide service of customized medical seating requires a new measurement system which is portable so that the system could be transported to the patients in remote areas. The requirements for a new measurement system are analysed to suite the needs of Equipment Technology Services of the Cerebral Palsy League of Queensland. Design for a new measurement system was conceptualised by reviewing systems and technologies in various scientific disciplines. Design for a new system was finalised by optimizing each individual component. The final approach was validated by measuring difficult models and repeating the process to check for process variances. This system has now been adopted for clinical evaluation by ETS Suggestions have been made for further improvements in this new measurement approach.
Resumo:
After the recent prolonged drought conditions in many parts of Australia it is increasingly recognised that many groundwater systems are under stress. Although this is obvious for systems that are utilised for intensive irrigation many other groundwater systems are also impacted.Management strategies are highly variable to non-existent. Policy and regulation are also often inadequate, and are reactive or politically driven. In addition, there is a wide range of opinion by water users and other stakeholders as to what is “reasonable”management practice. These differences are often related to the “value”that is put on the groundwater resource. Opinions vary from “our right to free water”to an awareness that without effective management the resource will be degraded. There is also often misunderstanding of surface water-groundwater linkages, recharge processes, and baseflow to drainage systems.
Resumo:
Business Process Modelling is a fast growing field in business and information technology, which uses visual grammars to model and execute the processes within an organisation. However, many analysts present such models in a 2D static and iconic manner that is difficult to understand by many stakeholders. Difficulties in understanding such grammars can impede the improvement of processes within an enterprise due to communication problems. In this chapter we present a novel framework for intuitively visualising animated business process models in interactive Virtual Environments. We also show that virtual environment visualisations can be performed with present 2D business process modelling technology, thus providing a low barrier to entry for business process practitioners. Two case studies are presented from film production and healthcare domains that illustrate the ease with which these visualisations can be created. This approach can be generalised to other executable workflow systems, for any application domain being modelled.
Resumo:
This paper describes an automated procedure for analysing the significance of each of the many terms in the equations of motion for a serial-link robot manipulator. Significance analysis provides insight into the rigid-body dynamic effects that are significant locally or globally in the manipulator's state space. Deleting those terms that do not contribute significantly to the total joint torque can greatly reduce the computational burden for online control, and a Monte-Carlo style simulation is used to investigate the errors thus introduced. The procedures described are a hybrid of symbolic and numeric techniques, and can be readily implemented using standard computer algebra packages.
Resumo:
This thesis addresses computational challenges arising from Bayesian analysis of complex real-world problems. Many of the models and algorithms designed for such analysis are ‘hybrid’ in nature, in that they are a composition of components for which their individual properties may be easily described but the performance of the model or algorithm as a whole is less well understood. The aim of this research project is to after a better understanding of the performance of hybrid models and algorithms. The goal of this thesis is to analyse the computational aspects of hybrid models and hybrid algorithms in the Bayesian context. The first objective of the research focuses on computational aspects of hybrid models, notably a continuous finite mixture of t-distributions. In the mixture model, an inference of interest is the number of components, as this may relate to both the quality of model fit to data and the computational workload. The analysis of t-mixtures using Markov chain Monte Carlo (MCMC) is described and the model is compared to the Normal case based on the goodness of fit. Through simulation studies, it is demonstrated that the t-mixture model can be more flexible and more parsimonious in terms of number of components, particularly for skewed and heavytailed data. The study also reveals important computational issues associated with the use of t-mixtures, which have not been adequately considered in the literature. The second objective of the research focuses on computational aspects of hybrid algorithms for Bayesian analysis. Two approaches will be considered: a formal comparison of the performance of a range of hybrid algorithms and a theoretical investigation of the performance of one of these algorithms in high dimensions. For the first approach, the delayed rejection algorithm, the pinball sampler, the Metropolis adjusted Langevin algorithm, and the hybrid version of the population Monte Carlo (PMC) algorithm are selected as a set of examples of hybrid algorithms. Statistical literature shows how statistical efficiency is often the only criteria for an efficient algorithm. In this thesis the algorithms are also considered and compared from a more practical perspective. This extends to the study of how individual algorithms contribute to the overall efficiency of hybrid algorithms, and highlights weaknesses that may be introduced by the combination process of these components in a single algorithm. The second approach to considering computational aspects of hybrid algorithms involves an investigation of the performance of the PMC in high dimensions. It is well known that as a model becomes more complex, computation may become increasingly difficult in real time. In particular the importance sampling based algorithms, including the PMC, are known to be unstable in high dimensions. This thesis examines the PMC algorithm in a simplified setting, a single step of the general sampling, and explores a fundamental problem that occurs in applying importance sampling to a high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of the estimate under conditions on the importance function. Additionally, the exponential growth of the asymptotic variance with the dimension is demonstrated and we illustrates that the optimal covariance matrix for the importance function can be estimated in a special case.
Resumo:
This dissertation is primarily an applied statistical modelling investigation, motivated by a case study comprising real data and real questions. Theoretical questions on modelling and computation of normalization constants arose from pursuit of these data analytic questions. The essence of the thesis can be described as follows. Consider binary data observed on a two-dimensional lattice. A common problem with such data is the ambiguity of zeroes recorded. These may represent zero response given some threshold (presence) or that the threshold has not been triggered (absence). Suppose that the researcher wishes to estimate the effects of covariates on the binary responses, whilst taking into account underlying spatial variation, which is itself of some interest. This situation arises in many contexts and the dingo, cypress and toad case studies described in the motivation chapter are examples of this. Two main approaches to modelling and inference are investigated in this thesis. The first is frequentist and based on generalized linear models, with spatial variation modelled by using a block structure or by smoothing the residuals spatially. The EM algorithm can be used to obtain point estimates, coupled with bootstrapping or asymptotic MLE estimates for standard errors. The second approach is Bayesian and based on a three- or four-tier hierarchical model, comprising a logistic regression with covariates for the data layer, a binary Markov Random field (MRF) for the underlying spatial process, and suitable priors for parameters in these main models. The three-parameter autologistic model is a particular MRF of interest. Markov chain Monte Carlo (MCMC) methods comprising hybrid Metropolis/Gibbs samplers is suitable for computation in this situation. Model performance can be gauged by MCMC diagnostics. Model choice can be assessed by incorporating another tier in the modelling hierarchy. This requires evaluation of a normalization constant, a notoriously difficult problem. Difficulty with estimating the normalization constant for the MRF can be overcome by using a path integral approach, although this is a highly computationally intensive method. Different methods of estimating ratios of normalization constants (N Cs) are investigated, including importance sampling Monte Carlo (ISMC), dependent Monte Carlo based on MCMC simulations (MCMC), and reverse logistic regression (RLR). I develop an idea present though not fully developed in the literature, and propose the Integrated mean canonical statistic (IMCS) method for estimating log NC ratios for binary MRFs. The IMCS method falls within the framework of the newly identified path sampling methods of Gelman & Meng (1998) and outperforms ISMC, MCMC and RLR. It also does not rely on simplifying assumptions, such as ignoring spatio-temporal dependence in the process. A thorough investigation is made of the application of IMCS to the three-parameter Autologistic model. This work introduces background computations required for the full implementation of the four-tier model in Chapter 7. Two different extensions of the three-tier model to a four-tier version are investigated. The first extension incorporates temporal dependence in the underlying spatio-temporal process. The second extensions allows the successes and failures in the data layer to depend on time. The MCMC computational method is extended to incorporate the extra layer. A major contribution of the thesis is the development of a fully Bayesian approach to inference for these hierarchical models for the first time. Note: The author of this thesis has agreed to make it open access but invites people downloading the thesis to send her an email via the 'Contact Author' function.
Resumo:
This thesis applies Monte Carlo techniques to the study of X-ray absorptiometric methods of bone mineral measurement. These studies seek to obtain information that can be used in efforts to improve the accuracy of the bone mineral measurements. A Monte Carlo computer code for X-ray photon transport at diagnostic energies has been developed from first principles. This development was undertaken as there was no readily available code which included electron binding energy corrections for incoherent scattering and one of the objectives of the project was to study the effects of inclusion of these corrections in Monte Carlo models. The code includes the main Monte Carlo program plus utilities for dealing with input data. A number of geometrical subroutines which can be used to construct complex geometries have also been written. The accuracy of the Monte Carlo code has been evaluated against the predictions of theory and the results of experiments. The results show a high correlation with theoretical predictions. In comparisons of model results with those of direct experimental measurements, agreement to within the model and experimental variances is obtained. The code is an accurate and valid modelling tool. A study of the significance of inclusion of electron binding energy corrections for incoherent scatter in the Monte Carlo code has been made. The results show this significance to be very dependent upon the type of application. The most significant effect is a reduction of low angle scatter flux for high atomic number scatterers. To effectively apply the Monte Carlo code to the study of bone mineral density measurement by photon absorptiometry the results must be considered in the context of a theoretical framework for the extraction of energy dependent information from planar X-ray beams. Such a theoretical framework is developed and the two-dimensional nature of tissue decomposition based on attenuation measurements alone is explained. This theoretical framework forms the basis for analytical models of bone mineral measurement by dual energy X-ray photon absorptiometry techniques. Monte Carlo models of dual energy X-ray absorptiometry (DEXA) have been established. These models have been used to study the contribution of scattered radiation to the measurements. It has been demonstrated that the measurement geometry has a significant effect upon the scatter contribution to the detected signal. For the geometry of the models studied in this work the scatter has no significant effect upon the results of the measurements. The model has also been used to study a proposed technique which involves dual energy X-ray transmission measurements plus a linear measurement of the distance along the ray path. This is designated as the DPA( +) technique. The addition of the linear measurement enables the tissue decomposition to be extended to three components. Bone mineral, fat and lean soft tissue are the components considered here. The results of the model demonstrate that the measurement of bone mineral using this technique is stable over a wide range of soft tissue compositions and hence would indicate the potential to overcome a major problem of the two component DEXA technique. However, the results also show that the accuracy of the DPA( +) technique is highly dependent upon the composition of the non-mineral components of bone and has poorer precision (approximately twice the coefficient of variation) than the standard DEXA measurements. These factors may limit the usefulness of the technique. These studies illustrate the value of Monte Carlo computer modelling of quantitative X-ray measurement techniques. The Monte Carlo models of bone densitometry measurement have:- 1. demonstrated the significant effects of the measurement geometry upon the contribution of scattered radiation to the measurements, 2. demonstrated that the statistical precision of the proposed DPA( +) three tissue component technique is poorer than that of the standard DEXA two tissue component technique, 3. demonstrated that the proposed DPA(+) technique has difficulty providing accurate simultaneous measurement of body composition in terms of a three component model of fat, lean soft tissue and bone mineral,4. and provided a knowledge base for input to decisions about development (or otherwise) of a physical prototype DPA( +) imaging system. The Monte Carlo computer code, data, utilities and associated models represent a set of significant, accurate and valid modelling tools for quantitative studies of physical problems in the fields of diagnostic radiology and radiography.
Resumo:
A national-level safety analysis tool is needed to complement existing analytical tools for assessment of the safety impacts of roadway design alternatives. FHWA has sponsored the development of the Interactive Highway Safety Design Model (IHSDM), which is roadway design and redesign software that estimates the safety effects of alternative designs. Considering the importance of IHSDM in shaping the future of safety-related transportation investment decisions, FHWA justifiably sponsored research with the sole intent of independently validating some of the statistical models and algorithms in IHSDM. Statistical model validation aims to accomplish many important tasks, including (a) assessment of the logical defensibility of proposed models, (b) assessment of the transferability of models over future time periods and across different geographic locations, and (c) identification of areas in which future model improvements should be made. These three activities are reported for five proposed types of rural intersection crash prediction models. The internal validation of the model revealed that the crash models potentially suffer from omitted variables that affect safety, site selection and countermeasure selection bias, poorly measured and surrogate variables, and misspecification of model functional forms. The external validation indicated the inability of models to perform on par with model estimation performance. Recommendations for improving the state of the practice from this research include the systematic conduct of carefully designed before-and-after studies, improvements in data standardization and collection practices, and the development of analytical methods to combine the results of before-and-after studies with cross-sectional studies in a meaningful and useful way.
Resumo:
Background: Impairments in upper-body function (UBF) are common following breast cancer. However, the relationship between arm morbidity and quality of life (QoL) remains unclear. This investigation uses longitudinal data to describe UBF in a population-based sample of women with breast cancer and examines its relationship with QoL. ---------- Methods: Australian women (n = 287) with unilateral breast cancer were assessed at three-monthly intervals, from six- to 18-months post-surgery (PS). Strength, endurance and flexibility were used to assess objective UBF, while the Disability of the Arm, Shoulder and Hand questionnaire and the Functional Assessment of Cancer Therapy- Breast questionnaire were used to assess self-reported UBF and QoL, respectively. ---------- Results: Although mean UBF improved over time, up to 41% of women revealed declines in UBF between sixand 18-months PS. Older age, lower socioeconomic position, treatment on the dominant side, mastectomy, more extensive lymph node removal and having lymphoedema each increased odds of declines in UBF by at least twofold (p < 0.05). Lower baseline and declines in perceived UBF between six- and 18-months PS were each associated with poorer QoL at 18-months PS (p < 0.05). ---------- Conclusions: Significant upper-body morbidity is experienced by many following breast cancer treatment, persisting longer term, and adversely influencing the QoL of breast cancer survivors.
Resumo:
The intent of this note is to succinctly articulate additional points that were not provided in the original paper (Lord et al., 2005) and to help clarify a collective reluctance to adopt zero-inflated (ZI) models for modeling highway safety data. A dialogue on this important issue, just one of many important safety modeling issues, is healthy discourse on the path towards improved safety modeling. This note first provides a summary of prior findings and conclusions of the original paper. It then presents two critical and relevant issues: the maximizing statistical fit fallacy and logic problems with the ZI model in highway safety modeling. Finally, we provide brief conclusions.