369 resultados para Inter Session Variability Modelling
Resumo:
Low back pain is an increasing problem in industrialised countries and although it is a major socio-economic problem in terms of medical costs and lost productivity, relatively little is known about the processes underlying the development of the condition. This is in part due to the complex interactions between bone, muscle, nerves and other soft tissues of the spine, and the fact that direct observation and/or measurement of the human spine is not possible using non-invasive techniques. Biomechanical models have been used extensively to estimate the forces and moments experienced by the spine. These models provide a means of estimating the internal parameters which can not be measured directly. However, application of most of the models currently available is restricted to tasks resembling those for which the model was designed due to the simplified representation of the anatomy. The aim of this research was to develop a biomechanical model to investigate the changes in forces and moments which are induced by muscle injury. In order to accurately simulate muscle injuries a detailed quasi-static three dimensional model representing the anatomy of the lumbar spine was developed. This model includes the nine major force generating muscles of the region (erector spinae, comprising the longissimus thoracis and iliocostalis lumborum; multifidus; quadratus lumborum; latissimus dorsi; transverse abdominis; internal oblique and external oblique), as well as the thoracolumbar fascia through which the transverse abdominis and parts of the internal oblique and latissimus dorsi muscles attach to the spine. The muscles included in the model have been represented using 170 muscle fascicles each having their own force generating characteristics and lines of action. Particular attention has been paid to ensuring the muscle lines of action are anatomically realistic, particularly for muscles which have broad attachments (e.g. internal and external obliques), muscles which attach to the spine via the thoracolumbar fascia (e.g. transverse abdominis), and muscles whose paths are altered by bony constraints such as the rib cage (e.g. iliocostalis lumborum pars thoracis and parts of the longissimus thoracis pars thoracis). In this endeavour, a separate sub-model which accounts for the shape of the torso by modelling it as a series of ellipses has been developed to model the lines of action of the oblique muscles. Likewise, a separate sub-model of the thoracolumbar fascia has also been developed which accounts for the middle and posterior layers of the fascia, and ensures that the line of action of the posterior layer is related to the size and shape of the erector spinae muscle. Published muscle activation data are used to enable the model to predict the maximum forces and moments that may be generated by the muscles. These predictions are validated against published experimental studies reporting maximum isometric moments for a variety of exertions. The model performs well for fiexion, extension and lateral bend exertions, but underpredicts the axial twist moments that may be developed. This discrepancy is most likely the result of differences between the experimental methodology and the modelled task. The application of the model is illustrated using examples of muscle injuries created by surgical procedures. The three examples used represent a posterior surgical approach to the spine, an anterior approach to the spine and uni-lateral total hip replacement surgery. Although the three examples simulate different muscle injuries, all demonstrate the production of significant asymmetrical moments and/or reduced joint compression following surgical intervention. This result has implications for patient rehabilitation and the potential for further injury to the spine. The development and application of the model has highlighted a number of areas where current knowledge is deficient. These include muscle activation levels for tasks in postures other than upright standing, changes in spinal kinematics following surgical procedures such as spinal fusion or fixation, and a general lack of understanding of how the body adjusts to muscle injuries with respect to muscle activation patterns and levels, rate of recovery from temporary injuries and compensatory actions by other muscles. Thus the comprehensive and innovative anatomical model which has been developed not only provides a tool to predict the forces and moments experienced by the intervertebral joints of the spine, but also highlights areas where further clinical research is required.
Resumo:
Aims : The aim of this study was to conduct an exploratory investigation into the in-session processes and behaviours that occur between therapists and young people in online counseling. Method: The Consensual Qualitative Research method was employed to identify in-session behaviours and a coding instrument was developed to determine their frequency of use and assess whether nuances carried in the meaning of text messages have an influential effect during sessions. Eighty-five single-session transcripts were examined in total by two independent coders. Results: Sample statistics revealed that, on average, rapport-building processes were used more consistently across cases with both types of processes having a moderately strong positive effect on young people. However, closer examination of these processes revealed weaker positive effects for in-session behaviours that rely more heavily on verbal and non-verbal cues to be accurately interpreted. Implications for Practice and Future Research: These findings imply that therapists may focus more on building rapport than accomplishing tasks with young people during online counselling sessions due to the absence of verbal and non-verbal information when communicating via text messages.
Resumo:
There are increasing indications that the contribution of holding costs and its impact on housing affordability is very significant. Their importance and perceived high level impact can be gauged from considering the unprecedented level of attention policy makers have given them recently. This may be evidenced by the embedding of specific strategies to address burgeoning holding costs (and particularly those cost savings associated with streamlining regulatory assessment) within statutory instruments such as the Queensland Housing Affordability Strategy, and the South East Queensland Regional Plan. However, several key issues require further investigation. Firstly, the computation and methodology behind the calculation of holding costs varies widely. In fact, it is not only variable, but in some instances completely ignored. Secondly, some ambiguity exists in terms of the inclusion of various elements of holding costs and assessment of their relative contribution. Perhaps this may in part be explained by their nature: such costs are not always immediately apparent. They are not as visible as more tangible cost items associated with greenfield development such as regulatory fees, government taxes, acquisition costs, selling fees, commissions and others. Holding costs are also more difficult to evaluate since for the most part they must be ultimately assessed over time in an ever-changing environment based on their strong relationship with opportunity cost which is in turn dependant, inter alia, upon prevailing inflation and / or interest rates. This paper seeks to provide a more detailed investigation of those elements related to holding costs, and in so doing determine the size of their impact specifically on the end user. It extends research in this area clarifying the extent to which holding costs impact housing affordability. Geographical diversity indicated by the considerable variation between various planning instruments and the length of regulatory assessment periods suggests further research should adopt a case study approach in order to test the relevance of theoretical modelling conducted.
Resumo:
In this article we explore young children's development of mathematical knowledge and reasoning processes as they worked two modelling problems (the Butter Beans Problem and the Airplane Problem). The problems involve authentic situations that need to be interpreted and described in mathematical ways. Both problems include tables of data, together with background information containing specific criteria to be considered in the solution process. Four classes of third-graders (8 years of age) and their teachers participated in the 6-month program, which included preparatory modelling activities along with professional development for the teachers. In discussing our findings we address: (a) Ways in which the children applied their informal, personal knowledge to the problems; (b) How the children interpreted the tables of data, including difficulties they experienced; (c) How the children operated on the data, including aggregating and comparing data, and looking for trends and patterns; (c) How the children developed important mathematical ideas; and (d) Ways in which the children represented their mathematical understandings.
Resumo:
An educational priority of many nations is to enhance mathematical learning in early childhood. One area in need of special attention is that of statistics. This paper argues for a renewed focus on statistical reasoning in the beginning school years, with opportunities for children to engage in data modelling activities. Such modelling involves investigations of meaningful phenomena, deciding what is worthy of attention (i.e., identifying complex attributes), and then progressing to organising, structuring, visualising, and representing data. Results are reported from the first year of a three-year longitudinal study in which three classes of first-grade children and their teachers engaged in activities that required the creation of data models. The theme of “Looking after our Environment,” a component of the children’s science curriculum at the time, provided the context for the activities. Findings focus on how the children dealt with given complex attributes and how they generated their own attributes in classifying broad data sets, and the nature of the models the children created in organising, structuring, and representing their data.
Resumo:
This article examines one approach to promoting creative and flexible use of mathematical ideas within an interdisciplinary context in the primary curriculum, namely, through modelling. Three classes of fifth-grade children worked on a modelling problem, The First Fleet (Australia’s settlement), situated within the curriculum domains of science and studies of society and environment. Reported here are the cycles of development displayed by one group of children as they worked the problem, together with the range of models created across the classes. Children developed mathematisation processes that extended beyond their regular curriculum, including identifying and prioritising key problem elements, exploring relationships among elements, quantifying qualitative data, ranking and aggregating data, and creating and working with weighted scores. Aspects of Goldin’s (2000, 2007) affective structures also appeared to play an important role in the children's mathematical developments.
Resumo:
The recent development of indoor wireless local area network (WLAN) standards at 2.45 GHz and 5 GHz has led to increased interest in propagation studies at these frequency bands. Within the indoor environment, human body effects can strongly reduce the quality of wireless communication systems. Human body effects can cause temporal variations and shadowing due to pedestrian movement and antenna- body interaction with portable terminals. This book presents a statistical characterisation, based on measurements, of human body effects on indoor narrowband channels at 2.45 GHz and at 5.2 GHz. A novel cumulative distribution function (CDF) that models the 5 GHz narrowband channel in populated indoor environments is proposed. This novel CDF describes the received envelope in terms of pedestrian traffic. In addition, a novel channel model for the populated indoor environment is proposed for the Multiple-Input Multiple-Output (MIMO) narrowband channel in presence of pedestrians at 2.45 GHz. Results suggest that practical MIMO systems must be sufficiently adaptive if they are to benefit from the capacity enhancement caused by pedestrian movement.
Resumo:
Background: In order to design appropriate environments for performance and learning of movement skills, physical educators need a sound theoretical model of the learner and of processes of learning. In physical education, this type of modelling informs the organization of learning environments and effective and efficient use of practice time. An emerging theoretical framework in motor learning, relevant to physical education, advocates a constraints-led perspective for acquisition of movement skills and game play knowledge. This framework shows how physical educators could use task, performer and environmental constraints to channel acquisition of movement skills and decision making behaviours in learners. From this viewpoint, learners generate specific movement solutions to satisfy the unique combination of constraints imposed on them, a process which can be harnessed during physical education lessons. Purpose: In this paper the aim is to provide an overview of the motor learning approach emanating from the constraints-led perspective, and examine how it can substantiate a platform for a new pedagogical framework in physical education: nonlinear pedagogy. We aim to demonstrate that it is only through theoretically valid and objective empirical work of an applied nature that a conceptually sound nonlinear pedagogy model can continue to evolve and support research in physical education. We present some important implications for designing practices in games lessons, showing how a constraints-led perspective on motor learning could assist physical educators in understanding how to structure learning experiences for learners at different stages, with specific focus on understanding the design of games teaching programmes in physical education, using exemplars from Rugby Union and Cricket. Findings: Research evidence from recent studies examining movement models demonstrates that physical education teachers need a strong understanding of sport performance so that task constraints can be manipulated so that information-movement couplings are maintained in a learning environment that is representative of real performance situations. Physical educators should also understand that movement variability may not necessarily be detrimental to learning and could be an important phenomenon prior to the acquisition of a stable and functional movement pattern. We highlight how the nonlinear pedagogical approach is student-centred and empowers individuals to become active learners via a more hands-off approach to learning. Summary: A constraints-based perspective has the potential to provide physical educators with a framework for understanding how performer, task and environmental constraints shape each individual‟s physical education. Understanding the underlying neurobiological processes present in a constraints-led perspective to skill acquisition and game play can raise awareness of physical educators that teaching is a dynamic 'art' interwoven with the 'science' of motor learning theories.
Resumo:
Purpose: The component modules in the standard BEAMnrc distribution may appear to be insufficient to model micro-multileaf collimators that have tri-faceted leaf ends and complex leaf profiles. This note indicates, however, that accurate Monte Carlo simulations of radiotherapy beams defined by a complex collimation device can be completed using BEAMnrc's standard VARMLC component module.---------- Methods: That this simple collimator model can produce spatially and dosimetrically accurate micro-collimated fields is illustrated using comparisons with ion chamber and film measurements of the dose deposited by square and irregular fields incident on planar, homogeneous water phantoms.---------- Results: Monte Carlo dose calculations for on- and off-axis fields are shown to produce good agreement with experimental values, even upon close examination of the penumbrae.--------- Conclusions: The use of a VARMLC model of the micro-multileaf collimator, along with a commissioned model of the associated linear accelerator, is therefore recommended as an alternative to the development or use of in-house or third-party component modules for simulating stereotactic radiotherapy and radiosurgery treatments. Simulation parameters for the VARMLC model are provided which should allow other researchers to adapt and use this model to study clinical stereotactic radiotherapy treatments.
Resumo:
Many studies in the area of project management and social networks have identified the significance of project knowledge transfer within and between projects. However, only few studies have examined the intra- and inter-projects knowledge transfer activities. Knowledge in projects can be transferred via face-to-face interactions on the one hand, and via IT-based tools on the other. Although companies have allocated many resources to the IT tools, it has been found that they are not always effectively utilised, and people prefer to look for knowledge using social face-to-face interactions. This paper explores how to effectively leverage two alternative knowledge transfer techniques, face-to-face and IT-based tools to facilitate knowledge transfer and enhance knowledge creation for intra- and inter-project knowledge transfer. The paper extends the previous research on the relationships between and within teams by examining the project’s external and internal knowledge networks concurrently. Social network qualitative analysis, using a case study within a small-medium enterprise, was used to examine the knowledge transfer activities within and between projects, and to investigate knowledge transfer techniques. This paper demonstrates the significance of overlapping employees working simultaneously on two or more projects and their impact on facilitating knowledge transfer between projects within a small/medium organisation. This research is also crucial to gaining better understanding of different knowledge transfer techniques used for intra- and inter-project knowledge exchange. The research provides recommendations on how to achieve better knowledge transfer within and between projects in order to fully utilise a project’s knowledge and achieve better project performance.
Resumo:
Ecological problems are typically multi faceted and need to be addressed from a scientific and a management perspective. There is a wealth of modelling and simulation software available, each designed to address a particular aspect of the issue of concern. Choosing the appropriate tool, making sense of the disparate outputs, and taking decisions when little or no empirical data is available, are everyday challenges facing the ecologist and environmental manager. Bayesian Networks provide a statistical modelling framework that enables analysis and integration of information in its own right as well as integration of a variety of models addressing different aspects of a common overall problem. There has been increased interest in the use of BNs to model environmental systems and issues of concern. However, the development of more sophisticated BNs, utilising dynamic and object oriented (OO) features, is still at the frontier of ecological research. Such features are particularly appealing in an ecological context, since the underlying facts are often spatial and temporal in nature. This thesis focuses on an integrated BN approach which facilitates OO modelling. Our research devises a new heuristic method, the Iterative Bayesian Network Development Cycle (IBNDC), for the development of BN models within a multi-field and multi-expert context. Expert elicitation is a popular method used to quantify BNs when data is sparse, but expert knowledge is abundant. The resulting BNs need to be substantiated and validated taking this uncertainty into account. Our research demonstrates the application of the IBNDC approach to support these aspects of BN modelling. The complex nature of environmental issues makes them ideal case studies for the proposed integrated approach to modelling. Moreover, they lend themselves to a series of integrated sub-networks describing different scientific components, combining scientific and management perspectives, or pooling similar contributions developed in different locations by different research groups. In southern Africa the two largest free-ranging cheetah (Acinonyx jubatus) populations are in Namibia and Botswana, where the majority of cheetahs are located outside protected areas. Consequently, cheetah conservation in these two countries is focussed primarily on the free-ranging populations as well as the mitigation of conflict between humans and cheetahs. In contrast, in neighbouring South Africa, the majority of cheetahs are found in fenced reserves. Nonetheless, conflict between humans and cheetahs remains an issue here. Conservation effort in South Africa is also focussed on managing the geographically isolated cheetah populations as one large meta-population. Relocation is one option among a suite of tools used to resolve human-cheetah conflict in southern Africa. Successfully relocating captured problem cheetahs, and maintaining a viable free-ranging cheetah population, are two environmental issues in cheetah conservation forming the first case study in this thesis. The second case study involves the initiation of blooms of Lyngbya majuscula, a blue-green algae, in Deception Bay, Australia. L. majuscula is a toxic algal bloom which has severe health, ecological and economic impacts on the community located in the vicinity of this algal bloom. Deception Bay is an important tourist destination with its proximity to Brisbane, Australia’s third largest city. Lyngbya is one of several algae considered to be a Harmful Algal Bloom (HAB). This group of algae includes other widespread blooms such as red tides. The occurrence of Lyngbya blooms is not a local phenomenon, but blooms of this toxic weed occur in coastal waters worldwide. With the increase in frequency and extent of these HAB blooms, it is important to gain a better understanding of the underlying factors contributing to the initiation and sustenance of these blooms. This knowledge will contribute to better management practices and the identification of those management actions which could prevent or diminish the severity of these blooms.
Resumo:
This paper argues a model of complex system design for sustainable architecture within a framework of entropy evolution. The spectrum of sustainable architecture consists of the efficient use of energy and material resource in life-cycle of buildings, the active involvement of the occupants in micro-climate control within buildings, and the natural environmental context. The interactions of the parameters compose a complex system of sustainable architectural design, of which the conventional linear and fragmented design technologies are insufficient to indicate holistic and ongoing environmental performance. The complexity theory of dissipative structure states a microscopic formulation of open system evolution, which provides a system design framework for the evolution of building environmental performance towards an optimization of sustainability in architecture.
Resumo:
Decisions made in the earliest stage of architectural design have the greatest impact on the construction, lifecycle cost and environmental footprint of buildings. Yet the building services, one of the largest contributors to cost, complexity, and environmental impact, are rarely considered as an influence on the design at this crucial stage. In order for efficient and environmentally sensitive built environment outcomes to be achieved, a closer collaboration between architects and services engineers is required at the outset of projects. However, in practice, there are a variety of obstacles impeding this transition towards an integrated design approach. This paper firstly presents a critical review of the existing barriers to multidisciplinary design. It then examines current examples of best practice in the building industry to highlight the collaborative strategies being employed and their benefits to the design process. Finally, it discusses a case study project to identify directions for further research.
Resumo:
The internet infrastructure which supports high data rates has a major impact on the Australian economy and the world. However, in rural Australia, the provision of broadband services to an internet dispersed population over a large geographical area with low population densities remains both an economic and technical challenge [1]. Furthermore, the implementation of currently available technologies such as fibre-to-the-premise (FTTP), 3G, 4G and WiMAX seems to be impractical, considering the low population density that is distributed in a large area. Therefore, new paradigms and innovative telecommunication technologies need to be explored to overcome the challenges of providing faster and more reliable broadband internet services to internet dispersed rural areas. The research project implements an innovative Multi-User- Single-Antenna for MIMO (MUSA-MIMO) technology using the spectrum currently allocated to analogue TV. MUSAMIMO technology can be considered as a special case of MIMO technology, which is beneficial when provisioning reliable and high-speed communication channels. Particularly, the abstract describes the development of a novel MUSA-MIMO channel model that takes into account temporal variations in the rural wireless environment. This can be considered as a novel approach tailor-made to rural Australia for provisioning efficient wireless broadband communications.
Resumo:
In this thesis we are interested in financial risk and the instrument we want to use is Value-at-Risk (VaR). VaR is the maximum loss over a given period of time at a given confidence level. Many definitions of VaR exist and some will be introduced throughout this thesis. There two main ways to measure risk and VaR: through volatility and through percentiles. Large volatility in financial returns implies greater probability of large losses, but also larger probability of large profits. Percentiles describe tail behaviour. The estimation of VaR is a complex task. It is important to know the main characteristics of financial data to choose the best model. The existing literature is very wide, maybe controversial, but helpful in drawing a picture of the problem. It is commonly recognised that financial data are characterised by heavy tails, time-varying volatility, asymmetric response to bad and good news, and skewness. Ignoring any of these features can lead to underestimating VaR with a possible ultimate consequence being the default of the protagonist (firm, bank or investor). In recent years, skewness has attracted special attention. An open problem is the detection and modelling of time-varying skewness. Is skewness constant or there is some significant variability which in turn can affect the estimation of VaR? This thesis aims to answer this question and to open the way to a new approach to model simultaneously time-varying volatility (conditional variance) and skewness. The new tools are modifications of the Generalised Lambda Distributions (GLDs). They are four-parameter distributions, which allow the first four moments to be modelled nearly independently: in particular we are interested in what we will call para-moments, i.e., mean, variance, skewness and kurtosis. The GLDs will be used in two different ways. Firstly, semi-parametrically, we consider a moving window to estimate the parameters and calculate the percentiles of the GLDs. Secondly, parametrically, we attempt to extend the GLDs to include time-varying dependence in the parameters. We used the local linear regression to estimate semi-parametrically conditional mean and conditional variance. The method is not efficient enough to capture all the dependence structure in the three indices —ASX 200, S&P 500 and FT 30—, however it provides an idea of the DGP underlying the process and helps choosing a good technique to model the data. We find that GLDs suggest that moments up to the fourth order do not always exist, there existence appears to vary over time. This is a very important finding, considering that past papers (see for example Bali et al., 2008; Hashmi and Tay, 2007; Lanne and Pentti, 2007) modelled time-varying skewness, implicitly assuming the existence of the third moment. However, the GLDs suggest that mean, variance, skewness and in general the conditional distribution vary over time, as already suggested by the existing literature. The GLDs give good results in estimating VaR on three real indices, ASX 200, S&P 500 and FT 30, with results very similar to the results provided by historical simulation.