975 resultados para Average models


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work a biomechanical model is used for simulation of muscle forces necessary to maintain the posture in a car seat under different support conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose. To create a binocular statistical eye model based on previously measured ocular biometric data. Methods. Thirty-nine parameters were determined for a group of 127 healthy subjects (37 male, 90 female; 96.8% Caucasian) with an average age of 39.9 ± 12.2 years and spherical equivalent refraction of −0.98 ± 1.77 D. These parameters described the biometry of both eyes and the subjects' age. Missing parameters were complemented by data from a previously published study. After confirmation of the Gaussian shape of their distributions, these parameters were used to calculate their mean and covariance matrices. These matrices were then used to calculate a multivariate Gaussian distribution. From this, an amount of random biometric data could be generated, which were then randomly selected to create a realistic population of random eyes. Results. All parameters had Gaussian distributions, with the exception of the parameters that describe total refraction (i.e., three parameters per eye). After these non-Gaussian parameters were omitted from the model, the generated data were found to be statistically indistinguishable from the original data for the remaining 33 parameters (TOST [two one-sided t tests]; P < 0.01). Parameters derived from the generated data were also significantly indistinguishable from those calculated with the original data (P > 0.05). The only exception to this was the lens refractive index, for which the generated data had a significantly larger SD. Conclusions. A statistical eye model can describe the biometric variations found in a population and is a useful addition to the classic eye models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In 2002, the United Nations Office on Drugs and Crime (UNODC) issued a report entitled Results of a pilot survey of forty selected organized criminal groups in sixteen countries which established five models of organised crime. This paper reviews these and other common organised crime models and drug trafficking models, and applies them to cases of South East Asian drug trafficking in the Australian state of Queensland. The study tests the following hypotheses: (1) South-East Asian drug trafficking groups in Queensland will operate within a criminal network or core group; (2) Wholesale drug distributors in Queensland will not fit consistently under any particular UN organised crime model; and (3) Street dealers will have no organisational structure. The study concluded that drug trafficking or importation closely resembles a criminal network or core group structure. Wholesale dealers did not fit consistently into any UN organised crime model. Street dealers had no organisational structure as an organisational structure is typically found in mid- to high-level drug trafficking.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PySSM is a Python package that has been developed for the analysis of time series using linear Gaussian state space models (SSM). PySSM is easy to use; models can be set up quickly and efficiently and a variety of different settings are available to the user. It also takes advantage of scientific libraries Numpy and Scipy and other high level features of the Python language. PySSM is also used as a platform for interfacing between optimised and parallelised Fortran routines. These Fortran routines heavily utilise Basic Linear Algebra (BLAS) and Linear Algebra Package (LAPACK) functions for maximum performance. PySSM contains classes for filtering, classical smoothing as well as simulation smoothing.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Evidence exists that repositories of business process models used in industrial practice contain significant amounts of duplication. This duplication may stem from the fact that the repository describes variants of the same pro- cesses and/or because of copy/pasting activity throughout the lifetime of the repository. Previous work has put forward techniques for identifying duplicate fragments (clones) that can be refactored into shared subprocesses. However, these techniques are limited to finding exact clones. This paper analyzes the prob- lem of approximate clone detection and puts forward two techniques for detecting clusters of approximate clones. Experiments show that the proposed techniques are able to accurately retrieve clusters of approximate clones that originate from copy/pasting followed by independent modifications to the copied fragments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, we argue that second language (L2) reading research, which has been informed by studies involving first language (L1) alphabetic English reading, may be less relevant to L2 readers with non-alphabetic reading backgrounds, such as Chinese readers with an L1 logographic (Chinese character) learning history. We provide both neuroanatomical and behavioural evidence from Chinese language reading studies to support our claims. The paper concludes with an argument outlining the need for a universal L2 reading model which can adequately account for readers with diverse L1 orthographic language learning histories.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to provide of a review of the theory and models underlying project management (PM) research degrees that encourage reflective learning. Design/methodology/approach – Review of the literature and reflection on the practice of being actively involved in conducting and supervising academic research and disseminating academic output. The paper argues the case for the potential usefulness of reflective academic research to PM practitioners. It also highlights theoretical drivers of and barriers to reflective academic research by PM practitioners. Findings – A reflective learning approach to research can drive practical results though it requires a great deal of commitment and support by both academic and industry partners. Practical implications – This paper suggests how PM practitioners can engage in academic research that has practical outcomes and how to be more effective at disseminating these research outcomes. Originality/value – Advanced academic degrees, in particular those completed by PM practitioners, can validate a valuable source of innovative ideas and approaches that should be more quickly absorbed into the PM profession’s sources of knowledge. The value of this paper is to critically review and facilitate a reduced adaptation time for implementation of useful reflective academic research to industry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Australian income tax regime is generally regarded as a mechanism by which the Federal Government raises revenue, with much of the revenue raised used to support public spending programs. A prime example of this type of spending program is health care. However, a government may also decide that the private sector should provide a greater share of the nation's health care. To achieve such a policy it can bring about change through positive regulation, or it can use the taxation regime, via tax expenditures, not to raise revenue but to steer or influence individuals in its desired direction. When used for this purpose, tax expenditures steer taxpayers towards or away from certain behaviour by either imposing costs on, or providing benefits to them. Within the context of the health sector, the Australian Federal Government deploys social steering via the tax system, with the Medicare Levy Surcharge and the 30 percent Private Health Insurance Rebate intended to steer taxpayer behaviour towards the Government’s policy goal of increasing the amount of health provision through the private sector. These steering mechanisms are complemented by the ‘Lifetime Health Cover Initiative’. This article, through the lens of behavioural economics, considers the ways in which these assorted mechanisms might have been expected to operate and whether they encourage individuals to purchase private health insurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The design of pre-contoured fracture fixation implants (plates and nails) that correctly fit the anatomy of a patient utilises 3D models of long bones with accurate geometric representation. 3D data is usually available from computed tomography (CT) scans of human cadavers that generally represent the above 60 year old age group. Thus, despite the fact that half of the seriously injured population comes from the 30 year age group and below, virtually no data exists from these younger age groups to inform the design of implants that optimally fit patients from these groups. Hence, relevant bone data from these age groups is required. The current gold standard for acquiring such data–CT–involves ionising radiation and cannot be used to scan healthy human volunteers. Magnetic resonance imaging (MRI) has been shown to be a potential alternative in the previous studies conducted using small bones (tarsal bones) and parts of the long bones. However, in order to use MRI effectively for 3D reconstruction of human long bones, further validations using long bones and appropriate reference standards are required. Accurate reconstruction of 3D models from CT or MRI data sets requires an accurate image segmentation method. Currently available sophisticated segmentation methods involve complex programming and mathematics that researchers are not trained to perform. Therefore, an accurate but relatively simple segmentation method is required for segmentation of CT and MRI data. Furthermore, some of the limitations of 1.5T MRI such as very long scanning times and poor contrast in articular regions can potentially be reduced by using higher field 3T MRI imaging. However, a quantification of the signal to noise ratio (SNR) gain at the bone - soft tissue interface should be performed; this is not reported in the literature. As MRI scanning of long bones has very long scanning times, the acquired images are more prone to motion artefacts due to random movements of the subject‟s limbs. One of the artefacts observed is the step artefact that is believed to occur from the random movements of the volunteer during a scan. This needs to be corrected before the models can be used for implant design. As the first aim, this study investigated two segmentation methods: intensity thresholding and Canny edge detection as accurate but simple segmentation methods for segmentation of MRI and CT data. The second aim was to investigate the usability of MRI as a radiation free imaging alternative to CT for reconstruction of 3D models of long bones. The third aim was to use 3T MRI to improve the poor contrast in articular regions and long scanning times of current MRI. The fourth and final aim was to minimise the step artefact using 3D modelling techniques. The segmentation methods were investigated using CT scans of five ovine femora. The single level thresholding was performed using a visually selected threshold level to segment the complete femur. For multilevel thresholding, multiple threshold levels calculated from the threshold selection method were used for the proximal, diaphyseal and distal regions of the femur. Canny edge detection was used by delineating the outer and inner contour of 2D images and then combining them to generate the 3D model. Models generated from these methods were compared to the reference standard generated using the mechanical contact scans of the denuded bone. The second aim was achieved using CT and MRI scans of five ovine femora and segmenting them using the multilevel threshold method. A surface geometric comparison was conducted between CT based, MRI based and reference models. To quantitatively compare the 1.5T images to the 3T MRI images, the right lower limbs of five healthy volunteers were scanned using scanners from the same manufacturer. The images obtained using the identical protocols were compared by means of SNR and contrast to noise ratio (CNR) of muscle, bone marrow and bone. In order to correct the step artefact in the final 3D models, the step was simulated in five ovine femora scanned with a 3T MRI scanner. The step was corrected using the iterative closest point (ICP) algorithm based aligning method. The present study demonstrated that the multi-threshold approach in combination with the threshold selection method can generate 3D models from long bones with an average deviation of 0.18 mm. The same was 0.24 mm of the single threshold method. There was a significant statistical difference between the accuracy of models generated by the two methods. In comparison, the Canny edge detection method generated average deviation of 0.20 mm. MRI based models exhibited 0.23 mm average deviation in comparison to the 0.18 mm average deviation of CT based models. The differences were not statistically significant. 3T MRI improved the contrast in the bone–muscle interfaces of most anatomical regions of femora and tibiae, potentially improving the inaccuracies conferred by poor contrast of the articular regions. Using the robust ICP algorithm to align the 3D surfaces, the step artefact that occurred by the volunteer moving the leg was corrected, generating errors of 0.32 ± 0.02 mm when compared with the reference standard. The study concludes that magnetic resonance imaging, together with simple multilevel thresholding segmentation, is able to produce 3D models of long bones with accurate geometric representations. The method is, therefore, a potential alternative to the current gold standard CT imaging.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fire safety of buildings has been recognised as very important by the building industry and the community at large. Gypsum plasterboards are widely used to protect light gauge steel frame (LSF) walls all over the world. Gypsum contains free and chemically bound water in its crystal structure. Plasterboard also contains gypsum (CaSO4.2H2O) and calcium carbonate (CaCO3). The dehydration of gypsum and the decomposition of calcium carbonate absorb heat, and thus are able to protect LSF walls from fires. Kolarkar and Mahendran (2008) developed an innovative composite wall panel system, where the insulation was sandwiched between two plasterboards to improve the thermal and structural performance of LSF wall panels under fire conditions. In order to understand the performance of gypsum plasterboards and LSF wall panels under standard fire conditions, many experiments were conducted in the Fire Research Laboratory of Queensland University of Technology (Kolarkar, 2010). Fire tests were conducted on single, double and triple layers of Type X gypsum plasterboards and load bearing LSF wall panels under standard fire conditions. However, suitable numerical models have not been developed to investigate the thermal performance of LSF walls using the innovative composite panels under standard fire conditions. Continued reliance on expensive and time consuming fire tests is not acceptable. Therefore this research developed suitable numerical models to investigate the thermal performance of both plasterboard assemblies and load bearing LSF wall panels. SAFIR, a finite element program, was used to investigate the thermal performance of gypsum plasterboard assemblies and LSF wall panels under standard fire conditions. Appropriate values of important thermal properties were proposed for plasterboards and insulations based on laboratory tests, literature review and comparisons of finite element analysis results of small scale plasterboard assemblies from this research and corresponding experimental results from Kolarkar (2010). The important thermal properties (thermal conductivity, specific heat capacity and density) of gypsum plasterboard and insulation materials were proposed as functions of temperature and used in the numerical models of load bearing LSF wall panels. Using these thermal properties, the developed finite element models were able to accurately predict the time temperature profiles of plasterboard assemblies while they predicted them reasonably well for load bearing LSF wall systems despite the many complexities that are present in these LSF wall systems under fires. This thesis presents the details of the finite element models of plasterboard assemblies and load bearing LSF wall panels including those with the composite panels developed by Kolarkar and Mahendran (2008). It examines and compares the thermal performance of composite panels developed based on different insulating materials of varying densities and thicknesses based on 11 small scale tests, and makes suitable recommendations for improved fire performance of stud wall panels protected by these composite panels. It also presents the thermal performance data of LSF wall systems and demonstrates the superior performance of LSF wall systems using the composite panels. Using the developed finite element of models of LSF walls, this thesis has proposed new LSF wall systems with increased fire rating. The developed finite element models are particularly useful in comparing the thermal performance of different wall panel systems without time consuming and expensive fire tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The three studies in this thesis focus on happiness and age and seek to contribute to our understanding of happiness change over the lifetime. The first study contributes by offering an explanation for what was evolving to a ‘stylised fact’ in the economics literature, the U-shape of happiness in age. No U-shape is evident if one makes a visual inspection of the age happiness relationship in the German socio-economic panel data, and, it seems counter-intuitive that we just have to wait until we get old to be happy. Eliminating the very young, the very old, and the first timers from the analysis did not explain away regression results supporting the U-shape of happiness in age, but fixed effect analysis did. Analysis revealed found that reverse causality arising from time-invariant individual traits explained the U-shape of happiness in age in the German population, and the results were robust across six econometric methods. Robustness was added to the German fixed effect finding by replicating it with the Australian and the British socio-economic panel data sets. During analysis of the German data an unexpected finding emerged, an exceedingly large negative linear effect of age on happiness in fixed-effect regressions. There is a large self-reported happiness decline by those who remain in the German panel. A similar decline over time was not evident in the Australian or the British data. After testing away age, time and cohort effects, a time-in-panel effect was found. Germans who remain in the panel for longer progressively report lower levels of happiness. Because time-in-panel effects have not been included in happiness regression specifications, our estimates may be biased; perhaps some economics of the happiness studies, that used German panel data, need revisiting. The second study builds upon the fixed-effect finding of the first study and extends our view of lifetime happiness to a cohort little visited by economists, children. Initial analysis extends our view of lifetime happiness beyond adulthood and revealed a happiness decline in adolescent (15 to 23 year-old) Australians that is twice the size of the happiness decline we see in older Australians (75 to 86 yearolds), who we expect to be unhappy due to declining income, failing health and the onset of death. To resolve a difference of opinion in the literature as to whether childhood happiness decreases, increases, or remains flat in age; survey instruments and an Internet-based survey were developed and used to collect data from four hundred 9 to 14 year-old Australian children. Applying the data to a Model of Childhood Happiness revealed that the natural environment life-satisfaction domain factor did not have a significant effect on childhood happiness. However, the children’s school environment and interactions with friends life-satisfaction domain factors explained over half a steep decline in childhood happiness that is three times larger than what we see in older Australians. Adding personality to the model revealed what we expect to see with adults, extraverted children are happier, but unexpectedly, so are conscientious children. With the steep decline in the happiness of young Australians revealed and explanations offered, the third study builds on the time-invariant individual trait finding from the first study by applying the Australian panel data to an Aggregate Model of Average Happiness over the lifetime. The model’s independent variable is the stress that arises from the interaction between personality and the life event shocks that affect individuals and peers throughout their lives. Interestingly, a graphic depiction of the stress in age relationship reveals an inverse U-shape; an inverse U-shape that looks like the opposite of the U-shape of happiness in age we saw in the first study. The stress arising from life event shocks is found to explain much of the change in average happiness over a lifetime. With the policy recommendations of economists potentially invoking unexpected changes in our lives, the ensuing stress and resulting (un)happiness warrant consideration before economists make policy recommendations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The question of under what conditions conceptual representation is compositional remains debatable within cognitive science. This paper proposes a well developed mathematical apparatus for a probabilistic representation of concepts, drawing upon methods developed in quantum theory to propose a formal test that can determine whether a specific conceptual combination is compositional, or not. This test examines a joint probability distribution modeling the combination, asking whether or not it is factorizable. Empirical studies indicate that some combinations should be considered non-compositionally.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Digital human models (DHM) have evolved as useful tools for ergonomic workplace design and product development, and found in various industries and education. DHM systems which dominate the market were developed for specific purposes and differ significantly, which is not only reflected in non-compatible results of DHM simulations, but also provoking misunderstanding of how DHM simulations relate to real world problems. While DHM developers are restricted by uncertainty about the user need and lack of model data related standards, users are confined to one specific product and cannot exchange results, or upgrade to another DHM system, as their previous results would be rendered worthless. Furthermore, origin and validity of anthropometric and biomechanical data is not transparent to the user. The lack of standardisation in DHM systems has become a major roadblock in further system development, affecting all stakeholders in the DHM industry. Evidently, a framework for standardising digital human models is necessary to overcome current obstructions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Individual-based models describing the migration and proliferation of a population of cells frequently restrict the cells to a predefined lattice. An implicit assumption of this type of lattice based model is that a proliferative population will always eventually fill the lattice. Here we develop a new lattice-free individual-based model that incorporates cell-to-cell crowding effects. We also derive approximate mean-field descriptions for the lattice-free model in two special cases motivated by commonly used experimental setups. Lattice-free simulation results are compared to these mean-field descriptions and to a corresponding lattice-based model. Data from a proliferation experiment is used to estimate the parameters for the new model, including the cell proliferation rate, showing that the model fits the data well. An important aspect of the lattice-free model is that the confluent cell density is not predefined, as with lattice-based models, but an emergent model property. As a consequence of the more realistic, irregular configuration of cells in the lattice-free model, the population growth rate is much slower at high cell densities and the population cannot reach the same confluent density as an equivalent lattice-based model.