214 resultados para Development models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Standard differential equation–based models of collective cell behaviour, such as the logistic growth model, invoke a mean–field assumption which is equivalent to assuming that individuals within the population interact with each other in proportion to the average population density. Implementing such assumptions implies that the dynamics of the system are unaffected by spatial structure, such as the formation of patches or clusters within the population. Recent theoretical developments have introduced a class of models, known as moment dynamics models, which aim to account for the dynamics of individuals, pairs of individuals, triplets of individuals and so on. Such models enable us to describe the dynamics of populations with clustering, however, little progress has been made with regard to applying moment dynamics models to experimental data. Here, we report new experimental results describing the formation of a monolayer of cells using two different cell types: 3T3 fibroblast cells and MDA MB 231 breast cancer cells. Our analysis indicates that the 3T3 fibroblast cells are relatively motile and we observe that the 3T3 fibroblast monolayer forms without clustering. Alternatively, the MDA MB 231 cells are less motile and we observe that the MDA MB 231 monolayer formation is associated with significant clustering. We calibrate a moment dynamics model and a standard mean–field model to both data sets. Our results indicate that the mean–field and moment dynamics models provide similar descriptions of the 3T3 fibroblast monolayer formation whereas these two models give very different predictions for the MDA MD 231 monolayer formation. These outcomes indicate that standard mean–field models of collective cell behaviour are not always appropriate and that care ought to be exercised when implementing such a model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need for a house rental model in Townsville, Australia is addressed. Models developed for predicting house rental levels are described. An analytical model is built upon a priori selected variables and parameters of rental levels. Regression models are generated to provide a comparison to the analytical model. Issues in model development and performance evaluation are discussed. A comparison of the models indicates that the analytical model performs better than the regression models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process modeling grammars are used to create scripts of a business domain that a process-aware information system is intended to support. A key grammatical construct of such grammars is known as a Gateway. A Gateway construct is used to describe scenarios in which the workflow of a process diverges or converges according to relevant conditions. Gateway constructs have been subjected to much academic discussion about their meaning, role and usefulness, and have been linked to both process-modeling errors and process-model understandability. This paper examines perceptual discriminability effects of Gateway constructs on an individual's abilities to interpret process models. We compare two ways of expressing two convergence and divergence patterns – Parallel Split and Simple Merge – implemented in a process modeling grammar. On the basis of an experiment with 98 students, we provide empirical evidence that Gateway constructs aid the interpretation of process models due to a perceptual discriminability effect, especially when models are complex. We discuss the emerging implications for research and practice, in terms of revisions to grammar specifications, guideline development and design choices in process modeling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Biological systems exhibit a wide range of contextual effects, and this often makes it difficult to construct valid mathematical models of their behaviour. In particular, mathematical paradigms built upon the successes of Newtonian physics make assumptions about the nature of biological systems that are unlikely to hold true. After discussing two of the key assumptions underlying the Newtonian paradigm, we discuss two key aspects of the formalism that extended it, Quantum Theory (QT). We draw attention to the similarities between biological and quantum systems, motivating the development of a similar formalism that can be applied to the modelling of biological processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Total Dik! is a collaborative project between the Queensland University of Technology (QUT) and Queensland Theatre Company (QTC). Total Dik! explores transmedia storytelling in live performance from concept development to delivery and builds on works, By the Way, Meet Vera Stark, (Forrester2012), Hotel Modern’s Kamp (2005) and God’s Beard (2012) that use visual art, puppetry, music and film. The project’s first iteration enabled an interrogation of the integration of media-rich elements with live performers in a theatrical environment. Performative transmedia storytelling draws on the tenets of convergent media theory developed by Jenkins (2007, 2012), Dena (2010) and Philips (2012). This exploratory work, juxtaposing transmedia storytelling techniques with live performance, draws on Samuel Becket’s challenges to theatre orthodoxy, and touches on Brechtian notions of alienation through ‘sleight-of-hand’ or processual unpacking and deconstruction during performance. Total Dik! blends a convergence of technologies, models, green screen capture, and live dimensions of performance in one narrative allowing the work’s creators to test new combinations of transmedia storytelling techniques on a traditional performance platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent fire research into the behaviour of light gauge steel frame (LSF) wall systems has devel-oped fire design rules based on Australian and European cold-formed steel design standards, AS/NZS 4600 and Eurocode 3 Part 1.3. However, these design rules are complex since the LSF wall studs are subjected to non-uniform elevated temperature distributions when the walls are exposed to fire from one side. Therefore this paper proposes an alternative design method for routine predictions of fire resistance rating of LSF walls. In this method, suitable equations are recommended first to predict the idealised stud time-temperature pro-files of eight different LSF wall configurations subject to standard fire conditions based on full scale fire test results. A new set of equations was then proposed to find the critical hot flange (failure) temperature for a giv-en load ratio for the same LSF wall configurations with varying steel grades and thickness. These equations were developed based on detailed finite element analyses that predicted the axial compression capacities and failure times of LSF wall studs subject to non-uniform temperature distributions with varying steel grades and thicknesses. This paper proposes a simple design method in which the two sets of equations developed for time-temperature profiles and critical hot flange temperatures are used to find the failure times of LSF walls. The proposed method was verified by comparing its predictions with the results from full scale fire tests and finite element analyses. This paper presents the details of this study including the finite element models of LSF wall studs, the results from relevant fire tests and finite element analyses, and the proposed equations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The determinants and key mechanisms of cancer cell osteotropism have not been identified, mainly due to the lack of reproducible animal models representing the biological, genetic and clinical features seen in humans. An ideal model should be capable of recapitulating as many steps of the metastatic cascade as possible, thus facilitating the development of prognostic markers and novel therapeutic strategies. Most animal models of bone metastasis still have to be derived experimentally as most syngeneic and transgeneic approaches do not provide a robust skeletal phenotype and do not recapitulate the biological processes seen in humans. The xenotransplantation of human cancer cells or tumour tissue into immunocompromised murine hosts provides the possibility to simulate early and late stages of the human disease. Human bone or tissue-engineered human bone constructs can be implanted into the animal to recapitulate more subtle, species-specific aspects of the mutual interaction between human cancer cells and the human bone microenvironment. Moreover, the replication of the entire "organ" bone makes it possible to analyse the interaction between cancer cells and the haematopoietic niche and to confer at least a partial human immunity to the murine host. This process of humanisation is facilitated by novel immunocompromised mouse strains that allow a high engraftment rate of human cells or tissue. These humanised xenograft models provide an important research tool to study human biological processes of bone metastasis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Recent advances in the planning and delivery of radiotherapy treatments have resulted in improvements in the accuracy and precision with which therapeutic radiation can be administered. As the complexity of the treatments increases it becomes more difficult to predict the dose distribution in the patient accurately. Monte Carlo (MC) methods have the potential to improve the accuracy of the dose calculations and are increasingly being recognised as the ‘gold standard’ for predicting dose deposition in the patient [1]. This project has three main aims: 1. To develop tools that enable the transfer of treatment plan information from the treatment planning system (TPS) to a MC dose calculation engine. 2. To develop tools for comparing the 3D dose distributions calculated by the TPS and the MC dose engine. 3. To investigate the radiobiological significance of any errors between the TPS patient dose distribution and the MC dose distribution in terms of Tumour Control Probability (TCP) and Normal Tissue Complication Probabilities (NTCP). The work presented here addresses the first two aims. Methods: (1a) Plan Importing: A database of commissioned accelerator models (Elekta Precise and Varian 2100CD) has been developed for treatment simulations in the MC system (EGSnrc/BEAMnrc). Beam descriptions can be exported from the TPS using the widespread DICOM framework, and the resultant files are parsed with the assistance of a software library (PixelMed Java DICOM Toolkit). The information in these files (such as the monitor units, the jaw positions and gantry orientation) is used to construct a plan-specific accelerator model which allows an accurate simulation of the patient treatment field. (1b) Dose Simulation: The calculation of a dose distribution requires patient CT images which are prepared for the MC simulation using a tool (CTCREATE) packaged with the system. Beam simulation results are converted to absolute dose per- MU using calibration factors recorded during the commissioning process and treatment simulation. These distributions are combined according to the MU meter settings stored in the exported plan to produce an accurate description of the prescribed dose to the patient. (2) Dose Comparison: TPS dose calculations can be obtained using either a DICOM export or by direct retrieval of binary dose files from the file system. Dose difference, gamma evaluation and normalised dose difference algorithms [2] were employed for the comparison of the TPS dose distribution and the MC dose distribution. These implementations are spatial resolution independent and able to interpolate for comparisons. Results and Discussion: The tools successfully produced Monte Carlo input files for a variety of plans exported from the Eclipse (Varian Medical Systems) and Pinnacle (Philips Medical Systems) planning systems: ranging in complexity from a single uniform square field to a five-field step and shoot IMRT treatment. The simulation of collimated beams has been verified geometrically, and validation of dose distributions in a simple body phantom (QUASAR) will follow. The developed dose comparison algorithms have also been tested with controlled dose distribution changes. Conclusion: The capability of the developed code to independently process treatment plans has been demonstrated. A number of limitations exist: only static fields are currently supported (dynamic wedges and dynamic IMRT will require further development), and the process has not been tested for planning systems other than Eclipse and Pinnacle. The tools will be used to independently assess the accuracy of the current treatment planning system dose calculation algorithms for complex treatment deliveries such as IMRT in treatment sites where patient inhomogeneities are expected to be significant. Acknowledgements: Computational resources and services used in this work were provided by the HPC and Research Support Group, Queensland University of Technology, Brisbane, Australia. Pinnacle dose parsing made possible with the help of Paul Reich, North Coast Cancer Institute, North Coast, New South Wales.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This book presents readers with the opportunity to fundamentally re-evaluate the processes of innovation and entrepreneurship, and to rethink how they might best be stimulated and fostered within our organizations and communities. The fundamental thesis of the book is that the entrepreneurial process is not a linear progression from novel idea to successful innovation, but is an iterative series of experiments, where progress depends on the persistence and resilience of the individuals involved, and their ability and to learn from failure as well as success. From this premise, the authors argue that the ideal environment for new venture creation is a form of “experimental laboratory,” a community of innovators where ideas are generated, shared, and refined; experiments are encouraged; and which in itself serves as a test environment for those ideas and experiments. This environment is quite different from the traditional “incubator,” which may impose the disciplines of the established firm too early in the development of the new venture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The traditional hospital-based model of cardiac rehabilitation faces substantial challenges, such as cost and accessibility. These challenges have led to the development of alternative models of cardiac rehabilitation in recent years. The aim of this study was to identify and critique evidence for the effectiveness of these alternative models. A total of 22 databases were searched to identify quantitative studies or systematic reviews of quantitative studies regarding the effectiveness of alternative models of cardiac rehabilitation. Included studies were appraised using a Critical Appraisal Skills Programme tool and the National Health and Medical Research Council's designations for Level of Evidence. The 83 included articles described interventions in the following broad categories of alternative models of care: multifactorial individualized telehealth, internet based, telehealth focused on exercise, telehealth focused on recovery, community- or home-based, and complementary therapies. Multifactorial individualized telehealth and community- or home-based cardiac rehabilitation are effective alternative models of cardiac rehabilitation, as they have produced similar reductions in cardiovascular disease risk factors compared with hospital-based programmes. While further research is required to address the paucity of data available regarding the effectiveness of alternative models of cardiac rehabilitation in rural, remote, and culturally and linguistically diverse populations, our review indicates there is no need to rely on hospital-based strategies alone to deliver effective cardiac rehabilitation. Local healthcare systems should strive to integrate alternative models of cardiac rehabilitation, such as brief telehealth interventions tailored to individual's risk factor profiles as well as community- or home-based programmes, in order to ensure there are choices available for patients that best fit their needs, risk factor profile, and preferences.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis highlights the limitations of the existing car following models to emulate driver behaviour for safety study purposes. It also compares the capabilities of the mainstream car following models emulating driver behaviour precise parameters such as headways and Time to Collisions. The comparison evaluates the robustness of each car following model for safety metric reproductions. A new car following model, based on the personal space concept and fish school model is proposed to simulate more precise traffic metrics. This new model is capable of reflecting changes in the headway distribution after imposing the speed limit form VSL systems. This research facilitates assessing Intelligent Transportation Systems on motorways, using microscopic simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Invasion waves of cells play an important role in development, disease and repair. Standard discrete models of such processes typically involve simulating cell motility, cell proliferation and cell-to-cell crowding effects in a lattice-based framework. The continuum-limit description is often given by a reaction–diffusion equation that is related to the Fisher–Kolmogorov equation. One of the limitations of a standard lattice-based approach is that real cells move and proliferate in continuous space and are not restricted to a predefined lattice structure. We present a lattice-free model of cell motility and proliferation, with cell-to-cell crowding effects, and we use the model to replicate invasion wave-type behaviour. The continuum-limit description of the discrete model is a reaction–diffusion equation with a proliferation term that is different from lattice-based models. Comparing lattice based and lattice-free simulations indicates that both models lead to invasion fronts that are similar at the leading edge, where the cell density is low. Conversely, the two models make different predictions in the high density region of the domain, well behind the leading edge. We analyse the continuum-limit description of the lattice based and lattice-free models to show that both give rise to invasion wave type solutions that move with the same speed but have very different shapes. We explore the significance of these differences by calibrating the parameters in the standard Fisher–Kolmogorov equation using data from the lattice-free model. We conclude that estimating parameters using this kind of standard procedure can produce misleading results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the use of 2- and 3-dimensional cell-based models for studying how skin cells respond to ultraviolet radiation. These methods were used to investigate skin damage and repair after exposure to radiation in the context of skin cancer development. Interactions between different skin cell types were demonstrated as being significant in protecting against ultraviolet radiation-induced skin damage. This has important implications in understanding how skin cancers occur, as well as in the development of new strategies to prevent and treat them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chlamydia is responsible for a wide range of diseases with enormous global economic and health burden. As the majority of chlamydial infections are asymptomatic, a vaccine has greatest potential to reduce infection and disease prevalence. Protective immunity against Chlamydia requires the induction of a mucosal immune response, ideally, at the multiple sites in the body where an infection can be established. Mucosal immunity is most effectively stimulated by targeting vaccination to the epithelium, which is best accomplished by direct vaccine application to mucosal surfaces rather than by injection. The efficacy of needle-free vaccines however is reliant on a powerful adjuvant to overcome mucosal tolerance. As very few adjuvants have proven able to elicit mucosal immunity without harmful side effects, there is a need to develop non-toxic adjuvants or safer ways to administered pre-existing toxic adjuvants. In the present study we investigated the novel non-toxic mucosal adjuvant CTA1-DD. The immunogenicity of CTA1-DD was compared to our "gold-standard" mucosal adjuvant combination of cholera toxin (CT) and cytosine-phosphate-guanosine oligodeoxynucleotide (CpG-ODN). We also utilised different needle-free immunisation routes, intranasal (IN), sublingual (SL) and transcutaneous (TC), to stimulate the induction of immunity at multiple mucosal surfaces in the body where Chlamydia are known to infect. Moreover, administering each adjuvant by different routes may also limit the toxicity of the CT/CpG adjuvant, currently restricted from use in humans. Mice were immunised with either adjuvant together with the chlamydial major outer membrane protein (MOMP) to evaluate vaccine safety and quantify the induction of antigen-specific mucosal immune responses. The level of protection against infection and disease was also assessed in vaccinated animals following a live genital or respiratory tract infectious challenge. The non-toxic CTA1-DD was found to be safe and immunogenic when delivered via the IN route in mice, inducing a comparable mucosal response and level of protective immunity against chlamydial challenge to its toxic CT/CpG counterpart administered by the same route. The utilisation of different routes of immunisation strongly influenced the distribution of antigen-specific responses to distant mucosal surfaces and also abrogated the toxicity of CT/CpG. The CT/CpG-adjuvanted vaccine was safe when administered by the SL and TC routes and conferred partial immunity against infection and pathology in both challenge models. This protection was attributed to the induction of antigen-specific pro-inflammatory cellular responses in the lymph nodes regional to the site of infection and rather than in the spleen. Development of non-toxic adjuvants and effective ways to reduce the side effects of toxic adjuvants has profound implications for vaccine development, particularly against mucosal pathogens like Chlamydia. Interestingly, we also identified two contrasting vaccines in both infection models capable of preventing infection or pathology exclusively. This indicated that the development of pathology following an infection of vaccinated animals was independent of bacterial load and was instead the result of immunopathology, potentially driven by the adaptive immune response generated following immunisation. While both vaccines expressed high levels of interleukin (IL)-17 cytokines, the pathology protected group displayed significantly reduced expression of corresponding IL-17 receptors and hence an inhibition of signalling. This indicated that the balance of IL-17-mediated responses defines the degree of protection against infection and tissue damage generated following vaccination. This study has enabled us to better understand the immune basis of pathology and protection, necessary to design more effective vaccines.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper evaluates the efficiency of a number of popular corpus-based distributional models in performing discovery on very large document sets, including online collections. Literature-based discovery is the process of identifying previously unknown connections from text, often published literature, that could lead to the development of new techniques or technologies. Literature-based discovery has attracted growing research interest ever since Swanson's serendipitous discovery of the therapeutic effects of fish oil on Raynaud's disease in 1986. The successful application of distributional models in automating the identification of indirect associations underpinning literature-based discovery has been heavily demonstrated in the medical domain. However, we wish to investigate the computational complexity of distributional models for literature-based discovery on much larger document collections, as they may provide computationally tractable solutions to tasks including, predicting future disruptive innovations. In this paper we perform a computational complexity analysis on four successful corpus-based distributional models to evaluate their fit for such tasks. Our results indicate that corpus-based distributional models that store their representations in fixed dimensions provide superior efficiency on literature-based discovery tasks.