865 resultados para Filmic approach methods
Resumo:
We consider a discrete agent-based model on a one-dimensional lattice, where each agent occupies L sites and attempts movements over a distance of d lattice sites. Agents obey a strict simple exclusion rule. A discrete-time master equation is derived using a mean-field approximation and careful probability arguments. In the continuum limit, nonlinear diffusion equations that describe the average agent occupancy are obtained. Averaged discrete simulation data are generated and shown to compare very well with the solution to the derived nonlinear diffusion equations. This framework allows us to approach a lattice-free result using all the advantages of lattice methods. Since different cell types have different shapes and speeds of movement, this work offers insight into population-level behavior of collective cellular motion.
Resumo:
Computational models in physiology often integrate functional and structural information from a large range of spatio-temporal scales from the ionic to the whole organ level. Their sophistication raises both expectations and scepticism concerning how computational methods can improve our understanding of living organisms and also how they can reduce, replace and refine animal experiments. A fundamental requirement to fulfil these expectations and achieve the full potential of computational physiology is a clear understanding of what models represent and how they can be validated. The present study aims at informing strategies for validation by elucidating the complex interrelations between experiments, models and simulations in cardiac electrophysiology. We describe the processes, data and knowledge involved in the construction of whole ventricular multiscale models of cardiac electrophysiology. Our analysis reveals that models, simulations, and experiments are intertwined, in an assemblage that is a system itself, namely the model-simulation-experiment (MSE) system. Validation must therefore take into account the complex interplay between models, simulations and experiments. Key points for developing strategies for validation are: 1) understanding sources of bio-variability is crucial to the comparison between simulation and experimental results; 2) robustness of techniques and tools is a pre-requisite to conducting physiological investigations using the MSE system; 3) definition and adoption of standards facilitates interoperability of experiments, models and simulations; 4) physiological validation must be understood as an iterative process that defines the specific aspects of electrophysiology the MSE system targets, and is driven by advancements in experimental and computational methods and the combination of both.
Resumo:
This thesis addresses the topic of real-time decision making by driverless (autonomous) city vehicles, i.e. their ability to make appropriate driving decisions in non-simplified urban traffic conditions. After addressing the state of research, and explaining the research question, the thesis presents solutions for the subcomponents which are relevant for decision making with respect to information input (World Model), information output (Driving Maneuvers), and the real-time decision making process. TheWorld Model is a software component developed to fulfill the purpose of collecting information from perception and communication subsystems, maintaining an up-to-date view of the vehicle’s environment, and providing the required input information to the Real-Time Decision Making subsystem in a well-defined, and structured way. The real-time decision making process consists of two consecutive stages. While the first decision making stage uses a Petri net to model the safetycritical selection of feasible driving maneuvers, the second stage uses Multiple Criteria Decision Making (MCDM) methods to select the most appropriate driving maneuver, focusing on fulfilling objectives related to efficiency and comfort. The complex task of autonomous driving is subdivided into subtasks, called driving maneuvers, which represent the output (i.e. decision alternatives) of the real-time decision making process. Driving maneuvers are considered as implementations of closed-loop control algorithms, each capable of maneuvering the autonomous vehicle in a specific traffic situation. Experimental tests in both a 3D simulation and real-world experiments attest that the developed approach is suitable to deal with the complexity of real-world urban traffic situations.
Resumo:
Plant food materials have a very high demand in the consumer market and therefore, improved food products and efficient processing techniques are concurrently being researched in food engineering. In this context, numerical modelling and simulation techniques have a very high potential to reveal fundamentals of the underlying mechanisms involved. However, numerical modelling of plant food materials during drying becomes quite challenging, mainly due to the complexity of the multiphase microstructure of the material, which undergoes excessive deformations during drying. In this regard, conventional grid-based modelling techniques have limited applicability due to their inflexible grid-based fundamental limitations. As a result, meshfree methods have recently been developed which offer a more adaptable approach to problem domains of this nature, due to their fundamental grid-free advantages. In this work, a recently developed meshfree based two-dimensional plant tissue model is used for a comparative study of microscale morphological changes of several food materials during drying. The model involves Smoothed Particle Hydrodynamics (SPH) and Discrete Element Method (DEM) to represent fluid and solid phases of the cellular structure. Simulation are conducted on apple, potato, carrot and grape tissues and the results are qualitatively and quantitatively compared and related with experimental findings obtained from the literature. The study revealed that cellular deformations are highly sensitive to cell dimensions, cell wall physical and mechanical properties, middle lamella properties and turgor pressure. In particular, the meshfree model is well capable of simulating critically dried tissues at lower moisture content and turgor pressure, which lead to cell wall wrinkling. The findings further highlighted the potential applicability of the meshfree approach to model large deformations of the plant tissue microstructure during drying, providing a distinct advantage over the state of the art grid-based approaches.
Resumo:
This thesis developed a high preforming alternative numerical technique to investigate microscale morphological changes of plant food materials during drying. The technique is based on a novel meshfree method, and is more capable of modeling large deformations of multiphase problem domains, when compared with conventional grid-based numerical modeling techniques. The developed cellular model can effectively replicate dried tissue morphological changes such as shrinkage and cell wall wrinkling, as influenced by moisture reduction and turgor loss.
Resumo:
Existing research and best practice were utilized to develop the Project Management, Stakeholder Engagement and Change Facilitation (PSC) approach to road safety infrastructure projects. Two case studies involving Queensland Transport and Main Roads demonstrated that use of the PSC has potential to create synergies for projects undertaken by multi-disciplinary road safety groups, and to complement Safe System projects and philosophy. The case studies were the North West Road Safety Alliance project, and the implementation of Road Safety Audit policy, and utilised a mix of qualitative and quantitative methods including interviews and surveys.
Resumo:
This paper demonstrates a renewed procedure for the quantification of surface-enhanced Raman scattering (SERS) enhancement factors with improved precision. The principle of this method relies on deducting the resonance Raman scattering (RRS) contribution from surface-enhanced resonance Raman scattering (SERRS) to end up with the surface enhancement (SERS) effect alone. We employed 1,8,15,22-tetraaminophthalocyanato-cobalt(II) (4α-CoIITAPc), a resonance Raman- and electrochemically redox-active chromophore, as a probe molecule for RRS and SERRS experiments. The number of 4α-CoIITAPc molecules contributing to RRS and SERRS phenomena on plasmon inactive glassy carbon (GC) and plasmon active GC/Au surfaces, respectively, has been precisely estimated by cyclic voltammetry experiments. Furthermore, the SERS substrate enhancement factor (SSEF) quantified by our approach is compared with the traditionally employed methods. We also demonstrate that the present approach of SSEF quantification can be applied for any kind of different SERS substrates by choosing an appropriate laser line and probe molecule.
Resumo:
Background Multi attribute utility instruments (MAUIs) are preference-based measures that comprise a health state classification system (HSCS) and a scoring algorithm that assigns a utility value to each health state in the HSCS. When developing a MAUI from a health-related quality of life (HRQOL) questionnaire, first a HSCS must be derived. This typically involves selecting a subset of domains and items because HRQOL questionnaires typically have too many items to be amendable to the valuation task required to develop the scoring algorithm for a MAUI. Currently, exploratory factor analysis (EFA) followed by Rasch analysis is recommended for deriving a MAUI from a HRQOL measure. Aim To determine whether confirmatory factor analysis (CFA) is more appropriate and efficient than EFA to derive a HSCS from the European Organisation for the Research and Treatment of Cancer’s core HRQOL questionnaire, Quality of Life Questionnaire (QLQ-C30), given its well-established domain structure. Methods QLQ-C30 (Version 3) data were collected from 356 patients receiving palliative radiotherapy for recurrent/metastatic cancer (various primary sites). The dimensional structure of the QLQ-C30 was tested with EFA and CFA, the latter informed by the established QLQ-C30 structure and views of both patients and clinicians on which are the most relevant items. Dimensions determined by EFA or CFA were then subjected to Rasch analysis. Results CFA results generally supported the proposed QLQ-C30 structure (comparative fit index =0.99, Tucker–Lewis index =0.99, root mean square error of approximation =0.04). EFA revealed fewer factors and some items cross-loaded on multiple factors. Further assessment of dimensionality with Rasch analysis allowed better alignment of the EFA dimensions with those detected by CFA. Conclusion CFA was more appropriate and efficient than EFA in producing clinically interpretable results for the HSCS for a proposed new cancer-specific MAUI. Our findings suggest that CFA should be recommended generally when deriving a preference-based measure from a HRQOL measure that has an established domain structure.
Resumo:
Interpolation techniques for spatial data have been applied frequently in various fields of geosciences. Although most conventional interpolation methods assume that it is sufficient to use first- and second-order statistics to characterize random fields, researchers have now realized that these methods cannot always provide reliable interpolation results, since geological and environmental phenomena tend to be very complex, presenting non-Gaussian distribution and/or non-linear inter-variable relationship. This paper proposes a new approach to the interpolation of spatial data, which can be applied with great flexibility. Suitable cross-variable higher-order spatial statistics are developed to measure the spatial relationship between the random variable at an unsampled location and those in its neighbourhood. Given the computed cross-variable higher-order spatial statistics, the conditional probability density function (CPDF) is approximated via polynomial expansions, which is then utilized to determine the interpolated value at the unsampled location as an expectation. In addition, the uncertainty associated with the interpolation is quantified by constructing prediction intervals of interpolated values. The proposed method is applied to a mineral deposit dataset, and the results demonstrate that it outperforms kriging methods in uncertainty quantification. The introduction of the cross-variable higher-order spatial statistics noticeably improves the quality of the interpolation since it enriches the information that can be extracted from the observed data, and this benefit is substantial when working with data that are sparse or have non-trivial dependence structures.
Resumo:
Background The primary health care sector delivers the majority of health care in western countries through small, community-based organizations. However, research into these healthcare organizations is limited by the time constraints and pressure facing them, and the concern by staff that research is peripheral to their work. We developed Q-RARA—Qualitative Rapid Appraisal, Rigorous Analysis—to study small, primary health care organizations in a way that is efficient, acceptable to participants and methodologically rigorous. Methods Q-RARA comprises a site visit, semi-structured interviews, structured and unstructured observations, photographs, floor plans, and social scanning data. Data were collected over the course of one day per site and the qualitative analysis was integrated and iterative. Results We found Q-RARA to be acceptable to participants and effective in collecting data on organizational function in multiple sites without disrupting the practice, while maintaining a balance between speed and trustworthiness. Conclusions The Q-RARA approach is capable of providing a richly textured, rigorous understanding of the processes of the primary care practice while also allowing researchers to develop an organizational perspective. For these reasons the approach is recommended for use in small-scale organizations both within and outside the primary health care sector.
Resumo:
Bayesian networks (BNs) are graphical probabilistic models used for reasoning under uncertainty. These models are becoming increasing popular in a range of fields including ecology, computational biology, medical diagnosis, and forensics. In most of these cases, the BNs are quantified using information from experts, or from user opinions. An interest therefore lies in the way in which multiple opinions can be represented and used in a BN. This paper proposes the use of a measurement error model to combine opinions for use in the quantification of a BN. The multiple opinions are treated as a realisation of measurement error and the model uses the posterior probabilities ascribed to each node in the BN which are computed from the prior information given by each expert. The proposed model addresses the issues associated with current methods of combining opinions such as the absence of a coherent probability model, the lack of the conditional independence structure of the BN being maintained, and the provision of only a point estimate for the consensus. The proposed model is applied an existing Bayesian Network and performed well when compared to existing methods of combining opinions.
Resumo:
Texture enhancement is an important component of image processing that finds extensive application in science and engineering. The quality of medical images, quantified using the imaging texture, plays a significant role in the routine diagnosis performed by medical practitioners. Most image texture enhancement is performed using classical integral order differential mask operators. Recently, first order fractional differential operators were used to enhance images. Experimentation with these methods led to the conclusion that fractional differential operators not only maintain the low frequency contour features in the smooth areas of the image, but they also nonlinearly enhance edges and textures corresponding to high frequency image components. However, whilst these methods perform well in particular cases, they are not routinely useful across all applications. To this end, we apply the second order Riesz fractional differential operator to improve upon existing approaches of texture enhancement. Compared with the classical integral order differential mask operators and other first order fractional differential operators, we find that our new algorithms provide higher signal to noise values and superior image quality.
Resumo:
Objectives Directly measuring disease incidence in a population is difficult and not feasible to do routinely. We describe the development and application of a new method of estimating at a population level the number of incident genital chlamydia infections, and the corresponding incidence rates, by age and sex using routine surveillance data. Methods A Bayesian statistical approach was developed to calibrate the parameters of a decision-pathway tree against national data on numbers of notifications and tests conducted (2001-2013). Independent beta probability density functions were adopted for priors on the time-independent parameters; the shape parameters of these beta distributions were chosen to match prior estimates sourced from peer-reviewed literature or expert opinion. To best facilitate the calibration, multivariate Gaussian priors on (the logistic transforms of) the time-dependent parameters were adopted, using the Matérn covariance function to favour changes over consecutive years and across adjacent age cohorts. The model outcomes were validated by comparing them with other independent empirical epidemiological measures i.e. prevalence and incidence as reported by other studies. Results Model-based estimates suggest that the total number of people acquiring chlamydia per year in Australia has increased by ~120% over 12 years. Nationally, an estimated 356,000 people acquired chlamydia in 2013, which is 4.3 times the number of reported diagnoses. This corresponded to a chlamydia annual incidence estimate of 1.54% in 2013, increased from 0.81% in 2001 (~90% increase). Conclusions We developed a statistical method which uses routine surveillance (notifications and testing) data to produce estimates of the extent and trends in chlamydia incidence.
Resumo:
This study provides validity evidence for the Capture-Recapture (CR) method, borrowed from ecology, as a measure of second language (L2) productive vocabulary size (PVS). Two separate “captures” of productive vocabulary were taken using written word association tasks (WAT). At Time 1, 47 bilinguals provided at least 4 associates to each of 30 high-frequency stimulus words in English, their first language (L1), and in French, their L2. A few days later (Time 2), this procedure was repeated with a different set of stimulus words in each language. Since the WAT was used, both Lex30 and CR PVS scores were calculated in each language. Participants also completed an animacy judgment task assessing the speed and efficiency of lexical access. Results indicated that, in both languages, CR and Lex30 scores were significantly positively correlated (evidence of convergent validity). CR scores were also significantly larger in the L1, and correlated significantly with the speed of lexical access in the L2 (evidence of construct validity). These results point to the validity of the technique for estimating relative L2 PVS. However, CR scores are not a direct indication of absolute vocabulary size. A discussion of the method’s underlying assumptions and their implications for interpretation are provided.
Resumo:
Successful management of design changes is critical for the efficient delivery of construction projects. Building Information Modeling (BIM) is envisioned to play an important role in integrating design, construction and facility management processes through coordinated changes throughout the project life-cycle. BIM currently provides significant benefits in coordinating changes across different views in a single model, and identifying conflicts between different discipline-specific models. However, current BIM tools provide limited support in managing changes across several discipline-specific models. This paper describes an approach to represent, coordinate, and track changes within a collaborative multi-disciplinary BIM environment. This approach was informed by a detailed case study of a large, complex, fast-tracked BIM project where we investigated numerous design changes, analyzed change management processes, and evaluated existing BIM tools. Our approach characterises design changes in an ontology to represent changed component attributes, dependencies between components, and change impacts. It explores different types of dependencies amongst different design changes and describes how a graph based approach and dependency matrix could assist with automating the propagation and impact of changes in a BIM-based project delivery process.