139 resultados para statistical model for macromolecules


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The melting temperature of a nanoscaled particle is known to decrease as the curvature of the solid-melt interface increases. This relationship is most often modelled by a Gibbs--Thomson law, with the decrease in melting temperature proposed to be a product of the curvature of the solid-melt interface and the surface tension. Such a law must break down for sufficiently small particles, since the curvature becomes singular in the limit that the particle radius vanishes. Furthermore, the use of this law as a boundary condition for a Stefan-type continuum model is problematic because it leads to a physically unrealistic form of mathematical blow-up at a finite particle radius. By numerical simulation, we show that the inclusion of nonequilibrium interface kinetics in the Gibbs--Thomson law regularises the continuum model, so that the mathematical blow up is suppressed. As a result, the solution continues until complete melting, and the corresponding melting temperature remains finite for all time. The results of the adjusted model are consistent with experimental findings of abrupt melting of nanoscaled particles. This small-particle regime appears to be closely related to the problem of melting a superheated particle.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na–Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na–HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous–Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a layered framework for the purposes of integrating different Socio-Technical Systems (STS) models and perspectives into a whole-of-systems model. Holistic modelling plays a critical role in the engineering of STS due to the interplay between social and technical elements within these systems and resulting emergent behaviour. The framework decomposes STS models into components, where each component is either a static object, dynamic object or behavioural object. Based on existing literature, a classification of the different elements that make up STS, whether it be a social, technical or a natural environment element, is developed; each object can in turn be classified according to the STS elements it represents. Using the proposed framework, it is possible to systematically decompose models to an extent such that points of interface can be identified and the contextual factors required in transforming the component of one model to interface into another is obtained. Using an airport inbound passenger facilitation process as a case study socio-technical system, three different models are analysed: a Business Process Modelling Notation (BPMN) model, Hybrid Queue-based Bayesian Network (HQBN) model and an Agent Based Model (ABM). It is found that the framework enables the modeller to identify non-trivial interface points such as between the spatial interactions of an ABM and the causal reasoning of a HQBN, and between the process activity representation of a BPMN and simulated behavioural performance in a HQBN. Such a framework is a necessary enabler in order to integrate different modelling approaches in understanding and managing STS.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A FitzHugh-Nagumo monodomain model has been used to describe the propagation of the electrical potential in heterogeneous cardiac tissue. In this paper, we consider a two-dimensional fractional FitzHugh-Nagumo monodomain model on an irregular domain. The model consists of a coupled Riesz space fractional nonlinear reaction-diffusion model and an ordinary differential equation, describing the ionic fluxes as a function of the membrane potential. Secondly, we use a decoupling technique and focus on solving the Riesz space fractional nonlinear reaction-diffusion model. A novel spatially second-order accurate semi-implicit alternating direction method (SIADM) for this model on an approximate irregular domain is proposed. Thirdly, stability and convergence of the SIADM are proved. Finally, some numerical examples are given to support our theoretical analysis and these numerical techniques are employed to simulate a two-dimensional fractional Fitzhugh-Nagumo model on both an approximate circular and an approximate irregular domain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work explores the potential of Australian native plants as a source of second-generation biodiesel for internal combustion engines application. Biodiesels were evaluated from a number of non-edible oil seeds which are grow naturally in Queensland, Australia. The quality of the produced biodiesels has been investigated by several experimental and numerical methods. The research methodology and numerical model developed in this study can be used for a broad range of biodiesel feedstocks and for the future development of renewable native biodiesel in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a new method for performing Bayesian parameter inference and model choice for low count time series models with intractable likelihoods. The method involves incorporating an alive particle filter within a sequential Monte Carlo (SMC) algorithm to create a novel pseudo-marginal algorithm, which we refer to as alive SMC^2. The advantages of this approach over competing approaches is that it is naturally adaptive, it does not involve between-model proposals required in reversible jump Markov chain Monte Carlo and does not rely on potentially rough approximations. The algorithm is demonstrated on Markov process and integer autoregressive moving average models applied to real biological datasets of hospital-acquired pathogen incidence, animal health time series and the cumulative number of poison disease cases in mule deer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Scope: Coffee is among the most frequently consumed beverages. Its consumption is inversely associated to the incidence of diseases related to reactive oxygen species; the phenomenon may be due to its antioxidant properties. Our primary objective was to investigate the impact of consumption of a coffee containing high levels of chlorogenic acids on the oxidation of proteins, DNA and membrane lipids; additionally, other redox biomarkers were monitored in an intervention trial. Methods and results: The treatment group (n=36) consumed instant coffee co-extracted from green and roasted beans, whereas the control consumed water (800 mL/P/day, 5 days). A global statistical analysis of four main biomarkers selected as primary outcomes showed that the overall changes are significant. 8-Isoprostaglandin F2α in urine declined by 15.3%, 3-nitrotyrosine was decreased by 16.1%, DNA migration due to oxidized purines and pyrimidines was (not significantly) reduced in lymphocytes by 12.5 and 14.1%. Other markers such as the total antioxidant capacity were moderately increased; e.g. LDL and malondialdehyde were shifted towards a non-significant reduction. Conclusion: The oxidation of DNA, lipids and proteins associated with the incidence of various diseases and the protection against their oxidative damage may be indicative for beneficial health effects of coffee.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provides an accessible foundation to Bayesian analysis using real world models This book aims to present an introduction to Bayesian modelling and computation, by considering real case studies drawn from diverse fields spanning ecology, health, genetics and finance. Each chapter comprises a description of the problem, the corresponding model, the computational method, results and inferences as well as the issues that arise in the implementation of these approaches. Case Studies in Bayesian Statistical Modelling and Analysis: •Illustrates how to do Bayesian analysis in a clear and concise manner using real-world problems. •Each chapter focuses on a real-world problem and describes the way in which the problem may be analysed using Bayesian methods. •Features approaches that can be used in a wide area of application, such as, health, the environment, genetics, information science, medicine, biology, industry and remote sensing. Case Studies in Bayesian Statistical Modelling and Analysis is aimed at statisticians, researchers and practitioners who have some expertise in statistical modelling and analysis, and some understanding of the basics of Bayesian statistics, but little experience in its application. Graduate students of statistics and biostatistics will also find this book beneficial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The Neck Disability Index frequently is used to measure outcomes of the neck. The statistical rigor of the Neck Disability Index has been assessed with conflicting outcomes. To date, Confirmatory Factor Analysis of the Neck Disability Index has not been reported for a suitably large population study. Because the Neck Disability Index is not a condition-specific measure of neck function, initial Confirmatory Factor Analysis should consider problematic neck patients as a homogenous group. PURPOSE: We sought to analyze the factor structure of the Neck Disability Index through Confirmatory Factor Analysis in a symptomatic, homogeneous, neck population, with respect to pooled populations and gender subgroups. STUDY DESIGN: This was a secondary analysis of pooled data. PATIENT SAMPLE: A total of 1,278 symptomatic neck patients (67.5% female, median age 41 years), 803 nonspecific and 475 with whiplash-associated disorder. OUTCOME MEASURES: The Neck Disability Index was used to measure outcomes. METHODS: We analyzed pooled baseline data from six independent studies of patients with neck problems who completed Neck Disability Index questionnaires at baseline. The Confirmatory Factor Analysis was considered in three scenarios: the full sample and separate sexes. Models were compared empirically for best fit. RESULTS: Two-factor models have good psychometric properties across both the pooled and sex subgroups. However, according to these analyses, the one-factor solution is preferable from both a statistical perspective and parsimony. The two-factor model was close to significant for the male subgroup (p<.07) where questions separated into constructs of mental function (pain, reading headaches and concentration) and physical function (personal care, lifting, work, driving, sleep, and recreation). CONCLUSIONS: The Neck Disability Index demonstrated a one-factor structure when analyzed by Confirmatory Factor Analysis in a pooled, homogenous sample of neck problem patients. However, a two-factor model did approach significance for male subjects where questions separated into constructs of mental and physical function. Further investigations in different conditions, subgroup and sex-specific populations are warranted.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aerosol deposition in cylindrical tubes is a subject of interest to researchers and engineers in many applications of aerosol physics and metrology. Investigation of nano-particles in different aspects such as lungs, upper airways, batteries and vehicle exhaust gases is vital due the smaller size, adverse health effect and higher trouble for trapping than the micro-particles. The Lagrangian particle tracking provides an effective method for simulating the deposition of nano-particles as well as micro-particles as it accounts for the particle inertia effect as well as the Brownian excitation. However, using the Lagrangian approach for simulating ultrafine particles has been limited due to computational cost and numerical difficulties. In this paper, the deposition of nano-particles in cylindrical tubes under laminar condition is studied using the Lagrangian particle tracking method. The commercial Fluent software is used to simulate the fluid flow in the pipes and to study the deposition and dispersion of nano-particles. Different particle diameters as well as different flow rates are examined. The point analysis in a uniform flow is performed for validating the Brownian motion. The results show good agreement between the calculated deposition efficiency and the analytic correlations in the literature. Furthermore, for the nano-particles with the diameter more than 40 nm, the calculated deposition efficiency by the Lagrangian method is less than the analytic correlations based on Eulerian method due to statistical error or the inertia effect.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As statistical education becomes more firmly embedded in the school curriculum and its value across the curriculum is recognised, attention moves from knowing procedures, such as calculating a mean or drawing a graph, to understanding the purpose of a statistical investigation in decision making in many disciplines. As students learn to complete the stages of an investigation, the question of meaningful assessment of the process arises. This paper considers models for carrying out a statistical inquiry and, based on a four-phase model, creates a developmental squence that can be used for the assessment of outcomes from each of the four phases as well as for the complete inquiry. The developmental sequence is based on the SOLO model, focussing on the "observed" outcomes during the inquiry process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Spatial data analysis has become more and more important in the studies of ecology and economics during the last decade. One focus of spatial data analysis is how to select predictors, variance functions and correlation functions. However, in general, the true covariance function is unknown and the working covariance structure is often misspecified. In this paper, our target is to find a good strategy to identify the best model from the candidate set using model selection criteria. This paper is to evaluate the ability of some information criteria (corrected Akaike information criterion, Bayesian information criterion (BIC) and residual information criterion (RIC)) for choosing the optimal model when the working correlation function, the working variance function and the working mean function are correct or misspecified. Simulations are carried out for small to moderate sample sizes. Four candidate covariance functions (exponential, Gaussian, Matern and rational quadratic) are used in simulation studies. With the summary in simulation results, we find that the misspecified working correlation structure can still capture some spatial correlation information in model fitting. When the sample size is large enough, BIC and RIC perform well even if the the working covariance is misspecified. Moreover, the performance of these information criteria is related to the average level of model fitting which can be indicated by the average adjusted R square ( [GRAPHICS] ), and overall RIC performs well.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the development of statistical models for prediction of constituent concentration of riverine pollutants, which is a key step in load estimation from frequent flow rate data and less frequently collected concentration data. We consider how to capture the impacts of past flow patterns via the average discounted flow (ADF) which discounts the past flux based on the time lapsed - more recent fluxes are given more weight. However, the effectiveness of ADF depends critically on the choice of the discount factor which reflects the unknown environmental cumulating process of the concentration compounds. We propose to choose the discount factor by maximizing the adjusted R-2 values or the Nash-Sutcliffe model efficiency coefficient. The R2 values are also adjusted to take account of the number of parameters in the model fit. The resulting optimal discount factor can be interpreted as a measure of constituent exhaustion rate during flood events. To evaluate the performance of the proposed regression estimators, we examine two different sampling scenarios by resampling fortnightly and opportunistically from two real daily datasets, which come from two United States Geological Survey (USGS) gaging stations located in Des Plaines River and Illinois River basin. The generalized rating-curve approach produces biased estimates of the total sediment loads by -30% to 83%, whereas the new approaches produce relatively much lower biases, ranging from -24% to 35%. This substantial improvement in the estimates of the total load is due to the fact that predictability of concentration is greatly improved by the additional predictors.