990 resultados para Heitler-London method
Resumo:
In recent years, it has been found that many phenomena in engineering, physics, chemistry and other sciences can be described very successfully by models using mathematical tools from fractional calculus. Recently, noted a new space and time fractional Bloch-Torrey equation (ST-FBTE) has been proposed (see Magin et al. (2008)), and successfully applied to analyse diffusion images of human brain tissues to provide new insights for further investigations of tissue structures. In this paper, we consider the ST-FBTE on a finite domain. The time and space derivatives in the ST-FBTE are replaced by the Caputo and the sequential Riesz fractional derivatives, respectively. Firstly, we propose a new effective implicit numerical method (INM) for the STFBTE whereby we discretize the Riesz fractional derivative using a fractional centered difference. Secondly, we prove that the implicit numerical method for the ST-FBTE is unconditionally stable and convergent, and the order of convergence of the implicit numerical method is ( T2 - α + h2 x + h2 y + h2 z ). Finally, some numerical results are presented to support our theoretical analysis.
A finite volume method for solving the two-sided time-space fractional advection-dispersion equation
Resumo:
The field of fractional differential equations provides a means for modelling transport processes within complex media which are governed by anomalous transport. Indeed, the application to anomalous transport has been a significant driving force behind the rapid growth and expansion of the literature in the field of fractional calculus. In this paper, we present a finite volume method to solve the time-space two-sided fractional advection dispersion equation on a one-dimensional domain. Such an equation allows modelling different flow regime impacts from either side. The finite volume formulation provides a natural way to handle fractional advection-dispersion equations written in conservative form. The novel spatial discretisation employs fractionally-shifted Gr¨unwald formulas to discretise the Riemann-Liouville fractional derivatives at control volume faces in terms of function values at the nodes, while the L1-algorithm is used to discretise the Caputo time fractional derivative. Results of numerical experiments are presented to demonstrate the effectiveness of the approach.
Resumo:
Fractional differential equation is used to describe a fractal model of mobile/immobile transport with a power law memory function. This equation is the limiting equation that governs continuous time random walks with heavy tailed random waiting times. In this paper, we firstly propose a finite difference method to discretize the time variable and obtain a semi-discrete scheme. Then we discuss its stability and convergence. Secondly we consider a meshless method based on radial basis functions (RBF) to discretize the space variable. By contrast to conventional FDM and FEM, the meshless method is demonstrated to have distinct advantages: calculations can be performed independent of a mesh, it is more accurate and it can be used to solve complex problems. Finally the convergence order is verified from a numerical example is presented to describe the fractal model of mobile/immobile transport process with different problem domains. The numerical results indicate that the present meshless approach is very effective for modeling and simulating of fractional differential equations, and it has good potential in development of a robust simulation tool for problems in engineering and science that are governed by various types of fractional differential equations.
Resumo:
In this paper, a method of separating variables is effectively implemented for solving a time-fractional telegraph equation (TFTE) in two and three dimensions. We discuss and derive the analytical solution of the TFTE in two and three dimensions with nonhomogeneous Dirichlet boundary condition. This method can be extended to other kinds of the boundary conditions.
Resumo:
Data quality has become a major concern for organisations. The rapid growth in the size and technology of a databases and data warehouses has brought significant advantages in accessing, storing, and retrieving information. At the same time, great challenges arise with rapid data throughput and heterogeneous accesses in terms of maintaining high data quality. Yet, despite the importance of data quality, literature has usually condensed data quality into detecting and correcting poor data such as outliers, incomplete or inaccurate values. As a result, organisations are unable to efficiently and effectively assess data quality. Having an accurate and proper data quality assessment method will enable users to benchmark their systems and monitor their improvement. This paper introduces a granules mining for measuring the random degree of error data which will enable decision makers to conduct accurate quality assessment and allocate the most severe data, thereby providing an accurate estimation of human and financial resources for conducting quality improvement tasks.
Resumo:
his paper formulates an edge-based smoothed conforming point interpolation method (ES-CPIM) for solid mechanics using the triangular background cells. In the ES-CPIM, a technique for obtaining conforming PIM shape functions (CPIM) is used to create a continuous and piecewise quadratic displacement field over the whole problem domain. The smoothed strain field is then obtained through smoothing operation over each smoothing domain associated with edges of the triangular background cells. The generalized smoothed Galerkin weak form is then used to create the discretized system equations. Numerical studies have demonstrated that the ES-CPIM possesses the following good properties: (1) ES-CPIM creates conforming quadratic PIM shape functions, and can always pass the standard patch test; (2) ES-CPIM produces a quadratic displacement field without introducing any additional degrees of freedom; (3) The results of ES-CPIM are generally of very high accuracy.
Resumo:
Background When large scale trials are investigating the effects of interventions on appetite, it is paramount to efficiently monitor large amounts of human data. The original hand-held Electronic Appetite Ratings System (EARS) was designed to facilitate the administering and data management of visual analogue scales (VAS) of subjective appetite sensations. The purpose of this study was to validate a novel hand-held method (EARS II (HP® iPAQ)) against the standard Pen and Paper (P&P) method and the previously validated EARS. Methods Twelve participants (5 male, 7 female, aged 18-40) were involved in a fully repeated measures design. Participants were randomly assigned in a crossover design, to either high fat (>48% fat) or low fat (<28% fat) meal days, one week apart and completed ratings using the three data capture methods ordered according to Latin Square. The first set of appetite sensations was completed in a fasted state, immediately before a fixed breakfast. Thereafter, appetite sensations were completed every thirty minutes for 4h. An ad libitum lunch was provided immediately before completing a final set of appetite sensations. Results Repeated measures ANOVAs were conducted for ratings of hunger, fullness and desire to eat. There were no significant differences between P&P compared with either EARS or EARS II (p > 0.05). Correlation coefficients between P&P and EARS II, controlling for age and gender, were performed on Area Under the Curve ratings. R2 for Hunger (0.89), Fullness (0.96) and Desire to Eat (0.95) were statistically significant (p < 0.05). Conclusions EARS II was sensitive to the impact of a meal and recovery of appetite during the postprandial period and is therefore an effective device for monitoring appetite sensations. This study provides evidence and support for further validation of the novel EARS II method for monitoring appetite sensations during large scale studies. The added versatility means that future uses of the system provides the potential to monitor a range of other behavioural and physiological measures often important in clinical and free living trials.
Resumo:
The lymphedema diagnostic method used in descriptive or intervention studies may influence results found. The purposes of this work were to compare baseline lymphedema prevalence in the physical activity and lymphedema (PAL) trial cohort and to subsequently compare the effect of the weight-lifting intervention on lymphedema, according to four standard diagnostic methods. The PAL trial was a randomized controlled intervention study, involving 295 women who had previously been treated for breast cancer, and evaluated the effect of 12 months of weight lifting on lymphedema status. Four diagnostic methods were used to evaluate lymphedema outcomes: (i) interlimb volume difference through water displacement, (ii) interlimb size difference through sum of arm circumferences, (iii) interlimb impedance ratio using bioimpedance spectroscopy, and (iv) a validated self-report survey. Of the 295 women who participated in the PAL trial, between 22 and 52% were considered to have lymphedema at baseline according to the four diagnostic criteria used. No between-group differences were noted in the proportion of women who had a change in interlimb volume, interlimb size, interlimb ratio, or survey score of ≥5, ≥5, ≥10%, and 1 unit, respectively (cumulative incidence ratio at study end for each measure ranged between 0.6 and 0.8, with confidence intervals spanning 1.0). The variation in proportions of women within the PAL trial considered to have lymphoedema at baseline highlights the potential impact of the diagnostic criteria on population surveillance regarding prevalence of this common morbidity of treatment. Importantly though, progressive weight lifting was shown to be safe for women following breast cancer, even for those at risk or with lymphedema, irrespective of the diagnostic criteria used.
Resumo:
The major limitation of current typing methods for Streptococcus pyogenes, such as emm sequence typing and T typing, is that these are based on regions subject to considerable selective pressure. Multilocus sequence typing (MLST) is a better indicator of the genetic backbone of a strain but is not widely used due to high costs. The objective of this study was to develop a robust and cost-effective alternative to S. pyogenes MLST. A 10-member single nucleotide polymorphism (SNP) set that provides a Simpson’s Index of Diversity (D) of 0.99 with respect to the S. pyogenes MLST database was derived. A typing format involving high-resolution melting (HRM) analysis of small fragments nucleated by each of the resolution-optimized SNPs was developed. The fragments were 59–119 bp in size and, based on differences in G+C content, were predicted to generate three to six resolvable HRM curves. The combination of curves across each of the 10 fragments can be used to generate a melt type (MelT) for each sequence type (ST). The 525 STs currently in the S. pyogenes MLST database are predicted to resolve into 298 distinct MelTs and the method is calculated to provide a D of 0.996 against the MLST database. The MelTs are concordant with the S. pyogenes population structure. To validate the method we examined clinical isolates of S. pyogenes of 70 STs. Curves were generated as predicted by G+C content discriminating the 70 STs into 65 distinct MelTs.