893 resultados para Boussinesq approximation
Resumo:
Condensation technique of degree of freedom is firstly proposed to improve the computational efficiency of meshfree method with Galerkin weak form. In present method, scattered nodes without connectivity are divided into several subsets by cells with arbitrary shape. The local discrete equations are established over each cell by using moving kriging interpolation, in which the nodes that located in the cell are used for approximation. Then, the condensation technique can be introduced into the local discrete equations by transferring equations of inner nodes to equations of boundary nodes based on cell. In the scheme of present method, the calculation of each cell is carried out by meshfree method with Galerkin weak form, and local search is implemented in interpolation. Numerical examples show that the present method has high computational efficiency and convergence, and good accuracy is also obtained.
Resumo:
An investigation of the drying of spherical food particles was performed, using peas as the model material. In the development of a mathematical model for drying curves, moisture diffusion was modelled using Fick’s second law for mass transfer. The resulting partial differential equation was solved using a forward-time central-space finite difference approximation, with the assumption of variable effective diffusivity. In order to test the model, experimental data was collected for the drying of green peas in a fluidised bed at three drying temperatures. Through fitting three equation types for effective diffusivity to the data, it was found that a linear equation form, in which diffusivity increased with decreasing moisture content, was most appropriate. The final model accurately described the drying curves of the three experimental temperatures, with an R2 value greater than 98.6% for all temperatures.
Resumo:
This paper presents an investigation into event detection in crowded scenes, where the event of interest co-occurs with other activities and only binary labels at the clip level are available. The proposed approach incorporates a fast feature descriptor from the MPEG domain, and a novel multiple instance learning (MIL) algorithm using sparse approximation and random sensing. MPEG motion vectors are used to build particle trajectories that represent the motion of objects in uniform video clips, and the MPEG DCT coefficients are used to compute a foreground map to remove background particles. Trajectories are transformed into the Fourier domain, and the Fourier representations are quantized into visual words using the K-Means algorithm. The proposed MIL algorithm models the scene as a linear combination of independent events, where each event is a distribution of visual words. Experimental results show that the proposed approaches achieve promising results for event detection compared to the state-of-the-art.
Resumo:
Discretization of a geographical region is quite common in spatial analysis. There have been few studies into the impact of different geographical scales on the outcome of spatial models for different spatial patterns. This study aims to investigate the impact of spatial scales and spatial smoothing on the outcomes of modelling spatial point-based data. Given a spatial point-based dataset (such as occurrence of a disease), we study the geographical variation of residual disease risk using regular grid cells. The individual disease risk is modelled using a logistic model with the inclusion of spatially unstructured and/or spatially structured random effects. Three spatial smoothness priors for the spatially structured component are employed in modelling, namely an intrinsic Gaussian Markov random field, a second-order random walk on a lattice, and a Gaussian field with Matern correlation function. We investigate how changes in grid cell size affect model outcomes under different spatial structures and different smoothness priors for the spatial component. A realistic example (the Humberside data) is analyzed and a simulation study is described. Bayesian computation is carried out using an integrated nested Laplace approximation. The results suggest that the performance and predictive capacity of the spatial models improve as the grid cell size decreases for certain spatial structures. It also appears that different spatial smoothness priors should be applied for different patterns of point data.
Resumo:
Organizational transformations reliant on successful ICT system developments (continue to) fail to deliver projected benefits even when contemporary governance models are applied rigorously. Modifications to traditional program, project and systems development management methods have produced little material improvement to successful transformation as they are unable to routinely address the complexity and uncertainty of dynamic alignment of IS investments and innovation. Complexity theory provides insight into why this phenomenon occurs and is used to develop a conceptualization of complexity in IS-driven organizational transformations. This research-in-progress aims to identify complexity formulations relevant to organizational transformation. Political/power based influences, interrelated business rules, socio-technical innovation, impacts on stakeholders and emergent behaviors are commonly considered as characterizing complexity while the proposed conceptualization accommodates these as connectivity, irreducibility, entropy and/or information gain in hierarchically approximation and scaling, number of states in a finite automata and/or dimension of attractor, and information and/or variety.
Resumo:
Background: The overuse of antibiotics is becoming an increasing concern. Antibiotic resistance, which increases both the burden of disease, and the cost of health services, is perhaps the most profound impact of antibiotics overuse. Attempts have been made to develop instruments to measure the psychosocial constructs underlying antibiotics use, however, none of these instruments have undergone thorough psychometric validation. This study evaluates the psychometric properties of the Parental Perceptions on Antibiotics (PAPA) scales. The PAPA scales attempt to measure the factors influencing parental use of antibiotics in children. Methods: 1111 parents of children younger than 12 years old were recruited from primary schools’ parental meetings in the Eastern Province of Saudi Arabia from September 2012 to January 2013. The structure of the PAPA instrument was validated using Confirmatory Factor Analysis (CFA) with measurement model fit evaluated using the raw and scaled χ2, Goodness of Fit Index, and Root Mean Square Error of Approximation. Results: A five-factor model was confirmed with the model showing good fit. Constructs in the model include: Knowledge and Beliefs, Behaviors, Sources of information, Adherence, and Awareness about antibiotics resistance. The instrument was shown to have good internal consistency, and good discriminant and convergent validity. Conclusion: The availability of an instrument able to measure the psychosocial factors underlying antibiotics usage allows the risk factors underlying antibiotic use and overuse to now be investigated.
Resumo:
Spreading cell fronts play an essential role in many physiological processes. Classically, models of this process are based on the Fisher-Kolmogorov equation; however, such continuum representations are not always suitable as they do not explicitly represent behaviour at the level of individual cells. Additionally, many models examine only the large time asymptotic behaviour, where a travelling wave front with a constant speed has been established. Many experiments, such as a scratch assay, never display this asymptotic behaviour, and in these cases the transient behaviour must be taken into account. We examine the transient and asymptotic behaviour of moving cell fronts using techniques that go beyond the continuum approximation via a volume-excluding birth-migration process on a regular one-dimensional lattice. We approximate the averaged discrete results using three methods: (i) mean-field, (ii) pair-wise, and (iii) one-hole approximations. We discuss the performace of these methods, in comparison to the averaged discrete results, for a range of parameter space, examining both the transient and asymptotic behaviours. The one-hole approximation, based on techniques from statistical physics, is not capable of predicting transient behaviour but provides excellent agreement with the asymptotic behaviour of the averaged discrete results, provided that cells are proliferating fast enough relative to their rate of migration. The mean-field and pair-wise approximations give indistinguishable asymptotic results, which agree with the averaged discrete results when cells are migrating much more rapidly than they are proliferating. The pair-wise approximation performs better in the transient region than does the mean-field, despite having the same asymptotic behaviour. Our results show that each approximation only works in specific situations, thus we must be careful to use a suitable approximation for a given system, otherwise inaccurate predictions could be made.
Resumo:
Dermal wound repair involves complex interactions between cells, cytokines and mechanics to close injuries to the skin. In particular, we investigate the contribution of fibroblasts, myofibroblasts, TGFβ, collagen and local tissue mechanics to wound repair in the human dermis. We develop a morphoelastic model where a realistic representation of tissue mechanics is key, and a fibrocontractive model that involves a reasonable approximation to the true kinetics of the important bioactive species. We use each of these descriptions to elucidate the mechanisms that generate pathologies such as hypertrophic scars, contractures and keloids. We find that for hypertrophic scar and contracture development, factors regulating the myofibroblast phenotype are critical, with heightened myofibroblast activation, reduced myofibroblast apoptosis or prolonged inflammation all predicted as mediators for scar hypertrophy and contractures. Prevention of these pathologies is predicted when myofibroblast apoptosis is induced, myofibroblast activation is blocked or TGFβ is neutralised. To investigate keloid invasion, we develop a caricature representation of the fibrocontractive model and find that TGFβ spread is the driving factor behind keloid growth. Blocking activation of TGFβ is found to cause keloid regression. Thus, we recommend myofibroblasts and TGFβ as targets for clinicians when developing intervention strategies for prevention and cure of fibrotic scars.
Resumo:
The numerical solution in one space dimension of advection--reaction--diffusion systems with nonlinear source terms may invoke a high computational cost when the presently available methods are used. Numerous examples of finite volume schemes with high order spatial discretisations together with various techniques for the approximation of the advection term can be found in the literature. Almost all such techniques result in a nonlinear system of equations as a consequence of the finite volume discretisation especially when there are nonlinear source terms in the associated partial differential equation models. This work introduces a new technique that avoids having such nonlinear systems of equations generated by the spatial discretisation process when nonlinear source terms in the model equations can be expanded in positive powers of the dependent function of interest. The basis of this method is a new linearisation technique for the temporal integration of the nonlinear source terms as a supplementation of a more typical finite volume method. The resulting linear system of equations is shown to be both accurate and significantly faster than methods that necessitate the use of solvers for nonlinear system of equations.
Resumo:
An important aspect of decision support systems involves applying sophisticated and flexible statistical models to real datasets and communicating these results to decision makers in interpretable ways. An important class of problem is the modelling of incidence such as fire, disease etc. Models of incidence known as point processes or Cox processes are particularly challenging as they are ‘doubly stochastic’ i.e. obtaining the probability mass function of incidents requires two integrals to be evaluated. Existing approaches to the problem either use simple models that obtain predictions using plug-in point estimates and do not distinguish between Cox processes and density estimation but do use sophisticated 3D visualization for interpretation. Alternatively other work employs sophisticated non-parametric Bayesian Cox process models, but do not use visualization to render interpretable complex spatial temporal forecasts. The contribution here is to fill this gap by inferring predictive distributions of Gaussian-log Cox processes and rendering them using state of the art 3D visualization techniques. This requires performing inference on an approximation of the model on a discretized grid of large scale and adapting an existing spatial-diurnal kernel to the log Gaussian Cox process context.
Resumo:
The Quantum Probability Ranking Principle (QPRP) has been recently proposed, and accounts for interdependent document relevance when ranking. However, to be instantiated, the QPRP requires a method to approximate the interference" between two documents. In this poster, we empirically evaluate a number of different methods of approximation on two TREC test collections for subtopic retrieval. It is shown that these approximations can lead to significantly better retrieval performance over the state of the art.
Resumo:
Dynamic light scattering (DLS) has become a primary nanoparticle characterization technique with applications from materials characterization to biological and environmental detection. With the expansion in DLS use from homogeneous spheres to more complicated nanostructures, comes a decrease in accuracy. Much research has been performed to develop different diffusion models that account for the vastly different structures but little attention has been given to the effect on the light scattering properties in relation to DLS. In this work, small (core size < 5 nm) core-shell nanoparticles were used as a case study to measure the capping thickness of a layer of dodecanethiol (DDT) on Au and ZnO nanoparticles by DLS. We find that the DDT shell has very little effect on the scattering properties of the inorganic core and hence can be ignored to a first approximation. However, this results in conventional DLS analysis overestimating the hydrodynamic size in the volume and number weighted distributions. By introducing a simple correction formula that more accurately yields hydrodynamic size distributions a more precise determination of the molecular shell thickness is obtained. With this correction, the measured thickness of the DDT shell was found to be 7.3 ± 0.3 Å, much less than the extended chain length of 16 Å. This organic layer thickness suggests that on small nanoparticles, the DDT monolayer adopts a compact disordered structure rather than an open ordered structure on both ZnO and Au nanoparticle surfaces. These observations are in agreement with published molecular dynamics results.
Resumo:
Dragon stream cipher is one of the focus ciphers which have reached Phase 2 of the eSTREAMproject. In this paper, we present a new method of building a linear distinguisher for Dragon. The distinguisher is constructed by exploiting the biases of two S-boxes and the modular addition which are basic components of the nonlinear function F. The bias of the distinguisher is estimated to be around 2−75.32 which is better than the bias of the distinguisher presented by Englund and Maximov. We have shown that Dragon is distinguishable from a random cipher by using around 2150.6 keystream words and 259 memory. In addition, we present a very efficient algorithm for computing the bias of linear approximation of modular addition.
Resumo:
Traction force microscopy (TFM) is commonly used to estimate cells’ traction forces from the deformation that they cause on their substrate. The accuracy of TFM highly depends on the computational methods used to measure the deformation of the substrate and estimate the forces, and also on the specifics of the experimental set-up. Computer simulations can be used to evaluate the effect of both the computational methods and the experimental set-up without the need to perform numerous experiments. Here, we present one such TFM simulator that addresses several limitations of the existing ones. As a proof of principle, we recreate a TFM experimental set-up, and apply a classic 2D TFM algorithm to recover the forces. In summary, our simulator provides a valuable tool to study the performance, refine experimentally, and guide the extraction of biological conclusions from TFM experiments.
Resumo:
Condensation technique of degree of freedom is first proposed to improve the computational efficiency of meshfree method with Galerkin weak form for elastic dynamic analysis. In the present method, scattered nodes without connectivity are divided into several subsets by cells with arbitrary shape. Local discrete equation is established over each cell by using moving Kriging interpolation, in which the nodes that located in the cell are used for approximation. Then local discrete equations can be simplified by condensation of degree of freedom, which transfers equations of inner nodes to equations of boundary nodes based on cells. The global dynamic system equations are obtained by assembling all local discrete equations and are solved by using the standard implicit Newmark’s time integration scheme. In the scheme of present method, the calculation of each cell is carried out by meshfree method, and local search is implemented in interpolation. Numerical examples show that the present method has high computational efficiency and good accuracy in solving elastic dynamic problems.