279 resultados para 080204 Mathematical Software


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The R statistical environment and language has demonstrated particular strengths for interactive development of statistical algorithms, as well as data modelling and visualisation. Its current implementation has an interpreter at its core which may result in a performance penalty in comparison to directly executing user algorithms in the native machine code of the host CPU. In contrast, the C++ language has no built-in visualisation capabilities, handling of linear algebra or even basic statistical algorithms; however, user programs are converted to high-performance machine code, ahead of execution. A new method avoids possible speed penalties in R by using the Rcpp extension package in conjunction with the Armadillo C++ matrix library. In addition to the inherent performance advantages of compiled code, Armadillo provides an easy-to-use template-based meta-programming framework, allowing the automatic pooling of several linear algebra operations into one, which in turn can lead to further speedups. With the aid of Rcpp and Armadillo, conversion of linear algebra centered algorithms from R to C++ becomes straightforward. The algorithms retains the overall structure as well as readability, all while maintaining a bidirectional link with the host R environment. Empirical timing comparisons of R and C++ implementations of a Kalman filtering algorithm indicate a speedup of several orders of magnitude.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Deterministic computer simulations of physical experiments are now common techniques in science and engineering. Often, physical experiments are too time consuming, expensive or impossible to conduct. Complex computer models or codes, rather than physical experiments lead to the study of computer experiments, which are used to investigate many scientific phenomena of this nature. A computer experiment consists of a number of runs of the computer code with different input choices. The Design and Analysis of Computer Experiments is a rapidly growing technique in statistical experimental design. This thesis investigates some practical issues in the design and analysis of computer experiments and attempts to answer some of the questions faced by experimenters using computer experiments. In particular, the question of the number of computer experiments and how they should be augmented is studied and attention is given to when the response is a function over time.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Pesticides used in agricultural systems must be applied in economically viable and environmentally sensitive ways, and this often requires expensive field trials on spray deposition and retention by plant foliage. Computational models to describe whether a spray droplet sticks (adheres), bounces or shatters on impact, and if any rebounding parent or shatter daughter droplets are recaptured, would provide an estimate of spray retention and thereby act as a useful guide prior to any field trials. Parameter-driven interactive software has been implemented to enable the end-user to study and visualise droplet interception and impaction on a single, horizontal leaf. Living chenopodium, wheat and cotton leaves have been scanned to capture the surface topography and realistic virtual leaf surface models have been generated. Individual leaf models have then been subjected to virtual spray droplets and predictions made of droplet interception with the virtual plant leaf. Thereafter, the impaction behaviour of the droplets and the subsequent behaviour of any daughter droplets, up until re-capture, are simulated to give the predicted total spray retention by the leaf. A series of critical thresholds for the stick, bounce, and shatter elements in the impaction process have been developed for different combinations of formulation, droplet size and velocity, and leaf surface characteristics to provide this output. The results show that droplet properties, spray formulations and leaf surface characteristics all influence the predicted amount of spray retained on a horizontal leaf surface. Overall the predicted spray retention increases as formulation surface tension, static contact angle, droplet size and velocity decreases. Predicted retention on cotton is much higher than on chenopodium. The average predicted retention on a single horizontal leaf across all droplet size, velocity and formulations scenarios tested, is 18, 30 and 85% for chenopodium, wheat and cotton, respectively.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Unsaturated water flow in soil is commonly modelled using Richards’ equation, which requires the hydraulic properties of the soil (e.g., porosity, hydraulic conductivity, etc.) to be characterised. Naturally occurring soils, however, are heterogeneous in nature, that is, they are composed of a number of interwoven homogeneous soils each with their own set of hydraulic properties. When the length scale of these soil heterogeneities is small, numerical solution of Richards’ equation is computationally impractical due to the immense effort and refinement required to mesh the actual heterogeneous geometry. A classic way forward is to use a macroscopic model, where the heterogeneous medium is replaced with a fictitious homogeneous medium, which attempts to give the average flow behaviour at the macroscopic scale (i.e., at a scale much larger than the scale of the heterogeneities). Using the homogenisation theory, a macroscopic equation can be derived that takes the form of Richards’ equation with effective parameters. A disadvantage of the macroscopic approach, however, is that it fails in cases when the assumption of local equilibrium does not hold. This limitation has seen the introduction of two-scale models that include at each point in the macroscopic domain an additional flow equation at the scale of the heterogeneities (microscopic scale). This report outlines a well-known two-scale model and contributes to the literature a number of important advances in its numerical implementation. These include the use of an unstructured control volume finite element method and image-based meshing techniques, that allow for irregular micro-scale geometries to be treated, and the use of an exponential time integration scheme that permits both scales to be resolved simultaneously in a completely coupled manner. Numerical comparisons against a classical macroscopic model confirm that only the two-scale model correctly captures the important features of the flow for a range of parameter values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Self-segregation and compartimentalisation are observed experimentally to occur spontaneously on live membranes as well as reconstructed model membranes. It is believed that many of these processes are caused or supported by anomalous diffusive behaviours of biomolecules on membranes due to the complex and heterogeneous nature of these environments. These phenomena are on the one hand of great interest in biology, since they may be an important way for biological systems to selectively localize receptors, regulate signaling or modulate kinetics; and on the other, they provide an inspiration for engineering designs that mimick natural systems. We present an interactive software package we are developing for the purpose of simulating such processes numerically using a fundamental Monte Carlo approach. This program includes the ability to simulate kinetics and mass transport in the presence of either mobile or immobile obstacles and other relevant structures such as liquid-ordered lipid microdomains. We also present preliminary simulation results regarding the selective spatial localization and chemical kinetics modulating power of immobile obstacles on the membrane, obtained using the program.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To address issues of divisive ideologies in the Mathematics Education community and to subsequently advance educational practice, an alternative theoretical framework and operational model is proposed which represents a consilience of constructivist learning theories whilst acknowledging the objective but improvable nature of domain knowledge. Based upon Popper’s three-world model of knowledge, the proposed theory supports the differentiation and explicit modelling of both shared domain knowledge and idiosyncratic personal understanding using a visual nomenclature. The visual nomenclature embodies Piaget’s notion of reflective abstraction and so may support an individual’s experience-based transformation of personal understanding with regards to shared domain knowledge. Using the operational model and visual nomenclature, seminal literature regarding early-number counting and addition was analysed and described. Exemplars of the resultant visual artefacts demonstrate the proposed theory’s viability as a tool with which to characterise the reflective abstraction-based organisation of a domain’s shared knowledge. Utilising such a description of knowledge, future research needs to consider the refinement of the operational model and visual nomenclature to include the analysis, description and scaffolded transformation of personal understanding. A detailed model of knowledge and understanding may then underpin the future development of educational software tools such as computer-mediated teaching and learning environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Toolbox, combined with MATLAB ® and a modern workstation computer, is a useful and convenient environment for investigation of machine vision algorithms. For modest image sizes the processing rate can be sufficiently ``real-time'' to allow for closed-loop control. Focus of attention methods such as dynamic windowing (not provided) can be used to increase the processing rate. With input from a firewire or web camera (support provided) and output to a robot (not provided) it would be possible to implement a visual servo system entirely in MATLAB. Provides many functions that are useful in machine vision and vision-based control. Useful for photometry, photogrammetry, colorimetry. It includes over 100 functions spanning operations such as image file reading and writing, acquisition, display, filtering, blob, point and line feature extraction, mathematical morphology, homographies, visual Jacobians, camera calibration and color space conversion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fracture healing is a complicated coupling of many processes. Yet despite the apparent complexity, fracture repair is usually effective. There is, however, no comprehensive mathematical model addressing the multiple interactions of cells, cytokines and oxygen that includes extra-cellular matrix production and that results in the formation of the early stage soft callus. This thesis develops a one dimensional continuum transport model in the context of early fracture healing. Although fracture healing is a complex interplay of many local factors, critical components are identified and used to construct an hypothesis about regulation of the evolution of early callus formation. Multiple cell lines, cellular differentiation, oxygen levels and cytokine concentrations are examined as factors affecting this model of early bone repair. The model presumes diffusive and chemotactic cell migration mechanisms. It is proposed that the initial signalling regime and oxygen availability arising as consequences of bone fracture, are sufficient to determine the quantity and quality of early soft callus formation. Readily available software and purpose written algorithms have been used to obtain numerical solutions representative of various initial conditions. These numerical distributions of cellular populations reflect available histology obtained from murine osteotomies. The behaviour of the numerical system in response to differing initial conditions can be described by alternative in vivo healing pathways. An experimental basis, as illustrated in murine fracture histology, has been utilised to validate the mathematical model outcomes. The model developed in this thesis has potential for future extension, to incorporate processes leading to woven bone deposition, while maintaining the characteristics that regulate early callus formation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If the land sector is to make significant contributions to mitigating anthropogenic greenhouse gas (GHG) emissions in coming decades, it must do so while concurrently expanding production of food and fiber. In our view, mathematical modeling will be required to provide scientific guidance to meet this challenge. In order to be useful in GHG mitigation policy measures, models must simultaneously meet scientific, software engineering, and human capacity requirements. They can be used to understand GHG fluxes, to evaluate proposed GHG mitigation actions, and to predict and monitor the effects of specific actions; the latter applications require a change in mindset that has parallels with the shift from research modeling to decision support. We compare and contrast 6 agro-ecosystem models (FullCAM, DayCent, DNDC, APSIM, WNMM, and AgMod), chosen because they are used in Australian agriculture and forestry. Underlying structural similarities in the representations of carbon flows though plants and soils in these models are complemented by a diverse range of emphases and approaches to the subprocesses within the agro-ecosystem. None of these agro-ecosystem models handles all land sector GHG fluxes, and considerable model-based uncertainty exists for soil C fluxes and enteric methane emissions. The models also show diverse approaches to the initialisation of model simulations, software implementation, distribution, licensing, and software quality assurance; each of these will differentially affect their usefulness for policy-driven GHG mitigation prediction and monitoring. Specific requirements imposed on the use of models by Australian mitigation policy settings are discussed, and areas for further scientific development of agro-ecosystem models for use in GHG mitigation policy are proposed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species identification based on short sequences of DNA markers, that is, DNA barcoding, has emerged as an integral part of modern taxonomy. However, software for the analysis of large and multilocus barcoding data sets is scarce. The Basic Local Alignment Search Tool (BLAST) is currently the fastest tool capable of handling large databases (e.g. >5000 sequences), but its accuracy is a concern and has been criticized for its local optimization. However, current more accurate software requires sequence alignment or complex calculations, which are time-consuming when dealing with large data sets during data preprocessing or during the search stage. Therefore, it is imperative to develop a practical program for both accurate and scalable species identification for DNA barcoding. In this context, we present VIP Barcoding: a user-friendly software in graphical user interface for rapid DNA barcoding. It adopts a hybrid, two-stage algorithm. First, an alignment-free composition vector (CV) method is utilized to reduce searching space by screening a reference database. The alignment-based K2P distance nearest-neighbour method is then employed to analyse the smaller data set generated in the first stage. In comparison with other software, we demonstrate that VIP Barcoding has (i) higher accuracy than Blastn and several alignment-free methods and (ii) higher scalability than alignment-based distance methods and character-based methods. These results suggest that this platform is able to deal with both large-scale and multilocus barcoding data with accuracy and can contribute to DNA barcoding for modern taxonomy. VIP Barcoding is free and available at http://msl.sls.cuhk.edu.hk/vipbarcoding/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The phosphine distribution in a cylindrical silo containing grain is predicted. A three-dimensional mathematical model, which accounts for multicomponent gas phase transport and the sorption of phosphine into the grain kernel is developed. In addition, a simple model is presented to describe the death of insects within the grain as a function of their exposure to phosphine gas. The proposed model is solved using the commercially available computational fluid dynamics (CFD) software, FLUENT, together with our own C code to customize the solver in order to incorporate the models for sorption and insect extinction. Two types of fumigation delivery are studied, namely, fan- forced from the base of the silo and tablet from the top of the silo. An analysis of the predicted phosphine distribution shows that during fan forced fumigation, the position of the leaky area is very important to the development of the gas flow field and the phosphine distribution in the silo. If the leak is in the lower section of the silo, insects that exist near the top of the silo may not be eradicated. However, the position of a leak does not affect phosphine distribution during tablet fumigation. For such fumigation in a typical silo configuration, phosphine concentrations remain low near the base of the silo. Furthermore, we find that half-life pressure test readings are not an indicator of phosphine distribution during tablet fumigation.