996 resultados para Distortion modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quasiconformal mappings are natural generalizations of conformal mappings. They are homeomorphisms with 'bounded distortion' of which there exist several approaches. In this work we study dimension distortion properties of quasiconformal mappings both in the plane and in higher dimensional Euclidean setting. The thesis consists of a summary and three research articles. A basic property of quasiconformal mappings is the local Hölder continuity. It has long been conjectured that this regularity holds at the Sobolev level (Gehring's higher integrabilty conjecture). Optimal regularity would also provide sharp bounds for the distortion of Hausdorff dimension. The higher integrability conjecture was solved in the plane by Astala in 1994 and it is still open in higher dimensions. Thus in the plane we have a precise description how Hausdorff dimension changes under quasiconformal deformations for general sets. The first two articles contribute to two remaining issues in the planar theory. The first one concerns distortion of more special sets, for rectifiable sets we expect improved bounds to hold. The second issue consists of understanding distortion of dimension on a finer level, namely on the level of Hausdorff measures. In the third article we study flatness properties of quasiconformal images of spheres in a quantitative way. These also lead to nontrivial bounds for their Hausdorff dimension even in the n-dimensional case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The financial health of beef cattle enterprises in northern Australia has declined markedly over the last decade due to an escalation in production and marketing costs and a real decline in beef prices. Historically, gains in animal productivity have offset the effect of declining terms of trade on farm incomes. This raises the question of whether future productivity improvements can remain a key path for lifting enterprise profitability sufficient to ensure that the industry remains economically viable over the longer term. The key objective of this study was to assess the production and financial implications for north Australian beef enterprises of a range of technology interventions (development scenarios), including genetic gain in cattle, nutrient supplementation, and alteration of the feed base through introduced pastures and forage crops, across a variety of natural environments. To achieve this objective a beef systems model was developed that is capable of simulating livestock production at the enterprise level, including reproduction, growth and mortality, based on energy and protein supply from natural C4 pastures that are subject to high inter-annual climate variability. Comparisons between simulation outputs and enterprise performance data in three case study regions suggested that the simulation model (the Northern Australia Beef Systems Analyser) can adequately represent the performance beef cattle enterprises in northern Australia. Testing of a range of development scenarios suggested that the application of individual technologies can substantially lift productivity and profitability, especially where the entire feedbase was altered through legume augmentation. The simultaneous implementation of multiple technologies that provide benefits to different aspects of animal productivity resulted in the greatest increases in cattle productivity and enterprise profitability, with projected weaning rates increasing by 25%, liveweight gain by 40% and net profit by 150% above current baseline levels, although gains of this magnitude might not necessarily be realised in practice. While there were slight increases in total methane output from these development scenarios, the methane emissions per kg of beef produced were reduced by 20% in scenarios with higher productivity gain. Combinations of technologies or innovative practices applied in a systematic and integrated fashion thus offer scope for providing the productivity and profitability gains necessary to maintain viable beef enterprises in northern Australia into the future.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mycobacterium leprae, which has undergone reductive evolution leaving behind a minimal set of essential genes, has retained intervening sequences in four of its genes implicating a vital role for them in the survival of the leprosy bacillus. A single in-frame intervening sequence has been found embedded within its recA gene. Comparison of M. leprae recA intervening sequence with the known intervening sequences indicated that it has the consensus amino acid sequence necessary for being a LAGLIDADG-type homing endonuclease. In light of massive gene decay and function loss in the leprosy bacillus, we sought to investigate whether its recA intervening sequence encodes a catalytically active homing endonuclease. Here we show that the purified M. leprae RecA intein (PI-MleI) binds to cognate DNA and displays endonuclease activity in the presence of alternative divalent cations, Mg2+ or Mn2+. A combination of approaches including four complementary footprinting assays such as DNase I, Cu/phenanthroline, methylation protection and KMnO4, enhancement of 2-aminopurine fluorescence and mapping of the cleavage site revealed that PI-MleI binds to cognate DNA flanking its insertion site, induces helical distortion at the cleavage site and generates two staggered double-strand breaks. Taken together, these results implicate that PI-MleI possess a modular structure with separate domains for DNA target recognition and cleavage, each with distinct sequence preferences. From a biological standpoint, it is tempting to speculate that our findings have implications for understanding the evolution of LAGLIDADG family of homing endonucleases

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Heat stress can cause sterility in sorghum and the anticipated increased frequency of high temperature events implies increasing risk to sorghum productivity in Australia. Here we summarise our research on specific varietal attributes associated with heat stress tolerance in sorghum and evaluate how they might affect yield outcomes in production environments by a crop simulation analysis. We have recently conducted a range of controlled environment and field experiments to study the physiology and genetics of high temperature effects on growth and development of sorghum. Sorghum seed set was reduced by high temperature effects (>36-38oC) on pollen germination around flowering, but genotypes differed in their tolerance to high temperature stress. Effects were quantified in a manner that enabled their incorporation into the APSIM sorghum crop model. Simulation analysis indicated that risk of high temperature damage and yield loss depended on sowing date, and variety. While climate trends will exacerbate high temperature effects, avoidance by crop management and genetic tolerance seems possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A central tenet in the theory of reliability modelling is the quantification of the probability of asset failure. In general, reliability depends on asset age and the maintenance policy applied. Usually, failure and maintenance times are the primary inputs to reliability models. However, for many organisations, different aspects of these data are often recorded in different databases (e.g. work order notifications, event logs, condition monitoring data, and process control data). These recorded data cannot be interpreted individually, since they typically do not have all the information necessary to ascertain failure and preventive maintenance times. This paper presents a methodology for the extraction of failure and preventive maintenance times using commonly-available, real-world data sources. A text-mining approach is employed to extract keywords indicative of the source of the maintenance event. Using these keywords, a Naïve Bayes classifier is then applied to attribute each machine stoppage to one of two classes: failure or preventive. The accuracy of the algorithm is assessed and the classified failure time data are then presented. The applicability of the methodology is demonstrated on a maintenance data set from an Australian electricity company.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Progress in crop improvement is limited by the ability to identify favourable combinations of genotypes (G) and management practices (M) in relevant target environments (E) given the resources available to search among the myriad of possible combinations. To underpin yield advance we require prediction of phenotype based on genotype. In plant breeding, traditional phenotypic selection methods have involved measuring phenotypic performance of large segregating populations in multi-environment trials and applying rigorous statistical procedures based on quantitative genetic theory to identify superior individuals. Recent developments in the ability to inexpensively and densely map/sequence genomes have facilitated a shift from the level of the individual (genotype) to the level of the genomic region. Molecular breeding strategies using genome wide prediction and genomic selection approaches have developed rapidly. However, their applicability to complex traits remains constrained by gene-gene and gene-environment interactions, which restrict the predictive power of associations of genomic regions with phenotypic responses. Here it is argued that crop ecophysiology and functional whole plant modelling can provide an effective link between molecular and organism scales and enhance molecular breeding by adding value to genetic prediction approaches. A physiological framework that facilitates dissection and modelling of complex traits can inform phenotyping methods for marker/gene detection and underpin prediction of likely phenotypic consequences of trait and genetic variation in target environments. This approach holds considerable promise for more effectively linking genotype to phenotype for complex adaptive traits. Specific examples focused on drought adaptation are presented to highlight the concepts.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The phosphine distribution in a cylindrical silo containing grain is predicted. A three-dimensional mathematical model, which accounts for multicomponent gas phase transport and the sorption of phosphine into the grain kernel is developed. In addition, a simple model is presented to describe the death of insects within the grain as a function of their exposure to phosphine gas. The proposed model is solved using the commercially available computational fluid dynamics (CFD) software, FLUENT, together with our own C code to customize the solver in order to incorporate the models for sorption and insect extinction. Two types of fumigation delivery are studied, namely, fan- forced from the base of the silo and tablet from the top of the silo. An analysis of the predicted phosphine distribution shows that during fan forced fumigation, the position of the leaky area is very important to the development of the gas flow field and the phosphine distribution in the silo. If the leak is in the lower section of the silo, insects that exist near the top of the silo may not be eradicated. However, the position of a leak does not affect phosphine distribution during tablet fumigation. For such fumigation in a typical silo configuration, phosphine concentrations remain low near the base of the silo. Furthermore, we find that half-life pressure test readings are not an indicator of phosphine distribution during tablet fumigation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A generalization of the isotropic theory of Batchelor & Proudman (1954) is developed to estimate the effect of sudden but arbitrary three-dimensional distortion on homogeneous, initially axisymmetric turbulence. The energy changes due to distortion are expressed in terms of the Fourier coefficients of an expansion in zonal harmonics of the two independent scalar functions that describe the axisymmetric spectral tensor. However, for two special but non-trivial forms of this tensor, which represent possibly the simplest kinds of non-isotropic turbulence and specify the angular distribution but not the wavenumber dependence, the energy ratios have been determined in closed form. The deviation of the ratio from its isotropic value is the product of a factor containing R, the initial value of the ratio of the longitudinal to the transverse energy component, and another factor depending only on the geometry of the distortion. It is found that, in axisymmetric and large two-dimensional contractions, the isotropic theory gives nearly the correct longitudinal energy, but (when R > 1) over-estimates the increase in the transverse energy; the product of the two intensities varies little unless the distortion is very large, thus accounting for the stress-freezing observed in rapidly accelerated shear flows.Comparisons with available experimental data for the spectra and for the energy ratios show reasonable agreement. The different ansatzes predict results in broad qualitative agreement with a simple strategem suggested by Reynolds & Tucker (1975), but the quantitative differences are not always negligible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The noted 19th century biologist, Ernst Haeckel, put forward the idea that the growth (ontogenesis) of an organism recapitulated the history of its evolutionary development. While this idea is defunct within biology, the idea has been promoted in areas such as education (the idea of an education being the repetition of the civilizations before). In the research presented in this paper, recapitulation is used as a metaphor within computer-aided design as a way of grouping together different generations of spatial layouts. In most CAD programs, a spatial layout is represented as a series of objects (lines, or boundary representations) that stand in as walls. The relationships between spaces are not usually explicitly stated. A representation using Lindenmayer Systems (originally designed for the purpose of modelling plant morphology) is put forward as a way of representing the morphology of a spatial layout. The aim of this research is not just to describe an individual layout, but to find representations that link together lineages of development. This representation can be used in generative design as a way of creating more meaningful layouts which have particular characteristics. The use of genetic operators (mutation and crossover) is also considered, making this representation suitable for use with genetic algorithms.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The phenomenon of adsorption is governed by the various interactions among the constituents of the interface and the forms of adsorption isotherms hold the clue to the nature of the se in teractions. An understanding of this phenomenon may be said to be complete only when the parameters occurring in such expres - sions for isotherms are interpretable in terms of molecular/electronic interactions.This objective viz. expressing the composition of the isotherm parameters through a microscopic modelling is by no means a simple one. Such a task is particularly made difficult in the case of charged interfaces where idealisation is difficult to make and, when made, not so easy to justify.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The term acclimation has been used with several connotations in the field of acclimatory physiology. An attempt has been made, in this paper, to define precisely the term “acclimation” for effective modelling of acclimatory processes. Acclimation is defined with respect to a specific variable, as cumulative experience gained by the organism when subjected to a step change in the environment. Experimental observations on a large number of variables in animals exposed to sustained stress, show that after initial deviation from the basal value (defined as “growth”), the variables tend to return to basal levels (defined as “decay”). This forms the basis for modelling biological responses in terms of their growth and decay. Hierarchical systems theory as presented by Mesarovic, Macko & Takahara (1970) facilitates modelling of complex and partially characterized systems. This theory, in conjunction with “growth-decay” analysis of biological variables, is used to model temperature regulating system in animals exposed to cold. This approach appears to be applicable at all levels of biological organization. Regulation of hormonal activity which forms a part of the temperature regulating system, and the relationship of the latter with the “energy” system of the animal of which it forms a part, are also effectively modelled by this approach. It is believed that this systematic approach would eliminate much of the current circular thinking in the area of acclimatory physiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recommender systems assist users in finding what they want. The challenging issue is how to efficiently acquire user preferences or user information needs for building personalized recommender systems. This research explores the acquisition of user preferences using data taxonomy information to enhance personalized recommendations for alleviating cold-start problem. A concept hierarchy model is proposed, which provides a two-dimensional hierarchy for acquiring user preferences. The language model is also extended for the proposed hierarchy in order to generate an effective recommender algorithm. Both Amazon.com book and music datasets are used to evaluate the proposed approach, and the experimental results show that the proposed approach is promising.