947 resultados para MEAN-FIELD MODELS


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Options for the integrated management of white blister (caused by Albugo candida) of Brassica crops include the use of well timed overhead irrigation, resistant cultivars, programs of weekly fungicide sprays or strategic fungicide applications based on the disease risk prediction model, Brassica(spot)(TM). Initial systematic surveys of radish producers near Melbourne, Victoria, indicated that crops irrigated overhead in the morning (0800-1200 h) had a lower incidence of white blister than those irrigated overhead in the evening (2000-2400 h). A field trial was conducted from July to November 2008 on a broccoli crop located west of Melbourne to determine the efficacy and economics of different practices used for white blister control, modifying irrigation timing, growing a resistant cultivar and timing spray applications based on Brassica(spot)(TM). Growing the resistant cultivar, 'Tyson', instead of the susceptible cultivar, 'Ironman', reduced disease incidence on broccoli heads by 99 %. Overhead irrigation at 0400 h instead of 2000 h reduced disease incidence by 58 %. A weekly spray program or a spray regime based on either of two versions of the Brassica(spot)(TM) model provided similar disease control and reduced disease incidence by 72 to 83 %. However, use of the Brassica(spot)(TM) models greatly reduced the number of sprays required for control from 14 to one or two. An economic analysis showed that growing the more resistant cultivar increased farm profit per ha by 12 %, choosing morning irrigation by 3 % and using the disease risk predictive models compared with weekly sprays by 15 %. The disease risk predictive models were 4 % more profitable than the unsprayed control.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A zonally averaged version of the Goddard Laboratory for Atmospheric Sciences (GLAS) climate model is used to study the sensitivity of the northern hemisphere (NH) summer mean meridional circulation to changes in the large scale eddy forcing. A standard solution is obtained by prescribing the latent heating field and climatological horizontal transports of heat and momentum by the eddies. The radiative heating and surface fluxes are calculated by model parameterizations. This standard solution is compared with the results of several sensitivity studies. When the eddy forcing is reduced to 0.5 times or increased to 1.5 times the climatological values, the strength of the Ferrel cells decrease or increase proportionally. It is also seen that such changes in the eddy forcing can influence the strength of theNH Hadley cell significantly. Possible impact of such changes in the large scale eddy forcing on the monsoon circulation via changes in the Hadley circulation is discussed. Sensitivity experiments including only one component of eddy forcing at a time show that the eddy momentum fluxes seem to be more important in maintaining the Ferrel cells than the eddy heat fluxes. In the absence of the eddy heat fluxes, the observed eddy momentum fluxes alone produce subtropical westerly jets which are weaker than those in the standard solution. On the other hand, the observed eddy heat fluxes alone produce subtropical westerly jets which are stronger than those in the standard solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis consists of an introduction, four research articles and an appendix. The thesis studies relations between two different approaches to continuum limit of models of two dimensional statistical mechanics at criticality. The approach of conformal field theory (CFT) could be thought of as the algebraic classification of some basic objects in these models. It has been succesfully used by physicists since 1980's. The other approach, Schramm-Loewner evolutions (SLEs), is a recently introduced set of mathematical methods to study random curves or interfaces occurring in the continuum limit of the models. The first and second included articles argue on basis of statistical mechanics what would be a plausible relation between SLEs and conformal field theory. The first article studies multiple SLEs, several random curves simultaneously in a domain. The proposed definition is compatible with a natural commutation requirement suggested by Dubédat. The curves of multiple SLE may form different topological configurations, ``pure geometries''. We conjecture a relation between the topological configurations and CFT concepts of conformal blocks and operator product expansions. Example applications of multiple SLEs include crossing probabilities for percolation and Ising model. The second article studies SLE variants that represent models with boundary conditions implemented by primary fields. The most well known of these, SLE(kappa, rho), is shown to be simple in terms of the Coulomb gas formalism of CFT. In the third article the space of local martingales for variants of SLE is shown to carry a representation of Virasoro algebra. Finding this structure is guided by the relation of SLEs and CFTs in general, but the result is established in a straightforward fashion. This article, too, emphasizes multiple SLEs and proposes a possible way of treating pure geometries in terms of Coulomb gas. The fourth article states results of applications of the Virasoro structure to the open questions of SLE reversibility and duality. Proofs of the stated results are provided in the appendix. The objective is an indirect computation of certain polynomial expected values. Provided that these expected values exist, in generic cases they are shown to possess the desired properties, thus giving support for both reversibility and duality.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetics, the science of heredity and variation in living organisms, has a central role in medicine, in breeding crops and livestock, and in studying fundamental topics of biological sciences such as evolution and cell functioning. Currently the field of genetics is under a rapid development because of the recent advances in technologies by which molecular data can be obtained from living organisms. In order that most information from such data can be extracted, the analyses need to be carried out using statistical models that are tailored to take account of the particular genetic processes. In this thesis we formulate and analyze Bayesian models for genetic marker data of contemporary individuals. The major focus is on the modeling of the unobserved recent ancestry of the sampled individuals (say, for tens of generations or so), which is carried out by using explicit probabilistic reconstructions of the pedigree structures accompanied by the gene flows at the marker loci. For such a recent history, the recombination process is the major genetic force that shapes the genomes of the individuals, and it is included in the model by assuming that the recombination fractions between the adjacent markers are known. The posterior distribution of the unobserved history of the individuals is studied conditionally on the observed marker data by using a Markov chain Monte Carlo algorithm (MCMC). The example analyses consider estimation of the population structure, relatedness structure (both at the level of whole genomes as well as at each marker separately), and haplotype configurations. For situations where the pedigree structure is partially known, an algorithm to create an initial state for the MCMC algorithm is given. Furthermore, the thesis includes an extension of the model for the recent genetic history to situations where also a quantitative phenotype has been measured from the contemporary individuals. In that case the goal is to identify positions on the genome that affect the observed phenotypic values. This task is carried out within the Bayesian framework, where the number and the relative effects of the quantitative trait loci are treated as random variables whose posterior distribution is studied conditionally on the observed genetic and phenotypic data. In addition, the thesis contains an extension of a widely-used haplotyping method, the PHASE algorithm, to settings where genetic material from several individuals has been pooled together, and the allele frequencies of each pool are determined in a single genotyping.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AbstractObjectives Decision support tools (DSTs) for invasive species management have had limited success in producing convincing results and meeting users' expectations. The problems could be linked to the functional form of model which represents the dynamic relationship between the invasive species and crop yield loss in the DSTs. The objectives of this study were: a) to compile and review the models tested on field experiments and applied to DSTs; and b) to do an empirical evaluation of some popular models and alternatives. Design and methods This study surveyed the literature and documented strengths and weaknesses of the functional forms of yield loss models. Some widely used models (linear, relative yield and hyperbolic models) and two potentially useful models (the double-scaled and density-scaled models) were evaluated for a wide range of weed densities, maximum potential yield loss and maximum yield loss per weed. Results Popular functional forms include hyperbolic, sigmoid, linear, quadratic and inverse models. Many basic models were modified to account for the effect of important factors (weather, tillage and growth stage of crop at weed emergence) influencing weed–crop interaction and to improve prediction accuracy. This limited their applicability for use in DSTs as they became less generalized in nature and often were applicable to a much narrower range of conditions than would be encountered in the use of DSTs. These factors' effects could be better accounted by using other techniques. Among the model empirically assessed, the linear model is a very simple model which appears to work well at sparse weed densities, but it produces unrealistic behaviour at high densities. The relative-yield model exhibits expected behaviour at high densities and high levels of maximum yield loss per weed but probably underestimates yield loss at low to intermediate densities. The hyperbolic model demonstrated reasonable behaviour at lower weed densities, but produced biologically unreasonable behaviour at low rates of loss per weed and high yield loss at the maximum weed density. The density-scaled model is not sensitive to the yield loss at maximum weed density in terms of the number of weeds that will produce a certain proportion of that maximum yield loss. The double-scaled model appeared to produce more robust estimates of the impact of weeds under a wide range of conditions. Conclusions Previously tested functional forms exhibit problems for use in DSTs for crop yield loss modelling. Of the models evaluated, the double-scaled model exhibits desirable qualitative behaviour under most circumstances.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Bats of the genus Pteropus (flying-foxes) are the natural host of Hendra virus (HeV) which periodically causes fatal disease in horses and humans in Australia. The increased urban presence of flying-foxes often provokes negative community sentiments because of reduced social amenity and concerns of HeV exposure risk, and has resulted in calls for the dispersal of urban flying-fox roosts. However, it has been hypothesised that disturbance of urban roosts may result in a stress-mediated increase in HeV infection in flying-foxes, and an increased spillover risk. We sought to examine the impact of roost modification and dispersal on HeV infection dynamics and cortisol concentration dynamics in flying-foxes. The data were analysed in generalised linear mixed models using restricted maximum likelihood (REML). The difference in mean HeV prevalence in samples collected before (4.9%), during (4.7%) and after (3.4%) roost disturbance was small and non-significant (P = 0.440). Similarly, the difference in mean urine specific gravity-corrected urinary cortisol concentrations was small and non-significant (before = 22.71 ng/mL, during = 27.17, after = 18.39) (P= 0.550). We did find an underlying association between cortisol concentration and season, and cortisol concentration and region, suggesting that other (plausibly biological or environmental) variables play a role in cortisol concentration dynamics. The effect of roost disturbance on cortisol concentration approached statistical significance for region, suggesting that the relationship is not fixed, and plausibly reflecting the nature and timing of disturbance. We also found a small positive statistical association between HeV excretion status and urinary cortisol concentration. Finally, we found that the level of flying-fox distress associated with roost disturbance reflected the nature and timing of the activity, highlighting the need for a ‘best practice’ approach to dispersal or roost modification activities. The findings usefully inform public discussion and policy development in relation to Hendra virus and flying-fox management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Flight directionality of the rust-red flour beetle, Tribolium castaneum (Herbst) (Coleoptera: Tenebrionidae), was investigated under glasshouse and field conditions using sticky traps placed around dense experimental infestations of T. castaneum derived from field-collected samples. Although beetles of this species are known to fly quite readily, information on flight of beetles away from grain resources is limited. Under still glasshouse conditions, T. castaneum does not demonstrate strong horizontal or vertical trajectories in their initial flight behaviour. Flight was significantly directional in half of the replicates, but trapped beetles were only weakly concentrated around the mean direction of flight. In the field, by contrast, emigration of T. castaneum was strongly directional soon after flight initiation. The mean vector lengths were generally >0.5 which indicates that trapped beetles were strongly concentrated around the calculated mean flight direction. A circular-circular regression of mean flight vs. mean downwind direction suggested that flight direction was generally correlated with downwind direction. The mean height at which T. castaneum individuals initially flew was 115.4 ± 7.0 cm, with 58.3% of beetles caught no more than 1 m above the ground. The height at which beetles were trapped did not correlate with wind speed at the time of sampling, but the data do indicate that wind speed significantly affected T. castaneum flight initiation, because no beetles (or very few; no more than three) were trapped in the field when the mean wind speed was above 3 m s−1. This study thus demonstrates that wind speed and direction are both important aspects of flight behaviour of T. castaneum, and therefore of the spatio-temporal dynamics of this species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Few data exist on direct greenhouse gas emissions from pen manure at beef feedlots. However, emission inventories attempt to account for these emissions. This study used a large chamber to isolate N2O and CH4 emissions from pen manure at two Australian commercial beef feedlots (stocking densities, 13-27 m(2) head) and related these emissions to a range of potential emission control factors, including masses and concentrations of volatile solids, NO3-, total N, NH4+, and organic C (OC), and additional factors such as total manure mass, cattle numbers, manure pack depth and density, temperature, and moisture content. Mean measured pen N2O emissions were 0.428 kg ha(-1) d(-1) (95% confidence interval [CI], 0.252-0.691) and 0.00405 kg ha(-1) d(-1) (95% CI, 0.00114-0.0110) for the northern and southern feedlots, respectively. Mean measured CH4 emission was 0.236 kg ha(-1) d(-1) (95% CI, 0.163-0.332) for the northern feedlot and 3.93 kg ha(-1) d(-1) (95% CI, 2.58-5.81) for the southern feedlot. Nitrous oxide emission increased with density, pH, temperature, and manure mass, whereas negative relationships were evident with moisture and OC. Strong relationships were not evident between N2O emission and masses or concentrations of NO3- or total N in the manure. This is significant because many standard inventory calculation protocols predict N2O emissions using the mass of N excreted by the animal.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite positive testing in animal studies, more than 80% of novel drug candidates fail to proof their efficacy when tested in humans. This is primarily due to the use of preclinical models that are not able to recapitulate the physiological or pathological processes in humans. Hence, one of the key challenges in the field of translational medicine is to “make the model organism mouse more human.” To get answers to questions that would be prognostic of outcomes in human medicine, the mouse's genome can be altered in order to create a more permissive host that allows the engraftment of human cell systems. It has been shown in the past that these strategies can improve our understanding of tumor immunology. However, the translational benefits of these platforms have still to be proven. In the 21st century, several research groups and consortia around the world take up the challenge to improve our understanding of how to humanize the animal's genetic code, its cells and, based on tissue engineering principles, its extracellular microenvironment, its tissues, or entire organs with the ultimate goal to foster the translation of new therapeutic strategies from bench to bedside. This article provides an overview of the state of the art of humanized models of tumor immunology and highlights future developments in the field such as the application of tissue engineering and regenerative medicine strategies to further enhance humanized murine model systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Genetic and environmental factors affect white matter connectivity in the normal brain, and they also influence diseases in which brain connectivity is altered. Little is known about genetic influences on brain connectivity, despite wide variations in the brain's neural pathways. Here we applied the 'DICCCOL' framework to analyze structural connectivity, in 261 twin pairs (522 participants, mean age: 21.8 y ± 2.7SD). We encoded connectivity patterns by projecting the white matter (WM) bundles of all 'DICCCOLs' as a tracemap (TM). Next we fitted an A/C/E structural equation model to estimate additive genetic (A), common environmental (C), and unique environmental/error (E) components of the observed variations in brain connectivity. We found 44 'heritable DICCCOLs' whose connectivity was genetically influenced (α2>1%); half of them showed significant heritability (α2>20%). Our analysis of genetic influences on WM structural connectivity suggests high heritability for some WM projection patterns, yielding new targets for genome-wide association studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The electrical conduction in insulating materials is a complex process and several theories have been suggested in the literature. Many phenomenological empirical models are in use in the DC cable literature. However, the impact of using different models for cable insulation has not been investigated until now, but for the claims of relative accuracy. The steady state electric field in the DC cable insulation is known to be a strong function of DC conductivity. The DC conductivity, in turn, is a complex function of electric field and temperature. As a result, under certain conditions, the stress at cable screen is higher than that at the conductor boundary. The paper presents detailed investigations on using different empirical conductivity models suggested in the literature for HV DC cable applications. It has been expressly shown that certain models give rise to erroneous results in electric field and temperature computations. It is pointed out that the use of these models in the design or evaluation of cables will lead to errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work belongs to the field of computational high-energy physics (HEP). The key methods used in this thesis work to meet the challenges raised by the Large Hadron Collider (LHC) era experiments are object-orientation with software engineering, Monte Carlo simulation, the computer technology of clusters, and artificial neural networks. The first aspect discussed is the development of hadronic cascade models, used for the accurate simulation of medium-energy hadron-nucleus reactions, up to 10 GeV. These models are typically needed in hadronic calorimeter studies and in the estimation of radiation backgrounds. Various applications outside HEP include the medical field (such as hadron treatment simulations), space science (satellite shielding), and nuclear physics (spallation studies). Validation results are presented for several significant improvements released in Geant4 simulation tool, and the significance of the new models for computing in the Large Hadron Collider era is estimated. In particular, we estimate the ability of the Bertini cascade to simulate Compact Muon Solenoid (CMS) hadron calorimeter HCAL. LHC test beam activity has a tightly coupled cycle of simulation-to-data analysis. Typically, a Geant4 computer experiment is used to understand test beam measurements. Thus an another aspect of this thesis is a description of studies related to developing new CMS H2 test beam data analysis tools and performing data analysis on the basis of CMS Monte Carlo events. These events have been simulated in detail using Geant4 physics models, full CMS detector description, and event reconstruction. Using the ROOT data analysis framework we have developed an offline ANN-based approach to tag b-jets associated with heavy neutral Higgs particles, and we show that this kind of NN methodology can be successfully used to separate the Higgs signal from the background in the CMS experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cosmological inflation is the dominant paradigm in explaining the origin of structure in the universe. According to the inflationary scenario, there has been a period of nearly exponential expansion in the very early universe, long before the nucleosynthesis. Inflation is commonly considered as a consequence of some scalar field or fields whose energy density starts to dominate the universe. The inflationary expansion converts the quantum fluctuations of the fields into classical perturbations on superhorizon scales and these primordial perturbations are the seeds of the structure in the universe. Moreover, inflation also naturally explains the high degree of homogeneity and spatial flatness of the early universe. The real challenge of the inflationary cosmology lies in trying to establish a connection between the fields driving inflation and theories of particle physics. In this thesis we concentrate on inflationary models at scales well below the Planck scale. The low scale allows us to seek for candidates for the inflationary matter within extensions of the Standard Model but typically also implies fine-tuning problems. We discuss a low scale model where inflation is driven by a flat direction of the Minimally Supersymmetric Standard Model. The relation between the potential along the flat direction and the underlying supergravity model is studied. The low inflationary scale requires an extremely flat potential but we find that in this particular model the associated fine-tuning problems can be solved in a rather natural fashion in a class of supergravity models. For this class of models, the flatness is a consequence of the structure of the supergravity model and is insensitive to the vacuum expectation values of the fields that break supersymmetry. Another low scale model considered in the thesis is the curvaton scenario where the primordial perturbations originate from quantum fluctuations of a curvaton field, which is different from the fields driving inflation. The curvaton gives a negligible contribution to the total energy density during inflation but its perturbations become significant in the post-inflationary epoch. The separation between the fields driving inflation and the fields giving rise to primordial perturbations opens up new possibilities to lower the inflationary scale without introducing fine-tuning problems. The curvaton model typically gives rise to relatively large level of non-gaussian features in the statistics of primordial perturbations. We find that the level of non-gaussian effects is heavily dependent on the form of the curvaton potential. Future observations that provide more accurate information of the non-gaussian statistics can therefore place constraining bounds on the curvaton interactions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Within Australia, there have been many attempts to pass voluntary euthanasia (VE) or physician-assisted suicide (PAS) legislation. From 16 June 1993 until the date of writing, 51 Bills have been introduced into Australian parliaments dealing with legalising VE or PAS. Despite these numerous attempts, the only successful Bill was the Rights of the Terminally Ill Act 1995 (NT), which was enacted in the Northern Territory, but a short time later overturned by the controversial Euthanasia Laws Act 1997 (Cth). Yet, in stark contrast to the significant political opposition, for decades Australian public opinion has overwhelmingly supported law reform legalising VE or PAS. While there is ongoing debate in Australia, both through public discourse and scholarly publications, about the merits and dangers of reform in this field, there has been remarkably little analysis of the numerous legislative attempts to reform the law, and the context in which those reform attempts occurred. The aim of this article is to better understand the reform landscape in Australia over the past two decades. The information provided in this article will better equip Australians, both politicians and the general public, to have a more nuanced understanding of the political context in which the euthanasia debate has been and is occurring. It will also facilitate a more informed debate in the future.