888 resultados para Full-Range Model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Betaretroviruses infect a wide range of species including primates, rodents, ruminants, and marsupials. They exist in both endogenous and exogenous forms and are implicated in animal diseases such as lung cancer in sheep, and in human disease, with members of the human endogenous retrovirus-K (HERV-K) group of endogenous betaretroviruses (βERVs) associated with human cancers and autoimmune diseases. To improve our understanding of betaretroviruses in an evolutionarily distinct host species, we characterized βERVs present in the genomes and transcriptomes of mega- and microbats, which are an important reservoir of emerging viruses.Results: A diverse range of full-length βERVs were discovered in mega- and microbat genomes and transcriptomes including the first identified intact endogenous retrovirus in a bat. Our analysis revealed that the genus Betaretrovirus can be divided into eight distinct sub-groups with evidence of cross-species transmission. Betaretroviruses are revealed to be a complex retrovirus group, within which one sub-group has evolved from complex to simple genomic organization through the acquisition of an env gene from the genus Gammaretrovirus. Molecular dating suggests that bats have contended with betaretroviral infections for over 30 million years.Conclusions: Our study reveals that a diverse range of betaretroviruses have circulated in bats for most of their evolutionary history, and cluster with extant betaretroviruses of divergent mammalian lineages suggesting that their distribution may be largely unrestricted by host species barriers. The presence of βERVs with the ability to transcribe active viral elements in a major animal reservoir for viral pathogens has potential implications for public health. © 2013 Hayward et al.; licensee BioMed Central Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Maize is one of the most important crops in the world. The products generated from this crop are largely used in the starch industry, the animal and human nutrition sector, and biomass energy production and refineries. For these reasons, there is much interest in figuring the potential grain yield of maize genotypes in relation to the environment in which they will be grown, as the productivity directly affects agribusiness or farm profitability. Questions like these can be investigated with ecophysiological crop models, which can be organized according to different philosophies and structures. The main objective of this work is to conceptualize a stochastic model for predicting maize grain yield and productivity under different conditions of water supply while considering the uncertainties of daily climate data. Therefore, one focus is to explain the model construction in detail, and the other is to present some results in light of the philosophy adopted. A deterministic model was built as the basis for the stochastic model. The former performed well in terms of the curve shape of the above-ground dry matter over time as well as the grain yield under full and moderate water deficit conditions. Through the use of a triangular distribution for the harvest index and a bivariate normal distribution of the averaged daily solar radiation and air temperature, the stochastic model satisfactorily simulated grain productivity, i.e., it was found that 10,604 kg ha(-1) is the most likely grain productivity, very similar to the productivity simulated by the deterministic model and for the real conditions based on a field experiment. © 2012 American Society of Agricultural and Biological Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hydrogen cyanide (HCN) is a toxic chemical that can potentially cause mild to severe reactions in animals when grazing forage sorghum. Developing technologies to monitor the level of HCN in the growing crop would benefit graziers, so that they can move cattle into paddocks with acceptable levels of HCN. In this study, we developed near-infrared spectroscopy (MRS) calibrations to estimate HCN in forage sorghum and hay. The full spectral NIRS range (400-2498 nm) was used as well as specific spectral ranges within the full spectral range, i.e., visible (400-750 nm), shortwave (800-1100 nm) and near-infrared (NIR) (1100-2498 nm). Using the full spectrum approach and partial least-squares (PLS), the calibration produced a coefficient of determination (R-2) = 0.838 and standard error of cross-validation (SECV) = 0.040%, while the validation set had a R-2 = 0.824 with a low standard error of prediction (SEP = 0.047%). When using a multiple linear regression (MLR) approach, the best model (NIR spectra) produced a R-2 = 0.847 and standard error of calibration (SEC) = 0.050% and a R-2 = 0.829 and SEP = 0.057% for the validation set. The MLR models built from these spectral regions all used nine wavelengths. Two specific wavelengths 2034 and 2458 nm were of interest, with the former associated with C=O carbonyl stretch and the latter associated with C-N-C stretching. The most accurate PLS and MLR models produced a ratio of standard error of prediction to standard deviation of 3.4 and 3.0, respectively, suggesting that the calibrations could be used for screening breeding material. The results indicated that it should be feasible to develop calibrations using PLS or MLR models for a number of users, including breeding programs to screen for genotypes with low HCN, as well as graziers to monitor crop status to help with grazing efficiency.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Costs of purchasing new piglets and of feeding them until slaughter are the main variable expenditures in pig fattening. They both depend on slaughter intensity, the nature of feeding patterns and the technological constraints of pig fattening, such as genotype. Therefore, it is of interest to examine the effect of production technology and changes in input and output prices on feeding and slaughter decisions. This study examines the problem by using a dynamic programming model that links genetic characteristics of a pig to feeding decisions and the timing of slaughter and takes into account how these jointly affect the quality-adjusted value of a carcass. The model simulates the growth mechanism of a pig under optional feeding and slaughter patterns and then solves the optimal feeding and slaughter decisions recursively. The state of nature and the genotype of a pig are known in the analysis. The main contribution of this study is the dynamic approach that explicitly takes into account carcass quality while simultaneously optimising feeding and slaughter decisions. The method maximises the internal rate of return to the capacity unit. Hence, the results can have vital impact on competitiveness of pig production, which is known to be quite capital-intensive. The results suggest that producer can significantly benefit from improvements in the pig's genotype, because they improve efficiency of pig production. The annual benefits from obtaining pigs of improved genotype can be more than €20 per capacity unit. The annual net benefits of animal breeding to pig farms can also be considerable. Animals of improved genotype can reach optimal slaughter maturity quicker and produce leaner meat than animals of poor genotype. In order to fully utilise the benefits of animal breeding, the producer must adjust feeding and slaughter patterns on the basis of genotype. The results suggest that the producer can benefit from flexible feeding technology. The flexible feeding technology segregates pigs into groups according to their weight, carcass leanness, genotype and sex and thereafter optimises feeding and slaughter decisions separately for these groups. Typically, such a technology provides incentives to feed piglets with protein-rich feed such that the genetic potential to produce leaner meat is fully utilised. When the pig approaches slaughter maturity, the share of protein-rich feed in the diet gradually decreases and the amount of energy-rich feed increases. Generally, the optimal slaughter weight is within the weight range that pays the highest price per kilogram of pig meat. The optimal feeding pattern and the optimal timing of slaughter depend on price ratios. Particularly, an increase in the price of pig meat provides incentives to increase the growth rates up to the pig's biological maximum by increasing the amount of energy in the feed. Price changes and changes in slaughter premium can also have large income effects. Key words: barley, carcass composition, dynamic programming, feeding, genotypes, lean, pig fattening, precision agriculture, productivity, slaughter weight, soybeans

Relevância:

30.00% 30.00%

Publicador:

Resumo:

ABSTRACT: Infrared studies of synthetic alamethicin fragments and model peptides containing a-aminoisobutyric acid (Aib) have been carried out in solution. Tripeptides and larger fragments exhibit a strong tendency to form /3 turns, stabilized by 4 - 1 10-atom hydrogen bonds. Dipeptides show less well-defined structures, though C5 and C7 conformations are detectable. Conformational restrictions imposed by Aib residues result in these peptides populating a limited range of states. Integrated intensities of the hydrogen-bonded N-H stretching band can be used to quantitate the number of intramolecular hydrogen bonds. Predictions made from infrared data are in excellent agreement with nuclear magnetic resonance and X-ray diffraction studies. Assignments of the urethane and tertiary amide carbonyl groups in the free state have been made in model peptides. Shifts to lower frequency on hydrogen bonding are observed for the carbonyl groups. The 1-6 segment of alamethicin is shown to adopt a 310 helical structure stabilized by four intramolecular hydrogen bonds. The fragments Boc-Leu-Aib-Pro-Val-Aib-OMe (1 2-1 6) and Boc-Gly-Leu-Aib-Pro-Val-Aib-OMe (1 1-1 6) possess structures involving 4 - 1 and 5 - 1 hydrogen bonds. Supporting evidence for these structures is obtained from proton nuclear magnetic resonance studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of maize simulation models to determine the optimum plant population for rainfed environments allows the evaluation of plant populations over multiple years and locations at a lower cost than traditional field experimentation. However the APSIM maize model that has been used to conduct some of these 'virtual' experiments assumes that the maximum rate of soil water extraction by the crop root system is constant across plant populations. This untested assumption may cause grain yield to be overestimated in lower plant populations. A field experiment was conducted to determine whether maximum rates of water extraction vary with plant population, and the maximum rate of soil water extraction was estimated for three plant populations (2.4, 3.5 and 5.5 plants m(-2)) under water limited conditions. Maximum soil water extraction rates in the field experiment decreased linearly with plant population, and no difference was detected between plant populations for the crop lower limit of soil water extraction. Re-analysis of previous maize simulation experiments demonstrated that the use of inappropriately high extraction-rate parameters at low plant populations inflated predictions of grain yield, and could cause erroneous recommendations to be made for plant population. The results demonstrate the importance of validating crop simulation models across the range of intended treatments. (C) 2013 Elsevier E.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Commercial environments may receive only a fraction of expected genetic gains for growth rate as predicted from the selection environment. This fraction is result of undesirable genotype-by-environment interactions (GxE) and measured by the genetic correlation (rg) of growth between environments. Rapid estimates of genetic correlation achieved in one generation are notoriously difficult to estimate with precision. A new design is proposed where genetic correlations can be estimated by utilising artificial mating from cryopreserved semen and unfertilised eggs stripped from a single female. We compare a traditional phenotype analysis of growth to a threshold model where only the largest fish are genotyped for sire identification. The threshold model was robust to differences in family mortality differing up to 30%. The design is unique as it negates potential re-ranking of families caused by an interaction between common maternal environmental effects and growing environment. The design is suitable for rapid assessment of GxE over one generation with a true 0.70 genetic correlation yielding standard errors as low as 0.07. Different design scenarios were tested for bias and accuracy with a range of heritability values, number of half-sib families created, number of progeny within each full-sib family, number of fish genotyped, number of fish stocked, differing family survival rates and at various simulated genetic correlation levels.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Digital elevation models (DEMs) have been an important topic in geography and surveying sciences for decades due to their geomorphological importance as the reference surface for gravita-tion-driven material flow, as well as the wide range of uses and applications. When DEM is used in terrain analysis, for example in automatic drainage basin delineation, errors of the model collect in the analysis results. Investigation of this phenomenon is known as error propagation analysis, which has a direct influence on the decision-making process based on interpretations and applications of terrain analysis. Additionally, it may have an indirect influence on data acquisition and the DEM generation. The focus of the thesis was on the fine toposcale DEMs, which are typically represented in a 5-50m grid and used in the application scale 1:10 000-1:50 000. The thesis presents a three-step framework for investigating error propagation in DEM-based terrain analysis. The framework includes methods for visualising the morphological gross errors of DEMs, exploring the statistical and spatial characteristics of the DEM error, making analytical and simulation-based error propagation analysis and interpreting the error propagation analysis results. The DEM error model was built using geostatistical methods. The results show that appropriate and exhaustive reporting of various aspects of fine toposcale DEM error is a complex task. This is due to the high number of outliers in the error distribution and morphological gross errors, which are detectable with presented visualisation methods. In ad-dition, the use of global characterisation of DEM error is a gross generalisation of reality due to the small extent of the areas in which the decision of stationarity is not violated. This was shown using exhaustive high-quality reference DEM based on airborne laser scanning and local semivariogram analysis. The error propagation analysis revealed that, as expected, an increase in the DEM vertical error will increase the error in surface derivatives. However, contrary to expectations, the spatial au-tocorrelation of the model appears to have varying effects on the error propagation analysis depend-ing on the application. The use of a spatially uncorrelated DEM error model has been considered as a 'worst-case scenario', but this opinion is now challenged because none of the DEM derivatives investigated in the study had maximum variation with spatially uncorrelated random error. Sig-nificant performance improvement was achieved in simulation-based error propagation analysis by applying process convolution in generating realisations of the DEM error model. In addition, typology of uncertainty in drainage basin delineations is presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frictions are factors that hinder trading of securities in financial markets. Typical frictions include limited market depth, transaction costs, lack of infinite divisibility of securities, and taxes. Conventional models used in mathematical finance often gloss over these issues, which affect almost all financial markets, by arguing that the impact of frictions is negligible and, consequently, the frictionless models are valid approximations. This dissertation consists of three research papers, which are related to the study of the validity of such approximations in two distinct modeling problems. Models of price dynamics that are based on diffusion processes, i.e., continuous strong Markov processes, are widely used in the frictionless scenario. The first paper establishes that diffusion models can indeed be understood as approximations of price dynamics in markets with frictions. This is achieved by introducing an agent-based model of a financial market where finitely many agents trade a financial security, the price of which evolves according to price impacts generated by trades. It is shown that, if the number of agents is large, then under certain assumptions the price process of security, which is a pure-jump process, can be approximated by a one-dimensional diffusion process. In a slightly extended model, in which agents may exhibit herd behavior, the approximating diffusion model turns out to be a stochastic volatility model. Finally, it is shown that when agents' tendency to herd is strong, logarithmic returns in the approximating stochastic volatility model are heavy-tailed. The remaining papers are related to no-arbitrage criteria and superhedging in continuous-time option pricing models under small-transaction-cost asymptotics. Guasoni, Rásonyi, and Schachermayer have recently shown that, in such a setting, any financial security admits no arbitrage opportunities and there exist no feasible superhedging strategies for European call and put options written on it, as long as its price process is continuous and has the so-called conditional full support (CFS) property. Motivated by this result, CFS is established for certain stochastic integrals and a subclass of Brownian semistationary processes in the two papers. As a consequence, a wide range of possibly non-Markovian local and stochastic volatility models have the CFS property.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: 1. Estimate population parameters required for a management model. These include survival, density, age structure, growth, age and size at maturity and at recruitment to the adult eel fishery. Estimate their variability among individuals in a range of habitats. 2. Develop a management population dynamics model and use it to investigate management options. 3. Establish baseline data and sustainability indicators for long-term monitoring. 4. Assess the applicability of the above techniques to other eel fisheries in Australia, in collaboration with NSW. Distribute developed tools via the Australia and New Zealand Eel Reference Group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalized isothermal effectiveness factor correlation has been proposed for catalytic reactions whose intrinsic kinetics are based on the redox model. In this correlation which is exact for asymptotic values of the Thiele parameter the effect of the parameters appearing in the model, the order of the reaction and particle geometry are incorporated in a modified form of Thiele parameter. The relationship takes the usual form: Image and predicts effectiveness factor with an error of less than 2% in a range of Thiele parameter that accommodates both the kinetic and diffusion control regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic volatility models are of fundamental importance to the pricing of derivatives. One of the most commonly used models of stochastic volatility is the Heston Model in which the price and volatility of an asset evolve as a pair of coupled stochastic differential equations. The computation of asset prices and volatilities involves the simulation of many sample trajectories with conditioning. The problem is treated using the method of particle filtering. While the simulation of a shower of particles is computationally expensive, each particle behaves independently making such simulations ideal for massively parallel heterogeneous computing platforms. In this paper, we present our portable Opencl implementation of the Heston model and discuss its performance and efficiency characteristics on a range of architectures including Intel cpus, Nvidia gpus, and Intel Many-Integrated-Core (mic) accelerators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this research we modelled computer network devices to ensure their communication behaviours meet various network standards. By modelling devices as finite-state machines and examining their properties in a range of configurations, we discovered a flaw in a common network protocol and produced a technique to improve organisations' network security against data theft.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electronic, magnetic, or structural inhomogeneities ranging in size from nanoscopic to mesoscopic scales seem endemic and are possibly generic to colossal magnetoresistance manganites and other transition metal oxides. They are hence of great current interest and understanding them is of fundamental importance. We show here that an extension, to include long-range Coulomb interactions, of a quantum two-fluid l-b model proposed recently for manganites [Phys. Rev. Lett. 92, 157203 (2004)] leads to an excellent description of such inhomogeneities. In the l-b model two very different kinds of electronic states, one localized and polaronic (l) and the other extended or broad band (b) coexist. For model parameters appropriate to manganites and even within a simple dynamical mean-field theory (DMFT) framework, it describes many of the unusual phenomena seen in manganites, including colossal magnetoresistance (CMR), qualitatively and quantitatively. However, in the absence of long-ranged Coulomb interaction, a system described by such a model would actually phase separate, into macroscopic regions of l and b electrons, respectively. As we show in this paper, in the presence of Coulomb interactions, the macroscopic phase separation gets suppressed and instead nanometer scale regions of polarons interspersed with band electron puddles appear, constituting a kind of quantum Coulomb glass. We characterize the size scales and distribution of the inhomogeneity using computer simulations. For realistic values of the long-range Coulomb interaction parameter V-0, our results for the thresholds for occupancy of the b states are in agreement with, and hence support, the earlier approach mentioned above based on a configuration averaged DMFT treatment which neglects V-0; but the present work has features that cannot be addressed in the DMFT framework. Our work points to an interplay of strong correlations, long-range Coulomb interaction, and dopant ion disorder, all inevitably present in transition metal oxides as the origin of nanoscale inhomogeneities rather than disorder frustrated phase competition as is generally believed. As regards manganites, it argues against explanations for CMR based on disorder frustrated phase separation and for an intrinsic origin of CMR. Based on this, we argue that the observed micrometer (meso) scale inhomogeneities owe their existence to extrinsic causes, e.g., strain due to cracks and defects. We suggest possible experiments to validate our speculation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic-related air pollution has been associated with a wide range of adverse health effects. One component of traffic emissions that has been receiving increasing attention is ultrafine particles(UFP, < 100 nm), which are of concern to human health due to their small diameters. Vehicles are the dominant source of UFP in urban environments. Small-scale variation in ultrafine particle number concentration (PNC) can be attributed to local changes in land use and road abundance. UFPs are also formed as a result of particle formation events. Modelling the spatial patterns in PNC is integral to understanding human UFP exposure and also provides insight into particle formation mechanisms that contribute to air pollution in urban environments. Land-use regression (LUR) is a technique that can use to improve the prediction of air pollution.