960 resultados para Perfect matches


Relevância:

10.00% 10.00%

Publicador:

Resumo:

High-density oligonucleotide (oligo) arrays are a powerful tool for transcript profiling. Arrays based on GeneChip® technology are amongst the most widely used, although GeneChip® arrays are currently available for only a small number of plant and animal species. Thus, we have developed a method to improve the sensitivity of high-density oligonucleotide arrays when applied to heterologous species and tested the method by analysing the transcriptome of Brassica oleracea L., a species for which no GeneChip® array is available, using a GeneChip® array designed for Arabidopsis thaliana (L.) Heynh. Genomic DNA from B. oleracea was labelled and hybridised to the ATH1-121501 GeneChip® array. Arabidopsis thaliana probe-pairs that hybridised to the B. oleracea genomic DNA on the basis of the perfect-match (PM) probe signal were then selected for subsequent B. oleracea transcriptome analysis using a .cel file parser script to generate probe mask files. The transcriptional response of B. oleracea to a mineral nutrient (phosphorus; P) stress was quantified using probe mask files generated for a wide range of gDNA hybridisation intensity thresholds. An example probe mask file generated with a gDNA hybridisation intensity threshold of 400 removed > 68 % of the available PM probes from the analysis but retained >96 % of available A. thaliana probe-sets. Ninety-nine of these genes were then identified as significantly regulated under P stress in B. oleracea, including the homologues of P stress responsive genes in A. thaliana. Increasing the gDNA hybridisation intensity thresholds up to 500 for probe-selection increased the sensitivity of the GeneChip® array to detect regulation of gene expression in B. oleracea under P stress by up to 13-fold. Our open-source software to create probe mask files is freely available http://affymetrix.arabidopsis.info/xspecies/ webcite and may be used to facilitate transcriptomic analyses of a wide range of plant and animal species in the absence of custom arrays.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Viral replication occurs within cells, with release (and onward infection) primarily achieved through two alternative mechanisms: lysis, in which virions emerge as the infected cell dies and bursts open; or budding, in which virions emerge gradually from a still living cell by appropriating a small part of the cell membrane. Virus budding is a poorly understood process that challenges current models of vesicle formation. Here, a plausible mechanism for arenavirus budding is presented, building on recent evidence that viral proteins embed in the inner lipid layer of the cell membrane. Experimental results confirm that viral protein is associated with increased membrane curvature, whereas a mathematical model is used to show that localized increases in curvature alone are sufficient to generate viral buds. The magnitude of the protein-induced curvature is calculated from the size of the amphipathic region hypothetically removed from the inner membrane as a result of translation, with a change in membrane stiffness estimated from observed differences in virion deformation as a result of protein depletion. Numerical results are based on experimental data and estimates for three arenaviruses, but the mechanisms described are more broadly applicable. The hypothesized mechanism is shown to be sufficient to generate spontaneous budding that matches well both qualitatively and quantitatively with experimental observations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The electronic structure and oxidation state of atomic Au adsorbed on a perfect CeO2(111) surface have been investigated in detail by means of periodic density functional theory-based calculations, using the LDA+U and GGA+U potentials for a broad range of U values, complemented with calculations employing the HSE06 hybrid functional. In addition, the effects of the lattice parameter a0 and of the starting point for the geometry optimization have also been analyzed. From the present results we suggest that the oxidation state of single Au atoms on CeO2(111) predicted by LDA+U, GGA+U, and HSE06 density functional calculations is not conclusive and that the final picture strongly depends on the method chosen and on the construction of the surface model. In some cases we have been able to locate two well-defined states which are close in energy but with very different electronic structure and local geometries, one with Au fully oxidized and one with neutral Au. The energy difference between the two states is typically within the limits of the accuracy of the present exchange-correlation potentials, and therefore, a clear lowest-energy state cannot be identified. These results suggest the possibility of a dynamic distribution of Au0 and Au+ atomic species at the regular sites of the CeO2(111) surface.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Monte Carlo field-theoretic simulations (MCFTS) are performed on melts of symmetric diblock copolymer for invariant polymerization indexes extending down to experimentally relevant values of N̅ ∼ 10^4. The simulations are performed with a fluctuating composition field, W_−(r), and a pressure field, W_+(r), that follows the saddle-point approximation. Our study focuses on the disordered-state structure function, S(k), and the order−disorder transition (ODT). Although shortwavelength fluctuations cause an ultraviolet (UV) divergence in three dimensions, this is readily compensated for with the use of an effective Flory−Huggins interaction parameter, χ_e. The resulting S(k) matches the predictions of renormalized one-loop (ROL) calculations over the full range of χ_eN and N̅ examined in our study, and agrees well with Fredrickson−Helfand (F−H) theory near the ODT. Consistent with the F−H theory, the ODT is discontinuous for finite N̅ and the shift in (χ_eN)_ODT follows the predicted N̅^−1/3 scaling over our range of N̅.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In projections of twenty-first century climate, Arctic sea ice declines and at the same time exhibits strong interannual anomalies. Here, we investigate the potential to predict these strong sea-ice anomalies under a perfect-model assumption, using the Max-Planck-Institute Earth System Model in the same setup as in the Coupled Model Intercomparison Project Phase 5 (CMIP5). We study two cases of strong negative sea-ice anomalies: a 5-year-long anomaly for present-day conditions, and a 10-year-long anomaly for conditions projected for the middle of the twenty-first century. We treat these anomalies in the CMIP5 projections as the truth, and use exactly the same model configuration for predictions of this synthetic truth. We start ensemble predictions at different times during the anomalies, considering lagged-perfect and sea-ice-assimilated initial conditions. We find that the onset and amplitude of the interannual anomalies are not predictable. However, the further deepening of the anomaly can be predicted for typically 1 year lead time if predictions start after the onset but before the maximal amplitude of the anomaly. The magnitude of an extremely low summer sea-ice minimum is hard to predict: the skill of the prediction ensemble is not better than a damped-persistence forecast for lead times of more than a few months, and is not better than a climatology forecast for lead times of two or more years. Predictions of the present-day anomaly are more skillful than predictions of the mid-century anomaly. Predictions using sea-ice-assimilated initial conditions are competitive with those using lagged-perfect initial conditions for lead times of a year or less, but yield degraded skill for longer lead times. The results presented here suggest that there is limited prospect of predicting the large interannual sea-ice anomalies expected to occur throughout the twenty-first century.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The phosphine-stabilised gold cluster [Au6(Ph2P-o-tolyl)6](NO3)2 is converted into an active nanocatalyst for the oxidation of benzyl alcohol through low-temperature peroxide-assisted removal of the phosphines, avoiding the high-temperature calcination process. The process was monitored using in-situ X-ray absorption spectroscopy, which revealed that after a certain period of the reaction with tertiary butyl hydrogen peroxide, the phosphine ligands are removed to form nanoparticles of gold which matches with the induction period seen in the catalytic reaction. Density functional theory calculations show that the energies required to remove the ligands from the [Au6Ln]2+ increase significantly with successive removal steps, suggesting that the process does not occur at once but sequentially. The calculations also reveal that ligand removal is accompanied by dramatic re-arrangements in the topology of the cluster core.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Numerical climate models constitute the best available tools to tackle the problem of climate prediction. Two assumptions lie at the heart of their suitability: (1) a climate attractor exists, and (2) the numerical climate model's attractor lies on the actual climate attractor, or at least on the projection of the climate attractor on the model's phase space. In this contribution, the Lorenz '63 system is used both as a prototype system and as an imperfect model to investigate the implications of the second assumption. By comparing results drawn from the Lorenz '63 system and from numerical weather and climate models, the implications of using imperfect models for the prediction of weather and climate are discussed. It is shown that the imperfect model's orbit and the system's orbit are essentially different, purely due to model error and not to sensitivity to initial conditions. Furthermore, if a model is a perfect model, then the attractor, reconstructed by sampling a collection of initialised model orbits (forecast orbits), will be invariant to forecast lead time. This conclusion provides an alternative method for the assessment of climate models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Plaut, McClelland, Seidenberg and Patterson (1996) connectionist model of reading was evaluated at two points early in its training against reading data collected from British children on two occasions during their first year of literacy instruction. First, the network’s non-word reading was poor relative to word reading when compared with the children. Second, the network made more non-lexical than lexical errors, the opposite pattern to the children. Three adaptations were made to the training of the network to bring it closer to the learning environment of a child: an incremental training regime was adopted; the network was trained on grapheme– phoneme correspondences; and a training corpus based on words found in children’s early reading materials was used. The modifications caused a sharp improvement in non-word reading, relative to word reading, resulting in a near perfect match to the children’s data on this measure. The modified network, however, continued to make predominantly non-lexical errors, although evidence from a small-scale implementation of the full triangle framework suggests that this limitation stems from the lack of a semantic pathway. Taken together, these results suggest that, when properly trained, connectionist models of word reading can offer insights into key aspects of reading development in children.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work involves investigation of a type of wireless power system wherein its analysis will yield the construction of a prototype modeled as a singular technological artifact. It is through exploration of the artifact that forms the intellectual basis for not only its prototypical forms, but suggestive of variant forms not yet discovered. Through the process it is greatly clarified the role of the artifact, its most suitable application given the constraints on the delivery problem, and optimization strategies to improve it. In order to improve maturity and contribute to a body of knowledge, this document proposes research utilizing mid-field region, efficient inductive-transfer for the purposes of removing wired connections and electrical contacts. While the description seems enough to state the purpose of this work, it does not convey the compromises of having to redraw the lines of demarcation between near and far-field in the traditional method of broadcasting. Two striking scenarios are addressed in this thesis: Firstly, the mathematical explanation of wireless power is due to J.C. Maxwell's original equations, secondly, the behavior of wireless power in the circuit is due to Joseph Larmor's fundamental works on the dynamics of the field concept. A model of propagation will be presented which matches observations in experiments. A modified model of the dipole will be presented to address the phenomena observed in the theory and experiments. Two distinct sets of experiments will test the concept of single and two coupled-modes. In a more esoteric context of the zero and first-order magnetic field, the suggestion of a third coupled-mode is presented. Through the remaking of wireless power in this context, it is the intention of the author to show the reader that those things lost to history, bound to a path of complete obscurity, are once again innovative and useful ideas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energy storage is a potential alternative to conventional network reinforcementof the low voltage (LV) distribution network to ensure the grid’s infrastructure remainswithin its operating constraints. This paper presents a study on the control of such storagedevices, owned by distribution network operators. A deterministic model predictive control (MPC) controller and a stochastic receding horizon controller (SRHC) are presented, wherethe objective is to achieve the greatest peak reduction in demand, for a given storagedevice specification, taking into account the high level of uncertainty in the prediction of LV demand. The algorithms presented in this paper are compared to a standard set-pointcontroller and bench marked against a control algorithm with a perfect forecast. A specificcase study, using storage on the LV network, is presented, and the results of each algorithmare compared. A comprehensive analysis is then carried out simulating a large number of LV networks of varying numbers of households. The results show that the performance of each algorithm is dependent on the number of aggregated households. However, on a typical aggregation, the novel SRHC algorithm presented in this paper is shown to outperform each of the comparable storage control techniques.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Distributed generation plays a key role in reducing CO2 emissions and losses in transmission of power. However, due to the nature of renewable resources, distributed generation requires suitable control strategies to assure reliability and optimality for the grid. Multi-agent systems are perfect candidates for providing distributed control of distributed generation stations as well as providing reliability and flexibility for the grid integration. The proposed multi-agent energy management system consists of single-type agents who control one or more gird entities, which are represented as generic sub-agent elements. The agent applies one control algorithm across all elements and uses a cost function to evaluate the suitability of the element as a supplier. The behavior set by the agent's user defines which parameters of an element have greater weight in the cost function, which allows the user to specify the preference on suppliers dynamically. This study shows the ability of the multi-agent energy management system to select suppliers according to the selection behavior given by the user. The optimality of the supplier for the required demand is ensured by the cost function based on the parameters of the element.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article seeks to explore the absence of the body in the depiction of dying women in a selection of seventeenth-century diaries. It considers the cultural forces that made this absence inevitable, and the means by which the physical body was replaced in death by a spiritual presence. The elevation of a dying woman from physical carer to spiritual nurturer in the days before death ensured that gender codes were not broken. The centrality of the body of the dying woman, within a female circle of care and support, was paradoxically juxtaposed with an effacement of the body in descriptions of a good death. In death, a woman might achieve the stillness, silence and compliance so essential to perfect early modern womanhood, and retrospective diary entries can achieve this ideal by replacing the body with images that deflect from the essential physicality of the woman.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Realistic representation of sea ice in ocean models involves the use of a non-linear free-surface, a real freshwater flux and observance of requisite conservation laws. We show here that these properties can be achieved in practice through use of a rescaled vertical coordinate ‘‘z*” in z-coordinate models that allows one to follow undulations in the free-surface under sea ice loading. In particular, the adoption of "z*" avoids the difficult issue of vanishing levels under thick ice. Details of the implementation within MITgcm are provided. A high resolution global ocean sea ice simulation illustrates the robustness of the z* formulation and reveals a source of oceanic variability associated with sea ice dynamics and ice-loading effects. The use of the z* coordinate allows one to achieve perfect conservation of fresh water, heat and salt, as shown in extended integration of coupled ocean sea ice atmospheric model.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Population modelling is increasingly recognised as a useful tool for pesticide risk assessment. For vertebrates that may ingest pesticides with their food, such as woodpigeon (Columba palumbus), population models that simulate foraging behaviour explicitly can help predicting both exposure and population-level impact. Optimal foraging theory is often assumed to explain the individual-level decisions driving distributions of individuals in the field, but it may not adequately predict spatial and temporal characteristics of woodpigeon foraging because of the woodpigeons’ excellent memory, ability to fly long distances, and distinctive flocking behaviour. Here we present an individual-based model (IBM) of the woodpigeon. We used the model to predict distributions of foraging woodpigeons that use one of six alternative foraging strategies: optimal foraging, memory-based foraging and random foraging, each with or without flocking mechanisms. We used pattern-oriented modelling to determine which of the foraging strategies is best able to reproduce observed data patterns. Data used for model evaluation were gathered during a long-term woodpigeon study conducted between 1961 and 2004 and a radiotracking study conducted in 2003 and 2004, both in the UK, and are summarised here as three complex patterns: the distributions of foraging birds between vegetation types during the year, the number of fields visited daily by individuals, and the proportion of fields revisited by them on subsequent days. The model with a memory-based foraging strategy and a flocking mechanism was the only one to reproduce these three data patterns, and the optimal foraging model produced poor matches to all of them. The random foraging strategy reproduced two of the three patterns but was not able to guarantee population persistence. We conclude that with the memory-based foraging strategy including a flocking mechanism our model is realistic enough to estimate the potential exposure of woodpigeons to pesticides. We discuss how exposure can be linked to our model, and how the model could be used for risk assessment of pesticides, for example predicting exposure and effects in heterogeneous landscapes planted seasonally with a variety of crops, while accounting for differences in land use between landscapes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ships and wind turbines generate noise, which can have a negative impact on marine mammal populations by scaring animals away. Effective modelling of how this affects the populations has to take account of the location and timing of disturbances. Here we construct an individual-based model of harbour porpoises in the Inner Danish Waters. Individuals have their own energy budgets constructed using established principles of physiological ecology. Data are lacking on the spatial distribution of food which is instead inferred from knowledge of time-varying porpoise distributions. The model produces plausible patterns of population dynamics and matches well the age distribution of porpoises caught in by-catch. It estimates the effect of existing wind farms as a 10% reduction in population size when food recovers fast (after two days). Proposed new wind farms and ships do not result in further population declines. The population is however sensitive to variations in mortality resulting from by-catch and to the speed at which food recovers after being depleted. If food recovers slowly the effect of wind turbines becomes negligible, whereas ships are estimated to have a significant negative impact on the population. Annual by-catch rates ≥10% lead to monotonously decreasing populations and to extinction, and even the estimated by-catch rate from the adjacent area (approximately 4.1%) has a strong impact on the population. This suggests that conservation efforts should be more focused on reducing by-catch in commercial gillnet fisheries than on limiting the amount of anthropogenic noise. Individual-based models are unique in their ability to take account of the location and timing of disturbances and to show their likely effects on populations. The models also identify deficiencies in the existing database and can be used to set priorities for future field research.