890 resultados para Zero sets of bivariate polynomials
Resumo:
Climate change is a major threat to global biodiversity, and its impacts can act synergistically to heighten the severity of other threats. Most research on projecting species range shifts under climate change has not been translated to informing priority management strategies on the ground. We develop a prioritization framework to assess strategies for managing threats to biodiversity under climate change and apply it to the management of invasive animal species across one-sixth of the Australian continent, the Lake Eyre Basin. We collected information from key stakeholders and experts on the impacts of invasive animals on 148 of the region's most threatened species and 11 potential strategies. Assisted by models of current distributions of threatened species and their projected distributions, experts estimated the cost, feasibility, and potential benefits of each strategy for improving the persistence of threatened species with and without climate change. We discover that the relative cost-effectiveness of invasive animal control strategies is robust to climate change, with the management of feral pigs being the highest priority for conserving threatened species overall. Complementary sets of strategies to protect as many threatened species as possible under limited budgets change when climate change is considered, with additional strategies required to avoid impending extinctions from the region. Overall, we find that the ranking of strategies by cost-effectiveness was relatively unaffected by including climate change into decision-making, even though the benefits of the strategies were lower. Future climate conditions and impacts on range shifts become most important to consider when designing comprehensive management plans for the control of invasive animals under limited budgets to maximize the number of threatened species that can be protected.
Resumo:
Gut bacterial communities are now known to influence a range of fitness related aspects of organisms. But how different the microbial community is in closely related species, and if these differences can be interpreted as adaptive is still unclear. In this study we compared microbial communities in two sets of closely related sympatric crater lake cichlid fish species pairs that show similar adaptations along the limnetic-benthic axis. The gut microbial community composition differs in the species pair inhabiting the older of two crater lakes. One major difference, relative to other fish, is that in these cichlids that live in hypersaline crater lakes, the microbial community is largely made up of Oceanospirillales (52.28%) which are halotolerant or halophilic bacteria. This analysis opens up further avenues to identify candidate symbiotic or co-evolved bacteria playing a role in adaptation to similar diets and life-styles or even have a role in speciation. Future functional and phylosymbiotic analyses might help to address these issues.
Resumo:
To further investigate susceptibility loci identified by genome-wide association studies, we genotyped 5,500 SNPs across 14 associated regions in 8,000 samples from a control group and 3 diseases: type 2 diabetes (T2D), coronary artery disease (CAD) and Graves' disease. We defined, using Bayes theorem, credible sets of SNPs that were 95% likely, based on posterior probability, to contain the causal disease-associated SNPs. In 3 of the 14 regions, TCF7L2 (T2D), CTLA4 (Graves' disease) and CDKN2A-CDKN2B (T2D), much of the posterior probability rested on a single SNP, and, in 4 other regions (CDKN2A-CDKN2B (CAD) and CDKAL1, FTO and HHEX (T2D)), the 95% sets were small, thereby excluding most SNPs as potentially causal. Very few SNPs in our credible sets had annotated functions, illustrating the limitations in understanding the mechanisms underlying susceptibility to common diseases. Our results also show the value of more detailed mapping to target sequences for functional studies. © 2012 Nature America, Inc. All rights reserved.
Resumo:
Drivers behave in different ways, and these different behaviors are a cause of traffic disturbances. A key objective for simulation tools is to correctly reproduce this variability, in particular for car-following models. From data collection to the sampling of realistic behaviors, a chain of key issues must be addressed. This paper discusses data filtering, robustness of calibration, correlation between parameters, and sampling techniques of acceleration-time continuous car-following models. The robustness of calibration is systematically investigated with an objective function that allows confidence regions around the minimum to be obtained. Then, the correlation between sets of calibrated parameters and the validity of the joint distributions sampling techniques are discussed. This paper confirms the need for adapted calibration and sampling techniques to obtain realistic sets of car-following parameters, which can be used later for simulation purposes.
Resumo:
Molecular phylogenetic studies of homologous sequences of nucleotides often assume that the underlying evolutionary process was globally stationary, reversible, and homogeneous (SRH), and that a model of evolution with one or more site-specific and time-reversible rate matrices (e.g., the GTR rate matrix) is enough to accurately model the evolution of data over the whole tree. However, an increasing body of data suggests that evolution under these conditions is an exception, rather than the norm. To address this issue, several non-SRH models of molecular evolution have been proposed, but they either ignore heterogeneity in the substitution process across sites (HAS) or assume it can be modeled accurately using the distribution. As an alternative to these models of evolution, we introduce a family of mixture models that approximate HAS without the assumption of an underlying predefined statistical distribution. This family of mixture models is combined with non-SRH models of evolution that account for heterogeneity in the substitution process across lineages (HAL). We also present two algorithms for searching model space and identifying an optimal model of evolution that is less likely to over- or underparameterize the data. The performance of the two new algorithms was evaluated using alignments of nucleotides with 10 000 sites simulated under complex non-SRH conditions on a 25-tipped tree. The algorithms were found to be very successful, identifying the correct HAL model with a 75% success rate (the average success rate for assigning rate matrices to the tree's 48 edges was 99.25%) and, for the correct HAL model, identifying the correct HAS model with a 98% success rate. Finally, parameter estimates obtained under the correct HAL-HAS model were found to be accurate and precise. The merits of our new algorithms were illustrated with an analysis of 42 337 second codon sites extracted from a concatenation of 106 alignments of orthologous genes encoded by the nuclear genomes of Saccharomyces cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, S. castellii, S. kluyveri, S. bayanus, and Candida albicans. Our results show that second codon sites in the ancestral genome of these species contained 49.1% invariable sites, 39.6% variable sites belonging to one rate category (V1), and 11.3% variable sites belonging to a second rate category (V2). The ancestral nucleotide content was found to differ markedly across these three sets of sites, and the evolutionary processes operating at the variable sites were found to be non-SRH and best modeled by a combination of eight edge-specific rate matrices (four for V1 and four for V2). The number of substitutions per site at the variable sites also differed markedly, with sites belonging to V1 evolving slower than those belonging to V2 along the lineages separating the seven species of Saccharomyces. Finally, sites belonging to V1 appeared to have ceased evolving along the lineages separating S. cerevisiae, S. paradoxus, S. mikatae, S. kudriavzevii, and S. bayanus, implying that they might have become so selectively constrained that they could be considered invariable sites in these species.
Resumo:
This work grew out of an attempt to understand a conjectural remark made by Professor Kyoji Saito to the author about a possible link between the Fox-calculus description of the symplectic structure on the moduli space of representations of the fundamental group of surfaces into a Lie group and pairs of mutually dual sets of generators of the fundamental group. In fact in his paper [3] , Prof. Kyoji Saito gives an explicit description of the system of dual generators of the fundamental group.
Resumo:
Article 38(1) of the Statute of International Court of Justice (hereinafter ICJ) is today generally seen as a direction to the significant sources of international law, which the world court must consider in resolving disputes; however, the list is not exhaustive nor encompasses all the formal and material sources of the international legal system. Article 38 of the Statute of ICJ was written ninety years ago in a different world, a question is under debate in many states, whether or not sources mentioned in Article 38 of the statute are compatible with needs of 21st century ? In recent decade, many new actors come on the stage which have transformed international law and now it is not only governs relations among states but also covers many International Organizations. Article 38(2) does refer to the other possible sources but does not define them. Moreover, law is a set of rules that citizens must follow to regulate peace and order in society. These laws are binding on both the individual and the state on a domestic and international level. Do states regard this particular rule as a rule of international law? The modern legal system of states is in the form of a specified and well organized set of rules, regulating affairs of different organs of a state. States also need a body of rules for their intercourse with each other. These sets of rules among states are called “International Law.” This article examines international law, its foundation and sources. It considers whether international conventions and treaties can be the only way states can considerably create international law, or there is a need for clarity about the sources of international law. Article is divided into two parts, the first one deals with sources of international law discussed in Article 38 of the statute of International Court of Justice whereas the second one discusses the material and formal sources of law, which still need reorganization as sources of law.
Resumo:
The Reeb graph tracks topology changes in level sets of a scalar function and finds applications in scientific visualization and geometric modeling. This paper describes a near-optimal two-step algorithm that constructs the Reeb graph of a Morse function defined over manifolds in any dimension. The algorithm first identifies the critical points of the input manifold, and then connects these critical points in the second step to obtain the Reeb graph. A simplification mechanism based on topological persistence aids in the removal of noise and unimportant features. A radial layout scheme results in a feature-directed drawing of the Reeb graph. Experimental results demonstrate the efficiency of the Reeb graph construction in practice and its applications.
Resumo:
Cellular materials that are often observed in biological systems exhibit excellent mechanical properties at remarkably low densities. Luffa sponge is one of such materials with a complex interconnecting porous structure. In this paper, we studied the relationship between its structural and mechanical properties at different levels of its hierarchical organization from a single fiber to a segment of whole sponge. The tensile mechanical behaviors of three single fibers were examined by an Instron testing machine and the ultrastructure of a fractured single fiber was observed in a scanning electronic microscope. Moreover, the compressive mechanical behaviors of the foam-like blocks from different locations of the sponge were examined. The difference of the compressive stress-strain responses of four sets of segmental samples were also compared. The result shows that the single fiber is a porous composite material mainly consisting of cellulose fibrils and lignin/hemicellulose matrix, and its Young's modulus and strength are comparable to wood. The mechanical behavior of the block samples from the hoop wall is superior to that from the core part. Furthermore, it shows that the influence of the inner surface on the mechanical property of the segmental sample is stronger than that of the core part; in particular, the former's Young's modulus, strength and strain energy absorbed are about 1.6 times higher. The present work can improve our understanding of the structure-function relationship of the natural material, which may inspire fabrication of new biomimetic foams with desirable mechanical efficiency for further applications in anti-crushing devices and super-light sandwich panels.
Resumo:
Poor pharmacokinetics is one of the reasons for the withdrawal of drug candidates from clinical trials. There is an urgent need for investigating in vitro ADME (absorption, distribution, metabolism and excretion) properties and recognising unsuitable drug candidates as early as possible in the drug development process. Current throughput of in vitro ADME profiling is insufficient because effective new synthesis techniques, such as drug design in silico and combinatorial synthesis, have vastly increased the number of drug candidates. Assay technologies for larger sets of compounds than are currently feasible are critically needed. The first part of this work focused on the evaluation of cocktail strategy in studies of drug permeability and metabolic stability. N-in-one liquid chromatography-tandem mass spectrometry (LC/MS/MS) methods were developed and validated for the multiple component analysis of samples in cocktail experiments. Together, cocktail dosing and LC/MS/MS were found to form an effective tool for increasing throughput. First, cocktail dosing, i.e. the use of a mixture of many test compounds, was applied in permeability experiments with Caco-2 cell culture, which is a widely used in vitro model for small intestinal absorption. A cocktail of 7-10 reference compounds was successfully evaluated for standardization and routine testing of the performance of Caco-2 cell cultures. Secondly, cocktail strategy was used in metabolic stability studies of drugs with UGT isoenzymes, which are one of the most important phase II drug metabolizing enzymes. The study confirmed that the determination of intrinsic clearance (Clint) as a cocktail of seven substrates is possible. The LC/MS/MS methods that were developed were fast and reliable for the quantitative analysis of a heterogenous set of drugs from Caco-2 permeability experiments and the set of glucuronides from in vitro stability experiments. The performance of a new ionization technique, atmospheric pressure photoionization (APPI), was evaluated through comparison with electrospray ionization (ESI), where both techniques were used for the analysis of Caco-2 samples. Like ESI, also APPI proved to be a reliable technique for the analysis of Caco-2 samples and even more flexible than ESI because of the wider dynamic linear range. The second part of the experimental study focused on metabolite profiling. Different mass spectrometric instruments and commercially available software tools were investigated for profiling metabolites in urine and hepatocyte samples. All the instruments tested (triple quadrupole, quadrupole time-of-flight, ion trap) exhibited some good and some bad features in searching for and identifying of expected and non-expected metabolites. Although, current profiling software is helpful, it is still insufficient. Thus a time-consuming largely manual approach is still required for metabolite profiling from complex biological matrices.
Resumo:
Topology-based methods have been successfully used for the analysis and visualization of piecewise-linear functions defined on triangle meshes. This paper describes a mechanism for extending these methods to piecewise-quadratic functions defined on triangulations of surfaces. Each triangular patch is tessellated into monotone regions, so that existing algorithms for computing topological representations of piecewise-linear functions may be applied directly to the piecewise-quadratic function. In particular, the tessellation is used for computing the Reeb graph, a topological data structure that provides a succinct representation of level sets of the function.
Resumo:
Epigenetics plays a crucial role in schizophrenia susceptibility. In a previous study, we identified over 4500 differentially methylated sites in prefrontal cortex (PFC) samples from schizophrenia patients. We believe this was the first genome-wide methylation study performed on human brain tissue using the Illumina Infinium HumanMethylation450 Bead Chip. To understand the biological significance of these results, we sought to identify a smaller number of differentially methylated regions (DMRs) of more functional relevance compared with individual differentially methylated sites. Since our schizophrenia whole genome methylation study was performed, another study analysing two separate data sets of post-mortem tissue in the PFC from schizophrenia patients has been published. We analysed all three data sets using the bumphunter function found in the Bioconductor package minfi to identify regions that are consistently differentially methylated across distinct cohorts. We identified seven regions that are consistently differentially methylated in schizophrenia, despite considerable heterogeneity in the methylation profiles of patients with schizophrenia. The regions were near CERS3, DPPA5, PRDM9, DDX43, REC8, LY6G5C and a region on chromosome 10. Of particular interest is PRDM9 which encodes a histone methyltransferase that is essential for meiotic recombination and is known to tag genes for epigenetic transcriptional activation. These seven DMRs are likely to be key epigenetic factors in the aetiology of schizophrenia and normal brain neurodevelopment.
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
In this paper we have used the method of characteristics developed for two dimensional unsteady flow problems to study a simplified axial turbine problem. The system consists of two sets of blades —the guiding vanes which are fixed and the rotor blades which move perpendicular to these vanes. The initial undisturbed constant flow in the system is perturbed by introducing a small velocity normal to the rotor blades to simulate a slight constant inclination. The resulting perturbed flow is periodic after the first three cycles. We have studied the perturbed density distribution throughout the system during a period.
Resumo:
The principal objective of this study was to determine if Campylobacter jejuni genotyping methods based upon resolution optimised sets of single nucleotide polymorphisms (SNPs) and binary genetic markers were capable of identifying epidemiologically linked clusters of chicken-derived isolates. Eighty-eight C. jejuni isolates of known flaA RFLP type were included in the study. They encompassed three groups of ten isolates that were obtained at the same time and place and possessed the same flaA type. These were regarded as being epidemiologically linked. Twenty-six unlinked C. jejuni flaA type I isolates were included to test the ability of SNP and binary typing to resolve isolates that were not resolved by flaA RFLP. The remaining isolates were of different flaA types. All isolates were typed by real-time PCR interrogation of the resolution optimised sets of SNPs and binary markers. According to each typing method, the three epidemiologically linked clusters were three different clones that were well resolved from the other isolates. The 26 unlinked C. jejuni flaA type I isolates were resolved into 14 SNP-binary types, indicating that flaA typing can be unreliable for revealing epidemiological linkage. Comparison of the data with data from a fully typed set of isolates associated with human infection revealed that abundant lineages in the chicken isolates that were also found in the human isolates belonged to clonal complex (CC) -21 and CC-353, with the usually rare C-353 member ST-524 being especially abundant in the chicken collection. The chicken isolates selected to be diverse according to flaA were also diverse according to SNP and binary typing. It was observed that CC-48 was absent in the chicken isolates, despite being very common in Australian human infection isolates, indicating that this may be a major cause of human disease that is not chicken associated.