990 resultados para Generating function
Resumo:
The current state of the practice in Blackspot Identification (BSI) utilizes safety performance functions based on total crash counts to identify transport system sites with potentially high crash risk. This paper postulates that total crash count variation over a transport network is a result of multiple distinct crash generating processes including geometric characteristics of the road, spatial features of the surrounding environment, and driver behaviour factors. However, these multiple sources are ignored in current modelling methodologies in both trying to explain or predict crash frequencies across sites. Instead, current practice employs models that imply that a single underlying crash generating process exists. The model mis-specification may lead to correlating crashes with the incorrect sources of contributing factors (e.g. concluding a crash is predominately caused by a geometric feature when it is a behavioural issue), which may ultimately lead to inefficient use of public funds and misidentification of true blackspots. This study aims to propose a latent class model consistent with a multiple crash process theory, and to investigate the influence this model has on correctly identifying crash blackspots. We first present the theoretical and corresponding methodological approach in which a Bayesian Latent Class (BLC) model is estimated assuming that crashes arise from two distinct risk generating processes including engineering and unobserved spatial factors. The Bayesian model is used to incorporate prior information about the contribution of each underlying process to the total crash count. The methodology is applied to the state-controlled roads in Queensland, Australia and the results are compared to an Empirical Bayesian Negative Binomial (EB-NB) model. A comparison of goodness of fit measures illustrates significantly improved performance of the proposed model compared to the NB model. The detection of blackspots was also improved when compared to the EB-NB model. In addition, modelling crashes as the result of two fundamentally separate underlying processes reveals more detailed information about unobserved crash causes.
Resumo:
A sensitive dimerization assay for DNA binding proteins has been developed using gene fusion technology. For this purpose, we have engineered a gene fusion using protein A gene of Staphylococcus aureus and C gene, the late gene transactivator of bacteriophage Mu. The C gene was fused to the 3' end of the gene for protein A to generate an A- C fusion. The overexpressed fusion protein was purified in a single step using immunoglobulin affinity chromatography. Purified fusion protein exhibits DNA binding activity as demonstrated by electrophoretic mobility shift assays. When the fusion protein A-C was mixed with C and analyzed for DNA binding, in addition to C and A-C specific complexes, a single intermediate complex comprising of a heterodimer of C and A-C fusion proteins was observed. Further, the protein A moiety in the fusion protein A-C does not contribute to DNA binding as demonstrated by proteolytic cleavage and circular dichroism (CD) analysis. The assay has also been applied to analyze the DNA binding domain of C protein by generating fusions between protein A and N- and C-terminal deletion mutants of C. The results indicate a role for the region towards the carboxy terminal of the protein in DNA binding. The general applicability of this method is discussed.
Resumo:
Genetic Algorithms are robust search and optimization techniques. A Genetic Algorithm based approach for determining the optimal input distributions for generating random test vectors is proposed in the paper. A cost function based on the COP testability measure for determining the efficacy of the input distributions is discussed, A brief overview of Genetic Algorithms (GAs) and the specific details of our implementation are described. Experimental results based on ISCAS-85 benchmark circuits are presented. The performance pf our GA-based approach is compared with previous results. While the GA generates more efficient input distributions than the previous methods which are based on gradient descent search, the overheads of the GA in computing the input distributions are larger. To account for the relatively quick convergence of the gradient descent methods, we analyze the landscape of the COP-based cost function. We prove that the cost function is unimodal in the search space. This feature makes the cost function amenable to optimization by gradient-descent techniques as compared to random search methods such as Genetic Algorithms.
Resumo:
Super-resolution imaging techniques are of paramount interest for applications in bioimaging and fluorescence microscopy. Recent advances in bioimaging demand application-tailored point spread functions. Here, we present some approaches for generating application-tailored point spread functions along with fast imaging capabilities. Aperture engineering techniques provide interesting solutions for obtaining desired system point spread functions. Specially designed spatial filters—realized by optical mask—are outlined both in a single-lens and 4Pi configuration. Applications include depth imaging, multifocal imaging, and super-resolution imaging. Such an approach is suitable for fruitful integration with most existing state-of-art imaging microscopy modalities.
Resumo:
Methods for generating a new population are a fundamental component of estimation of distribution algorithms (EDAs). They serve to transfer the information contained in the probabilistic model to the new generated population. In EDAs based on Markov networks, methods for generating new populations usually discard information contained in the model to gain in efficiency. Other methods like Gibbs sampling use information about all interactions in the model but are computationally very costly. In this paper we propose new methods for generating new solutions in EDAs based on Markov networks. We introduce approaches based on inference methods for computing the most probable configurations and model-based template recombination. We show that the application of different variants of inference methods can increase the EDAs’ convergence rate and reduce the number of function evaluations needed to find the optimum of binary and non-binary discrete functions.
Resumo:
We generalize the Faddeev-Jackiw canonical path integral quantization for the scenario of a Jacobian with J=1 to that for the general scenario of non-unit Jacobian, give the representation of the quantum transition amplitude with symplectic variables and obtain the generating functionals of the Green function and connected Green function. We deduce the unified expression of the symplectic field variable functions in terms of the Green function or the connected Green function with external sources. Furthermore, we generally get generating functionals of the general proper vertices of any n-points cases under the conditions of considering and not considering Grassmann variables, respectively; they are regular and are the simplest forms relative to the usual field theory.
Resumo:
Components of a xenobiotic detoxication/toxication system involving mixed function oxygenases are present inMytilus edulis. Our paper critically reviews the recent literature on this topic which reported the apparent absence of such a system in bivalve molluscs and attempts to reconcile this viewpoint with our own findings on NADPH neotetrazolium reductase, glucose-6-phosphate dehydrogenase, aldrin epoxidation and other reports of the presence of mixed function oxygenases. New experimental data are presented which indicate that some elements of the detoxication/toxication system inM. edulis can be induced by aromatic hydrocarbons derived from crude oil. This includes a brief review of the results of long-term experiments in which mussels were exposed to low concentrations of the water accommodated fraction of North Sea crude oil (7.7–68 µg 1−1) in which general stress responses such as reduced physiological scope for growth, cytotoxic damage to lysosomal integrity and cellular damage are considered as characteristics of the general stress syndrome induced by the toxic action of the xenobiotics. In addition, induction in the blood cells of microsomal NADPH neotetrazolium reductase (associated with mixed function oxygenases) and the NADPH generating enzyme glucose-6-phosphate dehydrogenase are considered to be specific biological responses to the presence of aromatic hydrocarbons. The consequences of this detoxication/toxication system forMytilus edulis are discussed in terms of the formation of toxic electrophilic intermediate metabolites which are highly reactive and can combine with DNA, RNA and proteins with subsequent damage to these cellular constituents. Implications for neoplasms associated with the blood cells are also discussed. Finally, in view of the increased use of mussel species in pollutant monitoring programmes, the induction phenomenon which is associated with microsomal enzymes in the blood cells is considered as a possible tool for the detection of the biological effects of environmental contamination by low concentrations of certain groups of organic xenobiotics.
Resumo:
At the start of the industrial revolution (circa 1750) the atmospheric concentration of carbon dioxide (CO2) was around 280 ppm. Since that time the burning of fossil fuel, together with other industrial processes such as cement manufacture and changing land use, has increased this value to 400 ppm, for the first time in over 3 million years. With CO2 being a potent greenhouse gas, the consequence of this rise for global temperatures has been dramatic, and not only for air temperatures. Global Sea Surface Temperature (SST) has warmed by 0.4–0.8 °C during the last century, although regional differences are evident (IPCC, 2007). This rise in atmospheric CO2 levels and the resulting global warming to some extent has been ameliorated by the oceanic uptake of around one quarter of the anthropogenic CO2 emissions (Sabine et al., 2004). Initially this was thought to be having little or no impact on ocean chemistry due to the capacity of the ocean’s carbonate buffering system to neutralise the acidity caused when CO2 dissolves in seawater. However, this assumption was challenged by Caldeira and Wickett (2005) who used model predictions to show that the rate at which carbonate buffering can act was far too slow to moderate significant changes to oceanic chemistry over the next few centuries. Their model predicted that since pre-industrial times, ocean surface water pH had fallen by 0.1 pH unit, indicating a 30% increase in the concentration of H+ ions. Their model also showed that the pH of surface waters could fall by up to 0.4 units before 2100, driven by continued and unabated utilisation of fossil fuels. Alongside increasing levels of dissolved CO2 and H+ (reduced pH) an increase in bicarbonate ions together with a decrease in carbonate ions occurs. These chemical changes are now collectively recognised as “ocean acidification”. Concern now stems from the knowledge that concentrations of H+, CO2, bicarbonate and carbonate ions impact upon many important physiological processes vital to maintaining health and function in marine organisms. Additionally, species have evolved under conditions where the carbonate system has remained relatively stable for millions of years, rendering them with potentially reduced capacity to adapt to this rapid change. Evidence suggests that, whilst the impact of ocean acidification is complex, when considered alongside ocean warming the net effect on the health and productivity of the oceans will be detrimental.
Resumo:
There is an increasing demand for environmental assessments of the marine environment to include ecosystem function. However, existing schemes are predominantly based on taxonomic (i.e. structural) measures of biodiversity. Biodiversity and Ecosystem Function (BEF) relationships are suggested to provide a mechanism for converting taxonomic information into surrogates of ecosystem function. This review assesses the evidence for marine BEF relationships and their potential to be used in practical monitoring applications (i.e. operationalized). Five key requirements were identified for the practical application of BEF relationships: (1) a complete understanding of strength, direction and prevalence of marine BEF relationships, (2) an understanding of which biological components are influential within specific BEF relationships, (3) the biodiversity of the selected biological components can be measured easily, (4) the ecological mechanisms that are the most important for generating marine BEF relationships, i.e. identity effects or complementarity, are known and (5) the proportion of the overall functional variance is explained by biodiversity, and hence BEF relationships, has been established. Numerous positive and some negative BEF relationships were found within the literature, although many reproduced poorly the natural species richness, trophic structures or multiple functions of real ecosystems (requirement 1). Null relationships were also reported. The consistency of the positive and negative relationships was often low that compromised the ability to generalize BEF relationships and confident application of BEF within marine monitoring. Equally, some biological components and functions have received little or no investigation. Expert judgement was used to attribute biological components using spatial extent, presence and functional rate criteria (requirement 2). This approach highlighted the main biological components contributing the most to specific ecosystem functions, and that many of the particularly influential components were found to have received the least amount of research attention. The need for biodiversity to be measureable (requirement 3) is possible for most biological components although difficult within the functionally important microbes. Identity effects underpinned most marine BEF relationships (requirement 4). As such, processes that translated structural biodiversity measures into functional diversity were found to generate better BEF relationships. The analysis of the contribution made by biodiversity, over abiotic influences, to the total expression of a particular ecosystem function was rarely measured or considered (requirement 5). Hence it is not possible to determine the overall importance of BEF relationships within the total ecosystem functioning observed. In the few studies where abiotic factors had been considered, it was clear that these modified BEF relationships and have their own direct influence on functional rate. Based on the five requirements, the information required for immediate ‘operationalization’ of BEF relationships within marine functional monitoring is lacking. However, the concept of BEF inclusion within practical monitoring applications, supported by ecological modelling, shows promise for providing surrogate indicators of functioning.
Resumo:
Caches hide the growing latency of accesses to the main memory from the processor by storing the most recently used data on-chip. To limit the search time through the caches, they are organized in a direct mapped or set-associative way. Such an organization introduces many conflict misses that hamper performance. This paper studies randomizing set index functions, a technique to place the data in the cache in such a way that conflict misses are avoided. The performance of such a randomized cache strongly depends on the randomization function. This paper discusses a methodology to generate randomization functions that perform well over a broad range of benchmarks. The methodology uses profiling information to predict the conflict miss rate of randomization functions. Then, using this information, a search algorithm finds the best randomization function. Due to implementation issues, it is preferable to use a randomization function that is extremely simple and can be evaluated in little time. For these reasons, we use randomization functions where each randomized address bit is computed as the XOR of a subset of the original address bits. These functions are chosen such that they operate on as few address bits as possible and have few inputs to each XOR. This paper shows that to index a 2(m)-set cache, it suffices to randomize m+2 or m+3 address bits and to limit the number of inputs to each XOR to 2 bits to obtain the full potential of randomization. Furthermore, it is shown that the randomization function that we generate for one set of benchmarks also works well for an entirely different set of benchmarks. Using the described methodology, it is possible to reduce the implementation cost of randomization functions with only an insignificant loss in conflict reduction.
Resumo:
A nonparametric, small-sample-size test for the homogeneity of two psychometric functions against the left- and right-shift alternatives has been developed. The test is designed to determine whether it is safe to amalgamate psychometric functions obtained in different experimental sessions. The sum of the lower and upper p-values of the exact (conditional) Fisher test for several 2 × 2 contingency tables (one for each point of the psychometric function) is employed as the test statistic. The probability distribution of the statistic under the null (homogeneity) hypothesis is evaluated to obtain corresponding p-values. Power functions of the test have been computed by randomly generating samples from Weibull psychometric functions. The test is free of any assumptions about the shape of the psychometric function; it requires only that all observations are statistically independent. © 2011 Psychonomic Society, Inc.
Resumo:
The Zubarev equation of motion method has been applied to an anharmonic crystal of O( ,,4). All possible decoupling schemes have been interpreted in order to determine finite temperature expressions for the one phonon Green's function (and self energy) to 0()\4) for a crystal in which every atom is on a site of inversion symmetry. In order to provide a check of these results, the Helmholtz free energy expressions derived from the self energy expressions, have been shown to agree in the high temperature limit with the results obtained from the diagrammatic method. Expressions for the correlation functions that are related to the mean square displacement have been derived to 0(1\4) in the high temperature limit.
Resumo:
The attached file is created with Scientific Workplace Latex
Resumo:
Germin and germin-like proteins (GLPs) are encoded by a family of genes found in all plants. They are part of the cupin superfamily of biochemically diverse proteins, a superfamily that has a conserved tertiary structure, though with limited similarity in primary sequence. The subgroups of GLPs have different enzyme functions that include the two hydrogen peroxide-generating enzymes, oxalate oxidase (OxO) and superoxide dismutase. This review summarizes the sequence and structural details of GLPs and also discusses their evolutionary progression, particularly their amplification in gene number during the evolution of the land plants. In terms of function, the GLPs are known to be differentially expressed during specific periods of plant growth and development, a pattern of evolutionary subfunctionalization. They are also implicated in the response of plants to biotic (viruses, bacteria, mycorrhizae, fungi, insects, nematodes, and parasitic plants) and abiotic (salt, heat/cold, drought, nutrient, and metal) stress. Most detailed data come from studies of fungal pathogenesis in cereals. This involvement with the protection of plants from environmental stress of various types has led to numerous plant breeding studies that have found links between GLPs and QTLs for disease and stress resistance. In addition the OxO enzyme has considerable commercial significance, based principally on its use in the medical diagnosis of oxalate concentration in plasma and urine. Finally, this review provides information on the nutritional importance of these proteins in the human diet, as several members are known to be allergenic, a feature related to their thermal stability and evolutionary connection to the seed storage proteins, also members of the cupin superfamily.
Resumo:
A basic principle in data modelling is to incorporate available a priori information regarding the underlying data generating mechanism into the modelling process. We adopt this principle and consider grey-box radial basis function (RBF) modelling capable of incorporating prior knowledge. Specifically, we show how to explicitly incorporate the two types of prior knowledge: the underlying data generating mechanism exhibits known symmetric property and the underlying process obeys a set of given boundary value constraints. The class of orthogonal least squares regression algorithms can readily be applied to construct parsimonious grey-box RBF models with enhanced generalisation capability.