988 resultados para level sets


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first level data cache un modern processors has become a major consumer of energy due to its increasing size and high frequency access rate. In order to reduce this high energy con sumption, we propose in this paper a straightforward filtering technique based on a highly accurate forwarding predictor. Specifically, a simple structure predicts whether a load instruction will obtain its corresponding data via forwarding from the load-store structure -thus avoiding the data cache access - or if it will be provided by the data cache. This mechanism manages to reduce the data cache energy consumption by an average of 21.5% with a negligible performance penalty of less than 0.1%. Furthermore, in this paper we focus on the cache static energy consumption too by disabling a portin of sets of the L2 associative cache. Overall, when merging both proposals, the combined L1 and L2 total energy consumption is reduced by an average of 29.2% with a performance penalty of just 0.25%. Keywords: Energy consumption; filtering; forwarding predictor; cache hierarchy

Relevância:

30.00% 30.00%

Publicador:

Resumo:

El objetivo final de las investigaciones recogidas en esta tesis doctoral es la estimación del volumen de hielo total de los ms de 1600 glaciares de Svalbard, en el Ártico, y, con ello, su contribución potencial a la subida del nivel medio del mar en un escenario de calentamiento global. Los cálculos más exactos del volumen de un glaciar se efectúan a partir de medidas del espesor de hielo obtenidas con georradar. Sin embargo, estas medidas no son viables para conjuntos grandes de glaciares, debido al coste, dificultades logísticas y tiempo requerido por ellas, especialmente en las regiones polares o de montaña. Frente a ello, la determinación de áreas de glaciares a partir de imágenes de satélite sí es viable a escalas global y regional, por lo que las relaciones de escala volumen-área constituyen el mecanismo más adecuado para las estimaciones de volúmenes globales y regionales, como las realizadas para Svalbard en esta tesis. Como parte del trabajo de tesis, hemos elaborado un inventario de los glaciares de Svalbard en los que se han efectuado radioecosondeos, y hemos realizado los cálculos del volumen de hielo de más de 80 cuencas glaciares de Svalbard a partir de datos de georradar. Estos volúmenes han sido utilizados para calibrar las relaciones volumen-área desarrolladas en la tesis. Los datos de georradar han sido obtenidos en diversas campañas llevadas a cabo por grupos de investigación internacionales, gran parte de ellas lideradas por el Grupo de Simulación Numérica en Ciencias e Ingeniería de la Universidad Politécnica de Madrid, del que forman parte la doctoranda y los directores de tesis. Además, se ha desarrollado una metodología para la estimación del error en el cálculo de volumen, que aporta una novedosa técnica de cálculo del error de interpolación para conjuntos de datos del tipo de los obtenidos con perfiles de georradar, que presentan distribuciones espaciales con unos patrones muy característicos pero con una densidad de datos muy irregular. Hemos obtenido en este trabajo de tesis relaciones de escala específicas para los glaciares de Svalbard, explorando la sensibilidad de los parámetros a diferentes morfologías glaciares, e incorporando nuevas variables. En particular, hemos efectuado experimentos orientados a verificar si las relaciones de escala obtenidas caracterizando los glaciares individuales por su tamaño, pendiente o forma implican diferencias significativas en el volumen total estimado para los glaciares de Svalbard, y si esta partición implica algún patrón significativo en los parámetros de las relaciones de escala. Nuestros resultados indican que, para un valor constante del factor multiplicativo de la relacin de escala, el exponente que afecta al área en la relación volumen-área decrece según aumentan la pendiente y el factor de forma, mientras que las clasificaciones basadas en tamaño no muestran un patrón significativo. Esto significa que los glaciares con mayores pendientes y de tipo circo son menos sensibles a los cambios de área. Además, los volúmenes de la población total de los glaciares de Svalbard calculados con fraccionamiento en grupos por tamaño y pendiente son un 1-4% menores que los obtenidas usando la totalidad de glaciares sin fraccionamiento en grupos, mientras que los volúmenes calculados fraccionando por forma son un 3-5% mayores. También realizamos experimentos multivariable para obtener estimaciones óptimas del volumen total mediante una combinación de distintos predictores. Nuestros resultados muestran que un modelo potencial simple volumen-área explica el 98.6% de la varianza. Sólo el predictor longitud del glaciar proporciona significación estadística cuando se usa además del área del glaciar, aunque el coeficiente de determinación disminuye en comparación con el modelo más simple V-A. El predictor intervalo de altitud no proporciona información adicional cuando se usa además del área del glaciar. Nuestras estimaciones del volumen de la totalidad de glaciares de Svalbard usando las diferentes relaciones de escala obtenidas en esta tesis oscilan entre 6890 y 8106 km3, con errores relativos del orden de 6.6-8.1%. El valor medio de nuestras estimaciones, que puede ser considerado como nuestra mejor estimación del volumen, es de 7.504 km3. En términos de equivalente en nivel del mar (SLE), nuestras estimaciones corresponden a una subida potencial del nivel del mar de 17-20 mm SLE, promediando 19_2 mm SLE, donde el error corresponde al error en volumen antes indicado. En comparación, las estimaciones usando las relaciones V-A de otros autores son de 13-26 mm SLE, promediando 20 _ 2 mm SLE, donde el error representa la desviación estándar de las distintas estimaciones. ABSTRACT The final aim of the research involved in this doctoral thesis is the estimation of the total ice volume of the more than 1600 glaciers of Svalbard, in the Arctic region, and thus their potential contribution to sea-level rise under a global warming scenario. The most accurate calculations of glacier volumes are those based on ice-thicknesses measured by groundpenetrating radar (GPR). However, such measurements are not viable for very large sets of glaciers, due to their cost, logistic difficulties and time requirements, especially in polar or mountain regions. On the contrary, the calculation of glacier areas from satellite images is perfectly viable at global and regional scales, so the volume-area scaling relationships are the most useful tool to determine glacier volumes at global and regional scales, as done for Svalbard in this PhD thesis. As part of the PhD work, we have compiled an inventory of the radio-echo sounded glaciers in Svalbard, and we have performed the volume calculations for more than 80 glacier basins in Svalbard from GPR data. These volumes have been used to calibrate the volume-area relationships derived in this dissertation. Such GPR data have been obtained during fieldwork campaigns carried out by international teams, often lead by the Group of Numerical Simulation in Science and Engineering of the Technical University of Madrid, to which the PhD candidate and her supervisors belong. Furthermore, we have developed a methodology to estimate the error in the volume calculation, which includes a novel technique to calculate the interpolation error for data sets of the type produced by GPR profiling, which show very characteristic data distribution patterns but with very irregular data density. We have derived in this dissertation scaling relationships specific for Svalbard glaciers, exploring the sensitivity of the scaling parameters to different glacier morphologies and adding new variables. In particular, we did experiments aimed to verify whether scaling relationships obtained through characterization of individual glacier shape, slope and size imply significant differences in the estimated volume of the total population of Svalbard glaciers, and whether this partitioning implies any noticeable pattern in the scaling relationship parameters. Our results indicate that, for a fixed value of the factor in the scaling relationship, the exponent of the area in the volume-area relationship decreases as slope and shape increase, whereas size-based classifications do not reveal any clear trend. This means that steep slopes and cirque-type glaciers are less sensitive to changes in glacier area. Moreover, the volumes of the total population of Svalbard glaciers calculated according to partitioning in subgroups by size and slope are smaller (by 1-4%) than that obtained considering all glaciers without partitioning into subgroups, whereas the volumes calculated according to partitioning in subgroups by shape are 3-5% larger. We also did multivariate experiments attempting to optimally predict the volume of Svalbard glaciers from a combination of different predictors. Our results show that a simple power-type V-A model explains 98.6% of the variance. Only the predictor glacier length provides statistical significance when used in addition to the predictor glacier area, though the coefficient of determination decreases as compared with the simpler V-A model. The predictor elevation range did not provide any additional information when used in addition to glacier area. Our estimates of the volume of the entire population of Svalbard glaciers using the different scaling relationships that we have derived along this thesis range within 6890-8106 km3, with estimated relative errors in total volume of the order of 6.6-8.1% The average value of all of our estimates, which could be used as a best estimate for the volume, is 7,504 km3. In terms of sea-level equivalent (SLE), our volume estimates correspond to a potential contribution to sea-level rise within 17-20 mm SLE, averaging 19 _ 2 mm SLE, where the quoted error corresponds to our estimated relative error in volume. For comparison, the estimates using the V-A scaling relations found in the literature range within 13-26 mm SLE, averaging 20 _ 2 mm SLE, where the quoted error represents the standard deviation of the different estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe here a method to generate combinatorial libraries of oligonucleotides mutated at the codon-level, with control of the mutagenesis rate so as to create predictable binomial distributions of mutants. The method allows enrichment of the libraries with single, double or larger multiplicity of amino acid replacements by appropriate choice of the mutagenesis rate, depending on the concentration of synthetic precursors. The method makes use of two sets of deoxynucleoside-phosphoramidites bearing orthogonal protecting groups [4,4′-dimethoxytrityl (DMT) and 9-fluorenylmethoxycarbonyl (Fmoc)] in the 5′ hydroxyl. These phosphoramidites are divergently combined during automated synthesis in such a way that wild-type codons are assembled with commercial DMT-deoxynucleoside-methyl-phosphoramidites while mutant codons are assembled with Fmoc-deoxynucleoside-methyl-phosphoramidites in an NNG/C fashion in a single synthesis column. This method is easily automated and suitable for low mutagenesis rates and large windows, such as those required for directed evolution and alanine scanning. Through the assembly of three oligonucleotide libraries at different mutagenesis rates, followed by cloning at the polylinker region of plasmid pUC18 and sequencing of 129 clones, we concluded that the method performs essentially as intended.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tranformed-rule up and down psychophysical methods have gained great popularity, mainly because they combine criterion-free responses with an adaptive procedure allowing rapid determination of an average stimulus threshold at various criterion levels of correct responses. The statistical theory underlying the methods now in routine use is based on sets of consecutive responses with assumed constant probabilities of occurrence. The response rules requiring consecutive responses prevent the possibility of using the most desirable response criterion, that of 75% correct responses. The earliest transformed-rule up and down method, whose rules included nonconsecutive responses, did not contain this limitation but failed to become generally accepted, lacking a published theoretical foundation. Such a foundation is provided in this article and is validated empirically with the help of experiments on human subjects and a computer simulation. In addition to allowing the criterion of 75% correct responses, the method is more efficient than the methods excluding nonconsecutive responses in their rules.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Immunization of mice with rat type II collagen (CII), a cartilage-specific protein, leads to development of collagen-induced arthritis (CIA), a model for rheumatoid arthritis. To define the interaction between the immune system and cartilage, we produced two sets of transgenic mice. In the first we point mutated the mouse CII gene to express an earlier defined T-cell epitope, CII-(256-270), present in rat CII. In the second we mutated the mouse type I collagen gene to express the same T-cell epitope. The mice with mutated type I collagen showed no T-cell reactivity to rat CII and were resistant to CIA. Thus, the CII-(256-270) epitope is immunodominant and critical for development of CIA. In contrast, the mice with mutated CII had an intact B-cell response and had T cells which could produce gamma interferon, but not proliferate, in response to CII. They developed CIA, albeit with a reduced incidence. Thus, we conclude that T cells recognize CII derived from endogenous cartilage and are partially tolerized but may still be capable of mediating CIA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the vast research examining the evolution of Caribbean education systems, little is chronologically tied to the postcolonial theoretical perspectives of specific island-state systems, such as the Jamaican education system and its relationship with the underground shadow education system. This dissertation study sought to address the gaps in the literature by critically positioning postcolonial theories in education to examine the macro- and micro-level impacts of extra lessons on secondary education in Jamaica. The following postcolonial theoretical (PCT) tenets in education were contextualized from a review of the literature: (a) PCT in education uses colonial discourse analysis to critically deconstruct and decolonize imperialistic and colonial representations of knowledge throughout history; (b) PCT in education uses an anti-colonial discursive framework to re-position indigenous knowledge in schools, colleges, and universities to challenge hegemonic knowledge; (c) PCT in education involves the "unlearning" of dominant, normative ideologies, the use of self-reflexivity, and deconstruction; and (d) PCT in education calls for critical pedagogical approaches that reject the banking concept of education and introduces inclusive pedagogy to facilitate "the passage from naïve to critical transitivity" (Freire, 1973, p. 32). Specifically, using a transformative mixed-methods design, grounded and informed by a postcolonial theoretical lens, I quantitatively uncovered and then qualitatively highlighted how if at all extra lessons can improve educational outcomes for students at the secondary level in Jamaica. Accordingly, the quantitative data was used to test the hypotheses that the practice of extra lessons in schools is related to student academic achievement and the practice of critical-inclusive pedagogy in extra lessons is related to academic achievement. The two-level hierarchical linear model analysis revealed that hours spent in extra lessons, average household monthly income, and critical-inclusive pedagogical tents were the best predictors for academic achievement. Alternatively, the holistic multi-case study explored how extra-lessons produces increased academic achievement. The data revealed new ways of knowledge construction and critical pedagogical approaches to galvanize systemic change in secondary education. Furthermore, the data showed that extra lessons can improve educational outcomes for students at the secondary level if the conditions for learning are met. This study sets the stage for new forms of knowledge construction and implications for policy change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A growing body of research focuses on the expanding roles of NGOs in global and supranational governance. The research emphasizes the increasing number of participation patterns of NGOs in policymaking and cross-national cooperation. It has produced important insights into the evolving political role of NGOs and their growing involvement in governance. The focus on activities at a transnational level has, however, lead to the virtual exclusion of research on other levels of governance. It has not been possible to tell whether the locus of their political activity is shifting from the national to the transnational environment, or whether it is simply broadening. Missing from the literature is an examination of the variety of cooperative relationships, including those between NGOs, which impact policy involvement across different levels of governance. To bridge this gap, I address two key questions: 1) Is the strategy of cooperation among NGOs a common feature of social movement activity across levels of governance, and if so, what does the structure of cooperation look like? 2) What impact, if any, does cooperation have on the expanding political involvement of NGOS, both within and across levels of governance? Using data from an original survey of migrant and refugee organizations across much of Europe, I test several hypotheses that shed light on these issues. The findings broadly indicate that 1) Cooperation is a widely-used strategy across levels of governance, 2) Cooperation with specific sets of actors increases the likelihood of NGO involvement at different levels of governance. Specifically, cooperation with EU-level actors increases the likelihood of national-level involvement, and 3) NGOs are more likely to extend their involvement across a range of institutions if they cooperate with a broad range of actors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DNA Microarray is a powerful tool to measure the level of a mixed population of nucleic acids at one time, which has great impact in many aspects of life sciences research. In order to distinguish nucleic acids with very similar composition by hybridization, it is necessary to design microarray probes with high specificities and sensitivities. Highly specific probes correspond to probes having unique DNA sequences; whereas highly sensitive probes correspond to those with melting temperature within a desired range and having no secondary structure. The selection of these probes from a set of functional DNA sequences (exons) constitutes a computationally expensive discrete non-linear search problem. We delegate the search task to a simple yet effective Evolution Strategy algorithm. The computational efficiency is also greatly improved by making use of an available bioinformatics tool.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Retrieving large amounts of information over wide area networks, including the Internet, is problematic due to issues arising from latency of response, lack of direct memory access to data serving resources, and fault tolerance. This paper describes a design pattern for solving the issues of handling results from queries that return large amounts of data. Typically these queries would be made by a client process across a wide area network (or Internet), with one or more middle-tiers, to a relational database residing on a remote server. The solution involves implementing a combination of data retrieval strategies, including the use of iterators for traversing data sets and providing an appropriate level of abstraction to the client, double-buffering of data subsets, multi-threaded data retrieval, and query slicing. This design has recently been implemented and incorporated into the framework of a commercial software product developed at Oracle Corporation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

DNA microarray is a powerful tool to measure the level of a mixed population of nucleic acids at one time, which has great impact in many aspects of life sciences research. In order to distinguish nucleic acids with very similar composition by hybridization, it is necessary to design probes with high specificities, i.e. uniqueness, and also sensitivities, i.e., suitable melting temperature and no secondary structure. We make use of available biology tools to gain necessary sequence information of human chromosome 12, and combined with evolutionary strategy (ES) to find unique subsequences representing all predicted exons. The results are presented and discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the dynamics of on-line learning in multilayer neural networks where training examples are sampled with repetition and where the number of examples scales with the number of network weights. The analysis is carried out using the dynamical replica method aimed at obtaining a closed set of coupled equations for a set of macroscopic variables from which both training and generalization errors can be calculated. We focus on scenarios whereby training examples are corrupted by additive Gaussian output noise and regularizers are introduced to improve the network performance. The dependence of the dynamics on the noise level, with and without regularizers, is examined, as well as that of the asymptotic values obtained for both training and generalization errors. We also demonstrate the ability of the method to approximate the learning dynamics in structurally unrealizable scenarios. The theoretical results show good agreement with those obtained by computer simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we summarise key elements of retail change in Britain over a twenty-year period. The time period is that covered by a funded study into long-term change in grocery shopping habits in Portsmouth, England. The major empirical findings—to which we briefly allude—are reported elsewhere: the present task is to assess the wider context underlying that change. For example, it has frequently been stated that retailing in the UK is not as competitive as in other leading economies. As a result, the issue of consumer choice has become increasingly important politically. Concerns over concentration in the industry, new format development and market definition have been expressed by local planners, competition regulators and consumer groups. Macro level changes over time have also created market inequality in consumer opportunities at a local level—hence our decision to attempt a local-level study. Situational factors affecting consumer experiences over time at the local level involve the changing store choice sets available to particular consumers. Using actual consumer experiences thus becomes a yardstick for assessing the practical effectiveness of policy making. The paper demonstrates that choice at local level is driven by store use and that different levels of provision reflect real choice at the local level. Macro-level policy and ‘one size fits all’ approaches to regulation, it is argued, do not reflect the changing reality of grocery shopping. Accordingly, arguments for a more local and regional approach to regulation are made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis analyses the impact of workplace stressors and mood on innovation activities. Based on three competitive frameworks offered by cognitive spreading activation theory, mood repair perspective, and mood-as-information theory, different sets of predictions are developed. These hypotheses are tested in a field study involving 41 R&D teams and 123 individual R&D workers, and in an experimental study involving 54 teams of students. Results of the field study suggest that stressors and mood interact to predict innovation activities in such a way that with increasing stressors a high positive ( or negative) mood is more detrimental to innovation activities than a low positive (or negative) mood, lending support to the mood repair perspective. These effects are found for both individuals and teams. In the experimental study this effect is replicated and potential boundary conditions and mediators are tested. In addition, this thesis includes the development of an instrument to assess creativity and implementation activities within the realm of task-related innovative performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose and demonstrate an optical liquid level sensor based on the surrounding medium refractive index (SRI) sensing using an excessively tilted fibre grating (ETFG). When the ETFG submerged in water, two sets of cladding modes are coupled, corresponding to air- and water-surrounded grating structures, respectively. The coupling strengths of the two sets of cladding modes evolve with the submerging length of the grating, providing a mechanism to measure the liquid level. Comparing with long-period fibre grating based liquid level sensor, the ETFG sensor has a much higher SRI responsivity for liquids with refractive index around 1.33 and a lower thermal cross sensitivity. © 2013 Published by Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The exponential growth of studies on the biological response to ocean acidification over the last few decades has generated a large amount of data. To facilitate data comparison, a data compilation hosted at the data publisher PANGAEA was initiated in 2008 and is updated on a regular basis (doi:10.1594/PANGAEA.149999). By January 2015, a total of 581 data sets (over 4 000 000 data points) from 539 papers had been archived. Here we present the developments of this data compilation five years since its first description by Nisumaa et al. (2010). Most of study sites from which data archived are still in the Northern Hemisphere and the number of archived data from studies from the Southern Hemisphere and polar oceans are still relatively low. Data from 60 studies that investigated the response of a mix of organisms or natural communities were all added after 2010, indicating a welcomed shift from the study of individual organisms to communities and ecosystems. The initial imbalance of considerably more data archived on calcification and primary production than on other processes has improved. There is also a clear tendency towards more data archived from multifactorial studies after 2010. For easier and more effective access to ocean acidification data, the ocean acidification community is strongly encouraged to contribute to the data archiving effort, and help develop standard vocabularies describing the variables and define best practices for archiving ocean acidification data.