50 resultados para Optimisations


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Plusieurs tests médicaux, comme celui du dépistage du cancer du sein, se basent sur l’observation de section tissulaire sous un microscope. Ces tests se basent sur l’interprétation d’un spécialiste et les résultats peuvent varier d’un expert à un autre dû la subjectivité des observations. L’utilisation d’une technique analytique offrant une quantification et une identification de cibles moléculaires dans une section tissulaire permettrait aux experts de produire des diagnostics plus objectifs et diminuerait possiblement le nombre de faux diagnostics. Les travaux présentés dans ce mémoire portent sur le développement d’une technique SPRi-MALDI-IMS permettant l’imagerie en deux dimensions de protéines contenues dans une section tissulaire. La MALDI-IMS est la technique de choix pour l’imagerie de biomolécules dans les sections tissulaires. Par contre, elle ne parvient pas à elle seule à quantifier de façon absolue le matériel adsorbé à la surface. Donc, le couplage de la MALDI-IMS avec la SPRi permet la quantification absolue de protéines en deux dimensions et crée une technique répondant aux besoins des experts médicaux. Pour ce faire, nous avons étudié, l’effet de la chimie de surface sur la nature et la quantité de matériel adsorbé à la surface du capteur. De plus, la cinétique de transfert des protéines du tissu vers le capteur a dû être optimisée afin de produire des empreintes correspondant au tissu d’origine, afin d’atteindre la gamme dynamique des instruments SPRi et MALDI-IMS. La technique résultante de ces optimisations permet d’obtenir les premières images quantitatives et qualitatives de protéines en deux dimensions d’une seule section tissulaire.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Identification of nucleic acid sub-sequences within larger background sequences is a fundamental need of the biology community. The applicability correlates to research studies looking for homologous regions, diagnostic purposes and many other related activities. This paper serves to detail the approaches taken leading to sub-sequence identification through the use of hidden Markov models and associated scoring optimisations. The investigation of techniques for locating conserved basal promoter elements correlates to promoter thus gene identification techniques. The case study centred on the TATA box basal promoter element, as such the background is a gene sequence with the TATA box the target. Outcomes from the research conducted, highlights generic algorithms for sub-sequence identification, as such these generic processes can be transposed to any case study where identification of a target sequence is required. Paths extending from the work conducted in this investigation have led to the development of a generic framework for the future applicability of hidden Markov models to biological sequence analysis in a computational context.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes the optimisation of the encoding and decoding processes used to transmit and receive frequency coded data tones acoustically during the operation of an underwater diver navigation system. The aim was to reduce the time required to both generate these data tones for transmission as well as to decode these tones during reception. Encoding of the data tones is performed using a phase lock loop under the control of a microcontroller. A technique was developed which combined both hardware and software modifications to effectively halve the phase lock loop settling time, and therefore the time required to generate these tones. Decoding of these data tones is achieved using the Fast Fourier Transform. Alternative forms of the Discrete Fourier Transform were explored to find the most efficient in terms of execution time. Numerous software optimisations were then applied which led to a reduction in program execution time of 54 % with no penalty in program complexity or length. Testing of the system under identical real-life operating conditions showed no evidence of any system performance degradation as a result of these optimisations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Oxygen production by air separation is of great importance in both environmental and industrial processes as most large scale clean energy technologies require oxygen as feed gas. Currently the conventional cryogenic air separation unit is a major economic impediment to the deployment of these clean energy technologies with carbon capture (i.e. oxy-fuel combustion ). Dense ceramic perovskite membranes are envisaged to replace the cryogenics and reduce O2 production costs by 35% or more; which can significantly cut the energy penalty by 50% when integrated in oxy-fuel power plant for CO2 capture. This paper reviews the current progress in the development of dense ceramic membranes for oxygen production. The principles, advantages or disadvantages, and the crucial problems of all kinds of membranes are discussed. Materials development, optimisation guidelines and suggestions for future research direction are also included. Some areas already previously reviewed are treated with less attention.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We extend the standard solution to comic rendering with a comic-style specular component. To minimise the computational overhead associated with this extension, we introduce two optimising approximations; the perspective correction angle and the vertex face-orientation measure. Both of these optimisations are generally applicable, but they are especially well suited for applications where a physically correct lighting simulation is not required. Using our optimisations we achieve performances comparable to the standard solution. As our approximations favour large models, we even outperform the standard approach for models consisting of 10,000 triangles or more, which we can render exceeding 40 frames per second, including the specular component.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Orange River, South Africa’s largest river, is a critical water resource for the country. In spite of the clear economic benefits of regulating river flows through a series of impoundments, one of the significant undesirable ecological consequences of this regulation has been the regular outbreaks of the pest blackfly species Simulium chutteri and S. damnosum s.l. (Diptera: Simuliidae). The current control programme, carried out by the South African National Department of Agriculture, uses regular applications, by helicopter, of the target-specific bacterial larvicide Bacillus thuringiensis var. israelensis. While cost-benefit analyses show significant benefits to the control programme, benefits could potentially be further increased through applying smaller volumes of larvicide in an optimised manner, which incorporates upstream residual amounts of pesticide through downstream carry. Using an optimisation technique applied in the West African Onchocerciasis Control Programme, to a 136 km stretch of the Orange River which includes 31 blackfly breeding sites, we demonstrate that 28.5% less larvicide could be used to potentially achieve the same control of blackfly. This translates into potential annual savings of between R540 000 and R1 800 000. A comparison of larvicide volumes estimated using traditional vs. optimised approaches at different discharges, illustrates that the savings on optimisation decline linearly with increasing flow volumes. Larvicide applications at the lowest discharge considered (40 m3·s-1) showed the greatest benefits from optimisations, with benefits remaining but decreasing to a theoretical 30% up to median flows of 100 m3·s-1. Given that almost 70% of flows in July are less than 100 m3·s-1, we suggest that an optimised approach is appropriate for the Orange River Blackfly Control Programme, particularly for flow volumes of less than 100 m3·s-1. We recommend that trials be undertaken over two reaches of the Orange River, one using the traditional approach, and another using the optimised approach, to test the efficacy of using optimised volumes of B.t.i.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper is concerned with the use of distributed vibration neutralisers to control the transmission of flexural waves on a beam. Of particular interest is an array of beam-like neutralisers and a continuous plate-like neutraliser. General expressions for wave transmission and reflection metrics either side of the distributed neutralisers are derived. Based on transmission efficiency, the characteristics of multiple neutralisers are investigated in terms of the minimum transmission efficiency, the normalised bandwidth and the shape factor, allowing optimisation of their performance. Analytical results show that the band-stop property of the neutraliser array depends on various factors, including the neutraliser damping, mass, separation distance in the array and the moment arm of each neutraliser. Moreover, it is found that the particular attachment configuration of an uncoupled forcemoment-type neutraliser can be used to improve their overall performance. It is also shown that in the limit of many neutralisers in the array, the performance tends to that of a continuous neutraliser. © 2011 Elsevier Ltd.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[EN]This works aims at assessing the acoustic efficiency of differente this noise barrier models. These designs frequently feature complex profiles and their implementarion in shape optimization processes may not always be easy in terms of determining their topological feasibility. A methodology to conduct both overall shape and top edge optimisations of thin cross section acoustic barriers by idealizing them as profiles with null boundary thickness is proposed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work considers the reconstruction of strong gravitational lenses from their observed effects on the light distribution of background sources. After reviewing the formalism of gravitational lensing and the most common and relevant lens models, new analytical results on the elliptical power law lens are presented, including new expressions for the deflection, potential, shear and magnification, which naturally lead to a fast numerical scheme for practical calculation. The main part of the thesis investigates lens reconstruction with extended sources by means of the forward reconstruction method, in which the lenses and sources are given by parametric models. The numerical realities of the problem make it necessary to find targeted optimisations for the forward method, in order to make it feasible for general applications to modern, high resolution images. The result of these optimisations is presented in the \textsc{Lensed} algorithm. Subsequently, a number of tests for general forward reconstruction methods are created to decouple the influence of sourced from lens reconstructions, in order to objectively demonstrate the constraining power of the reconstruction. The final chapters on lens reconstruction contain two sample applications of the forward method. One is the analysis of images from a strong lensing survey. Such surveys today contain $\sim 100$ strong lenses, and much larger sample sizes are expected in the future, making it necessary to quickly and reliably analyse catalogues of lenses with a fixed model. The second application deals with the opposite situation of a single observation that is to be confronted with different lens models, where the forward method allows for natural model-building. This is demonstrated using an example reconstruction of the ``Cosmic Horseshoe''. An appendix presents an independent work on the use of weak gravitational lensing to investigate theories of modified gravity which exhibit screening in the non-linear regime of structure formation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Beim Laser-Sintern wird das Pulverbett durch Heizstrahler vorgeheizt, um an der Pulveroberfläche eine Temperatur knapp unterhalb des Materialschmelzpunktes zu erzielen. Dabei soll die Temperaturverteilung auf der Oberfläche möglichst homogen sein, um gleiche Bauteileigenschaften im gesamten Bauraum zu erzielen und den Bauteilverzug gering zu halten. Erfahrungen zeigen jedoch sehr inhomogene Temperaturverteilungen, weshalb oftmals die Integration von neuen oder optimierten Prozessüberwachungssystemen in die Anlagen gefordert wird. Ein potentiell einsetzbares System sind Thermographiekameras, welche die flächige Aufnahme von Oberflächentemperaturen und somit Aussagen über die Temperaturen an der Pulverbettoberfläche erlauben. Dadurch lassen sich kalte Bereiche auf der Oberfläche identifizieren und bei der Prozessvorbereitung berücksichtigen. Gleichzeitig ermöglicht die Thermografie eine Beobachtung der Temperaturen beim Lasereingriff und somit das Ableiten von Zusammenhängen zwischen Prozessparametern und Schmelzetemperaturen. Im Rahmen der durchgeführten Untersuchungen wurde ein IR-Kamerasystem erfolgreich als Festeinbau in eine Laser-Sinteranlage integriert und Lösungen für die hierbei auftretenden Probleme erarbeitet. Anschließend wurden Untersuchungen zur Temperaturverteilung auf der Pulverbettoberfläche sowie zu den Einflussfaktoren auf deren Homogenität durchgeführt. In weiteren Untersuchungen wurden die Schmelzetemperaturen in Abhängigkeit verschiedener Prozessparameter ermittelt. Auf Basis dieser Messergebnisse wurden Aussagen über erforderliche Optimierungen getroffen und die Nutzbarkeit der Thermografie beim Laser-Sintern zur Prozessüberwachung, -regelung sowie zur Anlagenwartung als erster Zwischenstand der Untersuchungen bewertet.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Right to Food, as enshrined in international law, has found its way into national constitutions and practices. What matters from a national and international legal point of view is how this policy objective is implemented. In Switzerland, a number of policies and their instruments are relevant here, namely agricultural, supply/stockpile, trade and development policies. This paper (in German) asks whether the policy instruments are coherent and how implementation conflicts and negative spill-over effects could be minimised. It finds that the four policy objectives enshrined in the Federal Constitution are not in themselves incoherent. However, certain Swiss agricultural policy instruments, even where they are compatible with relevant rules of the World Trade Organization (WTO), do have an avoidable negative impact on the Right to Food of developing country producers, because Swiss Food Security is overwhelmingly and increasingly defined by agricultural (self-reliance) policies (“Food Sovereignty”). This implies higher domestic food prices, commercial displacement and food dumping. The conclusions suggest a number of optimisations as a contribution to the presently on-going reform process for 1983 National Economic Supply Act 1983 (NESA), such as virtual stockpiles and taxpayer-financed stockpile costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For perceptual-cognitive skill training, a variety of intervention methods has been proposed, including the so-called “color-cueing method” which aims on superior gaze-path learning by applying visual markers. However, recent findings challenge this method, especially, with regards to its actual effects on gaze behavior. Consequently, after a preparatory study on the identification of appropriate visual cues for life-size displays, a perceptual-training experiment on decision-making in beach volleyball was conducted, contrasting two cueing interventions (functional vs. dysfunctional gaze path) with a conservative control condition (anticipation-related instructions). Gaze analyses revealed learning effects for the dysfunctional group only. Regarding decision-making, all groups showed enhanced performance with largest improvements for the control group followed by the functional and the dysfunctional group. Hence, the results confirm cueing effects on gaze behavior, but they also question its benefit for enhancing decision-making. However, before completely denying the method’s value, optimisations should be checked regarding, for instance, cueing-pattern characteristics and gaze-related feedback.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Esta tesis estudia la reducción plena (‘full reduction’ en inglés) en distintos cálculos lambda. 1 En esencia, la reducción plena consiste en evaluar los cuerpos de las funciones en los lenguajes de programación funcional con ligaduras. Se toma el cálculo lambda clásico (i.e., puro y sin tipos) como el sistema formal que modela el paradigma de programación funcional. La reducción plena es una técnica fundamental cuando se considera a los programas como datos, por ejemplo para la optimización de programas mediante evaluación parcial, o cuando algún atributo del programa se representa a su vez por un programa, como el tipo en los demostradores automáticos de teoremas actuales. Muchas semánticas operacionales que realizan reducción plena tienen naturaleza híbrida. Se introduce formalmente la noción de naturaleza híbrida, que constituye el hilo conductor de todo el trabajo. En el cálculo lambda la naturaleza híbrida se manifiesta como una ‘distinción de fase’ en el tratamiento de las abstracciones, ya sean consideradas desde fuera o desde dentro de si mismas. Esta distinción de fase conlleva una estructura en capas en la que una semántica híbrida depende de una o más semánticas subsidiarias. Desde el punto de vista de los lenguajes de programación, la tesis muestra como derivar, mediante técnicas de transformación de programas, implementaciones de semánticas operacionales que reducen plenamente a partir de sus especificaciones. Las técnicas de transformación de programas consisten en transformaciones sintácticas que preservan la equivalencia semántica de los programas. Se ajustan las técnicas de transformación de programas existentes para trabajar con implementaciones de semánticas híbridas. Además, se muestra el impacto que tiene la reducción plena en las implementaciones que utilizan entornos. Los entornos son un ingrediente fundamental en las implementaciones realistas de una máquina abstracta. Desde el punto de vista de los sistemas formales, la tesis desvela una teoría novedosa para el cálculo lambda con paso por valor (‘call-by-value lambda calculus’ en inglés) que es consistente con la reducción plena. Dicha teoría induce una noción de equivalencia observacional que distingue más puntos que las teorías existentes para dicho cálculo. Esta contribución ayuda a establecer una ‘teoría estándar’ en el cálculo lambda con paso por valor que es análoga a la ‘teoría estándar’ del cálculo lambda clásico propugnada por Barendregt. Se presentan resultados de teoría de la demostración, y se sugiere como abordar el estudio de teoría de modelos. ABSTRACT This thesis studies full reduction in lambda calculi. In a nutshell, full reduction consists in evaluating the body of the functions in a functional programming language with binders. The classical (i.e., pure untyped) lambda calculus is set as the formal system that models the functional paradigm. Full reduction is a prominent technique when programs are treated as data objects, for instance when performing optimisations by partial evaluation, or when some attribute of the program is represented by a program itself, like the type in modern proof assistants. A notable feature of many full-reducing operational semantics is its hybrid nature, which is introduced and which constitutes the guiding theme of the thesis. In the lambda calculus, the hybrid nature amounts to a ‘phase distinction’ in the treatment of abstractions when considered either from outside or from inside themselves. This distinction entails a layered structure in which a hybrid semantics depends on one or more subsidiary semantics. From a programming languages standpoint, the thesis shows how to derive implementations of full-reducing operational semantics from their specifications, by using program transformations techniques. The program transformation techniques are syntactical transformations which preserve the semantic equivalence of programs. The existing program transformation techniques are adjusted to work with implementations of hybrid semantics. The thesis also shows how full reduction impacts the implementations that use the environment technique. The environment technique is a key ingredient of real-world implementations of abstract machines which helps to circumvent the issue with binders. From a formal systems standpoint, the thesis discloses a novel consistent theory for the call-by-value variant of the lambda calculus which accounts for full reduction. This novel theory entails a notion of observational equivalence which distinguishes more points than other existing theories for the call-by-value lambda calculus. This contribution helps to establish a ‘standard theory’ in that calculus which constitutes the analogous of the ‘standard theory’ advocated by Barendregt in the classical lambda calculus. Some prooftheoretical results are presented, and insights on the model-theoretical study are given.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ontology-Based Data Access (OBDA) permite el acceso a diferentes tipos de fuentes de datos (tradicionalmente bases de datos) usando un modelo más abstracto proporcionado por una ontología. La reescritura de consultas (query rewriting) usa una ontología para reescribir una consulta en una consulta reescrita que puede ser evaluada en la fuente de datos. Las consultas reescritas recuperan las respuestas que están implicadas por la combinación de los datos explicitamente almacenados en la fuente de datos, la consulta original y la ontología. Al trabajar sólo sobre las queries, la reescritura de consultas permite OBDA sobre cualquier fuente de datos que puede ser consultada, independientemente de las posibilidades para modificarla. Sin embargo, producir y evaluar las consultas reescritas son procesos costosos que suelen volverse más complejos conforme la expresividad y tamaño de la ontología y las consultas aumentan. En esta tesis exploramos distintas optimizaciones que peuden ser realizadas tanto en el proceso de reescritura como en las consultas reescritas para mejorar la aplicabilidad de OBDA en contextos realistas. Nuestra contribución técnica principal es un sistema de reescritura de consultas que implementa las optimizaciones presentadas en esta tesis. Estas optimizaciones son las contribuciones principales de la tesis y se pueden agrupar en tres grupos diferentes: -optimizaciones que se pueden aplicar al considerar los predicados en la ontología que no están realmente mapeados con las fuentes de datos. -optimizaciones en ingeniería que se pueden aplicar al manejar el proceso de reescritura de consultas en una forma que permite reducir la carga computacional del proceso de generación de consultas reescritas. -optimizaciones que se pueden aplicar al considerar metainformación adicional acerca de las características de la ABox. En esta tesis proporcionamos demostraciones formales acerca de la corrección y completitud de las optimizaciones propuestas, y una evaluación empírica acerca del impacto de estas optimizaciones. Como contribución adicional, parte de este enfoque empírico, proponemos un banco de pruebas (benchmark) para la evaluación de los sistemas de reescritura de consultas. Adicionalmente, proporcionamos algunas directrices para la creación y expansión de esta clase de bancos de pruebas. ABSTRACT Ontology-Based Data Access (OBDA) allows accessing different kinds of data sources (traditionally databases) using a more abstract model provided by an ontology. Query rewriting uses such ontology to rewrite a query into a rewritten query that can be evaluated on the data source. The rewritten queries retrieve the answers that are entailed by the combination of the data explicitly stored in the data source, the original query and the ontology. However, producing and evaluating the rewritten queries are both costly processes that become generally more complex as the expressiveness and size of the ontology and queries increase. In this thesis we explore several optimisations that can be performed both in the rewriting process and in the rewritten queries to improve the applicability of OBDA in real contexts. Our main technical contribution is a query rewriting system that implements the optimisations presented in this thesis. These optimisations are the core contributions of the thesis and can be grouped into three different groups: -optimisations that can be applied when considering the predicates in the ontology that are actually mapped to the data sources. -engineering optimisations that can be applied by handling the process of query rewriting in a way that permits to reduce the computational load of the query generation process. -optimisations that can be applied when considering additional metainformation about the characteristics of the ABox. In this thesis we provide formal proofs for the correctness of the proposed optimisations, and an empirical evaluation about the impact of the optimisations. As an additional contribution, part of this empirical approach, we propose a benchmark for the evaluation of query rewriting systems. We also provide some guidelines for the creation and expansion of this kind of benchmarks.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The aim of this thesis is to present numerical investigations of the polarisation mode dispersion (PMD) effect. Outstanding issues on the side of the numerical implementations of PMD are resolved and the proposed methods are further optimized for computational efficiency and physical accuracy. Methods for the mitigation of the PMD effect are taken into account and simulations of transmission system with added PMD are presented. The basic outline of the work focusing on PMD can be divided as follows. At first the widely-used coarse-step method for simulating the PMD phenomenon as well as a method derived from the Manakov-PMD equation are implemented and investigated separately through the distribution of a state of polarisation on the Poincaré sphere, and the evolution of the dispersion of a signal. Next these two methods are statistically examined and compared to well-known analytical models of the probability distribution function (PDF) and the autocorrelation function (ACF) of the PMD phenomenon. Important optimisations are achieved, for each of the aforementioned implementations in the computational level. In addition the ACF of the coarse-step method is considered separately, based on the result which indicates that the numerically produced ACF, exaggerates the value of the correlation between different frequencies. Moreover the mitigation of the PMD phenomenon is considered, in the form of numerically implementing Low-PMD spun fibres. Finally, all the above are combined in simulations that demonstrate the impact of the PMD on the quality factor (Q=factor) of different transmission systems. For this a numerical solver based on the coupled nonlinear Schrödinger equation is created which is otherwise tested against the most important transmission impairments in the early chapters of this thesis.